-
- Posts: 8
- Joined: Mon Aug 29, 2016 4:32 pm
using raspistill to only display red pixels in Real Time
Is there a way to set up Raspistill to display in real time only the Red Pixels on the 8mp Sony camera?
Re: using raspistill to only display red pixels in Real Time
Tempted to ask - define RED as its a range of temperatures
Not sure if you can get anything from manipulating the Y/U channels with --colfx (e.g. --colfx 128:128 would give you B&W) but I would be tempted to use a physical red filter on front of the lens and change the contrast / saturation.
Note output will be dependant on the ambient light levels and temperature...
More command line options are at https://www.raspberrypi.org/documentati ... /camera.md

Not sure if you can get anything from manipulating the Y/U channels with --colfx (e.g. --colfx 128:128 would give you B&W) but I would be tempted to use a physical red filter on front of the lens and change the contrast / saturation.
Note output will be dependant on the ambient light levels and temperature...
More command line options are at https://www.raspberrypi.org/documentati ... /camera.md
-
- Raspberry Pi Engineer & Forum Moderator
- Posts: 15294
- Joined: Wed Dec 04, 2013 11:27 am
- Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.
Re: using raspistill to only display red pixels in Real Time
No, there's no simple way to achieve that. Getting the equivalent output from all the green pixels would be achievable as you can control the red and blue gains in the white balance block. Do be aware that demosaicing will still be interpolating green values for the red and blue bayer pixels.
If using raspiraw or the v4l2 drivers such that you get the raw data and can tweak the register set, then it would be possible to read out just the red pixels by setting the skip registers appropriately. You've then got to deal with exposure and gain control yourself, and handle the funny 10bit packing that is used on the csi2 bus.
Swings and roundabouts. It depends on what your use case is.
If using raspiraw or the v4l2 drivers such that you get the raw data and can tweak the register set, then it would be possible to read out just the red pixels by setting the skip registers appropriately. You've then got to deal with exposure and gain control yourself, and handle the funny 10bit packing that is used on the csi2 bus.
Swings and roundabouts. It depends on what your use case is.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.
I'm not interested in doing contracts for bespoke functionality - please don't ask.
Re: using raspistill to only display red pixels in Real Time
If you want to go the raspiraw route @6by9 mentioned, you can find running code to extract 320x240 blue or green sub image of 640x480 raw bayer in this posting:
https://www.raspberrypi.org/forums/view ... 1#p1218763
You can see blue/green sub image on left/rigth display here:

I wanted to do what you want (just take the red pixels) from raspividyuv (raspividstill gives a single frame only, but you want "in real time").
You can pipe raspividyuv output through below described filter into a yuv display program to achive what you want.
I used the YUV2RGB and RGB2YUV formulas on top of this page:
https://www.fourcc.org/fccyvrgb.php
I determined R from YV and set G and B to 0, then calculated back to YUV.
This are the formulas for red filter:
You have to clip all values to 0-255 range.
Y has dimension W×H, V and U have dimension (W/2)×(H/2).
All values are 8bit.
https://www.raspberrypi.org/forums/view ... 1#p1218763
You can see blue/green sub image on left/rigth display here:

I wanted to do what you want (just take the red pixels) from raspividyuv (raspividstill gives a single frame only, but you want "in real time").
You can pipe raspividyuv output through below described filter into a yuv display program to achive what you want.
I used the YUV2RGB and RGB2YUV formulas on top of this page:
https://www.fourcc.org/fccyvrgb.php
I determined R from YV and set G and B to 0, then calculated back to YUV.
This are the formulas for red filter:
Code: Select all
Y = 0.3×Y + 0.41×V - 41.28
V = 0.51×Y + 0.7×V + 30.143
U = -0.172×Y - 0.236×V + 161
Y has dimension W×H, V and U have dimension (W/2)×(H/2).
All values are 8bit.
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
-
- Posts: 8
- Joined: Mon Aug 29, 2016 4:32 pm
Re: using raspistill to only display red pixels in Real Time
Thanks all. I’m working with some dyes that fluoresce at 633 nm and wanted to be able to isolate that general “red” color from the image. It would be useful to blend the other colors, so I want to avoid a physical color filter. I am encouraged and will try some of the helpful examples. Thanks HermannSW for the links!
Re: using raspistill to only display red pixels in Real Time
I wanted to create a gstreamer plugin and realized that plugin writers guide is broken (at least for Raspbian) and how to fix that:
https://twitter.com/HermannSW/status/11 ... 3488919552
Without the three lines added the newly created plugin did not compile.
Adding all three defines with "none" resulted in that plugin was not listed using gst-inspect-1.0
It turned out that the number of blacklisted plugins increased by 1.
From Google search I learned that "gst-inspect-1.0 src/.libs/libredfilter.so" revealed the problem:
Further search showed that this command reveals very detailed reason:
And the reason was:
Using a valid license for GST_LICENSE resolved the issue.
P.S:
I opened a gitlab issue on this:
https://gitlab.freedesktop.org/gstreame ... e/issues/5
https://twitter.com/HermannSW/status/11 ... 3488919552
Without the three lines added the newly created plugin did not compile.
Adding all three defines with "none" resulted in that plugin was not listed using gst-inspect-1.0
It turned out that the number of blacklisted plugins increased by 1.
From Google search I learned that "gst-inspect-1.0 src/.libs/libredfilter.so" revealed the problem:
Code: Select all
appears to be a GStreamer plugin, but it failed to initialize
Code: Select all
$ GST_DEBUG=4 gst-inspect-1.0 src/.libs/libredfilter.so
Code: Select all
...
0:00:00.134567821 17366 0x1cb2e00 WARN GST_PLUGIN_LOADING gstplugin.c:500:gst_plugin_register_func: plugin "/usr/local/lib/gstreamer-1.0/libredfilter.so" has invalid license "none", not loading
...
P.S:
I opened a gitlab issue on this:
https://gitlab.freedesktop.org/gstreame ... e/issues/5
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: using raspistill to only display red pixels in Real Time
I went a step further, until now my projects are on github.com.
Today I registered on gitlab.freedesktop.org and did fork gst-template repo:
https://gitlab.freedesktop.org/HermannSW/gst-template
I had to learn to deal with gitlab, working with git over https and oauth2 for now. Github README.md is gitlab README, and there are some more differences. I did initial commit of fork, then committed temporary fix for issue 5, added redfilter plug-in directory tree and finally committed what redfilter is to top level README:
https://gitlab.freedesktop.org/HermannS ... ccbd2018c0

This is work in progress, currently redfilter only does the demo code from gst-plugin.
My Raspberry is not converted to use English yet, but the "I'm plugged ..." messages from redfilter plug-in can be seen:
P.S:
I found an online tool that displays colors for wavelength provided:
https://academo.org/demos/wavelength-to ... ationship/
633nm is this color (█ █ █ █ █ █ █ █ █ █ ), and website stated:
Today I registered on gitlab.freedesktop.org and did fork gst-template repo:
https://gitlab.freedesktop.org/HermannSW/gst-template
I had to learn to deal with gitlab, working with git over https and oauth2 for now. Github README.md is gitlab README, and there are some more differences. I did initial commit of fork, then committed temporary fix for issue 5, added redfilter plug-in directory tree and finally committed what redfilter is to top level README:
https://gitlab.freedesktop.org/HermannS ... ccbd2018c0

This is work in progress, currently redfilter only does the demo code from gst-plugin.
My Raspberry is not converted to use English yet, but the "I'm plugged ..." messages from redfilter plug-in can be seen:
Code: Select all
$ gst-launch-1.0 -v -m fakesrc ! redfilter ! fakesink silent=TRUE
Leitung wird auf PAUSIERT gesetzt ...
...
new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING;
Leitung ist vorgelaufen …
Nachricht #24 wurde von Element »pipeline0« (async-done) erhalten: GstMessageAsyncDone, running-time=(guint64)18446744073709551615;
Leitung wird auf ABSPIELEN gesetzt ...
I'm plugged, therefore I'm in.
I'm plugged, therefore I'm in.
I'm plugged, therefore I'm in.
I'm plugged, therefore I'm in.
I'm plugged, therefore I'm in.
I'm plugged, therefore I'm in.
I'm plugged, therefore I'm in.
Nachricht #26 wurde von Element »pipeline0« (new-clock) erhalten: I'm plugged, therefore I'm in.
I'm plugged, therefore I'm in.
GstMessageNewClock, clock=(GstClock)"\(GstSystemClock\)\ GstSystemClock";
New clock: GstSystemClock
...
P.S:
I found an online tool that displays colors for wavelength provided:
https://academo.org/demos/wavelength-to ... ationship/
633nm is this color (█ █ █ █ █ █ █ █ █ █ ), and website stated:
Helium-neon lasers emit at 632.8nm, which is a bright red.
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: using raspistill to only display red pixels in Real Time
Just realized that I need a raspividyuv gstreamer pipeline for the yuv redfilter.
It should be possible to determine the frame format in redfilter plugin, and make redfilter work for 32bit rgba as well.
It was not that easy, until I stumbled over the rawvideoparse element:
https://gstreamer.freedesktop.org/docum ... parse.html
2nd example pipeline (with slight changes) works like a charm:

Later I replaced
by
and measured that really 52fps were achieved from raspividyuv to fbdevsink:
It should be possible to determine the frame format in redfilter plugin, and make redfilter work for 32bit rgba as well.
It was not that easy, until I stumbled over the rawvideoparse element:
https://gstreamer.freedesktop.org/docum ... parse.html
2nd example pipeline (with slight changes) works like a charm:
Code: Select all
raspividyuv -t 0 -w 640 -h 480 -fps 90 -n -o - | gst-launch-1.0 -v fdsrc ! queue ! "video/x-raw, width=640, height=480, format=I420, framerate=90/1" ! rawvideoparse use-sink-caps=true ! videoconvert ! fbdevsink device="/dev/fb0"

Later I replaced
Code: Select all
... ! fbdevsink device="/dev/fb0"
Code: Select all
... ! fpsdisplaysink text-overlay=false video-sink='fbdevsink device="/dev/fb0"'
Code: Select all
...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1205, dropped: 0, current: 52,22, average: 52,43
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1232, dropped: 0, current: 52,41, average: 52,43
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1259, dropped: 0, current: 52,82, average: 52,44
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: using raspistill to only display red pixels in Real Time
Another preparation, for redfilter testing:
Visible spectrum animation
https://jsfiddle.net/HermannSW/pao7qnym/5/
https://youtu.be/5zTBKHFLBlg
Visible spectrum animation
https://jsfiddle.net/HermannSW/pao7qnym/5/
https://youtu.be/5zTBKHFLBlg
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: using raspistill to only display red pixels in Real Time
I looked into appsrc/appsrc-launch5.c to see how to get access to the frame data.
This is the small diff (besides including <string.h> at top) that allows to see 640*480*1.5=460800 bytes buffer size, and get READWRITE memory in data for manipulating the frame:
Filling first 100 lines with 25% luma (64, 640x480 Y bytes are followed by 320x240 U bytes and 320x240 V bytes) can be seen in screenshot, and was simplest demo besides just making area white (255) or black (0):

This was just to test how to modify frames in gstreamer with help of GST_MAP_READWRITE mapping (and later unmap), and works nicely with high framerate gstreamer pipeline (requested 90fps as yesterday, for sure a bit slower than the 52fps measured yesterday).
Next is to do the real YUV red filtering as described before and check that in.
P.S:
I just commented out the two g_print statements and measured framerate achieved as described before.
Framerate drops from 52fps to 45fps, mostly because of map/unmap.
This is the small diff (besides including <string.h> at top) that allows to see 640*480*1.5=460800 bytes buffer size, and get READWRITE memory in data for manipulating the frame:
Code: Select all
@@ -236,8 +237,16 @@ gst_red_filter_chain (GstPad * pad, GstObject * parent, GstBuffer * buf)
filter = GST_REDFILTER (parent);
- if (filter->silent == FALSE)
+ if (filter->silent == FALSE) {
+ GstMapInfo info;
+
+ gst_buffer_map (buf, &info, GST_MAP_READWRITE);
+ g_print("buf(%d) ",info.size);
+ memset(info.data,64,640*100);
+ gst_buffer_unmap (buf, &info);
+
g_print ("I'm plugged, therefore I'm in.\n");
+ }

This was just to test how to modify frames in gstreamer with help of GST_MAP_READWRITE mapping (and later unmap), and works nicely with high framerate gstreamer pipeline (requested 90fps as yesterday, for sure a bit slower than the 52fps measured yesterday).
Next is to do the real YUV red filtering as described before and check that in.
P.S:
I just commented out the two g_print statements and measured framerate achieved as described before.
Framerate drops from 52fps to 45fps, mostly because of map/unmap.
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: using raspistill to only display red pixels in Real Time
The redfilter works with this commit
https://gitlab.freedesktop.org/HermannS ... 8a999ab41d
But there is a bug in updating gstreamer output.
The frames get processed with 90fps without any difficulty (nanosecond timestamps, deltas by >11 million ns):
This is how the pipeline got started, and screenshot with redfilter at work:

Next step is to find out what I did wrong with buffer modification (perhaps a missing signal?) and fix it.
Then enable redfilter for rgba video format as well, and compare performance of both gstreamer pipelines end to end.
https://gitlab.freedesktop.org/HermannS ... 8a999ab41d
But there is a bug in updating gstreamer output.
The frames get processed with 90fps without any difficulty (nanosecond timestamps, deltas by >11 million ns):
Code: Select all
...
640x480(I420) 11188888777
640x480(I420) 11199999888
640x480(I420) 11211110999
640x480(I420) 11222222110
640x480(I420) 11233333221
640x480(I420) 11244444332
640x480(I420) 11255555443
This is how the pipeline got started, and screenshot with redfilter at work:
Code: Select all
$ raspividyuv -t 0 -md 7 -w 640 -h 480 -fps 90 -n -o - | GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0/ gst-launch-1.0 -v fdsrc ! queue ! "video/x-raw, width=640, height=480, format=I420, framerate=90/1" ! rawvideoparse use-sink-caps=true ! redfilter ! videoconvert ! fbdevsink device="/dev/fb0"

Next step is to find out what I did wrong with buffer modification (perhaps a missing signal?) and fix it.
Then enable redfilter for rgba video format as well, and compare performance of both gstreamer pipelines end to end.
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: using raspistill to only display red pixels in Real Time
I just synced redfilter on Ubuntu 18.04.2 and built it.
One warning on assignment used as truth value, and one warning on using %lld for output buf->pts timestamp (no warnings on Raspbian I used for development), but the plugin gets build and works. <comment>compiler warnings fixed with this commit</comment>
I have no camera at the laptop, so used videotestsrc and "video/x-raw,format=I420" to get YUV as input for redfilter. Unlike on Raspian the display gets updated immediately and fast as can be seen with videotestsrc default snow pattern:

Seems to confirm assumption on missing signal for the raspividyuv gstreamer pipeline on Raspbian with redfilter ...
Just for comparison, a former screenshot of "normal" snow videotestsrc display:

One warning on assignment used as truth value, and one warning on using %lld for output buf->pts timestamp (no warnings on Raspbian I used for development), but the plugin gets build and works. <comment>compiler warnings fixed with this commit</comment>
I have no camera at the laptop, so used videotestsrc and "video/x-raw,format=I420" to get YUV as input for redfilter. Unlike on Raspian the display gets updated immediately and fast as can be seen with videotestsrc default snow pattern:
Code: Select all
$ gst-launch-1.0 videotestsrc ! video/x-raw,format=I420 ! redfilter ! autovideosink

Seems to confirm assumption on missing signal for the raspividyuv gstreamer pipeline on Raspbian with redfilter ...
Just for comparison, a former screenshot of "normal" snow videotestsrc display:

https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: using raspistill to only display red pixels in Real Time
It was no bug in redfilter gstreamer plugin, it was just total overload of gstreamer pipeline.
No change was needed to get "real time" 28fps display with this command (60fps mode 7 raspividyuv, gstreamer specify 30fps):
I uploaded smartphone video to youtube in order to show that all is smooth now:
https://www.youtube.com/watch?v=LBLRBYL ... e=youtu.be

No change was needed to get "real time" 28fps display with this command (60fps mode 7 raspividyuv, gstreamer specify 30fps):
Code: Select all
$ raspividyuv -t 0 -md 7 -w 640 -h 480 -fps 60 -n -o - | GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0/ gst-launch-1.0 -v fdsrc ! queue ! "video/x-raw, width=640, height=480, format=I420, framerate=30/1" ! rawvideoparse use-sink-caps=true ! redfilter ! videoconvert ! fpsdisplaysink text-overlay=false video-sink='fbdevsink device="/dev/fb0"'
I uploaded smartphone video to youtube in order to show that all is smooth now:
https://www.youtube.com/watch?v=LBLRBYL ... e=youtu.be

https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: using raspistill to only display red pixels in Real Time
No need for that, just learned (again) that raspivid encoding is I420 as well.
With raspivid use of h264parse and (hardware accelerated) omxh264dec are needed, besides that redfilter just works. With mode 7 minimal framerate 60fps gstreamer pipeline gets overloaded again. Using v2 mode 6 (1280x720) and scale to 640x480 (with GPU, options -w 640 -h 480) does overload gstreamer pipeline for framerate 30fps, but works for framerate 20fps. This is 30fps raspivid redfilter pipeline that works, the frame is just 50% of mode 6 (640x360) and a 25% smaller. This is the command that gives 30fps "real time" redfilter video display:
Code: Select all
$ raspivid -t 0 -md 6 -w 640 -h 360 -fps 30 -n -o - | GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0/ gst-launch-1.0 -v fdsrc ! h264parse ! omxh264dec ! redfilter ! videoconvert ! fpsdisplaysink text-overlay=false video-sink='fbdevsink device="/dev/fb0"'
Summary:
Either framesize or framerate has to be reduced a bit for raspivid to achieve same performance as raspividyuv with redfilter.
In addition radpivid redfilter pipeline has higher latency than raspividyuv pipeline.
Therefore raspividyuv redfilter pipeline is preferable.
Can you please try the redfilter (git clone my gitlab repo, build&install by "cd gst-template/redfilter; ./autogen.sh; make; sudo make install", and then use raspividyuv redfilter pipeline from previous posting)?SanDiegoPiGuy wrote: ↑Sun Jul 28, 2019 2:24 pmThanks all. I’m working with some dyes that fluoresce at 633 nm and wanted to be able to isolate that general “red” color from the image. It would be useful to blend the other colors, so I want to avoid a physical color filter.
633nm wavelength corresponds to color #ff4200; does redfilter achieve what you are interested in?
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: using raspistill to only display red pixels in Real Time
Recently I determined the times for capturing a 640x480 frame for both Raspberry cameras with the help of hardware camera sync pulses:
https://www.raspberrypi.org/forums/view ... 7#p1516097
Capturing a frame takes 4.7ms for v2 camera, and 10.21ms for v1 camera.
Today I wanted to look deeper into why only 30fps were achievable for redfilter gstreamer pipeline with v1 camera.
With this commit milliseconds used for just redfilter processing get reported for each frame:
https://gitlab.freedesktop.org/HermannS ... 279946465c
I used same raspividyuv pipeline as before, only "framerate=31/1" was used this time for best results:
During run the gstreamer pipeline consumed 100%CPU (a full core), and "vcgencmd measure_clock arm" showed 1.4GHz.
v1 camera was able to achieve sightly more than 31fps on average, and redfilter processing time on Pi3A was 14.3ms:
Same command with change to "framerate=32/1" for v2 camera resulted in slightly more than 32fps on average, with same redfilter processing time (same computations):
For v2 camera adding frame capture time plus redfilter time (ignoring all other processing) gives 19ms, so 52.6fps is an upper bound on what might be doable for capturing with redfilter. For v1 camera that addition gives 24.51ms (40.8fps).
The gst-template Makefile compiles optimized with -O2, but I did implement the helper functions as needed functionally and not looking for best runtime. Inlined functions or macros might help, as well as using "cheaper" datatypes for computations.
https://www.raspberrypi.org/forums/view ... 7#p1516097
Capturing a frame takes 4.7ms for v2 camera, and 10.21ms for v1 camera.
Today I wanted to look deeper into why only 30fps were achievable for redfilter gstreamer pipeline with v1 camera.
With this commit milliseconds used for just redfilter processing get reported for each frame:
https://gitlab.freedesktop.org/HermannS ... 279946465c
I used same raspividyuv pipeline as before, only "framerate=31/1" was used this time for best results:
Code: Select all
$ raspividyuv -t 0 -md 7 -w 640 -h 480 -fps 60 -n -o - | GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0/ gst-launch-1.0 -v fdsrc ! queue ! "video/x-raw, width=640, height=480, format=I420, framerate=31/1" ! rawvideoparse use-sink-caps=true ! redfilter ! videoconvert ! fpsdisplaysink text-overlay=false video-sink='fbdevsink device="/dev/fb0"'
During run the gstreamer pipeline consumed 100%CPU (a full core), and "vcgencmd measure_clock arm" showed 1.4GHz.
v1 camera was able to achieve sightly more than 31fps on average, and redfilter processing time on Pi3A was 14.3ms:
Code: Select all
...
640x480(I420) 4064516064 14,27ms
640x480(I420) 4096774128 14,34ms
640x480(I420) 4129032192 14,29ms
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 129, dropped: 0, current: 31,00, average: 31,23
640x480(I420) 4161290256 14,30ms
640x480(I420) 4193548320 14,47ms
640x480(I420) 4225806384 14,24ms
...
Same command with change to "framerate=32/1" for v2 camera resulted in slightly more than 32fps on average, with same redfilter processing time (same computations):
Code: Select all
640x480(I420) 23500000000 14,26ms
640x480(I420) 23531250000 14,27ms
640x480(I420) 23562500000 14,27ms
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 755, dropped: 0, current: 31,98, average: 32,04
640x480(I420) 23593750000 14,21ms
640x480(I420) 23625000000 14,20ms
640x480(I420) 23656250000 14,23ms
...
For v2 camera adding frame capture time plus redfilter time (ignoring all other processing) gives 19ms, so 52.6fps is an upper bound on what might be doable for capturing with redfilter. For v1 camera that addition gives 24.51ms (40.8fps).
The gst-template Makefile compiles optimized with -O2, but I did implement the helper functions as needed functionally and not looking for best runtime. Inlined functions or macros might help, as well as using "cheaper" datatypes for computations.
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: using raspistill to only display red pixels in Real Time
I used "... ! redfilter ! fakesink" in below posting to determine what framerate can be achieved by redfilter without videoconvert and output to fbdevsink:
https://www.raspberrypi.org/forums/view ... 6#p1519206
There was a problem, the (gstreamer reported) timestamps showed that requested 200fps framerate was possible, but 200*14.3ms=2.86s which is far more than 1 second can take.
With today's commit real time timestamps at nanosecond resolution get output as well:
https://gitlab.freedesktop.org/HermannS ... 69b5ff0183
This allows to see whether a given requested framerate can be achieved or not.
It turned out that Pi3A+ with fakesink output allows for 54fps framerate, as you can see gstreamer buf->pts timestamps are bigger than the real timestamps:
P.S:
In total 54*14.3ms=772ms of gstreamer buffer processing per 1 second, or 77.2%.
P.P.S:
The difference of buf->pts and real timestamp is growing the longer the pipeline runs, but slightly erratic
(horizontal frame number taken, vertical difference in nanoseconds):

https://www.raspberrypi.org/forums/view ... 6#p1519206
There was a problem, the (gstreamer reported) timestamps showed that requested 200fps framerate was possible, but 200*14.3ms=2.86s which is far more than 1 second can take.
With today's commit real time timestamps at nanosecond resolution get output as well:
https://gitlab.freedesktop.org/HermannS ... 69b5ff0183
This allows to see whether a given requested framerate can be achieved or not.
It turned out that Pi3A+ with fakesink output allows for 54fps framerate, as you can see gstreamer buf->pts timestamps are bigger than the real timestamps:
Code: Select all
pi@raspberrypi3Aplus:~/gst-template/redfilter $ raspividyuv -t 0 -md 7 -w 640 -h 480 -fps 54 -n -o - | GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0/ gst-launch-1.0 -v fdsrc ! queue ! "video/x-raw, width=640, height=480, format=I420, framerate=54/1" ! rawvideoparse use-sink-caps=true ! redfilter ! fakesink > out
Code: Select all
...
redfilter: 640x480(I420) buf->pts=4351851730ns 14,29ms real=4336337649ns
redfilter: 640x480(I420) buf->pts=4370370248ns 14,32ms real=4354555036ns
redfilter: 640x480(I420) buf->pts=4388888766ns 14,31ms real=4371308655ns
redfilter: 640x480(I420) buf->pts=4407407284ns 14,31ms real=4391392665ns
redfilter: 640x480(I420) buf->pts=4425925802ns 14,34ms real=4407780042ns
redfilter: 640x480(I420) buf->pts=4444444320ns 14,31ms real=4429087617ns
redfilter: 640x480(I420) buf->pts=4462962838ns 14,31ms real=4446170864ns
redfilter: 640x480(I420) buf->pts=4481481356ns 14,30ms real=4465258855ns
handling interrupt.
...
P.S:
In total 54*14.3ms=772ms of gstreamer buffer processing per 1 second, or 77.2%.
P.P.S:
The difference of buf->pts and real timestamp is growing the longer the pipeline runs, but slightly erratic
(horizontal frame number taken, vertical difference in nanoseconds):

https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
-
- Posts: 8
- Joined: Mon Aug 29, 2016 4:32 pm
Re: using raspistill to only display red pixels in Real Time
HermannSW, you are amazing! Thank you so much. I will give you some feedback shortly.
Re: using raspistill to only display red pixels in Real Time
Hi HermannSW:HermannSW wrote: ↑Mon Aug 05, 2019 2:42 pmNo need for that, just learned (again) that raspivid encoding is I420 as well.
With raspivid use of h264parse and (hardware accelerated) omxh264dec are needed, besides that redfilter just works. With mode 7 minimal framerate 60fps gstreamer pipeline gets overloaded again. Using v2 mode 6 (1280x720) and scale to 640x480 (with GPU, options -w 640 -h 480) does overload gstreamer pipeline for framerate 30fps, but works for framerate 20fps. This is 30fps raspivid redfilter pipeline that works, the frame is just 50% of mode 6 (640x360) and a 25% smaller. This is the command that gives 30fps "real time" redfilter video display:Although omxh264dec is hardware accelerated, "... ! h264parse ! omxh264dec ! ..." introduces a small latency.Code: Select all
$ raspivid -t 0 -md 6 -w 640 -h 360 -fps 30 -n -o - | GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0/ gst-launch-1.0 -v fdsrc ! h264parse ! omxh264dec ! redfilter ! videoconvert ! fpsdisplaysink text-overlay=false video-sink='fbdevsink device="/dev/fb0"'
Summary:
Either framesize or framerate has to be reduced a bit for raspivid to achieve same performance as raspividyuv with redfilter.
In addition radpivid redfilter pipeline has higher latency than raspividyuv pipeline.
Therefore raspividyuv redfilter pipeline is preferable.
Can you please try the redfilter (git clone my gitlab repo, build&install by "cd gst-template/redfilter; ./autogen.sh; make; sudo make install", and then use raspividyuv redfilter pipeline from previous posting)?SanDiegoPiGuy wrote: ↑Sun Jul 28, 2019 2:24 pmThanks all. I’m working with some dyes that fluoresce at 633 nm and wanted to be able to isolate that general “red” color from the image. It would be useful to blend the other colors, so I want to avoid a physical color filter.
633nm wavelength corresponds to color #ff4200; does redfilter achieve what you are interested in?
Thank you so much! We ran your 'redfilter' on our Raspbian Stretch OS and it actually worked. The only concern is that there is too much latency (delay) in the image itself. Is that expected or does the 'redfilter' need more tuning?
Re: using raspistill to only display red pixels in Real Time
Hi HermannSW:
I work with 'SanDiegoPiGuy' and we were able to implement the red filter feature in OpenCV.
OpenCV is very powerful with color detection and processing.
Thanks again for all your help.
I work with 'SanDiegoPiGuy' and we were able to implement the red filter feature in OpenCV.
OpenCV is very powerful with color detection and processing.
Thanks again for all your help.
Re: using raspistill to only display red pixels in Real Time
Hi,
regardless whether you do redfilter in gstreamer plugin or OpenCV, the amount of pixel work is big.
In this posting I explained the per pixel code in detail:
https://www.raspberrypi.org/forums/view ... 0#p1519206
In that posting P.S I calculated the time per pixel as 65 CPU cycles per pixel:
Average computation time per pixel is 14.3ms/(640*480)=46.6ns (or 65 clock cycles at 1.4GHz).
So yes, there is a latency addition of 14.3ms for using redfilter with 640x480 video stream.
But I doubt that you will be able to do much about it when doing redfilter in C code without help of GPU.
And @6by9 stated in 3rd comment that for redfilter (unlike greenfilter) you cannout do much about that:
https://www.raspberrypi.org/forums/view ... 4#p1508965
No, there's no simple way to achieve that.
You may be able to reduce the work per pixel from 65 CPU cycles a bit with more tuning.
But by the nature of the used YUV2RGB and rgb2yuv with r=R and b=g=0 there will always be a ms range latency for mode7 video.
P.S:
You probably can reduce cycles per pixel a bit by speeding up these functions in getting rid of gdouble computations somehow:
https://gitlab.freedesktop.org/HermannS ... ter.c#L232
Code: Select all
gdouble clamp(gdouble d) {
if (d<0.0) { d=0.0; }
else if (d>255.0) { d=255.0; }
return d;
}
gdouble YUV2R(gdouble y, gdouble v) { return clamp(1.164*(y-16)+1.596*(v-128));}
gdouble R2Y(gdouble r) { return 0.257*r+16;}
gdouble R2V(gdouble r) { return 0.439*r+128;}
gdouble R2U(gdouble r) { return 128-0.148*r;}
Last edited by HermannSW on Tue Aug 20, 2019 10:03 pm, edited 3 times in total.
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: using raspistill to only display red pixels in Real Time
Thank you so much to have brought this up.
I created redfilter sofar on the YUV input requirement from original poster.
But as we have seen doing redfilter for YUV data needs quite some transformations per pixel.
Yesterday I looked at raspividyuv doc and found option "--rgb, -rgb" that saves RGB formation instead of I420 (YUV).
I just commited making redfilter applicable to RGB format in addition to I420:
https://gitlab.freedesktop.org/HermannS ... 878edb42c5
This is how to execute command:
Code: Select all
$ raspividyuv -rgb -t 0 -md 7 -w 640 -h 480 -fps 100 -n -o - | GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0/ gst-launch-1.0 -v fdsrc ! queue ! "video/x-raw, width=640, height=480, format=RGB, framerate=100/1" ! rawvideoparse use-sink-caps=true ! redfilter ! fakesink > /dev/shm/out
As you can see, raspividyou option -rgb tells to use RGB format, and in gstreamer pipeline now we see fomat="RGB". What you will notice as well is now a much higher framerate for raspividyuv as well as in gstreamer pipeline. Reason is that redfilter now can be done in 1.39ms per pixel, a more than 10× speedup! And that on "only" 1.2GHz Raspberry 3B compared to 1.4GHz Raspberry 3A+ before. 1.39ms is now less than 6 CPU cycles only per 640x480 frame pixel(!):
Code: Select all
redfilter: 640x480(RGB) buf->pts=9570000000ns 1.37ms real=9550868926ns
redfilter: 640x480(RGB) buf->pts=9580000000ns 1.39ms real=9560937012ns
redfilter: 640x480(RGB) buf->pts=9590000000ns 1.36ms real=9570771400ns
redfilter: 640x480(RGB) buf->pts=9600000000ns 1.38ms real=9580576100ns
redfilter: 640x480(RGB) buf->pts=9610000000ns 1.37ms real=9590676687ns
redfilter: 640x480(RGB) buf->pts=9620000000ns 1.36ms real=9600544199ns
redfilter: 640x480(RGB) buf->pts=9630000000ns 1.36ms real=9610577389ns
redfilter: 640x480(RGB) buf->pts=9640000000ns 1.37ms real=9620691361ns
redfilter: 640x480(RGB) buf->pts=9650000000ns 1.36ms real=9630505957ns
redfilter: 640x480(RGB) buf->pts=9660000000ns 1.37ms real=9640425657ns
handling interrupt.
The code for redfilter with RGB format is nearly trivial:
Code: Select all
} else if (strcmp(fmt,"RGB")==0) {
GstMapInfo info;
guint8 *p;
gst_buffer_map (buf, &info, GST_MAP_READWRITE);
assert(w*h*3 == info.size);
for(p=info.data+info.size; p>info.data; ) {
*--p=0; // b
*--p=0; // g
--p; // r
}
assert(p==info.data);
...
P.S:
Below you can see redfilter in action with output on fbdevsink.
Before it was 31fps that was possible with v2 camera and redfilter, now it is 70fps(!) with this command:
Code: Select all
$ raspividyuv -rgb -t 0 -md 7 -w 640 -h 480 -fps 70 -n -o - | GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0/ gst-launch-1.0 -v fdsrc ! queue ! "video/x-raw, width=640, height=480, format=RGB, framerate=70/1" ! rawvideoparse use-sink-caps=true ! redfilter ! videoconvert ! fbdevsink device=/dev/fb0

(screenshot was taken with raspi2png)
P.P.S:
In this thread we have seen gstreamer pipelines dealing with YUV/I420 and RGB formats.
In this posting you can see how to deal with gstreamer gray8 format data by raspividyuv "--luma" option (eg. for robot control, when color is not needed):
https://www.raspberrypi.org/forums/view ... 5#p1522575
P.P.P.S:
In case a gstreamer plugin like redfilter is used for robot control only, it might be better to create an appsink instead of a plugin like redfilter where fakesink is needed afterwards for cleanup.
This thread discussed how to create an appsrc that takes raspiraw stream and sends it into gstreamer pipeline, just the other way around:
https://www.raspberrypi.org/forums/view ... p?t=197124
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: using raspistill to only display red pixels in Real Time
I noticed that the RPi is now overheating (see the top right red thermometer). We made more experiments with OpenCV and it looks like this is the direction we are going to take.HermannSW wrote: ↑Tue Aug 20, 2019 9:20 pmThank you so much to have brought this up.
I created redfilter sofar on the YUV input requirement from original poster.
But as we have seen doing redfilter for YUV data needs quite some transformations per pixel.
Yesterday I looked at raspividyuv doc and found option "--rgb, -rgb" that saves RGB formation instead of I420 (YUV).
I just commited making redfilter applicable to RGB format in addition to I420:
https://gitlab.freedesktop.org/HermannS ... 878edb42c5
This is how to execute command:Code: Select all
$ raspividyuv -rgb -t 0 -md 7 -w 640 -h 480 -fps 100 -n -o - | GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0/ gst-launch-1.0 -v fdsrc ! queue ! "video/x-raw, width=640, height=480, format=RGB, framerate=100/1" ! rawvideoparse use-sink-caps=true ! redfilter ! fakesink > /dev/shm/out
As you can see, raspividyou option -rgb tells to use RGB format, and in gstreamer pipeline now we see fomat="RGB". What you will notice as well is now a much higher framerate for raspividyuv as well as in gstreamer pipeline. Reason is that redfilter now can be done in 1.39ms per pixel, a more than 10× speedup! And that on "only" 1.2GHz Raspberry 3B compared to 1.4GHz Raspberry 3A+ before. 1.39ms is now less than 6 CPU cycles only per 640x480 frame pixel(!):Code: Select all
redfilter: 640x480(RGB) buf->pts=9570000000ns 1.37ms real=9550868926ns redfilter: 640x480(RGB) buf->pts=9580000000ns 1.39ms real=9560937012ns redfilter: 640x480(RGB) buf->pts=9590000000ns 1.36ms real=9570771400ns redfilter: 640x480(RGB) buf->pts=9600000000ns 1.38ms real=9580576100ns redfilter: 640x480(RGB) buf->pts=9610000000ns 1.37ms real=9590676687ns redfilter: 640x480(RGB) buf->pts=9620000000ns 1.36ms real=9600544199ns redfilter: 640x480(RGB) buf->pts=9630000000ns 1.36ms real=9610577389ns redfilter: 640x480(RGB) buf->pts=9640000000ns 1.37ms real=9620691361ns redfilter: 640x480(RGB) buf->pts=9650000000ns 1.36ms real=9630505957ns redfilter: 640x480(RGB) buf->pts=9660000000ns 1.37ms real=9640425657ns handling interrupt.
The code for redfilter with RGB format is nearly trivial:Code: Select all
} else if (strcmp(fmt,"RGB")==0) { GstMapInfo info; guint8 *p; gst_buffer_map (buf, &info, GST_MAP_READWRITE); assert(w*h*3 == info.size); for(p=info.data+info.size; p>info.data; ) { *--p=0; // b *--p=0; // g --p; // r } assert(p==info.data); ...
P.S:
Below you can see redfilter in action with output on fbdevsink.
Before it was 31fps that was possible with v2 camera and redfilter, now it is 70fps(!) with this command:Code: Select all
$ raspividyuv -rgb -t 0 -md 7 -w 640 -h 480 -fps 70 -n -o - | GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0/ gst-launch-1.0 -v fdsrc ! queue ! "video/x-raw, width=640, height=480, format=RGB, framerate=70/1" ! rawvideoparse use-sink-caps=true ! redfilter ! videoconvert ! fbdevsink device=/dev/fb0
(screenshot was taken with raspi2png)
P.P.S:
In this thread we have seen gstreamer pipelines dealing with YUV/I420 and RGB formats.
In this posting you can see how to deal with gstreamer gray8 format data by raspividyuv "--luma" option (eg. for robot control, when color is not needed):
https://www.raspberrypi.org/forums/view ... 5#p1522575
P.P.P.S:
In case a gstreamer plugin like redfilter is used for robot control only, it might be better to create an appsink instead of a plugin like redfilter where fakesink is needed afterwards for cleanup.
This thread discussed how to create an appsrc that takes raspiraw stream and sends it into gstreamer pipeline, just the other way around:
https://www.raspberrypi.org/forums/view ... p?t=197124
Re: using raspistill to only display red pixels in Real Time
The thermal warning only happened because the Pi3B had no heatsinks, no fan and did lie flat on the desktop.
Just making Pi3B stand vertically is enough to get rid of thermal warning.
That is OK.We made more experiments with OpenCV and it looks like this is the direction we are going to take.
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/