User avatar
HermannSW
Posts: 4838
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Capturing with Arducam MIPI monochrome global shutter cameras

Tue Mar 10, 2020 6:56 pm

In 2018 I worked on getting Raspberry v1 camera (clone) do global external shutter capturing.
In the end quite impressive global shutter type videos and frames got captured:
https://github.com/Hermann-SW/Raspberry ... al-shutter

Thread "Arducam cameras fitting Raspberry CSI-2 interface" showed that Arducam color and monochorme cameras as well as stereo hat can be used via Pi CSI-2 interface:
https://lb.raspberrypi.org/forums/viewt ... 7b79e54da0

In this thread only the monochrome global shutter cameras will be looked at.
In this posting I work with ov9281 sensor based 1MP 1280x800 monochrome global shutter MIPI camera.
The STROBE pin (when used as OUTPUT) allows to get signal when pixel array starts integration.
Raspberry camera hardware sync pulses trigger a bit later, when camera starts to send frame to Pi.
https://www.uctronics.com/arducam-ov928 ... amera.html
Image


In previous work with v1 camera it was essential that the very bright light (5000lm or more) was active for single digit microsecond duration only, in order to achieve the global shutter effect. With a real global shutter camera like the ov9281 that is not necessary.

As first experiment I did setup a Pi controlled 5000lm LED used from v1 camera work:
https://github.com/Hermann-SW/Raspberry ... al-shutter

This is complete setup, ov9281 camera and fast rotating mini drone propeller in the clamps of soldering third hand, camera connected to Pi4B CSI-2 interface below, GND/3V3/GPIO18 connected with MOSFETs GND/VCC/SIG pins (16MP photo for detail view):
Image


I did play with Arducam opencvGui demo app. That app allows to change exposure in 200 steps from 0x0000-0xFFFF.
The only change I did was to change 0xFFFF to 0x00C8, allowing to set exposure register to values in 0..200 range:
https://github.com/ArduCAM/MIPI_Camera/ ... i.cpp#L183


After some experiments with slow and faster propeller speeds (propeller motor is connected to constant voltage power supply) I ended up at normal 3.3V voltage for the motor. I measured with laser tachometer that propeller did 25100rpm for that voltage. That is quite fast, 44.7m/s or 160.9km/h rotational speed at 34mm diameter blade tips.

First experiment was not to use the 5000lm led, but normal 1000lm light directly in front of propeller, shining the whole time.
The minimal exposure where a little bit can be seen of the rotating propeller was 5.
Even that very low exposure value is not fast enough, the propeller is not displayed sharp:
Image


Next I used the 5000lm led, shining for a few seconds in order to not get too hot, allowing to click on the opencvGui demo "Snapshot" button for capturing a single frame. Now the propeller is captured sharp. What I cannot explain yet is why the frame shows some very light white disk where the propeller rotates, I would have expected to see the sharp blade only:
Image


In the previous frame the 5000lm led can be seen mirrored in plastic box in background. I moved a cardbox between 5000lm led and plastic box in order to avoid light from behind the propeller having an effect of the captured frame. There is a change, but still some very light white disk can be seen in addition to the sharp blade captured for the exposure=1 exposure time:
Image


Summary sofar:
  • exposure=1 is fast enough for capturing 45m/s moving objects sharply
  • 5000lm led just in front of propeller is bright enough
  • bright light can shine the whole time, since the hardware exposure time is very short

Next I want to capture airsoft pistol pellets in flight.
A single pellet does fly with 36m/s, so will be captured sharp.
I think that exposure=1 is the time that a single 1280 pixel row needs to be transferred to Pi via CSI-2.
Assuming capturing speed 60fps that means capturing time was 1s/60/800=21µs.
In that time pellet does fly 21*0.036mm=0.76mm, pellet has diameter 6mm.
Now recording 1280x800@60fps video, there is 16.6ms between frames.
Pellet does fly 60cm in that time, so shooting several pellets will likely capture pellet(s) in 60fps video.
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4838
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Wed Apr 01, 2020 4:35 pm

Recently I captured video with v1 front camera of raspcatbot driving with 1.14m/s backward along a tape line.
Scene was lighted with 5000lm led from outside, shutter speed was 100µs:
https://www.raspberrypi.org/forums/view ... 0#p1633505

Even though shutter time was really low, the frames are still shaky (right with center camera):
Image


So I wanted to mount one of the monochrome global shutter cameras onto the robot and see the difference to rolling shutter camera. Because I later want to do live frame processing for autonomous robot drive, I was interested in low number of pixels. So I decided to go with smallest and cheapest (25.99$) ov7251 Arducam Mipi camera (the monochrome version of ov7750 color sensor):
https://www.uctronics.com/arducam-ov725 ... amera.html

There was raw_callback.c demo in Arducam MIPI_Camera repo, that just counted the frames recorded.
I enhanced that demo a bit, and just committed and push to my fork:
https://github.com/Hermann-SW/MIPI_Came ... fff641aa2a

First now "raw_callback" creates "frame.pts" (microsecond precision) timestamp file in current directory, in same format as raspivid "-pts" option writes.

Then, similar to raspiraw high framerate work, I utilized "/dev/shm" to optionally write out N raw frames as "/dev/shm/frame.%04d.raw10".

arducamstill demo output reveals that format stored is Y10P:

Code: Select all

...
Current mode: 0, width: 640, height: 480, pixelformat: Y10P, desc: (null)
...
https://www.kernel.org/doc/html/v4.19/m ... -y10p.html
This is a packed grey-scale image format with a depth of 10 bits per pixel. Every four consecutive pixels are packed into 5 bytes. Each of the first 4 bytes contain the 8 high order bits of the pixels, and the 5th byte contains the 2 least significants bits of each pixel, in the same order.
My Pi3A+ has 447756KB free space in /dev/shm, which is sufficient to store 1190 640x480 frames (384000 bytes).

I created new netpbm like tool y10ptopgm.c, which converts (640x480 only as of now) Y10P frame to portable gray map:
https://github.com/Hermann-SW/MIPI_Came ... 10ptopgm.c

Calling "raw_callback 500 1000" does capture 640x480 frames and stores the first 1000 frames with exposure 500 in /dev/shm. Timestamps of all frames that get captured in 10s get stored in frame.pts. With 500 exposure and less ov7251 captures 640x480 at 198fps framerate, faster than the ov7251 datasheet says:
https://cdn.datasheetspdf.com/pdf-down/ ... Vision.pdf

Capturing with exposure 1000 results in 100fps framerate, and experimenting shows that 620 exposure is the lowest with zero frameskips in 10s recording (161fps framerate). I used ptsanalyze tool for frame skip and delta analysis:
https://github.com/Hermann-SW/userland/ ... ptsanalyze

This is output of raw_callback execution:

Code: Select all

pi@raspberrypi3Bplus:~/MIPI_Camera/RPI $ ./raw_callback 500 1000
Open camera...
Hardware Platform: a020d3
Found sensor ov7251 at address 60
Setting the resolution...
Can't open the file
mmal: Failed to fix lens shading, use the default mode!
Current resolution is 640x480
Notice:You can use the list_format sample program to see the resolution and control supported by the camera.

Code: Select all

Read 0x3662 value = 0x01
Start raw data callback...
Total frame count = 1850
TimeElapsed = 10.000090
Stop raw data callback...
Close camera...
pi@raspberrypi3Bplus:~/MIPI_Camera/RPI $ 

Here you can see analysis of generated "frame.pts" timestamp file, 6% frame skips, no frame delta bigger than 10.090ms:

Code: Select all

pi@raspberrypi3Bplus:~/MIPI_Camera/RPI $ time ( ~/ptsanalyze frame.pts 0 | grep -v "^>" )
creating tstamps.csv
1850 frames were captured at 198fps
frame delta time[us] distribution
      1 5038
      2 5039
      3 5040
      3 5041
     12 5042
     85 5043
    666 5044

Code: Select all

    804 5045
    108 5046
     21 5047
      4 5048
      2 5049
      3 5050
      1 5051
      1 5052
      1 10083
      1 10084
      1 10085

Code: Select all

      1 10086
      7 10087
     33 10088
     81 10089
      6 10090
after skip frame indices (middle column)
131 frame skips (6%)

real	0m16.868s
user	0m12.426s
sys	0m8.294s
pi@raspberrypi3Bplus:~/MIPI_Camera/RPI $ 

Last night I did record frames with raw_callback and did move finger fast before the camera. Scene was lighted from 1m above with 1000lm lamp. Lux (lumen/area) reduces quadratically for longer distance. I just bought 10W 1000lm leds because of that, they will be mounted on raspcatbot front to light the scene and get brigher frames ("bring your own light"). This is animation I created from frames 0900-0999 from the capture, recorded with 198fps framerate (1861 frames in 10 second), played at 20fps, roughly 10× slower than real (frames will get brighter with the robot mounted 1000lm leds):
Image


Here are the two simple tools I used to create the animation from the 100 pgm files.

"togg" converts the pgm frames to png format using netpbm tool, and then creates ogg video with gstreamer:

Code: Select all

🍓 cat togg 
#!/bin/bash

for f in frame.09*pgm; do pnmtopng $f > $f.png; echo $f; done

echo "now creating .ogg"
gst-launch-1.0 multifilesrc location="frame.%04d.raw10.pgm.png" index=900 caps="image/png,framerate=\(fraction\)20/1" ! pngdec ! videorate ! videoconvert ! videorate ! theoraenc ! oggmux ! filesink location="x.ogg"
🍓 
"pngs2anim" creates animated gif with nice colors (see doc link) at 20fps framerate, and scaled to 640 width (100%):

Code: Select all

🍓 cat pngs2anim 
#!/bin/bash

echo "now creating .anim.gif"
# doc:   http://blog.pkh.me/p/21-high-quality-gif-with-ffmpeg.html
# needs: ffmpeg

palette="/tmp/palette.png"

#filters="fps=15,scale=320:-1:flags=lanczos"
filters="fps=20,scale=640:-1:flags=lanczos"

ffmpeg -v warning -i x.ogg -vf "$filters,palettegen" -y $palette
ffmpeg -v warning -i x.ogg -i $palette -lavfi "$filters [x]; [x][1:v] paletteuse" -y x.anim.gif
🍓 

P.S:
If you look at my commit diff, you will see a new section commented out with link into the datasheet.
https://github.com/Hermann-SW/MIPI_Came ... fff641aa2a
Register 0x3662 allows to choose raw8 format instead of raw10.
If that will work, 5/4*1190=1487 frames would fit into /dev/shm.
With no loss in image quality, since y10ptopgm.c just discards the lowest two bits per pixel.
I will follow up with Lee from Arducam on that.

I was surprised that the values written to exposure register seem to correspond to 10 microseconds.
The values are multiples of t_row, the time it takes to transfer a frame row.
Unfortunately the datasheet mentions t_row in several places, but without any numbers.
With exposure 500 I saw 198fps framerate, so 10.52µs for t_row:

Code: Select all

1,000,000/198/480=10.52µs

P.P.S:
Just realized that I will need exposure of 20 for 200µs shutter time, bright light is needed.
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4838
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Thu Apr 02, 2020 7:20 am

In raspcatbot thread I posted about ov7251 monochrome global shutter camera mounted as front camera on raspcatbot caterpillar robot yesterday. The animation from 198fps framerate frames was nice, but a bit dark. Just now I created spreadpgm tool that spreads the gray levels in a portable gray map to full range. Details here, the animation (of scene lighted with 5000lm led from 50cm above), played at 20fps looks nice:
https://www.raspberrypi.org/forums/view ... 8#p1636118
Image


I just closed roller shutter to get dark room and lighted scene from 1m above with only a normal 1000lm lamp. That is 2*2*5 =20 times less lux on the ground. Maximal gray level of frame was only 7(!), but the details after applying spreadpgm tool can be seen. With 1000lm led mounted on front of robot, raspcatbot will be able to follow the 1.5cm wide black line even in dark room:
Image
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4838
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Thu Apr 02, 2020 8:02 pm

I heard back from Lee, and directly setting register 0x3662 for raw8 format does not work.
But I should use camera modes, and that worked.

With this commit raw_callback.c accepts mode as optional 3rd argument.
spreadpgm.c is adapted to deal with pgm files from all modes.
New rawtopgm.c can convert any mode raw file to portable grey map (looks at raw data size).
https://github.com/Hermann-SW/MIPI_Came ... 541b2a8e81

These are the 6 modes, the first two are 10bit, the others 8bit.
The framerates are cool, up to 353fps for 640x120.
All modes capture full camera FoV.
Over 10 seconds now there are very few frameskips:
ov7251.modes.jpg
ov7251.modes.jpg (16.91 KiB) Viewed 5455 times
This is execution with exposure 20 (200µs shutter time), capturing 300 frames in /dev/shm with mode 4:

Code: Select all

pi@raspberrypi08X:~/MIPI_Camera/RPI $ ./raw_callback 20 300 4
Open camera...
Hardware Platform: 9000c1
Found sensor ov7251 at address 60
Can't open the file
mmal: Failed to fix lens shading, use the default mode!
Current mode: 4, width: 640, height: 120, pixelformat: GREY, desc: (null)
Start raw data callback...
Total frame count = 3529
TimeElapsed = 10.000310
Stop raw data callback...
Close camera...
pi@raspberrypi08X:~/MIPI_Camera/RPI $ 
ptsanalyze shows a single frameskip:

Code: Select all

pi@raspberrypi08X:~/MIPI_Camera/RPI $ ~/ptsanalyze frame.pts 0
creating tstamps.csv
3529 frames were captured at 353fps
frame delta time[us] distribution
      2 2823
      1 2825
      1 2827
      3 2828
     53 2829
    543 2830
   2543 2831
    337 2832
     36 2833
      2 2834
      1 2835
      1 2837
      2 2839
      1 8492
after skip frame indices (middle column)
> 8492,5282517
1 frame skips (0%)
pi@raspberrypi08X:~/MIPI_Camera/RPI $ 

This is a sample mode5 frame captured earlier today.
320x240 with full FoV sounds promising for line following, only 1/4 of the pixels of VGA:
frame.0300.5.raw10.pgm.png
frame.0300.5.raw10.pgm.png (30.22 KiB) Viewed 5455 times
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4838
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Fri Apr 03, 2020 8:04 am

Tilted camera 320x240 view is good compromise, only 8cm blind spot in front of robot, but >1m forward view:
https://www.raspberrypi.org/forums/view ... 0#p1636740
raspcatbot.ov7251.fixed_tilt.yt.jpg
raspcatbot.ov7251.fixed_tilt.yt.jpg
raspcatbot.ov7251.fixed_tilt.yt.jpg (89.82 KiB) Viewed 5404 times

P.S:
2045 mode5 raw frames of 10s recording (150MB) even fit into PiZeroW small /dev/shm.
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4838
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Fri Apr 03, 2020 9:27 pm

In room there are 4 bright lights 3m high on the wall, that show powerline frequency.
And a desk light not showing powerline frequency.

Below is animation from 10 frames captured with
  • wall lights only (left)
  • wall lights and desk light (middle)
  • desk light only (right)
and normalized with spreadpgm.

As can be seen wall lights have to be turned off to not have a visible effect.
320x240 animation was recorded at 205fps, German powerline frequency is 50Hz:
Image


P.S:
I took one of the 10 dark frames, painted the only bright spot black, and then applied spreadpgm.
The wall lights are too week, but interestingly some details can still be seen:
x.enh.pgm.png
x.enh.pgm.png (8.31 KiB) Viewed 5365 times
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4838
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Sat Apr 04, 2020 1:22 pm

Last year I got some camera samples from Lee (CEO of Arducam) for testing.
While searching for second ov7251 0.3MP monochrome global shutter camera for the faster 1500rpm raspcatbot, I found camera UC-621.
I looked for that model, and it turned out to be my only color global shutter camera (2MP !):
https://www.uctronics.com/camera-module ... -html.html
(I thought it was monochrome because last year I did capture with mode 0 instead of mode 1)

It has 6 modes as well, so I did use raw_callback.c from my fork to determine framerate and frameskip count for the modes. Not high framerates as with the small modes of ov7251 camera, but big color global shutter captures:
OG02B10.modes.jpg
OG02B10.modes.jpg (16.46 KiB) Viewed 5324 times

I took a sample image with mode 1 for color, camera has 110° horizontal FoV:
Image


I will use ov7251 camera on raspcatbot for highspeed autonomous driving, but the OG02B10 will be mounted next time I let one of the raspcatbots drive outdoor under Wifi joystick control.
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4838
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Sun Apr 05, 2020 9:05 pm

For doing robot control based on live ov7251 320x240@205fps frames processing, I wanted consistent light.
So I added 6W led, and powered it via an IRF520 modsfet module from 16.1V lipo (see photo), with PWM=192 on module SIG pin.
That gives 12V to the led as needed:
20200405_223901.small.jpg
20200405_223901.small.jpg
20200405_223901.small.jpg (49.21 KiB) Viewed 5275 times

Then I did capture with "raw_callback 20 1600 5" of my fork (1600 mode5 frames into /dev/shm, with 200µs shutter time).
This is animation created from frames 850-899, converted to pgm with rawtopgm, then enhanced by spreadpgm
Seeing this effect I would normally say "mains powerline frequency", but all is powered by 4S 1300mAh 95C lipo.
Not sure whether IRF520 module is responsible, or Pi software PWM on GPIO4?!?!?
Image
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4838
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Sun Apr 05, 2020 9:49 pm

It was a "Layer 8" error (the error sits before the screen) ;-)

I don't know which is the default frequency for software PWM in pigpio library:
http://abyz.me.uk/rpi/pigpio/pigs.html#PFS
[EDIT: logic analyzer showed 800Hz default PWM frequency]

I executed this command

Code: Select all

$ pigs pfs 4 100000
8000
$
which did set maximal PWM frequency (8000Hz with default sample rate of 5µs).

Now there is no mysterious effect anymore in recording 320x240 frames at 205fps framerate:
Image


This is good basis for making robot do full stop when straight black line ends by analyzing each frame every <5ms (205fps framerate).
Even when robot runs at 5m/s, it will move at most 5*5mm=2.5cm between frames.
Will compare raspcatbot brake efficiency next, while still have the cable car as safety net.
Assumption is that turning both L298N motor direction pins 0 as currently has longest braking distance.
Follow by setting both motor direction pins to 1.
Best braking distance is expected by just reversing both caterpillar gear motors for some time.


P.S:
Adding thresholding to the nomalization is basis for following autonomous coding.
Threshold of 31% of maximal grey value (80/255 after normalization) works with the onboard bulb led:
frame.1174.raw.thr_80.pgm.png
frame.1174.raw.thr_80.pgm.png (1.66 KiB) Viewed 5186 times
See this posting for details, and for animation of the thresholded view of front ov7251 camera:
https://www.raspberrypi.org/forums/view ... 1#p1638841
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4838
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Sat Apr 11, 2020 12:10 pm

HermannSW wrote:
Tue Mar 10, 2020 6:56 pm
Next I want to capture airsoft pistol pellets in flight.
A single pellet does fly with 36m/s, so will be captured sharp.
I think that exposure=1 is the time that a single 1280 pixel row needs to be transferred to Pi via CSI-2.
Assuming capturing speed 60fps that means capturing time was 1s/60/800=21µs.
In that time pellet does fly 21*0.036mm=0.76mm, pellet has diameter 6mm.
Now recording 1280x800@60fps video, there is 16.6ms between frames.
Pellet does fly 60cm in that time, so shooting several pellets will likely capture pellet(s) in 60fps video.
I remembered that todo and did use Arducam OG02B10 2MP color global shutter camera today.
The Pi4B with OG02B10 camera is on the right, 5000lm led near pellet flight path in the middle, and pellet catcher on the left:
20200411_115227.15%.jpg
20200411_115227.15%.jpg
20200411_115227.15%.jpg (179.32 KiB) Viewed 5139 times

I did capture mode0 (GREY) 1600x1300 frames at 60fps framerate.
I know from previous global (external) shutter multiple exposure captures with Raspberry v1 camera that pellet speed is 36m/s(!):
https://github.com/Hermann-SW/Raspberry ... bottom_led
I did capture with exposure 20, which corresponds to roughly 210µs shutter time, knowing it would result in not sharp image.
6mm ⌀ pellet, over 36m/s*210µs=7.56mm range, can be seen inflight just right of the window in this 400% scaled up part:
174.part.400%.png
174.part.400%.png
174.part.400%.png (62.18 KiB) Viewed 5139 times
This image shows two important other iterms:
  1. using color global shutter camera with GREY is not good, better use mode1 BA81 and demosaic in post processing
  2. the house more than 30m away can be clearly seen in sunlight with 210µs shutter time, without 5000lm led

I created animation playing at 2fps, 30× slower than real. The first frame with capture of pellet during 36m/s flight is shows 4 times. Pellet catcher was not ideal, pellet left it and fall down (much slower):
x.anim.gif
x.anim.gif
x.anim.gif (248.92 KiB) Viewed 5139 times
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4838
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Sun Apr 12, 2020 5:46 pm

HermannSW wrote:
Sat Apr 11, 2020 12:10 pm
2. the house more than 30m away can be clearly seen in sunlight with 210µs shutter time, without 5000lm led
I went out to the terrace with sunlight in order to capture 36m/s pellet inflight without a 5000lm led.

This is my mobile setup, Pi3B+, powered from 14.8V 1300mAh 95C lipo through LM2596 stepdown converter to 5.0V, Arducam ov9281 1MP monochrome global shutter camera (29.99$, with 166° diagonal FoV) plugged into Pi CSI-2 slot, 26$ 9" HDMI lcd, airsoft pistol and pellet catcher (16MP photo, right click for details):
Image


I did record with raw_callback.c from my github fork, mode0 1280x800 frames at 60fps. My raw_callback 1st parameter is the exposure time, which is multiple of t_row, the time to transfer a single row from camera over CSI-2 to Pi. t_row is <1,000,000µs/60/800=20.83µs. It turns out that exposure=1 works with direct sunlight! So below images are all captured with 20.83µs shutter time. During that time the 6mm ⌀ pellet does fly 36m/s*20.83µs=0.75mm. This is not perfect, but as you can see the captured pellets look all sharp.
[t_row for 0.3MP ov7251 monochrome global shutter camera is <1,000,000µs/204/480=10.21µs, I use exposure=20 with that camera for controlling raspcatbot with live frame processing]

Next, what does the 60fps framerate mean to what gets captured? Pellet flies 36m/s*(1/60)s=60cm(!) between frames. Therefore with the setup shown pellet can be captured inflight at most one time before hitting pellet catcher. With multiple exposures pellet can be captured in a single frame as often as you like, not in sunlight, but with bright flash strobe effect. I did that with v1 camera global external shutter capturing last year:
https://github.com/Hermann-SW/Raspberry ... bottom_led

The first pellet captured with the setup as shown above is this one. The (scaled to 50%) animation shows
  • the frame before pellet was shot
  • the frame with pellet inflight, just before reaching pellet catcher
  • the frame after pellet did move into pellet catcher
Interesting with 2nd frame is, that although the pellet is still outside of pellet catcher, there is a visible effect on the paper in pellet catcher already. Since the paper form does not change, I assume that the pellet's reflected sunlight hits the paper in 2nd frame:
frames3.anim.gif
frames3.anim.gif
frames3.anim.gif (153.78 KiB) Viewed 5097 times

The sun moved, and I had to move the setup as well, so new perspective. Here are the captured pellets inflight, marked with white rectangle. These captures are from 4 different shots. I kept original 1280x800 resolution to show unmodified (besides white rectangle) frames:
Image


I was really happy in that I recorded a trick shot. My shot did hit a pellet lying in front of pellet catcher on the ground. The shot pellet moves into pellet catcher after collision, the pellet from the ground goes up and up and up, nearly vertically in the frame:
trick.anim.gif
trick.anim.gif
trick.anim.gif (169.18 KiB) Viewed 5097 times

During one shooting (I did capture 600 frames in 10s, and was able to do many shots in that period), cloud moved between sun and terrace. The frame with pellet captured inflight is a bit dark:
Image


Again I normalized that frame grey values with a modified version of my spreadpgm.c tool, that can deal with 1MP frame. As you can see, the pellet captured inflight can easily be seen near middle between airsoft pistol muzzle on the left and pellet catcher on the right:
Image


Summary:
  1. capturing 36m/s pellets inflight is possible with global shutter cameras
  2. without using a sound trigger, camera has to be far away from pellet trajectory, and pellet captured is small in frame
  3. sunlight is very bright (32,000-100,000 Lux)
  4. capturing with 2MP color global shutter camera and demosaicing the frame, color inflight captures are possible
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4838
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Wed Apr 29, 2020 3:56 pm

I found a new application for monochrome 0.3MP global shutter ov7251 camera.

Until now I used Raspberry v2 camera capturing at 100fps from the side when capturing raspcatbot fast drive by.
This is just a frame, animation captured can be seen here:
viewtopic.php?f=37&t=267999&start=25#p1634378
Image


Because of only 200µs shutter time the scene was lighted with 5000lm led.
I just replaced v2 camera with ov7251, not because it is global shutter (this time), but because its mode4 allows for 353fps framerate (640x120 -- its mode5 allows for 320x240@204fps, and that is used by raspcatbot onboard camera).

This is the setup, raspcatbot jacked up, powered (temporarily for jacked up testing, instead of 4S 1300mAh 95C lipo) from constant voltage power supply, ov7251 camera connected to Pi3B+ with 1m cable, left of it the 5000lm led with big heatsink lighting the scene:
20200429_150050.15%.jpg
20200429_150050.15%.jpg
20200429_150050.15%.jpg (211.14 KiB) Viewed 4859 times

All ov7251 modes capture full camera FoV, this is frame 18 captured, compressed by factor 4× vertically:
frame.0018.raw.pgm.jpg
frame.0018.raw.pgm.jpg
frame.0018.raw.pgm.jpg (10.66 KiB) Viewed 4859 times

It is a bit hard for me to spot the vertical details, therefore I wrote small utility to double each frame row:

Code: Select all

$ cat dblpgm.c 
/* gcc -Wall -Wextra -pedantic dblpgm.c -o dblpgm
*/
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <assert.h>

int main(int argc, char *argv[])
{
  unsigned char buf[15+640*480];
  int width, height, height0, y;
  FILE *src;

  assert(argc <= 2 || !"spreadpgm [file.pgm]");
  src = (argc == 1 || strcmp(argv[1], "-") == 0) ? stdin : fopen(argv[1], "rb");

  assert(1 == fread(buf, 15, 1, src));
  assert(2 == sscanf((const char *)buf, "P5\n%d %d\n255\n", &width, &height));
  height0 = height;
  if (height%16)
    height += (16-(height%16));
  assert(1 == fread(buf+15, width*height, 1, src));
  assert(0 == fread(buf, 1, 1, src));
  assert(feof(src));
  fclose(src);

  printf("P5\n%d %d\n255\n", width, 2*height0);
  for(y=0; y<height0; ++y)
  {
    fwrite(buf+15+width*y, width, 1, stdout);
    fwrite(buf+15+width*y, width, 1, stdout);
  }

  return 0;
}
$
Applying this two times in series to previous frame results in this better to view 640x480 frame:
frame.0018.raw.q.pgm.jpg
frame.0018.raw.q.pgm.jpg
frame.0018.raw.q.pgm.jpg (23.88 KiB) Viewed 4859 times

You see right caterpillar of raspcatbot, and I did drive it with PWM=255 forwards.
On right front wheel you can see reflective marker I added long ago for measuring speed with laser tachometer.
That is not needed this time, a full turn does need 13 frames in a phase without frameskips.
Below animation consists of frames 18-56 (in frame 57 marker is at same position as in 18).
So time taken for 3 full turns of front wheel is 158533-48126=110407µs (from frame.pts file written by my fork's raw_callback.c).
Therefore rotational speed seen in the animation is 1000000/(110407/3)*60=1630rpm.

I do not capture these free running videos because of determining speed, but to debug mechanical issues with the other side continuous track. Here is this side animation, captured (without frameskips in this range) at 353fps, played at 2fps, 176× slower than real:
Image


The reason that 12V 1500rpm gear motor can run at 1630rpm is, that mechanically all is fine (and it is powered with overvoltage).
I always have to remind myself that caterpillar is driving forward (easy when spotting the reflective marker, clockwise turn).
My eyes tell me that endless track is moving backward when looking on back wheel.

This application of ov7251 will allow me to get the needed insight in mechanical issues, and will likely be part of initial raspcatbot calibration from now on. Nice to be able to inspect the endless tracks with wheel rotating with >27 rotations per second(!).
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4838
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Thu Apr 30, 2020 6:34 pm

I did same capturing of caterpillar right endless track full speed backwards in this posting (1737rpm vs. 1630 forward as shown here):
viewtopic.php?f=37&t=267999&p=1652038#p1652038

The animation is nice, but has the same disturbing effect that tells the eye that rotation direction is the other way around.
I found a simple countermeasure in adding a small reflective marker to the endless track this time.
Frames 1-74 were captured at 353fps without frameskips.
Frames 8 and 73 had new reflective marker at same position, so animation consists of frames 8-72.
In those frames the new marker travels two times around all caterpillar wheels.
Looking up timestamps for 8th and 73th frame, the whole track rotations per second is 1,000,000/((203829-19817)/2)=10.9, or 652rpm.
Looking at right front wheel marker, minimally more than 12 frames are needed for full turn, that is 1000*(12/353)=34rps or 2040rpm(!). I had seen slightly more than 1900rpm with laser tachometer before, but never more than 2000 (the motor is really a 12V 1500rpm gear motor, with obviously air for more when powering with 16.8v).
Image
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

Rustysnake
Posts: 1
Joined: Wed Mar 17, 2021 6:30 pm

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Wed Mar 17, 2021 8:29 pm

Hi
I want to use CC a 2mp color arducam with global Sutter for a project. Obviously it is sold out everywhere. Would you have any idea where I might find one?
Cheers and thanks in advance

Code: Select all

😊

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 12153
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Capturing with Arducam MIPI monochrome global shutter cameras

Thu Mar 18, 2021 8:31 am

Rustysnake wrote:
Wed Mar 17, 2021 8:29 pm
Hi
I want to use CC a 2mp color arducam with global Sutter for a project. Obviously it is sold out everywhere. Would you have any idea where I might find one?
Cheers and thanks in advance
Which module are you looking at? All the global shutter cameras list at https://www.arducam.com/product-categor ... er-camera/ are mono.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

Return to “Camera board”