User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Thu Mar 26, 2020 8:25 pm

I did move the good 1500rpm gear motor and continuous track from the other T101 over to the big raspcatbot.
Then I realized that it will never drive controllable by a joystick because of likely minimal speed of 1m/s (no problem if running autonomous).
I wanted to slow down the 2nd T101 for work on wifi joystick control. This was done by using the original 330rpm gear motors.
The new lipo did not arrive yet, so I cannibalized a 3S 25C 1000mAh from the big raspcatbot.
And I used the new design shown two postings ago.
To my surprise powering both motors as well as the step-down converter to power the Pi from single lipo works without issues.
New raspcatbot does work already, currently it can drive forward and backward only.
Because the motors behave very similarly, it nearly does drive straight, see diagram below.
I wanted to capture a video of new raspcatbot driving, but the heartbeat emergency stop gets triggered always currently. There is a simple reason, the Pi ZeroW was not booted for 5 months, and I did a "sudo apt-get update; sudo apt-get upgrade" and that is still running. This takes so long on a Pi ZeroW. Will be interesting whether heartbeat signal will interfere with first person view of camera installed in front (via uv4l, to be installed after update is complete). This is the new design -- so much smaller than the old (6.7MP image has the details):
Image


318rpm corresponds to 1m/s, and that is roughly the maximal speed at 12V with the 330rpm gear motors. But raspcatbot can drive much slower speeds now:
Image


P.S:
Interesting, raspistill did not work during and after upgrade, after reboot it worked again.
This is raspcatbot's view from camera mounted in front, I measured that the first 14cm of ground before camera is blind spot.

Code: Select all

raspistill -md 7 -w 640 -h 480 -o tst.jpg -br 60
tst.jpg
tst.jpg (138.36 KiB) Viewed 5479 times

P.P.S:
Only 778g (compared to 1227g of big raspcatbot):
20200327_015341.jpg
20200327_015341.jpg (58.57 KiB) Viewed 5465 times
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Sat Mar 28, 2020 5:55 pm

Wifi joystick now works fine.
I posted the ESP32 Arduino sketch as well as the code running on the Pi in this posting:
https://esp32.com/viewtopic.php?f=19&t= ... 513#p57338
For driving outside dependency to home router needed to be removed.
It was difficult to determine the connected devices with ESP32 being Wifi AP.
Making Pi AP is not that easy either.
So I chose simplest solution and let Android smartphone be AP.
That way I can login to raspcatbot's Pi (ZeroW) with JuiceSSH app and start/stop stuff being outside.
The posting contains video of raspcatbot drive on hardboard target ground as well.
Image


Today the 4S 1300mAh 95C lipos arrived(!):
https://twitter.com/HermannSW/status/12 ... 5687433216

I just tested one 4S lipo on the 330rpm gear motor (slow) new design raspcatbot.
Before I took some videos with 3S 1000mAh 25C lipo raspcatbot and will post speed video analysis later..
This is new design 12V 330rpm raspcatbot with right continuous track running full speed backward with overvoltage at 16.5V.
Laser tachometer did show 430rpm, that is 1.35m/s, a speed controllable with joystick (after more handson).
Next step is to rebuild the "big" 1500rpm raspcatbot to new design with single 4S lipo.
In the background you see 75W lipo safety bag I bought together with the new 4S lipos.
Image


P.S:
Robot weight did increase by 101g with the new lipo to 879g.
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Sun Mar 29, 2020 9:52 am

Still working on video speed analysis of the videos taken yesterday, with some satisfying results.

Today I did take some more videos. First raspcatbot front camera video of inhouse test drive. 640x480@90fps video captured looks surprisingly not shaken, reason seems to be 90fps framerate. Uploaded video is played at 25fps, 3.6× slower than real:
https://www.youtube.com/watch?v=UphiyM1acPU
raspcatbot.inhouse.front-camera.png
raspcatbot.inhouse.front-camera.png (179.48 KiB) Viewed 5403 times

3 years ago I did take some outdoor videos with/from previous caterpillar robot platform, starting with this posting:
https://forum.arduino.cc/index.php?topi ... msg3234127

There was a ramp that the previous robot was not able to drive uphill completely. Therefore I did test that ramp today. raspcatbot had lost some Lego pieces before (vibrations), and left side middle wheel (not really needed, just found the two missing washers and nut outdoor). With slow 330rpm gear motors that ramp uphill was no problem anymore:
https://www.youtube.com/watch?v=6zQk2KK ... e=youtu.be
raspcatbot.uphill.png
raspcatbot.uphill.png (180.23 KiB) Viewed 5403 times

Downhill the ramp robot was not that easy to control at full speed:
https://www.youtube.com/watch?v=y6uM8IdrKiM
raspcatbot.downhill.png
raspcatbot.downhill.png (164.86 KiB) Viewed 5403 times
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Sun Mar 29, 2020 4:41 pm

Now the promised video speed analysis.

  1. full speed center turn comparison on hardboard/carpet
  2. full speed comparison straight on carpet/free running
  3. full speed slippage analysis/plan

full speed center turn comparison on hardboard/carpet

In video recorded two days ago on hardboard course, at the end I made raspcatbot two times full speed center turn right. The recorded smartphone video was 1080x1920@30fps. I did count 26 frames for a full 360° rotation. Then I did the same full speed U-turn on carpet, and to my surprise full rotation there takes 26 frames as well. That means that the U-turn on carpet I did 3 years ago with previous caterpillar robot platform should work for raspcatbot on hardboard target ground as well:
Image
Image

full speed comparison straight on carpet/free running
This was done with only 3S lipo, so slower than with the 4S lipo used now. In top animation you can see raspcatbot drive straight full speed. I used the reflective mark on front right wheel to determine that 25 frames are needed for 3 full turns of the wheel. Next I did record jacked up raspcatbot, and there 15 frames are needed for two full turns of the wheel. So straight real speed is (π*0.060*3/(25/30))/(π*0.060*2/(15/30))=90% of free running speed(!). I have seen free running speeds of 5.65m/s already with gear motors running 1800rpm, therefore full speed straight line with 1500rpm gear motors and 16.8V overvoltage should result in 5.08m/s -- nice, and what I targeted for.
Image
Image

full speed slippage analysis/plan
This is definitely not something that can be really done anymore with 30fps only smartphone camera. I will use external Raspberry v2 camera capturing 640x480@100fps for nice timestamps. Shutter time needs to get adjusted for getting clear images of full speed passing raspcatbot, at least for the 1500rpm gear motors one. I took two frames and copied raspcatbot images together after robot did 2 full wheel turns. Location on the left side is roughly 107cm, on the right side it is roughly 72cm. So raspcatbot did 35cm with two full wheel turns. Without slippage it should do π*0.060*2=37.7cm:
Image

It was really difficult to drive straight forward. Reason is that right motor driving forward is faster than left motor driving forward, see diagram of motor speeds 3 postings before. Driving backward both motor speeds are very similar. Therefore I did drive raspcatbot backwards for the last analysis. Later, when live raspcatbot front camera video processing is in place and with a nice PID algorithm, raspcatbot should easily follow a straight black line on hardboards for speed measurements. For now I will try whether enforcing straight light movement (for measurements only) by making it run like cable car will work. That could allow determination of high speeds even inhouse. I already superglued M3 nuts below left and right T101 gear motors in middle position -- lets see whether high speed cable car will work:
20200329_174638.small.jpg
20200329_174638.small.jpg (136.46 KiB) Viewed 5347 times
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Mon Mar 30, 2020 7:58 am

The "cable car" idea really worked.
This is 100fps 640x480 v2 camera video from the side (no frameskips, "nice" timestamps).
In order to get sharp frames I used only 100µs shutter time (--shutter 100).
That needed bright light, I used 5000lm led from the side.
The robot drives on ellipsis with near 1 eccentricity, that is "straight line" enough for measurement:
Image


Robot drives full speed backward (for reason stated in previous posting) with 16.75V fully loaded 4S lipo.
This is frame 1218 of video captured, reflective marker at right front wheel at lowest point, roughly at 71cm:
raspcatbot.5.1218.png.jpg
raspcatbot.5.1218.png.jpg (23.24 KiB) Viewed 5301 times

This is frame 1232 of video captured, reflective marker at right front wheel at lowest point, roughly at 55cm:
raspcatbot.5.1232.png.jpg
raspcatbot.5.1232.png.jpg (21.25 KiB) Viewed 5301 times

1st observation:
1 full wheel turn in 14/100s, that is 60/(14/100)=428rpm (12V 330rpm gear motor running with 16.75V).

2nd observation:
In 0.14s robot moves slightly more than "only" 16cm.
But is should be π*0.060=18.8cm.
Currently I have no explanation for the discrepancy.

3rd observation:
Real speed is 0.16m/(14/100)=1.14m/s or 4.1km/h.
With 1500rpm gear motors (to be measured!) that should give real speed 1.14*1500/330=5.2m/s.

4th observation:
The continuous track once touching hardboard keeps in place, no slippage.


As shown in previous posting the cable goes through the two M3 nuts superglued below the two gear motors.
This is setup, hooks in wooden cabinet and carpet baseboard on both sides, and 5000lm led near the recorded scene.
I was not able to tighten the cable enough for real straight line drive.
But robot driving on ellipsis with near 1 eccentricity is "straight line" enough for measurement.
The yellow box fills gap between baseboard and last of nine 50cmx50cm plates, giving 4.5m hardboard runway:
raspcatbot.5.setup.jpg
raspcatbot.5.setup.jpg (73.49 KiB) Viewed 5301 times
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Mon Mar 30, 2020 4:57 pm

I tried to measure the speed with onboard camera.
Same scenario as yesterday, cable car drive, just a little less voltage (15.9V today, yesterday 16.75V).

raspcatbot camera is a v1 camera, so I had to reduced framerate to 90fps.
I did turn the 5000lm led 90° to shine in direction the robot was coming from.
And I placed it a bit higher for shining a bit more distant on the hardboard plates.

Initially I tried with 200µs shutter time -- very shaky:
Image


Next I tried with 100µs shutter time -- better, but still shaky:
Image


Then I realized what could be cause. I had moved the v1 camera directly in front of right continuous track, perhaps that was cause of vibrations. So I moved camera back to center position wher it was before. Definitely a bit better than the last:
Image


In order to compare both "100" and "100c" (with center camera) animations, I scaled the frames down to 50% size and concatenated them. This is the result:
Image


Now time has come to test (monochrome) global shutter camera, first only for this measurement task. I have 0.3/1.0/2.0MP monochrome global shutter cameras. For later robot control video processing at high framerate the number of pixels should not be too big. So I will go with the 0.3MP ov7251:
https://www.uctronics.com/arducam-ov725 ... amera.html
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Tue Mar 31, 2020 6:55 am

Still no idea where the discrepancy of observation 2 from next to last posting comes from.
I measured wheel+track diameter as 58mm.
But I don't see π*58mm=18.22cm between the reflective markers of frames 1218 and 1232:
raspcatbot.58mm.jpg
raspcatbot.58mm.jpg (52.59 KiB) Viewed 5213 times
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Wed Apr 01, 2020 10:12 pm

I made a lot of progress with Arducam ov7251 640x480 monochrome global shutter camera, for details:
"Re: Capturing with Arducam MIPI monochrome global shutter cameras"
https://www.raspberrypi.org/forums/view ... 2#p1635792

After that I removed the v1 camera from raspcatbot, and mounted ov7251:
20200401_223218.10perc.jpg
20200401_223218.10perc.jpg (73.15 KiB) Viewed 5164 times

I did put the 5000lm led 50cm above ground this time, in order to shine over a wider area.
Again full backward drive of raspcatbot, will fully loaded 4S lipo (16.74V).
The frames taken with enhanced raw_callback.c demo in /dev/shm were captured at 198fps framerate, and 200µs shutter time.
With the two tools in the pointed to posting I created animation from frames 390-500.
Animation plays at 20fps, slightly less than 10× slower than real:
Image


One of the enhancements of raw_callback.c I did was to write out microsecond precision timestamps as written by raspivid "-pts" option. This allows to determine the real full speed backward speed exactly.

This is frame 414, lookup in generated frame.pts gives timestamp 3838.886, 119cm at bottom frame border:
Image


This is frame 449, lookup in generated frame.pts gives timestamp 4181.914, 79cm at bottom frame border:
Image


Speed is 1.17m/s, very similar to determined 1.14m/s before:

Code: Select all

1000/(4181.914-3838.886)*(119-79) = 1.17

I already ordered 10W 1000lm leds. One or more will be mounted on raspcatbot front shining forward (bring your own light). 95C 4S lipo will be able to power the led(s) in addition to motors and Pi easily.

The animation above is global shutter animation, no more skewed frame parts, exactly what is needed for autonomous robot drive based on live frame analysis.

Foam rubber arrived today, will see whether using that for mounting camera not directly on robot platform will reduce the shaking during drive a bit.
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Thu Apr 02, 2020 7:04 am

It was late last night and I was happy about the accomplished.
But the frames are a bit dark, despite 5000lm led.
I looked into the two frames with gimp Histogram function.
I found that maximal gray value for frame 414 is 28, and for frame 449 is 44.
Then I created simple spreadpgm.c tool which spreads gray levels to full range:
https://github.com/Hermann-SW/MIPI_Came ... bb6edbd30d

I found no existing functon in gimp that does the same simple scaling, but the results are nice!

This is enhanced frame 414 from last posting:
Image


This is enhanced frame 449 from last posting:
Image


I enhanced the frames in 390..500 range and created animation again, just beautiful:
Image


While for line following algorithm these enhancements nice for humans might not be needed, I can see that scaling gray levels to full range while processing a single frame might be useful as "normalization".


P.S:
I just closed roller shutter, and did light the scene with only 1000lm lamp, and that with double distance from 1m above. That is 2*2*5=20 times less lux on ground than yesterday night. The captured frame just looks completely black -- but it is not. gimp Histogram says maximal gray level is 7(!), so I did run spreadpgm and while this is not as nice as the previous frames, the relevant information is now visible! What you see between tape line and black line is cable and its shadow for raspcatbot "cable car". With 1000lm led mounted on front of robot, raspcatbot will be able to follow the 1.5cm wide black line even in dark room:
Image
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Fri Apr 03, 2020 6:24 am

With Lee's help I was able to capture raw 8bit frames as well:
https://www.raspberrypi.org/forums/view ... 3#p1636573
I added "mode" handling to raw_callback.c in addtion to exposure (exp=x means 10*x µs shutter time):
https://github.com/Hermann-SW/MIPI_Came ... 541b2a8e81
All modes capture full camera FoV, and for all formats the framerates are cool.
Not sure yet whether Pi3A+ live frame processing for autonomous line following can be done in 5ms.
These are the modes of ov7251 monochrome global shutter camera (frame skips count over 10s record time):
Image


I experimented with camera not being mounted vertically in order to reduce the blind spot in front of robot.
This angle is a good compromise, only 8cm in front of robot blind spot, with >1m view forward:
raspcatbot.ov7251.fixed_tilt.jpg
raspcatbot.ov7251.fixed_tilt.jpg
raspcatbot.ov7251.fixed_tilt.jpg (59.47 KiB) Viewed 5096 times

This is 320x240 frame captured with mode5 of same scene (exposure=20, spreadpgm applied).
Right robot continuous track touches line tape, so no turn right is needed before hard turn left:
frame.0100.enh.5.png
frame.0100.enh.5.png
frame.0100.enh.5.png (18.18 KiB) Viewed 5096 times

This is 640x480 frame captured with mode2 of same scene (exposure=20, spreadpgm applied).
Will be my 8bit fallback mode in case preferred 320x240 is not good enough for some reason:
frame.0100.enh.2.png
frame.0100.enh.2.png
frame.0100.enh.2.png (114.62 KiB) Viewed 5096 times

P.S:
12V 0.68A led is good enough for slightly more than 50cm forward view in dark room, but not 1m:
Image
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Fri Apr 03, 2020 4:33 pm

I remembered that I have some reflectors.
With 12V the led now draws 0.459A, rougly 5.5W:
raspcatbot.ov7251.12V_680mA.reflector.jpg
raspcatbot.ov7251.12V_680mA.reflector.jpg
raspcatbot.ov7251.12V_680mA.reflector.jpg (79.17 KiB) Viewed 5057 times

Led with reflector now is good enough for 1m forward view in dark room:
raspcatbot.reflector.enh.pgm.jpg
raspcatbot.reflector.enh.pgm.jpg
raspcatbot.reflector.enh.pgm.jpg (6.63 KiB) Viewed 5057 times

In difference to previous image captured hardboard left is dark because of reflector focus:
Image


Summary for robot onboard led:
  • with reflector 1m forward view in dark room is possible
  • view is restricted left and right because of reflector
  • with normal room light, robot onboard led is not needed because of spreadpgm.c
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Sun Apr 05, 2020 12:43 pm

This is 1500rpm raspcatbot brought to single 4S lipo new design as well:
Image
The camera is tilted encapsulated in front Lego cage (with 5cm blind spot only), after successful prototype test yesterday:
https://twitter.com/HermannSW/status/12 ... 0033321984
The 4S lipo is directly connected to L298N motor driver via the big XT60 connector.
And it is connected to step down converter via outermost charging connector female headers for powering the Pi3AB+ (front left) with 5.0V.

Measured with 15.5V free running, speed is forward 1650rpm/1540rpm and backward 1510rpm/1590rpm for left/right gear motors (measured with laser tachometer). Since speeds for full backward are not identical for left/right motors, and as a safety measure, raspcatbot did drive as cable car again.

This time I did not only record with Raspberry v2 camera at 100fps framerate with 200µs shutter time from the side (scene lighted with 5000lm led). In addition smartphone did record from desk and shows some meters of the hardboard test ground. Most importantly it did record audio. That allowed me to identify the position where loud motor sound disappeard (because I entered braking command on laptop, too early) and braking happened. It is exactly at this frame, just when raspcatbot did enter the v2 camera scene. Robot started with rear at 4m mark, and enters v2 camera scene at 1.70m mark, so 2.3m of acceleration. Motor was nearly empty (after many tests), loaded with only 14.9V instead of 16.6V, and PWM=200 was used (80%) to avoid crash:
https://www.youtube.com/watch?v=hp7pdkD ... e=youtu.be
raspcatbot.7.yt.jpg
raspcatbot.7.yt.jpg
raspcatbot.7.yt.jpg (36 KiB) Viewed 4983 times

Because braking happens all the time while raspcatbot passes the v2 camera scene, I determined the speed from the first two frame allowing for that: White marker on right front wheel does 0.15 of full rotation (between two successive frames 0.01s apart):
Image
Raspcatbot has 2.73m/s or 9.8km/h speed at that moment:

Code: Select all

$ bc -ql
pi=4*a(1)
s=2.30
d=0.058
fps=100
print rps=0.15*fps, "\n"
15.00
print v=d*pi*rps, "\n"
2.73318560862312011730
print a=v^2/(2*s), "\n"
1.62397903721402946471
print t=v/a, "\n"
1.68301778901774378065
print v*3.6, "\n"
9.83946819104323242228

This is complete drive through scene, played at 3fps, 33.3× times slower than real:
Image


P.S:
Finally raspcatbot got its own 12V 6W led onboard (controlled by IRF520 mosfet module from the Pi):
Image


There was a mysterious problem, but it turned out to be a "layer 8" error (sits in front of display).
After setting pigpio PWM frequency to 8000Hz, capturing 320x240 frames at 205fps framerate does not result in any problems for the normalized (spreadpgm) pgm frames:
Image
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Mon Apr 06, 2020 7:33 pm

This is raspcatbot with onboard led bulb mounted tilted (with small heatsink, superglued below a tilted Lego piece).
I did place a small mirror on the ground, and camera sees the bright led light (room was dark) through welding shield safety glass.
For humans it is easy to see that only 2 of the 9 bulb leds are working, smartphone camera cannot show that, even in original 16MP photo.
20200406_203944.jpg
20200406_203944.jpg (49.02 KiB) Viewed 4928 times
So this is the 4th(!) of those bulbs that got damaged (they light 9 leds normally).
Reason is that I found no 1.5Ω resistor as mentioned in product description of LED bulb:
Power: 10W
Luminance: 750-850LM
Current: 600-700mA
Voltage: DC 9-12V(if use DC 12,Recommended Series a about 1.50 ohm resistor to use)
Integrated approach: Series 3 and 3
Color: Daylight white (6000 - 6500K Kalt/Cold or 3000K-3500K Warm color temperature)
Size (L x W): Approx. 1.15 x 0.8 inch / 2.9 x 2 cm
Light-emitting part of the size: 11mm * 11mm
LifeSpan Time : > 50,000 hours
Application: Landscape scene / spot lighting, Architectural lighting
Surprisingly this damaged (I still have the other 3 damaged bulbs) shines bright enough.

I looked at the normalized (wih spreadpgm) frames yesterday and played with gimp algorithms.
Simple "Threshold" algorithm was nice to play with.
With today's lighting I did that again and threshold 80 for separation between black and white was good.
New thresholdpgm.c committed and pushed:
https://github.com/Hermann-SW/MIPI_Came ... 42f1541f24

I did set led bulb IRF520 mosfet PWM to 144, that resulted in powering the bulb with 9.6V.
Then I did record 320x240 frames at 205fps framerate, and made raspcatbot drive slowly (PWM=120) backward.
After applying "threshold frame 80" to each frame, I created this 25fps framerate animation (8× slower than real).
This is the view I will base braking test code and later line following on.
At end of animation you can see easily that black straight line ends, easy to recognize in code as well (1.5cm wide straight line is roughly 25 pixels wide):
Image


While I have heartbeat in dead-man_button.c for controlling raspcatbot over Wifi joystick, I have nothing like that currently in Pi only experiments. While recording this video exactly what I feard happend: when I executed the command to stop raspcatbot, Wifi connection had a problem and raspcatbot did drive at that speed into filing cabinet. Because of relatively slow speed nothing bad happened. Without heartbeat I will use cushion at end of the course as air bag from now on.
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Tue Apr 07, 2020 9:41 pm

Things slowed down today since raspcatbot did crash into a chair by accident.
Somehow the led bulb killed the onboard IRF520 mosfet.
First I had to determine which of my remaining 4 mosfets was good.
And then I had to learn that the characteristic curves are quite different for mosfet modules that had been used:
https://twitter.com/HermannSW/status/12 ... 8427700242

Because of the crash now there is cushion at both ends of the hardboard route.

Next the led bulb made problems. I learned the reason for light flickering by looking at the bulb through welding helmet safety glass:
led.bulb.1.png
led.bulb.1.png
led.bulb.1.png (138.2 KiB) Viewed 4897 times
Normally only the two bottom of 3x3 leds do light, but sometimes others chime in.
In order to remove possible issues I soldered cables to the bulb led, now everything is screwed firmly.

The scripts "fwd" and "bck" I used to move the robot did have only one speed parameter. The different speeds of the left and right gear motors for same voltage was the reason raspcatbot did move away from the black line as far as the cable permits. Today I used new scripts fwd2 and bck2 allowing to specify speed for each motor (later the visual feedback will do all this handling automatically). I did capture 320x240 frames again at 205fps framerate, and then executed "./bck2 120 140". As can be seen in the video today the robot followed the cable nearly perfectly with those different motor speeds. At start you can see cushion at one end of the route. The animation plays at 50fps this time, 4× slower than real:
Image


I have the code already that analyzes the botton three rows for more than 20 consecutive black dots. If 2 or 3 of the rows have that, black line is considered being seen. This is signaled in top left 8x8 square:
x.pgm.png
x.pgm.png
x.pgm.png (1.81 KiB) Viewed 4897 times

It is good that specifying speed per motor allowed to drive on the line easily today. The analysis tool correctly reports that there is no black line (in left top 8x8 square) because the line moved out of lowest 3 rows to the side in a run with same speed for both motors:
Image


This is split view, left side besides bottom 3 rows is normalized view, right side and bottom 3 rows is threshold view:
z.48.pgm.b.png
z.48.pgm.b.png
z.48.pgm.b.png (42.38 KiB) Viewed 4846 times

Not nice, work in progress, needs to be integrated with pigpio C interface code that does control the robot:

Code: Select all

$ cat braking2.c
/* gcc -Wall -Wextra -pedantic braking2.c -o braking2
*/
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <assert.h>

#define size 8
#define min 20

int line(unsigned char *p, int width)
{
  unsigned char *q, *r;
  for(q=p+width-1; q>=p; --q)
  {
    if (*q)  continue;

    for(r=q; q>=p && !*q; --q)  {}

    if (r-q>=min)
      return 1;
  }

  return 0;
}

int main(int argc, char *argv[])
{
  unsigned char buf[15+640*480],*p,*q, *a, *b, *c, m;
  int width, height, stop=0, thr;
  double sc;
  FILE *src;

  assert(argc == 3 || !"braking2 file.pgm thr");
  src = (strcmp(argv[1], "-") == 0) ? stdin : fopen(argv[1], "rb");
  thr = atoi(argv[2]);

  assert(1 == fread(buf, 15, 1, src));
  assert(2 == sscanf((const char *)buf, "P5\n%d %d\n255\n", &width, &height));
  if (height%16)
    height += (16-(height%16));
  assert(width*height <= 640*480);
  assert(1 == fread(buf+15, width*height, 1, src));
  assert(0 == fread(buf, 1, 1, src));
  assert(feof(src));
  fclose(src);


  for(p=buf+15; p<buf+15+width*height; ++p)
    if (*p > m)
      m = *p;

  sc = 254.0 / m;

  for(p=buf+15; p<buf+15+width*(height-3); ++p)
   if ((p-buf)%width>=15 && (p-buf)%width<15+width/2)
    *p = sc * *p;
   else
    *p = (sc * *p) > thr ? 255 : 0;

  for(p=buf+15+width*(height-3); p<buf+15+width*height; ++p)
    *p = (sc * *p) > thr ? 255 : 0;


  a=buf+15+width*(height-1);
  b=a-width;
  c=b-width;

  stop = line(a, width) + line(b, width) + line(c, width) >= 2 ? 0x00 : 0xff;
 
  for(q=buf+15; q<buf+15+size; ++q)
    *q=(stop ^0xff);
  for(p=buf+15+width; p<buf+15+size*width; p+=width)
  {
    *p=(stop ^0xff);
    for(q=p+1; q<p+size-1; ++q)
      *q=stop;
    *q=(stop ^0xff);
  }
  for(q=p; q<p+size; ++q)
    *q=(stop ^0xff);

  fwrite(buf, 15+width*height, 1, stdout);

  return 0;
}
$
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Wed Apr 08, 2020 9:09 pm

I have the C code doing capturing frames as well as controlling the motors running.
Unfortunately the led bulb is dead now.
Will receive new led bulbs in 1 or 2 days.
I did order 1.5Ω 10W resistors as well, they will arrive next week.
Until then I will have to find a reliable external lighting (for capturing frames with 200µs shutter time).

What I learned today is, that frame normalization, when done for full frame in camera callback, reduces framerate from 205fps to 102fps, although C code was compiled with -O3. What just worked was once to visit all pixels in order to determine maximal grey value, and then only normalize bottom three rows. Only 4 frameskips during 2s long recording at 205fps. External light was missing, just a sample frame:
frame.0405.raw.pgm.png
frame.0405.raw.pgm.png (14.38 KiB) Viewed 4824 times

Code: Select all

$ ~/ptsanalyze frame.pts 0
creating tstamps.csv
405 frames were captured at 204fps
frame delta time[us] distribution
      1 4888
    152 4889
    244 4890
      1 4891
      3 9779
      1 9780
after skip frame indices (middle column)
> 9779,136909
> 9779,303156
> 9779,601423
> 9780,1114833
4 frame skips (0%)
$

Today I took photos of the for now finalized new design of 1500rpm gear motors raspcatbot.
4S lipo is connected to L298N motor driver via big XT60 plug, as well as to IRF520 mosfet for led bulb as well as to voltage step-down converter for powering Pi3A+.
IRF520 is connected to led bulb above Arducam ov7251 0.3MP monochrome global shutter camera, which is connected to Pi3A+.
Pi3A+ controls the L298N and the IRF520 mosfet (16MP photo, right click for details).
Image


All parts are firmly screwed to Lego plate, through holes I did dremel at the locations needed.
In front view you can see the corresponding nuts below Lego plate.
And you can see the tilted camera as well as the tilted led bulb.
Below the front gear motor you can see M3 nut superglued to motor, allowing with other M3 nut superglued to back gear motor for "cable car" mode of raspcatbot, until automatic control will be good enough to try without cable:
Image
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Thu Apr 09, 2020 8:40 am

With 204fps framerate, less than 5ms are available in camera callback.
I wanted to know how much time is taken by writing the raw frame to /dev/shm.
Below temporary diff to raw_callback.c provided that information.
I did capture 1600 320x240 mode5 frames with raw_callback.
This is frame delay and skip analysis for the first 1600 of that, only 16 (double time) frame skips among 1600 frames:

Code: Select all

$ ~/ptsanalyze g.pts 0
creating tstamps.csv
1600 frames were captured at 204fps
frame delta time[us] distribution
      1 4879
      1 4882
      1 4883
      2 4884
      1 4886
      3 4887
     13 4888
    594 4889
    941 4890
     12 4891
      5 4892
      2 4893
      2 4895
      1 4896
      1 4898
      1 4901
     15 9779
      1 9780
after skip frame indices (middle column)
> 9779,982813
> 9779,1799378
> 9779,1902060
> 9780,2518152
> 9779,2963106
> 9779,3437399
> 9779,3848126
> 9779,4258854
> 9779,4669581
> 9779,5080309
> 9779,5491036
> 9779,5593718
> 9779,5901764
> 9779,6517855
> 9779,6933472
> 9779,7481109
16 frame skips (0%)
$

The 1600 reported durations for writing raw frames are in range 189µs..739µs, with average of 396.7µs.
̶I̶n̶ ̶c̶a̶s̶e̶ ̶o̶f̶ ̶t̶i̶m̶e̶ ̶c̶o̶n̶s̶u̶m̶i̶n̶g̶ ̶f̶r̶a̶m̶e̶ ̶p̶r̶o̶c̶e̶s̶s̶i̶n̶g̶ ̶t̶a̶s̶k̶ ̶t̶o̶ ̶b̶e̶ ̶d̶o̶n̶e̶,̶ ̶t̶h̶e̶ ̶0̶.̶4̶m̶s̶ ̶c̶a̶n̶ ̶b̶e̶ ̶r̶e̶c̶l̶a̶i̶m̶e̶d̶ ̶b̶y̶ ̶s̶t̶o̶r̶i̶n̶g̶ ̶i̶n̶t̶o̶ ̶a̶n̶ ̶a̶r̶r̶a̶y̶ ̶o̶f̶ ̶7̶6̶8̶0̶0̶ ̶b̶y̶t̶e̶ ̶e̶n̶t̶r̶i̶e̶s̶,̶ ̶a̶n̶d̶ ̶w̶r̶i̶t̶i̶n̶g̶ ̶t̶o̶ ̶f̶i̶l̶e̶s̶ ̶a̶f̶t̶e̶r̶ ̶r̶a̶s̶p̶c̶a̶t̶b̶o̶t̶ ̶h̶a̶s̶ ̶s̶t̶o̶p̶p̶e̶d̶.̶
̶In case of time consuming frame processing task to be done, part of the 0.4ms can be reclaimed by storing into an array of 76800 byte entries, and writing to files after raspcatbot has stopped. I just measured that, durations for memcpy into array of 76800 byte entries are in range 118µ..739µs (nearly same range), with average of 203.8µs. So only 0.2ms can be reclaimed on average if wanting to be able to "see" what the robot saw afterwards.

Code: Select all

diff --git a/RPI/raw_callback.c b/RPI/raw_callback.c
index 6727526..8b47100 100644
--- a/RPI/raw_callback.c
+++ b/RPI/raw_callback.c
@@ -27,17 +27,23 @@ int raw_callback(BUFFER *buffer) {
            fprintf(fd,"# timecode format v2\n");
            ptsbase = buffer->pts;
         }
-        fprintf(fd,"%lld.%03lld\n",(buffer->pts-ptsbase)/1000,(buffer->pts-ptsbase)%1000);
         frame_count++;

         if (save >= frame_count)
         {
             char buf[99];
+    struct timespec start, end;
+    clock_gettime(CLOCK_REALTIME, &start);
+
             sprintf(buf, "/dev/shm/frame.%04d.raw", frame_count);
             FILE *tgt = fopen(buf, "wb");
             fwrite(buffer->data, buffer->length, 1, tgt);
             fclose(tgt);
+    clock_gettime(CLOCK_REALTIME, &end);
+            fprintf(fd,"%lld.%03lld,%ld\n",(buffer->pts-ptsbase)/1000,(buffer->pts-ptsbase)%1000, 1000000*(end.tv_sec - start.tv_sec) + (end.tv_nsec - start.tv_nsec)/1000);
         }
+        else
+            fprintf(fd,"%lld.%03lld\n",(buffer->pts-ptsbase)/1000,(buffer->pts-ptsbase)%1000);
     }
     return 0;
 }

P.S:
To be on safe side, 204fps framerate leaves 4.88ms between callback calls, and worst case is 0.74ms for writing frame. Therefore single frame processing for robot control needs to be done in less than 4.14ms.
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Thu Apr 09, 2020 6:32 pm

Today 4 new 10W led bulbs arrived.
Product information says:
Voltage: DC 9-12V(if use DC 12,
and
Recommended Series a about 1.50 ohm resistor to use)
The 1.5Ω 10W resistors will arrive end of next week, so 12V is a no go now.
But I wanted to risk one of the 4 bulbs and play safe.
I increased voltage for constant current power supply from 0V in 1V steps.
The first led dot began to light at 3V, then more and more.
Even at 8V not all 9 leds did light up, but at 9V.
I think running a 10W led bulb at less than 1.1W, even without recommended resistor, is acceptable risk.
Even with 5V or 6V the light is already bright for human eyes.
I took photo through welding helmet security glass again, and you can see all 9 led dots lighting:
Attachments
20200409_200118.jpg
20200409_200118.jpg (62.24 KiB) Viewed 4784 times
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Fri Apr 10, 2020 7:33 pm

I wanted to make sure that never more than 9V go to led bulb, and I experienced that characteristic curves of IRF520 mosfet did move when module was used. So I decided to use the IRF520 mosfet as programmable switch only, and did put a LM2596 voltage step-down converter between IRF520 and led bulb. It is same LM2596 module I already used for powering the Pi, but without digital voltmeter. You can see it top left on the robot, just above 4S lipo. Again I used mirror on ground and looked through welding helmet safety glass in order to verify that all 9 bulb leds light:
Image


Just to give you an idea how bright the 12V 10W led already is when only powered with 9V and 1.1W (in dark room):
Image


I did a test run with raw_callback and took 1200th raw frame.
This is it after rawtopgm, then converted to .png.
Even for human eye at least the first 20-30cm of black line are visible already:
1200.pgm.png
1200.pgm.png
1200.pgm.png (19.11 KiB) Viewed 4719 times

This is same frame normalized with spreadpgm.c:
1200.enh.pgm.png
1200.enh.pgm.png
1200.enh.pgm.png (19.14 KiB) Viewed 4719 times

And finally this is converted with thresholdpgm.c and threshold 80.
A solid visual basis for autonomous drive algorithm work the next days:
1200.thr80.pgm.png
1200.thr80.pgm.png
1200.thr80.pgm.png (989 Bytes) Viewed 4719 times
Last edited by HermannSW on Thu Apr 23, 2020 6:34 pm, edited 1 time in total.
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Sun Apr 12, 2020 9:42 pm

Another 12V 10W led arrived, that claims to be able to provide 1000lm (the other claimed 600-800lm):
https://www.ebay.de/itm/402088332171

I did some short tests, will go with the previous for first raspcatbot autonomous control algorithms.

I did run new led powered with 12V for 30 seconds, then reduced voltage to 10V (no welding helmet safety glass needed for 10V), and measured temperature contactlessly. Temperature did not reach 40°C at any sport of the led There are 28x4=112 led dots:
20200412_224147.jpg
20200412_224147.jpg
20200412_224147.jpg (113.09 KiB) Viewed 4628 times

I used this as test setup, without soldering the power connections are fragile:
20200412_231318.jpg
20200412_231318.jpg
20200412_231318.jpg (84.85 KiB) Viewed 4628 times

Raspcatbot's endless track front was located at 145cm.
These are normalized views for voltages 10.8V(top left), 11.2V (top right), 11.6V (bottom left) and 12.0V (bottom right):
10Wled.Vs.png.png
10Wled.Vs.png.png
10Wled.Vs.png.png (128.15 KiB) Viewed 4628 times

These are the amps measured by constant voltage power supply, and the calculated watts:

Code: Select all

$ bc -ql
10.8*0.15
1.620
11.2*0.434
4.8608
11.6*0.818
9.4888
12*1.289
15.468
15.5W is above spec, so 12V is no option.
10.8V looks worse than the others.
11.6V does not look that much better than 11.2V, will go with 11.2V and <5W, in case I will use this led.

P.S:
The threshold=80 views for all 4 voltages look nearly identical, this is 11.2V view.
What can be seen is until 180cm, that is 35cm lookahead:
Image
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Mon Apr 13, 2020 7:27 pm

A last test run before autonomous code work, now that raspcatbot setup is complete with 9 led dots bulb powered with 9V.
A simple, slowly, drive backwards (./bck2 120 140), until stopped with "./fwd 0".
1000 raw 320x240 frames got recorded with "./raw_callback20 1000 5" into /dev/shm.
In post processing the frames got converted to pgm, normalized with spreadpgm and finally thresholded at 80 with thresholdpgm.
I inspected those frames with "eog frame.*.enh.thr80.pgm" and determined good start frame for animation being 470.
Adjusted togg script converted those frames to png format, and used gstreamer to convert frames 470..999 into an ogg video with framerate 25fps, 8× slower as real.
At start of animation you can see cushion lying on black line, I use that at both ends of hardboard course as "airbag".

Analysis shows 204fps framerate:

Code: Select all

pi@raspberrypi3Aplus:~/MIPI_Camera/RPI $ ~/ptsanalyze frame.pts 0
creating tstamps.csv
1388 frames were captured at 204fps
frame delta time[us] distribution
      1 4874
      1 4875
      2 4883
      1 4885
      3 4886
      2 4887
      1 4888

Code: Select all

    519 4889
    831 4890
      3 4891
      5 4893
      1 4894
      1 4896
      1 4897
      1 4904
      1 4905
      1 9777
     10 9779

Code: Select all

after skip frame indices (middle column)
> 9779,596533
> 9779,2322567
> 9777,2381242
> 9779,3496074
> 9779,3823678
> 9779,4244185
> 9779,4684250
> 9779,4997186
> 9779,5984888
> 9779,6395615
With 11 frame skips for the 1388 frames captured.
The determined timestamps for frame 470 and frame 999 show that only 6 frameskips happened during animation.
And all are just double the average length:

Code: Select all

> 9779,6806343
11 frame skips (0%)
pi@raspberrypi3Aplus:~/MIPI_Camera/RPI $ head -471 frame.pts | tail -1
2298.119
pi@raspberrypi3Aplus:~/MIPI_Camera/RPI $ head -1000 frame.pts | tail -1
4914.062
pi@raspberrypi3Aplus:~/MIPI_Camera/RPI $ 
While raw 320x240 GREY pixel frame size is 76800 bytes, the animation average (b&w) frame size is 5.5KB:

Code: Select all

$ du -sb raspcatbot.10.anim.gif 
2943303	raspcatbot.10.anim.gif
$ echo "2943303/530" | bc -ql
5553.40188679245283018867
$ 
And this is raspcatbot view of the test drive animation played at 25fps (for youtube upload), played 8× slower than real.
The thin line that can be seen sometimes in center is the cable, raspcatbot still drives in "cable car" mode.
I uploaded the ogg video the animated gif was created from:
https://www.youtube.com/watch?v=pSc6VRL ... e=youtu.be
Image


Next step:
Use C code described 5 postings ago, for autonomous stop after end of black thick line is visually detected.
Then compare stops
  • setting both L298N direction signals to 0 (long slide phase: https://www.youtube.com/watch?v=hp7pdkD ... e=youtu.be)
  • setting both L298N direction signals to 1 (shorter slide phase)
  • do reverse both motor directions and drive (full speed!) for short anount of time (expected to show shortest braking distance)
P..S:
This is how smartphone sees raspcatbot with its own led bulb light in completely dark room (cushion as "airbag"):
20200413_215834.jpg
20200413_215834.jpg (10% size)
20200413_215834.jpg (53.96 KiB) Viewed 4557 times
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Tue Apr 14, 2020 9:43 pm

I just commited initial version of braking_eol.c, pigpio library was added to Makefile:
https://github.com/Hermann-SW/MIPI_Came ... 012e2932b9

Raspcatbot drives backward at specified gear motor speeds, and threshold.
Initial runs were done with "./braking_eol 120 140 80".
Later with "./braking_eol 150 170 80" faster.

As discussed before, determining maximal grey value in frame for normalizion is possible.
But doing normalization in whole frame as well does not fit into <5ms time to process a frame.
Therefore the frames show only the bottom three lines normalized (those that end of line decision is based on).
Left top square shows whether line is detected (black) or not (white).
This is animation of 100 frames from before detecting eol, until after, where braking already takes place.
Although the animation played at 20fps, 10× slower than real, has 100 frames, the animation .gif size is only 128KB(!):
braking_test.1.anim.gif
braking_test.1.anim.gif
braking_test.1.anim.gif (125.26 KiB) Viewed 4492 times

The animation is very dark, so I did a special normalization version of spreadpgm.c that works on lines 8..(height-3) only.
Not perfect, but now you can see what raspcatbot does see:
Image


The end of line is at 145cm.
Because of 5cm blind spot raspcatbot starts braking at 140cm.
End position of endless track front is 70cm, so raspcatbot did need 70cm braking distance at this speed.

I took a smartphone video of the whole test run and uploaded to youtube:
https://www.youtube.com/watch?v=PIDuKCs ... e=youtu.be
Smartphone was held by selfie stick, jammed in a drawer, because I had to walk to laptop for executing braking_eol on raspcatbot's Pi3A+ in ssh session.

In the video you can see that I turned light off resulting in dark room, started raspcatbot, then raspcatbot did drive through camera view with its own bright light, detects eol and brakes. Finally I turn light on again, and you can see that raspcatbot front is at 70cm:
braking_test.1.jpg
braking_test.1.jpg
braking_test.1.jpg (24.68 KiB) Viewed 4492 times

Results:
I knew from years ago, that it makes a difference for DC motor braking whether both L298N direction pins are set to 0 or to 1.

I compared 00 vs. 11 setting, with braking PWM speed set to 0 and 255, but the distances measured were "the same":
"./braking_eol 120 140 80"
L298N direction bits 00, PWM=0 for both gear motors: 92cm
L298N direction bits 11, PWM=0 for both gear motors: 93cm
L298N direction bits 00, PWM=255 for both gear motors: 97cm
L298N direction bits 11, PWM=255 for both gear motors: 101cm
"./braking_eol 150 170 80"
L298N direction bits 00, PWM=255 for both gear motors: 69cm
L298N direction bits 11, PWM=255 for both gear motors: 70cm

So there is no real difference for caterpillar robot gear motors in braking pattern.


Next:
For braking, reverse motor direction for both gear motors and set PWM=255 for both motors for short time.
This will result in much shorter braking distance.

After that, probably add code that automatically detects that moving has come to an end, and stop motor then and not after fixed time.
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Wed Apr 15, 2020 7:26 am

More data for last raspcatbot drive (./braking_eol 150 170 80) from last night.

ptsanalyze of stored "frame.pts" did show 204fps average framerate, with only 20 frame skips (3%).
frame delta time[µs] distribution

Code: Select all

      1 4874
      1 4887
      3 4888
    174 4889
    284 4890
      1 4891
      1 4893
      1 4905
     16 9779
      4 9780
I always placed front of raspcatbot's endless tracks at position 345cm for start.
Directly after both motors start running, camera callback is enabled, so time=0s is when motors started:
https://github.com/Hermann-SW/MIPI_Came ... a087b9R165
I did determine the position when transition from one 50cmx50cm hardboard to next was in bottom 3 lines of frame.
After braking the thick black line horizontally was in middle of hardboard, so position 145-25=120cm.
Speed is determined as position delta (50cm/25cm) divided by time delta , so real speed at a position is not known.
braking1.svt.png
braking1.svt.png
braking1.svt.png (23.72 KiB) Viewed 4442 times

Because of 5cm camera blind spot in front of robot, the camera position at start was 350cm.
braking1.pvt.png
braking1.pvt.png
braking1.pvt.png (18.52 KiB) Viewed 4442 times

P.S:
Watching youtube video of last run at speed=0.25 under settings provides an interesting audio experience:
https://www.youtube.com/watch?v=PIDuKCsJX4w
Image
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Wed Apr 15, 2020 4:14 pm

Playing youtube video at 0.25 playback speed is not just fun.
It provides side channel information from a robot drive captured.
Eg. you can hear exactly when the motors drive, and when they brake.

There is no youtube API URL that allows to set playback speed.
Therefore this small script starts the video at 11.4s, just before motors start.
And it sets playback speed to 0.25.
But it needs you clicking on the displayed play button in video window, after you open this URL:
https://stamm-wilbrandt.de/raspcatbot_audio.html
braking1.yt.jpg
braking1.yt.jpg
braking1.yt.jpg (7.7 KiB) Viewed 4389 times

Code: Select all

<div id="player"></div>
<script>
// https://developers.google.com/youtube/player_parameters
var tag = document.createElement('script'), player;
tag.src = "https://www.youtube.com/iframe_api";

var firstScriptTag = document.getElementsByTagName('script')[0];
firstScriptTag.parentNode.insertBefore(tag, firstScriptTag);

function onYouTubeIframeAPIReady() {
  player = new YT.Player('player', {

Code: Select all

    height: '300', width: '170', videoId: 'PIDuKCsJX4w',
    events: { 'onReady': onPlayerReady  }
  });
}
	
function onPlayerReady(event) {
  player.setPlaybackRate(0.25);
  player.loadPlaylist('PIDuKCsJX4w', 0, 11.4);
  player.playVideo();
}
</script>

I do run Linux, and used Firefox and Chrome to hear the sound.
While both browsers provide the exact same sound for playback speed 1, the audio is completely different for speed 0.25.
Not sure yet whether that might be helpful for something ...


Btw, today my 3rd T101 platform arrived at home (Germany), 21 days after ordering on aliexpress.
Because I will replace its motors with 1500rpm gear motors anyway, I did choose 9V/150rpm gear motors with hall sensor encoder.
I did not have any motors with hall sensor encoders, will allow me to play with it -- not for high speed raspcatbot.
https://www.aliexpress.com/item/32813434124.html
20200415_130351.15%.jpg
20200415_130351.15%.jpg
20200415_130351.15%.jpg (188.75 KiB) Viewed 4389 times
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Wed Apr 15, 2020 7:42 pm

Motor direction reversal at full speed does not work on hardboard like it did on carpet, a fail.
Braking distance even slightly longer than yesterday, but the lipos have more voltage today.
I started with 0.1s, then 0.5s and finally with 1.0s full speed reverse direction motors.
Video shows three times the same test run with 1.0s:
https://www.youtube.com/watch?v=9l37nVw ... e=youtu.be
braking2.yt.jpg
braking2.yt.jpg
braking2.yt.jpg (14.85 KiB) Viewed 4358 times

I have to rethink what to do next, redo carpet braking test from 2.5 years ago, or teach raspcatbot to follow straight black line without cable and cable car mode:
Image
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 5182
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: raspcatbot

Thu Apr 16, 2020 8:52 am

I did the "braking on carpet" tests last night.
In order to avoid problems of carpet appearing dark and dealt as "line present", I did choose a terminating hardboard where the straight black line is long, but does not reach the end. Cable car raspcatbot is forced moving straight, and the curve is detected as end of line. Braking starts on hardboard, but only few centimeters, the remaining braking is done of carpet:
carpet_braking_temination_hardboard.png
carpet_braking_temination_hardboard.png
carpet_braking_temination_hardboard.png (72.21 KiB) Viewed 4309 times

In first braking test on carpet raspcatbot did loose his front camera ;-) -- no real damage:
https://www.youtube.com/watch?v=9DMnrwG ... e=youtu.be
loosing_front_camera_carpet.png
loosing_front_camera_carpet.png
loosing_front_camera_carpet.png (55.87 KiB) Viewed 4309 times

Braking test with motor direction reversal and full speed, 1 second was too long for "stand still".
Complete video with audio here:
https://www.youtube.com/watch?v=EYTyzPB ... e=youtu.be
Image


My previous experience of difference wrt 00 vs. 11 direction line braking distance, and better braking distance for full reversal, must have been from other motor tests in my motor test station years ago. I was able to brake radially simple DC motors running robot (with motor controller different to L298N) with 14.37m/s or 51.7km/h speed to stand still with 1.8s of full motor reversal:
https://forum.arduino.cc/index.php?topi ... msg2430545
Image


I did many experiments yesterday, findings for caterpillar robot gear motors:
  • braking distances on all types of braking (00, 11, reversal) much shorter than on hardboard (no surprise)
  • while full reversal has shorter braking distance, it is only few centimeters compard to 00 and 11 direction braking

Finally I went back to hardboard course, with no change in too long braking distance as expected.
I did forget to put cushion back as airbag on that end of the course, but braking was good.
braking2.totem.jpg
braking2.totem.jpg
braking2.totem.jpg (65.25 KiB) Viewed 4309 times

Next I will try whether continuously slowing motor speed to 0 and then increasing from 0 to full reversal might achieve better braking distance results for caterpillar gear motors. Followed from getting rid of cable and cable car mode in making raspcatbot (easily) follow straight black line.
https://hermann-sw.github.io/planar_graph_playground
https://stamm-wilbrandt.de/en#raspcatbt
https://github.com/Hermann-SW/memrun
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

Return to “Automation, sensing and robotics”