User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

2-wheel balancing (fast) line following robot

Fri Jan 22, 2021 9:42 pm

I had that idea 14 years ago, now want to do that as small project between so many others.


Some history:
Back in 2007 I soldered ASURO robot and flashed C programs onto it -- it had two wheels and a half table tennis ball below PCB front:
(only in German language) https://de.wikipedia.org/wiki/ASURO
Image


This is list of projects I did with myIrAsuro (German language, photos, animations, links to German language robot forum):
https://stamm-wilbrandt.de/myIrAsuro.html


Flashing code was done from PC via IR connection. Putting IR led and IR sensor directed to front and separated allowd to measure distance to obstacles (here small cell phone allowed to even capture videos from moving myIrAsuro) :
Image


With very simple code (no obstacle in front => both wheels full speed forward / obstacle in front => left wheel stop, right wheel full speed forward) this allowed robot to drive rounds in our kitchen. Two ASUROs with different distance threshold were able to drive as "2-robot swarm" autonomously (the ASURO with big wheels attched had 1.8m/s maximal speed, and did really slide into the curves):
Image Image
Image Image


I mounted IR led and sensor so that robot "standing" on two wheels only was able to measure distance to desk. First I did not get it right completely, and created a "ranaway robot" ;-)
Image


Later I was able to balance on two wheels, made left wheel stop the whole time, and change the thrshold distance for right wheel forward/backward drive to make robot slanted. That way myIrAsuro did drive forward (fast)
Image


In this (German language) posting I discussed the title of this thread in 2007:
https://www.roboternetz.de/community/th ... post246683



State of the art:
I found many 2-wheel balancing robots on youtube, but only this did follow a line (slowly) with a camera:
https://www.youtube.com/watch?v=AZ8HnyVm6Bw
out.anim.gif
out.anim.gif
out.anim.gif (249.26 KiB) Viewed 2619 times


Plan:
I plan to use VL53L0X Time-of-Flight ranging sensor for balancing.
Found many python libraries, but I prefer C.
This github repo makes ST Microelectronics library available for Raspberry Pi.
Building worked as documented:
https://github.com/cassou/VL53L0X_rasp


I burned my VL53L0X, ordered a new which will arrive tomorrow.


I will use these simple motors with wheels as basis:
20210121_174519.15%.jpg
20210121_174519.15%.jpg
20210121_174519.15%.jpg (50.33 KiB) Viewed 2619 times

I did setup a small test course already (German language test course construction: https://stamm-wilbrandt.de/RobotChallenge/):
IMG_20210121_180430_740.50%.jpg
IMG_20210121_180430_740.50%.jpg
IMG_20210121_180430_740.50%.jpg (14.61 KiB) Viewed 2619 times

As camera I will use v1 or v2 camera without lens, or (v1) spy camera or Arducam 0.3MP monochrome global shutter camera.


Milestones:
  1. build new robot with motors, Pi0, camera, VL53L0X, 1S Lipo
  2. make it balance
  3. make it move slowly (slanted)
  4. add line following
  5. maximise forward speed while still following line

P.S:
Mid last year my raspcatbot project (Raspberry caterpillar robot) got interrupted by new HQ camera (microscope with 0.21µm/pixel distance, superlong exposures (60min), attaching HQ camera to cheap telescope) and many other projects. It definitely will be resumed (maximal speed of that robot is 5m/s, already seen 2.55m/s inhouse, line following with 25.99$ 0.3MP monochrome global shutter camera, frame information extraction will use just before robot (present) as well as far away (near future) frame information to control high speed moving robot autonomously):
viewtopic.php?f=37&t=267999&start=50#p1683666
Image
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

blimpyway
Posts: 638
Joined: Mon Mar 19, 2018 1:18 pm

Re: 2-wheel balancing (fast) line following robot

Fri Jan 22, 2021 11:58 pm

Hi,
The robot idea is cool.

The motors look like standard arduino robot aka TT motors. If true they are quite slow/low torque. They are 6V rated.

About the ToF range sensor for balance.. I suppose this is intended to work by measuring distance to floor at an inclined angle.
Why using this instead of the usual IMU most balance bots use?
Will it work on inclined or uneven terrain or when a small object (siting on the floor) is in its view?

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Sun Jan 24, 2021 11:45 am

blimpyway wrote:
Fri Jan 22, 2021 11:58 pm
The motors look like standard arduino robot aka TT motors. If true they are quite slow/low torque. They are 6V rated.
Yes, they are slow for a racing robot (the 12V 1500rpm grear motors of my raspcatot can do more than 5m/s when overpowered with 16.8V from 4S lipo, laser tachometer showed 1770rpm free running). But for balancing the simple motors I showed should be good enough. I did choose them because they are widely available, cheap, and most likely present for people having bought a simple robot set. I did measure 3rps at 4V when powered with 1S lipo, with 65mm diameter wheel that is still 0.61m/s or 2.2km/h.

About the ToF range sensor for balance.. I suppose this is intended to work by measuring distance to floor at an inclined angle.
Yes, because that most easily allows to setup an angle of robot (90° for vertical), and easily keep robot in that position.

Why using this instead of the usual IMU most balance bots use?
I can try that later, have 6-axis as well as 9-axis IMUs at home.

Will it work on inclined or uneven terrain or when a small object (siting on the floor) is in its view?
The laser ranging sensor not, but my first test courses with consist of flat hardboards as shown in initial posting.
With IMU likely.

HermannSW wrote:
Fri Jan 22, 2021 9:42 pm
This github repo makes ST Microelectronics library available for Raspberry Pi.
Building worked as documented:
https://github.com/cassou/VL53L0X_rasp


I burned my VL53L0X, ordered a new which will arrive tomorrow.
Received 3 sensors yesterday, but received 3 new PICOs earlier yesterday and got sidetracked.

Just installed cassou repo and ST Microelectronics drivers, built and built examples successfully on Pi400.

This time I did not mix rows of Pi400 GPIO header like yesterday, module got burned by connecting to 5V pins ...

Module was found after connecting:

Code: Select all

pi@raspberrypi400:~/VL53L0X_rasp $ i2cdetect -y 1
     0  1  2  3  4  5  6  7  8  9  a  b  c  d  e  f
00:          -- -- -- -- -- -- -- -- -- -- -- -- -- 
10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 
20: -- -- -- -- -- -- -- -- -- 29 -- -- -- -- -- -- 
30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 
40: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 
50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 
60: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 
70: -- -- -- -- -- -- -- --                         
pi@raspberrypi400:~/VL53L0X_rasp $ 

I did capture readngs into file "out" from continuous example:

Code: Select all

$ bin/vl53l0x_ContinuousRanging_Example | tee out

Then I used trend to display:

Code: Select all

pi@raspberrypi400:~/VL53L0X_rasp $ grep "In loop meas" out | cut -f2 -d: | trend -v - 600x1

uniq analysis (194-211 range, that is 17mm without change of sensor position):

Code: Select all

pi@raspberrypi400:~/VL53L0X_rasp $ sort -n n | uniq -c
      1  194
      4  195
      2  196
     16  197
     25  198
     37  199
     56  200
     72  201
     83  202
     74  203
     49  204
     53  205
     31  206
     19  207
     14  208
      5  209
      4  210
      1  211
pi@raspberrypi400:~/VL53L0X_rasp $
trend.png
trend.png
trend.png (13.09 KiB) Viewed 2526 times
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

blimpyway
Posts: 638
Joined: Mon Mar 19, 2018 1:18 pm

Re: 2-wheel balancing (fast) line following robot

Tue Jan 26, 2021 3:43 am

A few thoughts again

- I had to check if anybody else used same motor for a balancing bot and found this Russian site which is excellent at explain the principles with physics and equations included: https://zizibot.ru/articles/programming ... 5-mpu6050/
I don't understand Russian but google's translation was quite good. Their videos are also interesting unfortunately no translation there. https://youtu.be/3N-8OLf_ofs - these motors, although shaky, doing a decent job.

- regarding measuring inclination "visually" - I think that you can use the same camera needed to see the line , and a laser pointer to measure incline of the robot. It's kind of "stereoscopy" but instead of two cameras you use camera + laser pointer and it can measure distance to only one pixel - the one hit by the laser. Unlike full depth map, calculating depth for a single point should be very quick.

More so,
- regarding inclined terrain - here camera can help to seek for a still image - the point where the standing robot does not tilt back or forward is the balance point. Maybe picamera's H264 motion vectors could be used to sense not only whole body gyration around wheel's axis but also speed. So robot "knows" it is closest to vertical when its angular acceleration (tilt) is lowest. As with a gyro.

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Wed Jan 27, 2021 3:45 am

Thanks for the investigation+ideas.

Good to see those motors+wheels on Russian video, I think shaky movement is caused by control algorithm.

I think one of the design criteria should be "only needed" and "cheap" components when possible.
So having a camera, a cheap laser led should be preferred over a fully fledged laser ranging sensor or IMU.
I currently do not see how fixed camera and fixed led should allow for distance determination, but I will give it a try.

I have several cheap 3.3V and 5V 5mW laser leds, that could be directly connected to GPIO pins (they have smd resistor already).
They do not give a precise dot, but I will try.
This is v2 camera 1007fps recording or "laser clock" consisting of 5 leds:
https://forum.arduino.cc/index.php?topi ... msg3973844
Image


I got sidetrack again, created several threads on Pico/SDK:
viewforum.php?f=145
For a simple benchmark I created long ago, I learned that Pico is faster than all Arduinos tested, but slower than ESPs.


I started construction of balancing robot, used my favorite connection method (superglueing) and attached small Lego pieces to the motors. They will allow for rapid prototyping, and allow to create a high or low robot, whatever is needed:
20210127_034513.15%.jpg
20210127_034513.15%.jpg
20210127_034513.15%.jpg (47.44 KiB) Viewed 2388 times

As can be seen I will go with cheap v1 camera clone first (4$ on aliexpress with free shiping), and switch to other camera if needed. During dev work I will use a Pi0W, later a simple Pi0 will suffice.
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Wed Jan 27, 2021 2:10 pm

The two 2x1 Lego pieces were superglued above a small plastic pin of the motor.
This did avoid connecting a Lego piece from the bottom, which is needed for stability (as well as one from above).
I used Dremel to cut small openings into the flat 2x6 bottom Lego piece, and now the balancing robot platform base is firm.

I did turn led on GPIO17 on:

Code: Select all

pi@raspberrypi0W:~ $ sudo pigpiod
pi@raspberrypi0W:~ $ pigs m 17 w
pi@raspberrypi0W:~ $ pigs w 17 1

You can see spot on white paper in front of left wheel (as said the laser led has a smd resistor attached and in line):
20210127_150153.15%.jpg
20210127_150153.15%.jpg
20210127_150153.15%.jpg (33.77 KiB) Viewed 2322 times
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Wed Jan 27, 2021 5:52 pm

I wanted to see that robot moving, now, without Pi0 control.
This is original speed when powering one wheel with 6V from constant voltage supply:
Image
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Wed Jan 27, 2021 9:32 pm

Robot is getting more complete. I added Pi0W with v1 camera clone, and lipo from quadcopter with cage, that had already flat Lego piece superglued from a previous project. Robot can stand because of blue Lego piece behind/under right wheel:
20210127_222458.15%.jpg
20210127_222458.15%.jpg
20210127_222458.15%.jpg (49.39 KiB) Viewed 2280 times

I SSHed into the Pi0W, captured 5MP image with "raspistill -o tst.jpg" and downloaded that - not bad:
tst.20%.jpg
tst.20%.jpg
tst.20%.jpg (13.43 KiB) Viewed 2280 times

Next step is connecting laser led and see if/how distance between led and beam ground hit can be computed from camera frame.
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Thu Jan 28, 2021 7:11 pm

For the laser led camera pitch determination tests it was good to have design based on Lego pieces. I was able to quickly move any component where I want. After trying out alternatives, I ended up with laser led angle (laser led superglued to Lego piece just abve left wheel) that allows to easily tell pitch between 45° and 105° (normal vertical stand is 90°), 16MP smartphone photo: https://stamm-wilbrandt.de/en/forum/20210128_174831.jpg
Scaled to 15%:
20210128_174831.15%.jpg
20210128_174831.15%.jpg
20210128_174831.15%.jpg (26.01 KiB) Viewed 2206 times

This is frame taken at 90° robot pitch, video recorded with "raspivid -md 7 -w 640 -h 480 -fps 90 -t 0 -o tst.h264", laser dot on hardboard can be easily seen on left side:
90°.png
90°.png
90°.png (176.52 KiB) Viewed 2206 times

I tried to upload the video to youtube, with 90fps, converted to 25fps with ffmpeg, converted to only 5fps with ffmpeg -- all uploads did hang youtube "SD creation"?!? So I took every 90th frame with ffmpeg and created an animated .gif from that, scaled to 33% with togg and pngs2anim tools. Small, but the moving laser dot can be seen. Robot pitch start was at 45°, end at 105°. I chose laser position so that it was visible for 90°-105° as well (that will allow the robot to (slowly) drive backwards -- not sure whether that is needed):
x.anim.gif
x.anim.gif
x.anim.gif (218.64 KiB) Viewed 2206 times

So blimpyway's "use camera + laser pointer" really works.
And it is much simpler than I thought, no need to determine distance from laser led to groud.
Each robot pitch corresponds to a specific vertical position of laser dot in frame!
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Thu Jan 28, 2021 8:14 pm

I think I will go with my i420toh264 framework:
https://github.com/Hermann-SW2/userland ... i420toh264

i420toh264 tool allows to convert raspividyuv i420 stream of frames to be converted to .h264 utilizing GPU.
Before i420toh264, code in bash pipe can process (and modify if needed) the frames.
The trick is to process only Y plane of YUV frame (the first 2/3rd of data) as luminosity, no conversion from rgb needed.

"Locating an airplane" nearly does what I need (it determines dark spot against bright sky, the airplane, and marks it with 2x2 white square):
https://github.com/Hermann-SW2/userland ... g-airplane
Image


For balancing I need to determine the brightest spot (does not look that bright for my eyes with room light on, but is bright for v1 camera). To reduce the amount of pixels I will capture mode 7 at 90fps, but scaled down to 320x240 pixels. This is sample frame (robot pitch 90°):
frame.png
frame.png
frame.png (26.51 KiB) Viewed 2175 times

I used gimp threshold tool to increase threshold so that only part of laser dot remained white, this is same frame converted to b/w with threshold 150:
frame.150.png
frame.150.png
frame.150.png (379 Bytes) Viewed 2175 times

gimp histogram for frame.150.png says there are 76800 pixels in total, and 50 of them are white. So I will first go with "determine the roughly 50 brightest pixels in frame and compute their average y-coordinate" for determining "laser beam position" in frame. After adding motor controller, left motor brake, right motor forward/backward depending on some threshold for laser beam position should balance, and move forward for pitch <90° as done 14 years ago:
Image
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Thu Jan 28, 2021 9:13 pm

I have many motor drivers, will use simple L9110S module here.
It can drive two DC motors in 2.5-12V range, with 800mA per channel.

The yellow motors are rated 3-12V, and draw roughly 200mA free running when powered with 4V.
Reducing rotation with hand quickly increases current draw.
But there is nothing in use case that could make motor draw more 800mA, so the L9110S is sufficient.

In order to avoid spikes doing bad stuff to the Pi0[W] I will use two lipos, one for Pi and the other for the motors.
Two drops of superglue and I got a double lipo cage:
20210128_220333.15%.jpg
20210128_220333.15%.jpg
20210128_220333.15%.jpg (31.95 KiB) Viewed 2143 times
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

blimpyway
Posts: 638
Joined: Mon Mar 19, 2018 1:18 pm

Re: 2-wheel balancing (fast) line following robot

Thu Jan 28, 2021 10:12 pm

Ha, I'm glad it works!

You are very fast, I'm not sure the robot will be able to keep up with you :D

if the camera-laser frame is rigid enough then the laser's red dot in the view would follow a single line. It means checking only a few hundred pixels in numpy should be quite fast. Even faster if algorythm starts searching the spot from last known (previous frame) position.

The fact the dot isn't "small" as in one pixel isn't much of an impediment, its center should be easy to calculate. In fact, having more red pixels (a spot instead of a dot) allows sub-pixel precision of spot's center position.

blimpyway
Posts: 638
Joined: Mon Mar 19, 2018 1:18 pm

Re: 2-wheel balancing (fast) line following robot

Thu Jan 28, 2021 10:26 pm

You said you prefer C over python. If you change your mind I would gladly attempt to write a real time laser pointer tracker in python with picamera library.

PS out of curiosity I tested a "red dot position calculator" in few lines of numpy.
I generated a random noise 240x360 rgb image (as picamera module produces),
It yields ~130 fps on the whole image and ~400fps when it crops a 240rows x 50 columns area of interest.
On Zero W of course.

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Fri Jan 29, 2021 4:15 am

blimpyway wrote:
Thu Jan 28, 2021 10:26 pm
You said you prefer C over python. If you change your mind I would gladly attempt to write a real time laser pointer tracker in python with picamera library.
That might be an option, the i420toh264 framework I pointed to for processing raspividyuv i420 frames in bash pipeline definitely allows for Python plugins as well.

Yes, I might be able to further reduce frame size (perhaps 160x120 might suffice). But in parallel to determining laser dot, later the line to follow needs to be tracked as well, for generating motor control output, which might be challenging for Python with 90fps (of course that can be reduced as well).

blimpyway wrote:
Thu Jan 28, 2021 10:12 pm
Ha, I'm glad it works!

You are very fast, I'm not sure the robot will be able to keep up with you :D
4 years ago I played with my first caterpillar robot that was able to do 2.4m/s (8.64km/h).
I had to run controlling the robot with wired 2m cable joystick, from a (v1) front camera video:
https://forum.arduino.cc/index.php?topi ... msg3239076
Image

The newer caterpillar robot base I used for raspcatbot can do up to 5m/s (18km/h) -- no chance for me to keep up with that one running autonomously outside (I plan to follow robot with bicycle and having dead-man's button for emergency stop). The red dead-man's button is on left side, joystick with display on right side of my ESP32 based "wireless joystick" (joystick will not be used for autonomous robot):
https://esp32.com/viewtopic.php?f=19&t= ... 513#p57291
Image


While the 2-wheel balancing robot will run autonomously, making use of dead-man's red button of wireless joystick for emergencies during dev might be useful.


The balancing robot videos will be important for dev work on the balancing and line following code. Maybe there will be funny videos for that robot as well, like the old caterpillar robot overturning and ending upside down ;-)
Image
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

RedMarsBlueMoon
Posts: 354
Joined: Mon Apr 06, 2020 3:49 am

Re: 2-wheel balancing (fast) line following robot

Fri Jan 29, 2021 5:14 am

Awesome project! :)
It reminds me of the OpenAI Gym 'Cartpole' example that I never did! :)

https://youtu.be/hQK_3C6S4Ak?t=93

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Sat Jan 30, 2021 1:53 pm

@RedMarsBlueMoon interesting.


I cloned my userland fork onto the Pi0W that contains the i420toh264 framework:
https://github.com/Hermann-SW2/userland/

I did build according instructions without issues:
https://github.com/Hermann-SW2/userland ... ding-on-pi

Finally I did some simple tests, and later played with "simple modifying pipeline", where sample_yuv2grey.c converted the raspividyuv i420 frames to grey frames (keep Y plane, set complete U and V planes to 128 values):
https://github.com/Hermann-SW2/userland ... g-pipeline

That was the first time I saw frameskips when recording with 90fps. I use tool ptsanalyze to analyze timestamp files written with raspivid[yuv] "-pts" option:
https://github.com/Hermann-SW/userland/ ... ptsanalyze

It turned out that two changes were needed to run the pipeline with 90fps without frameskips:
  • redirect stderr output of i420toh264 tool into a file
  • do not show preview window of raspividyuv
This is the pipeline I used, resulting in 90fps 320x240 tst.h264 without frameskips:

Code: Select all

$ raspividyuv -md 7 -w 320 -h 240 -fps 90 -pts tst.pts -n -o - | \
> ./sample_yuv2grey 320 240 | ./i420toh264 tst.h264 320 240 2> err
$ 
This is proof of no frameskips:

Code: Select all

$ ptsanalyze tst.pts 
creating tstamps.csv
452 frames were captured
majority framerate 90fps
  frame delta time[us] distribution
      1 11085
      2 11086
      3 11087
      9 11088
     34 11089
     81 11090

Code: Select all

    183 11091
     93 11092
     33 11093
      5 11094
      3 11095
      1 11096
      1 11097
> after skip frame indices (middle column)
0 frame skips (0%)
average framerate 90fps
$ 

This is frame from "Minimal use pipeline", with colors:
https://github.com/Hermann-SW2/userland ... n-pipeline
color.png
color.png
color.png (45.04 KiB) Viewed 1952 times

This is frame from modifying pipeline shown above, completely grey:
grey.png
grey.png
grey.png (28.83 KiB) Viewed 1952 times


Because modifying pipeline did not show any frameskip, we know that sample_yuv2grey.c did take less than 11.1ms for each frame (1000ms/90fps=11.1ms/frame):
https://github.com/Hermann-SW2/userland ... yuv2grey.c

Next I will start with that code, use count array "cnt" to count how many pixels have grey value x for x in 0..255, determine m such that "sum_{i>m} cnt <50" and "sum_{i>=m} cnt >= 50". Then determine mean y-coordinate of all pixels at least m bright and add (blue?) horizontal line to frame for indication of determined y-coordinate for each frame. When going through robot pitches 45°..105°, the recorded .h264 video will show whether computation done is good enough.


P.S:
Currently there is a problem when laser beam hits ground on black line, in that case it becomes too dark to be determined. On the other hand, in case robot follows line with mostly line between robot center, then laser beam should not be on black line ...


P.P.S:
Just in case brightnes of laser led really will be an issue, I will just replace the laser led with very bright green laser from 2$ laser pointer:
viewtopic.php?f=43&t=281646#
That was able to light >500m long into the night, and is even bright on 44m distant roof, so should be more than bright enough over 20cm or so:
Image
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Sat Jan 30, 2021 4:02 pm

I am building sample_laserBeam.c, having started with sample_yuv2grey.c.
I wondered why the determined lines where so near to top of frame and not near laser beam.
So after determining maximal m with "sum_{i>=m} cnt[ i ] >= 50", I did simply set Y pixels to 255 if Y>=m, to 0 otherwise. From created tst.h264 I did extract first seconds (90 frames), and created animated .gif playing at 30fps. As can be seen there is bright junk at top right, so 50 is too high threshold:
x.anim.gif
x.anim.gif
x.anim.gif (53.32 KiB) Viewed 1894 times
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Sat Jan 30, 2021 4:52 pm

I did a quick test with green laser led mentioned, connected to 3.3V constant voltage power supply. Then I took mode 7 320x240 frames with raspistill this time, one with green laser ob black tape, one on white hardboard. The red laser can be seen as well -- no comparison. The green laser dot is even brighter on black tape than the red laser on white hardboard:
x.anim.gif
x.anim.gif
x.anim.gif (52.97 KiB) Viewed 1870 times

I will replace red laser led with green laser pen led half. It will be superglued to Lego piece of robot like the red led was. As can be seen powering the green laser is most easily done by using logic analyzer probes:
green_laser.png
green_laser.png
green_laser.png (119.08 KiB) Viewed 1870 times

So it was not good enough for the robot to bring his own laser light, that light needs to be bright enough for the task. Since laser dot will not be as near to black tape as just shown, black tape should be good to see for line following as well.
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Sat Jan 30, 2021 8:06 pm

I tried to power the green laser from lipo directly -- worked nicely despite the more than 4V. But after some time the green laser got really hot and I stopped that. Then I used Polulu U1V11F3 voltage regulator to bring lipo voltage down to 3.3V. But it did not really work. So I decided to go with the whole green laser Pen for now to make progress in balancing and line following. I did tape the on/off switch firmly so that laser is always on. Turning off can be done by unscrewing the battery half of the pen. I superglued it with same slope as the red laser led before:
20210130_201211.15%.jpg
20210130_201211.15%.jpg
20210130_201211.15%.jpg (41.6 KiB) Viewed 1825 times

Next I wanted to reduce the radius of laser dot on hardboard a bit. I did cover the front with black insulation tape. Then I did drill a small hole into that. Now the laser dot radius on hardboard is reduced:
20210130_201235.15%.jpg
20210130_201235.15%.jpg
20210130_201235.15%.jpg (28.32 KiB) Viewed 1825 times
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

RedMarsBlueMoon
Posts: 354
Joined: Mon Apr 06, 2020 3:49 am

Re: 2-wheel balancing (fast) line following robot

Sat Jan 30, 2021 8:39 pm

You might get a better average contrast image if you put a colored gel in front of the lens I'm guessing? (red for red laser, green for green laser)
At the very low cost for latency of the light travelling slightly slower in that medium for about a 10th of a mm! :)

User avatar
HermannSW
Posts: 4827
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: 2-wheel balancing (fast) line following robot

Sat Jan 30, 2021 9:04 pm

RedMarsBlueMoon wrote:
Sat Jan 30, 2021 8:39 pm
At the very low cost for latency of the light travelling slightly slower in that medium for about a 10th of a mm! :)
:)


I created the same type of video I did with red laser led before, now with green laser. I took 9 frames from 45° until 105° pitch of robot and created 320x240 animation of them -- so much more contrast with that green laser(!), room lights are on:
x.anim.gif
x.anim.gif
x.anim.gif (246 KiB) Viewed 1780 times
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

RedMarsBlueMoon
Posts: 354
Joined: Mon Apr 06, 2020 3:49 am

Re: 2-wheel balancing (fast) line following robot

Sat Jan 30, 2021 9:25 pm

That green laser might be stronger that you need? You're getting quite a lot of contaminating 'spread' around the core there do you know what that is? Is it light scattering around the camera lens? Or light hitting dust particles in the room? Or the green laser diffracting against your slit sides?
Could a lower intensity light give you a more contained dot?

blimpyway
Posts: 638
Joined: Mon Mar 19, 2018 1:18 pm

Re: 2-wheel balancing (fast) line following robot

Sat Jan 30, 2021 9:36 pm

If you have a short moving video with the red dot & track in view, which shows all situations:
- max incline back and forth passing through vertical position
- laser dot both on the side the track and onto it

I would attempt to write the code to "read" both laser dot and dark track positions within frame. I'm curious how fast it could be.

Regarding the laser reflection attenuated on the dark track, here are some thoughts:
- the laser dot might still be traceable, that's why I want to look at actual "footage".
- adding colors in gray shade, instead of filtering for specific color certainly makes things worse E.G. in rgb format a crude filter is to discard green and blue channels when I'm searching for "high" red.
- if the track is shiny (e.g.duct tape) most of the light is reflected away because laser beam is oblique not perpendicular to the shiny surface.
- matt paint with a lighter color than black will certainly help.
- a too powerful laser might impede similarly the correct location of the darker track.
- size of the dot, in reasonable limits, should not be too big of a concern.

PS green is fine too, but I'm curious what is left of the track when the beam shines it.

A workaround to address both problems - top low visibility of either track or beam when they overlap, would be to have a pair of lasers, left and right, with enough spacing to have at least one of them away from the track. And every odd frame turn on only the left laser, every even frame turn the right laser only.

This will have at least every other frame a "clean view" of both laser and track.
And might also help simplify seeing when the track "goes" left or right - the corresponding dot will drop intensity a lot.
Actually, having two laser dots might provide both tilt and track detection - similarly as simplest track following robots with a pair of IR reflective sensors

blimpyway
Posts: 638
Joined: Mon Mar 19, 2018 1:18 pm

Re: 2-wheel balancing (fast) line following robot

Sat Jan 30, 2021 11:35 pm

RedMarsBlueMoon wrote:
Sat Jan 30, 2021 9:25 pm
That green laser might be stronger that you need? You're getting quite a lot of contaminating 'spread' around the core there do you know what that is? Is it light scattering around the camera lens? Or light hitting dust particles in the room? Or the green laser diffracting against your slit sides?
Could a lower intensity light give you a more contained dot?
As long as the pixels form a consistent "shape" that should not be a problem, a program should select a set of brightest ones.

Here are the results with the green animated gif (9 frames)

Code: Select all

[167.98724954  12.07103825] 549 253
[149.19260401  18.57935285] 649 253
[122.43059937  29.20662461] 634 253
[96.53932584 39.43258427] 534 253
[72.05957447 48.77446809] 470 253
[51.12742382 56.98060942] 361 253
[37.3971831  62.52957746] 355 253
[20.23148148 69.37654321] 324 253
[11.10646388 72.46768061] 263 253
I ignored red and blue channels, measured only green
Last two integers on each line are number of pixels and a light intensity threshold (255 is max color brightness in RGB format)
first two floats are [line, column] position of CoG for pixels higher than threshold.
Top left corner is [0,0]

The speed reaches ~360 fps for algorithm alone in python. It assumes the gif presents full range of tilt, so it looks for spot only on the first 180 rows x 80 columns

The real problem will be to see the track. The gradient from the spot is so large that many pixels on the track closer to the spot are brighter than most of the floor

blimpyway
Posts: 638
Joined: Mon Mar 19, 2018 1:18 pm

Re: 2-wheel balancing (fast) line following robot

Sun Jan 31, 2021 12:09 am

Another way to alleviate the high intensity of the laser is to mount it exactly in the center,

That leads to two useful results:
- the bright spot (almost) does not move sideways as in the animated gif, so tilt measure can be sped up even further by searching only a few pixels wide rectangle.
- the black track, when off-center to e.g. the right of the image will make the right side of the picture slightly darker than the left side, quite reliably.

Return to “Automation, sensing and robotics”