Re: Creating a RPI-assisted microscope
@GRBL: There are variants for up to 6 axis, with some servos etc., but you'd need atmega256 (any 3d printer board will do) or stm32. Stepper motors are usually driven with current control, not voltage control, so you can use 12V for your 5V steppers. To be precise, a rating of "5V 0.1A" only defines the coil resistance (e.g. 50 Ohm).
Looking for professional help? https://klepp.biz
Re: Creating a RPI-assisted microscope
Wow!
Entering coordinates in Linux screen session to Arduino Uno connected to Raspberry Pi4B /dev/ttyUSB0 allows to position (x,y) of two-axis microscope sliding-table in steps of 5µm. Preview magnification was chosen so that one HDMI display preview pixel corresponds to one pixel of 12MP frame 1:1. Left console window shows coordinates reaching final position, micrometer division centers 10µm apart. Object that can be seen in Figure of Merit is slightly more than 10µm large in both dimensions, When positioned on a new origin coordinate, pressing Arduino reset button sets the new origin coordinate as (0,0) -- never was positioning at micrometer range more easy. More details tomorrow.
(right click 1360x768 raspi2png snapshot for 100% view)
P.S:
Part from 16MP smartphone detail photo of Raspberry HQ camera microscope with microscope two-axis sliding-table, shard with micrometer superglued on top. 50% scaled photo below, 100% here:
https://pbs.twimg.com/media/EabwN_KWsAI ... name=large
P.P.S
12MP frame dimension is 4056x3040, and 3040/4=760.
My "luck" was that my (TV) HDMI display size is small (1360x768) compared to modern monitors and 768 nearly perfectly matches 760.
Instead of
I will use this for perfect 1:1 HDMI display to 12MP centered 1:1 pixel mapping from now on (768/3040 ≈ 0.2526):
P.P.P.S:
Wow, although 768 is really close to 760, using correct "-roi" as per P.P.S results in a better preview:

Entering coordinates in Linux screen session to Arduino Uno connected to Raspberry Pi4B /dev/ttyUSB0 allows to position (x,y) of two-axis microscope sliding-table in steps of 5µm. Preview magnification was chosen so that one HDMI display preview pixel corresponds to one pixel of 12MP frame 1:1. Left console window shows coordinates reaching final position, micrometer division centers 10µm apart. Object that can be seen in Figure of Merit is slightly more than 10µm large in both dimensions, When positioned on a new origin coordinate, pressing Arduino reset button sets the new origin coordinate as (0,0) -- never was positioning at micrometer range more easy. More details tomorrow.
(right click 1360x768 raspi2png snapshot for 100% view)
P.S:
Part from 16MP smartphone detail photo of Raspberry HQ camera microscope with microscope two-axis sliding-table, shard with micrometer superglued on top. 50% scaled photo below, 100% here:
https://pbs.twimg.com/media/EabwN_KWsAI ... name=large
P.P.S
There is a simple explanation why factor 4× does it.
12MP frame dimension is 4056x3040, and 3040/4=760.
My "luck" was that my (TV) HDMI display size is small (1360x768) compared to modern monitors and 768 nearly perfectly matches 760.
Instead of
Code: Select all
🍓 ~/userland/build/bin/raspistill --focus -t 0 --roi 0.375,0.375,0.25,0.25
Code: Select all
🍓 ~/userland/build/bin/raspistill --focus -t 0 --roi 0.3737,0.3737,0.2526,0.2526
P.P.P.S:
Wow, although 768 is really close to 760, using correct "-roi" as per P.P.S results in a better preview:

https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
I did publish Arduino Uno sketch allowing to control X and Y direction of sliding-table with 5µm steps:
https://forum.arduino.cc/index.php?topi ... msg4641633
Besides all the parse user input stuff, this is the simpel code that moves sliding table from current position (x,y) to new position read from Serial (nx,ny):
After X and Y direction can now be controlled in single digit micrometer steps, I found it annoying having to control Z direction via microscope stand wheel by hand. Especially given the much reduced depth of view with macro lens. So I just tried, and it worked. Now superglued 28BYJ48 stepper motor controls microscope wheel. One full rotation does move 18mm vertically, with 2048 full-steps per rotation now a single step in Z direction does 8.8µm(!):
https://www.youtube.com/watch?v=SE6mB6ZdpRI
Next stepi is to implement microstepping. Currently sketch uses 5µm per step half-stepping in X and Y direction.
7.1µm per step full-step is more reliable, so I will go with that.
Sketch uses pins 2-5 and 8-11, Uno digital pins 3,5,7 and 9 allow for PWM.
I will use String.toFloat() which allows for 2 decimal digits precision after point only.
But 7.1µm/100 = 0.071µm(!) per step in X and Y direction of sliding-table should be more than good enough
[the 12MP frame has 0.21µm/px, so a single stepper motor step would move less than 1 pixel wide ...]
https://forum.arduino.cc/index.php?topi ... msg4641633
Besides all the parse user input stuff, this is the simpel code that moves sliding table from current position (x,y) to new position read from Serial (nx,ny):
Code: Select all
…
while ((x!=nx) || (y!=ny)) {
if (x<nx) step(0, ++x);
if (x>nx) step(0, --x);
if (y<ny) step(1, ++y);
if (y>ny) step(1, --y);
delay(d);
}
…
After X and Y direction can now be controlled in single digit micrometer steps, I found it annoying having to control Z direction via microscope stand wheel by hand. Especially given the much reduced depth of view with macro lens. So I just tried, and it worked. Now superglued 28BYJ48 stepper motor controls microscope wheel. One full rotation does move 18mm vertically, with 2048 full-steps per rotation now a single step in Z direction does 8.8µm(!):
https://www.youtube.com/watch?v=SE6mB6ZdpRI
Next stepi is to implement microstepping. Currently sketch uses 5µm per step half-stepping in X and Y direction.
7.1µm per step full-step is more reliable, so I will go with that.
Sketch uses pins 2-5 and 8-11, Uno digital pins 3,5,7 and 9 allow for PWM.
I will use String.toFloat() which allows for 2 decimal digits precision after point only.
But 7.1µm/100 = 0.071µm(!) per step in X and Y direction of sliding-table should be more than good enough

[the 12MP frame has 0.21µm/px, so a single stepper motor step would move less than 1 pixel wide ...]
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
I changed Arduino Uno sketch for sliding-table to process one input only (Z axis), and use Stepper library instead:
https://forum.arduino.cc/index.php?topi ... msg4642386
I started Rapberr Pi HQ camera microscope view, and in left bottom terminal you see my inputs. I did move 1 stepper motor step deeper each time, that means 8.8µm per step. I took 1360x768 screenshots after each move (from laptop ssh session) and created animated .gif from captured frames. Right click to see animation in 100% size:

This is a new kind of focusing, if having moved too deep, just move away 100 steps up, and then down again. This avoids issues with gravity and moving single step up.
https://forum.arduino.cc/index.php?topi ... msg4642386
I started Rapberr Pi HQ camera microscope view, and in left bottom terminal you see my inputs. I did move 1 stepper motor step deeper each time, that means 8.8µm per step. I took 1360x768 screenshots after each move (from laptop ssh session) and created animated .gif from captured frames. Right click to see animation in 100% size:

This is a new kind of focusing, if having moved too deep, just move away 100 steps up, and then down again. This avoids issues with gravity and moving single step up.
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
You could use standard deviation (I think that's what you show it in the rectangle) to implement an autofocus - at least that's what I did for automatic solar observation. And you could also use the stack of images to create an overall crisp image ...
Looking for professional help? https://klepp.biz
Re: Creating a RPI-assisted microscope
The rectangle is new "Feature of Merit (FoM)" feature of raspstill ("--focus"):
viewtopic.php?f=43&t=273804
Interesting. The FoM values are not accessible with 4.7 kernel Buster.) to implement an autofocus
From 6by9 in another thread:
I did rpi-update my 2GB Pi4B and have 5.4 kernel on it.6by9 wrote: ↑Tue May 19, 2020 9:13 amrpi-5.4.y kernel.
You'll have /dev/video13 to /dev/video16. 13 is the input, 14 is the high res output, 15 is the low res output, and 16 is the stats output.
The format of the stats is as documented in https://github.com/raspberrypi/linux/bl ... -stats.rst and the struct at https://github.com/raspberrypi/linux/bl ... isp.h#L310
I tried to access /dev/video16 but was not able to even read a single byte

A small sample would be really helpful.
If I could read, then autofocus can be done. In that case implementing microstepping for the microscope stand wheel stepper motor would allow for much finer vertical positioning, less than 0.1µm per step ...
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
What I experienced is hat moves to far away positions with microscope two-axis sliding-table work reliable (5µm per step in X and Y direction). But for small distances, especially for moving a single step, positioning is not that good. I learned that reason for that is most likely backlash. So I simulated longer distance by moving away 8 steps with both steppers first, and then moving to target position.
I used this command to create the video (unlike before where I used Linux "screen" command to control the Arduino Uno connected to Pi4B, stty allows to set budrate for /dev/ttyUSB0 and then that device can be used like a file; the positions echoed will be executed as commands by Arduino Uno sketch):
The 4:19min video is here (centers of micrometer divisions are 10µm apart):
https://www.youtube.com/watch?v=C2Orst9qGrI
I used this command to create the video (unlike before where I used Linux "screen" command to control the Arduino Uno connected to Pi4B, stty allows to set budrate for /dev/ttyUSB0 and then that device can be used like a file; the positions echoed will be executed as commands by Arduino Uno sketch):
Code: Select all
🍓 stty ospeed 115200 --file=/dev/ttyUSB0 &&
> for((i=0;i<100;++i))
> do
> echo "8,8"; sleep 0.1; echo "0,0"; sleep 0.5;
> echo "8,8"; sleep 0.1; echo "0,1"; sleep 0.5;
> echo "8,8"; sleep 0.1; echo "1,1"; sleep 0.5;
> echo "8,8"; sleep 0.1; echo "1,0"; sleep 0.5;
> done > /dev/ttyUSB0
^C
🍓
The 4:19min video is here (centers of micrometer divisions are 10µm apart):
https://www.youtube.com/watch?v=C2Orst9qGrI
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
New work on getting xy two-axis sliding-table stepper motors from 5µm/step and z axis stepper motor from 8.8µm/step down to 1µm/step happens here:
"Uno" µm xyz positioning system for 0.21µm/pixel microscope(Raspberry HQ camera)
https://forum.arduino.cc/index.php?topic=691123.0
From last posting in that new thread, bill of material (prices only if >5$):
For controlling three stepper motor 12 coil ends with a single Arduino Uno that has only 6 pwm pins, four 74..... ICs are needed.
Prototype for one stepper motor is done and works already (two PWM pins and two non-PWM pins for controlling 4 coil ends):
https://www.youtube.com/watch?v=B9wgbDBsAP4

"Uno" µm xyz positioning system for 0.21µm/pixel microscope(Raspberry HQ camera)
https://forum.arduino.cc/index.php?topic=691123.0
From last posting in that new thread, bill of material (prices only if >5$):
- Raspberry Pi (5$ Pi0, 25$ Pi3A+ or 35$ 2GB Pi4B)
- 26$ 1024x600 9" HDMI display (if needed)
- --
- 20$ Arduino Uno (or cheap clone)
- --
- 50$ Raspberry HQ camera (without lens)
- 20$ Arducam M12 180° lens with CS- to M12-mount adapter
- 2× CS to C-mount extension rings
- --
- 15$ aluminum alloy microscope stand
- 28BYJ48 stepper motor
- few Lego pieces, superglue
- --
- microscope two-axis sliding-table
- --
- 3× uln2003 stepper motor drivers
- 2× 74HC00N, 1×74HC02N and 1×74HC04N ICs
- breadboard, cables, boxes
For controlling three stepper motor 12 coil ends with a single Arduino Uno that has only 6 pwm pins, four 74..... ICs are needed.
Prototype for one stepper motor is done and works already (two PWM pins and two non-PWM pins for controlling 4 coil ends):
https://www.youtube.com/watch?v=B9wgbDBsAP4

https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
Great work,
In order to get small precise movement without backlash you may consider flexures.
If you have a 3d printer there are some examples in Thingiverse that's the easiest way to make.
Here-s a good introduction if you-re not familiar with flexures: https://www.youtube.com/watch?v=PaypcVFPs48
In order to get small precise movement without backlash you may consider flexures.
If you have a 3d printer there are some examples in Thingiverse that's the easiest way to make.
Here-s a good introduction if you-re not familiar with flexures: https://www.youtube.com/watch?v=PaypcVFPs48
Re: Creating a RPI-assisted microscope
Interesting, this "XY Table Stage flexure" may be worth a try:
https://www.thingiverse.com/thing:1072440
This flexure demonstrates a 15:1 reduction:
https://www.thingiverse.com/thing:1424169
With flexure reduction there is no need to use micro-stepping, 15:1 would bring 5µm per step to 0.33µm per step.
The 8.8µm per step for Z axis was with full-stepping, so with half-stepping it is 4.4µm per step, and <0.3µm per step with 15:1 reduction.
I will see how microstepping works, and look into flexures when my son will have access to 3D printers at university again (currently he has not because of COVID-19).
Perhaps autofocus will become available for Z axis, see:
"diy autofocus with /dev/video16 and stepper motor?"
viewtopic.php?f=43&t=278268
Since yesterday I have correct way to drive the tiny bipolar steppers for xy axis.
Currently they are driven directly from Arduino digital pins, later L293D will be used:
https://forum.arduino.cc/index.php?topi ... msg4653923
P.S:
I mentioned it before, optical microscope minimal resolution is d=0.2µm:
https://en.wikipedia.org/wiki/Optical_m ... imitations
I now found that maximal magnification without oil immersion is 800×.
Not sure whether that is what microscope with 0.21µm/px currently has:
https://en.wikipedia.org/wiki/Magnifica ... nification

https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
I have noticed microscope view of micrometer shaking when I entered the room before,
Today I took a video:
https://www.youtube.com/watch?v=VCirNaVANKc
"inhouse heel-strikes seen by 0.21µm/pixel microscope"
And they are all of the form "tiny rise, then big step down and finally return" for the Figure of Merit values.
I did cut out 1 second of 30fps smartphone camera 1920x1080 video, animation scaled to 640x360 plays at 2fps, 15× slower than real.
Here you can see constant phase of 862, slight rise to 863, fall to 792 during shake and finally return above 800, then loop:

If you watch the walking steps at first part of video (stop youtube video, single frame step fore-/back-ward with "."/"," keys) you will notice that the minimal value of a shake is deeper the closer the heel-strike is to the microscope. Even 3m away the heel-strikes can be recognized by Figure Of Merrit values. While the values have no real meaning absolutely, the relative chnages have. Naushir said that readout sample code for FoM values will be available soon (viewtopic.php?f=43&t=278268#p1685252).
Seems that this is "mission impossible" like floor movement detection system, without any sensors attached to floor and just using HQ camera ...
P.S:
Experiment is done in 1st floor of our house, and we have reinforced concrete floors. They have so much steel inside, that Wifi signal from 1st floor cannot reach basement. Therefore our router is on ground floor for reaching all three floors. There is floor pavement on top of reinforced concrete floor, and carpet on top of that as well.
Today I took a video:
https://www.youtube.com/watch?v=VCirNaVANKc
"inhouse heel-strikes seen by 0.21µm/pixel microscope"
Shakes are in the 10s of micrometer range, even those of normal walking.Centers of micrometer divisions are 10µm apart, or 48 pixels. I walked from floor into room, turned left before microscope and walked 3m to window away, then returned to microscpe and there entered the floor again. Finally I went in and did a small and then a big heel-strike stationary (those sound loud)
And they are all of the form "tiny rise, then big step down and finally return" for the Figure of Merit values.
I did cut out 1 second of 30fps smartphone camera 1920x1080 video, animation scaled to 640x360 plays at 2fps, 15× slower than real.
Here you can see constant phase of 862, slight rise to 863, fall to 792 during shake and finally return above 800, then loop:

If you watch the walking steps at first part of video (stop youtube video, single frame step fore-/back-ward with "."/"," keys) you will notice that the minimal value of a shake is deeper the closer the heel-strike is to the microscope. Even 3m away the heel-strikes can be recognized by Figure Of Merrit values. While the values have no real meaning absolutely, the relative chnages have. Naushir said that readout sample code for FoM values will be available soon (viewtopic.php?f=43&t=278268#p1685252).
Seems that this is "mission impossible" like floor movement detection system, without any sensors attached to floor and just using HQ camera ...

P.S:
Experiment is done in 1st floor of our house, and we have reinforced concrete floors. They have so much steel inside, that Wifi signal from 1st floor cannot reach basement. Therefore our router is on ground floor for reaching all three floors. There is floor pavement on top of reinforced concrete floor, and carpet on top of that as well.
Last edited by HermannSW on Fri Jun 26, 2020 10:02 pm, edited 1 time in total.
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
I did cut a very small needle with a knife, and placed it onto shard with micrometer. It does cross the black circle of micrometer, so I maually focused and searched following black circle. Then I carefully moved to the inside of black circle. Unfortunately I found the side that I did cut with the knife, and not the pin point. Will try to move to the other side now. But I measured width of the needle at constant width place as 148 pixel, that is 31.08µm(!). Here is 1360x768 screenshot for now (right click for original size):
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
I was wrong, the bright part I measured was only part of the lancet (not needle) side!
I moved to the other side and found pin point, so small(!).
I measured the length of line between bright pin point and dark rest of lancet with gimp as 63.1px:
Not sure whether it is just bright light at pin point, or whether there is different material.
Anyway, I would not have expected to have objects with <10µm size at pin point at home in numbers.
Type of lancet (uncut):
https://en.wikipedia.org/wiki/Blood_lancet
I moved to the other side and found pin point, so small(!).
I measured the length of line between bright pin point and dark rest of lancet with gimp as 63.1px:
Code: Select all
63.1px * 0.21µm/px = 13.251µm
Anyway, I would not have expected to have objects with <10µm size at pin point at home in numbers.
Type of lancet (uncut):
https://en.wikipedia.org/wiki/Blood_lancet
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
Until now I did light the microscope with a big 1000lm desk lamp from the side.
Problem with that was that I was not able to look between HQ camera lens and two-axis sliding table.
Today I tried to replace lighting, I used a single 12V led chip that I use for my raspcatbot as well:
https://www.ebay.de/itm/254442101119
viewtopic.php?f=37&t=267999&start=25#p1640482
I don't like the big in physical size 1.5Ω resistor I would have to use in series when using 12V, so I power led with 9V, the first voltage where all 9 led dots do light. Constant power supply says it draws 140mA, but that value increases after some time a bit. I glued a small CPU heatsink onto led backside, with that led can run long without overheating. Because the led is so small, it can be placed near to the microscope sliding-table, and still allows to look for details from the side. Since lux increases quadratically with distance reduction, the led might not have much lumen when powered with 9V only (800lm when powered with 12V), but the more important lux at microscope sliding table is really good:
Below is 1360x768 raspi2png screenshot showing the complete width of the lanzet at its big part, taken while using new led to light the scene. I measured from left bottom border to one of the dark lines on top of lanzet vertically 590 pixels, and from that line vertically to right top border 343 pixels. Therefore lanzet thickness is (590+343)*0.21=195.93µm:
I did the xyz focusing by hand -- its time now to join the Arduino sketches for 5µm per step xy positioning with z positioning, and using half-steps for z allowing for 4.4µm steps. Then I could place lanzet onto micormeter divisions, focus on center of lazcet and note the z coorindinate, then move to side and focus on micrometer divisions and note z coordinate as well. The step delta in z direction multiplied with 4.4µm will then be the thickness of lanzet. This method allows to measure vertical distances up to 4.4µm per step precision (later better with micro-stepping).
P.S:
I powered the led with 9V from constant voltage supply. Reducing to lower voltages (with milliVolt precision if needed) allows to light a microscope scene with exactly the amount of light needed.
Problem with that was that I was not able to look between HQ camera lens and two-axis sliding table.
Today I tried to replace lighting, I used a single 12V led chip that I use for my raspcatbot as well:
https://www.ebay.de/itm/254442101119
viewtopic.php?f=37&t=267999&start=25#p1640482
I don't like the big in physical size 1.5Ω resistor I would have to use in series when using 12V, so I power led with 9V, the first voltage where all 9 led dots do light. Constant power supply says it draws 140mA, but that value increases after some time a bit. I glued a small CPU heatsink onto led backside, with that led can run long without overheating. Because the led is so small, it can be placed near to the microscope sliding-table, and still allows to look for details from the side. Since lux increases quadratically with distance reduction, the led might not have much lumen when powered with 9V only (800lm when powered with 12V), but the more important lux at microscope sliding table is really good:
Below is 1360x768 raspi2png screenshot showing the complete width of the lanzet at its big part, taken while using new led to light the scene. I measured from left bottom border to one of the dark lines on top of lanzet vertically 590 pixels, and from that line vertically to right top border 343 pixels. Therefore lanzet thickness is (590+343)*0.21=195.93µm:
I did the xyz focusing by hand -- its time now to join the Arduino sketches for 5µm per step xy positioning with z positioning, and using half-steps for z allowing for 4.4µm steps. Then I could place lanzet onto micormeter divisions, focus on center of lazcet and note the z coorindinate, then move to side and focus on micrometer divisions and note z coordinate as well. The step delta in z direction multiplied with 4.4µm will then be the thickness of lanzet. This method allows to measure vertical distances up to 4.4µm per step precision (later better with micro-stepping).
P.S:
I powered the led with 9V from constant voltage supply. Reducing to lower voltages (with milliVolt precision if needed) allows to light a microscope scene with exactly the amount of light needed.
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
I did mount another lanzet into a 5mm to 3mm coupler that is mounted onto 28BYJ-48 stepper motor, that is superglued to a yellow box.
This is the setup, lanzet into camera direction: Yesterday we have seen that lanzet looks like this from the side:

I used same Arduino sketch that did control microscope height with stepper motor, allowing to input a number, and stepper motor moves to that position. I did start capturing screenshots at 220, and then did move 10 and took snapshot repeatedly, until 450. That is rotation of 230/2038*360°=40.6° from first to last. Yes, because of coupler (likely) the center of lanzet jumps, but not far. I did display 2x2 pixels from 12MP frame on single dispay pixel this time, zoomed out 2× to previous screenshots in this thread. Reason was that whole lanzent would not have fit into preview window otherwise. 1024×748 preview window corresponds to 0.430mm × 0.322mm in real with 0.42µm/pixel. I measured lanzet diameter as 695 pixel, or 292µm. This is a bit thicker than the 195µm diameter measured before. Only small part of image is in focus, 30 pixel wide, 12.6µm with 0.42µm/px. Only that part in focus indicates, that a thin triangle sits on top of round lancet. Here is the anmiation generated from screenshots, you can see on left side the stepper motor position values I entered. 1360x768 frames were scaled to 50% for animation:

This is part of 27th snapshot, where I did the pixel measurements with:
This is the setup, lanzet into camera direction: Yesterday we have seen that lanzet looks like this from the side:
I used same Arduino sketch that did control microscope height with stepper motor, allowing to input a number, and stepper motor moves to that position. I did start capturing screenshots at 220, and then did move 10 and took snapshot repeatedly, until 450. That is rotation of 230/2038*360°=40.6° from first to last. Yes, because of coupler (likely) the center of lanzet jumps, but not far. I did display 2x2 pixels from 12MP frame on single dispay pixel this time, zoomed out 2× to previous screenshots in this thread. Reason was that whole lanzent would not have fit into preview window otherwise. 1024×748 preview window corresponds to 0.430mm × 0.322mm in real with 0.42µm/pixel. I measured lanzet diameter as 695 pixel, or 292µm. This is a bit thicker than the 195µm diameter measured before. Only small part of image is in focus, 30 pixel wide, 12.6µm with 0.42µm/px. Only that part in focus indicates, that a thin triangle sits on top of round lancet. Here is the anmiation generated from screenshots, you can see on left side the stepper motor position values I entered. 1360x768 frames were scaled to 50% for animation:

This is part of 27th snapshot, where I did the pixel measurements with:
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
It was really difficult to get lanzet pin point into focus when stepper motor axis is inclined.
It turned out that pin point of lanzet is wound after some collision with something.
I used program to control xy of two-axis sliding table to control the spit stepper, as well as the microscope height stepper.
This is the setup: I created single screenshots like in previous posting, this is 1fps animation crated from the frames.
I turned the spit stepper with delta of 50 steps. And I used 2nd stepper to get lanzet focused.
This animation gives a good (3D?) idea of how lanzet front looks like.
This time preview was done with full 12MP (raspistill --focus -t 0):

P.S:
With only daylight microscope image gets very dark.
But with 23 seconds long exposure, daylight is sufficient.
With 239 seconds long exposure few leds from Arduino Uno and two uln2003 stepper drivers are sufficient in completely dark otherwise(!) room:
viewtopic.php?f=43&t=273358&p=1686980#p1686980
P.P.S:
Since the microscope two-axis sliding-table works so well (precision 5µm/half step) I did order ten more for 1.49$/ps:
https://twitter.com/HermannSW/status/12 ... 7770159105
It turned out that pin point of lanzet is wound after some collision with something.
I used program to control xy of two-axis sliding table to control the spit stepper, as well as the microscope height stepper.
This is the setup: I created single screenshots like in previous posting, this is 1fps animation crated from the frames.
I turned the spit stepper with delta of 50 steps. And I used 2nd stepper to get lanzet focused.
This animation gives a good (3D?) idea of how lanzet front looks like.
This time preview was done with full 12MP (raspistill --focus -t 0):

P.S:
With only daylight microscope image gets very dark.
But with 23 seconds long exposure, daylight is sufficient.
With 239 seconds long exposure few leds from Arduino Uno and two uln2003 stepper drivers are sufficient in completely dark otherwise(!) room:
viewtopic.php?f=43&t=273358&p=1686980#p1686980
P.P.S:
Since the microscope two-axis sliding-table works so well (precision 5µm/half step) I did order ten more for 1.49$/ps:
https://twitter.com/HermannSW/status/12 ... 7770159105
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
I did some more experiments with the rotating wounded lanzet pin point.
I took a 3:38min video which shows exactly which input I did to move to different positions.
I captured video from side with smartphone camera on a tripod, sitting in front of screen looked sharper.
I determined for all positions 0, 100, 200, ..., 1900, 2000 the best z coordinate where pin point was in focus (not center with Figure of Merit).
Extreme points in z direction were start (2000,210) and end (0,-30).
The 2000 half steps for turning lanzet correspond to 180° rotation, 9° what I positioned at, 0.09° per single step.
z positioning was done with half-stepping as well this time, so 4.4µm per step.
So (210-(-30))*4.4µm=1056µm=1.056mm is the distance covered in z direction.
While using Arduino sketch to move the steppers, 12MP frames were captured, 4056*3040 mapped to 25% on display:
So whole camera preview frame area is 851.76µm × 638.4µm (0.21µm/pixel).
Youtube video (with audio of stepper motors as well as keyboard use) is here:
https://www.youtube.com/watch?v=p2MywtRl8Lk
I wanted to add scale and took same scaling image of micrometer:
I then did cut out all from the frame besides two small rectangles, then scaled 1360x768 screenshot to 1920x1080 and stored as transparent .png. Finally I used gimp to determine the 4 corner coordinates of 12MP frame in 1920x1080 video, and used gimp's Perspective tool to make the scales an overlay for the video. With this command the scales are overlayed at the correct positions onto the video:
Next I will use a fresh lanzet with <10µm pinpoint and hope to get better (more localized) rotation (at least in z direction, that would allow to keep single focus).
I took a 3:38min video which shows exactly which input I did to move to different positions.
I captured video from side with smartphone camera on a tripod, sitting in front of screen looked sharper.
I determined for all positions 0, 100, 200, ..., 1900, 2000 the best z coordinate where pin point was in focus (not center with Figure of Merit).
Extreme points in z direction were start (2000,210) and end (0,-30).
The 2000 half steps for turning lanzet correspond to 180° rotation, 9° what I positioned at, 0.09° per single step.
z positioning was done with half-stepping as well this time, so 4.4µm per step.
So (210-(-30))*4.4µm=1056µm=1.056mm is the distance covered in z direction.
While using Arduino sketch to move the steppers, 12MP frames were captured, 4056*3040 mapped to 25% on display:
Code: Select all
🍓 userland/build/bin/raspistill -t 0 --focus -p 342,4,1014,760
Youtube video (with audio of stepper motors as well as keyboard use) is here:
https://www.youtube.com/watch?v=p2MywtRl8Lk
I wanted to add scale and took same scaling image of micrometer:
I then did cut out all from the frame besides two small rectangles, then scaled 1360x768 screenshot to 1920x1080 and stored as transparent .png. Finally I used gimp to determine the 4 corner coordinates of 12MP frame in 1920x1080 video, and used gimp's Perspective tool to make the scales an overlay for the video. With this command the scales are overlayed at the correct positions onto the video:
Code: Select all
🍓 time ffmpeg -i 20200629_214343.mp4 -i overlay.png -filter_complex "[0:v][1:v]overlay" mic180.mp4
ffmpeg version 4.1.4-1+rpt7~deb10u1 Copyright (c) 2000-2019 the FFmpeg developers
...
[aac @ 0x186d500] Qavg: 186.462
real 11m14.833s
user 44m4.322s
sys 0m8.367s
🍓
Next I will use a fresh lanzet with <10µm pinpoint and hope to get better (more localized) rotation (at least in z direction, that would allow to keep single focus).
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
I have double digit number of 28BYJ-48 stepper motors, but their motor shafts have too much play for what I am after.
I do have one NEMA17 stepper, and its motor shaft seems to have much less play -- already prepared with lancet for the evening: I did take another screenshot when using raspistill with 25% roi view, can use parts of that for scale overlays with 0.21µm/pixel microscope:
The 25% roi 1360×768 screenshot is scaled to 50% for forum attachment:
I do have one NEMA17 stepper, and its motor shaft seems to have much less play -- already prepared with lancet for the evening: I did take another screenshot when using raspistill with 25% roi view, can use parts of that for scale overlays with 0.21µm/pixel microscope:
Code: Select all
🍓 userland/build/bin/raspistill -t 0 --focus -p 342,4,1014,760 --roi 0.375,0.375,0.25,0.25
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
In the morning I was able to control microscope z direction stepper motor with uln2003 driver from Pi:
viewtopic.php?f=43&t=278268&p=1692610#p1692610
I added two L293D motor drivers and now can contol movements in xyz direction with simple Python scrypt and keys. This script is simple extension of pigpio library stepper example:
https://github.com/stripcode/pigpio-ste ... right-keys
The cables are a jumble, but the rest is clean. Half-steps in x and y direction are 5µm wide, in z direction 4.4µm. If you zoom into the 16MP smartphone photo, you can see cut lanzet lying on top of micrometer on two-axis microscope sliding table. The Arduino Uno you can see does only provide 5V motor voltage, and can deliver 450mA. 28BYJ48 stepper motor draws 200mA if both coils are powered, and each xy stepper motor draws 81mA, so 362mA in total:

I captured a video of first xyz journey with "peek" and uploaded to youtube. With that video it is totally clear that z axis movement was absolutely necessary, since depth of focus is so small. "Climbing" onto lanzet from micrometer in focus at start required many steps in z direction:
https://www.youtube.com/watch?v=s-SjTs8gBPA
While resolution with 12MP photos is 0.21µm/pixel, qcam application did record 2028x1520 with 0.42µm/pixel. qcam application scaled that to display frame, with "only" 1.05µm/pixel resolution. As said, capturing with raspistill allows for 0.21µm/pixel, 5× maginified to what you see here.
P.S:
"new pigpio-stepper-motor fork" posting, new keys_xyz.py example allows to start microscope at exactly the X/Y/Z position it stopped before, and not accidentially move up to 35µm wrong on first step in any of the three dimensions:
viewtopic.php?f=32&t=279671
viewtopic.php?f=43&t=278268&p=1692610#p1692610
I added two L293D motor drivers and now can contol movements in xyz direction with simple Python scrypt and keys. This script is simple extension of pigpio library stepper example:
https://github.com/stripcode/pigpio-ste ... right-keys
Code: Select all
$ cat keys.py
import pigpio
import curses
from PigpioStepperMotor import StepperMotor
pi = pigpio.pi()
stdscr = curses.initscr()
stdscr.keypad(True)
curses.cbreak()
curses.noecho()
try:
z = StepperMotor(pi, 21, 20, 16, 12)
y = StepperMotor(pi, 26, 19, 13, 6)
x = StepperMotor(pi, 5, 11, 10, 9)
while True:
key = stdscr.getkey()
if key == "KEY_LEFT":
x.doClockwiseStep()
elif key == "KEY_RIGHT":
x.doCounterclockwiseStep()
elif key == "KEY_UP":
y.doCounterclockwiseStep()
elif key == "KEY_DOWN":
y.doClockwiseStep()
elif key == "KEY_NPAGE":
z.doCounterclockwiseStep()
elif key == "KEY_PPAGE":
z.doClockwiseStep()
elif key == "KEY_END":
break
except Exception as e:
print(e)
finally:
curses.endwin()
pi.stop()
$
The cables are a jumble, but the rest is clean. Half-steps in x and y direction are 5µm wide, in z direction 4.4µm. If you zoom into the 16MP smartphone photo, you can see cut lanzet lying on top of micrometer on two-axis microscope sliding table. The Arduino Uno you can see does only provide 5V motor voltage, and can deliver 450mA. 28BYJ48 stepper motor draws 200mA if both coils are powered, and each xy stepper motor draws 81mA, so 362mA in total:

I captured a video of first xyz journey with "peek" and uploaded to youtube. With that video it is totally clear that z axis movement was absolutely necessary, since depth of focus is so small. "Climbing" onto lanzet from micrometer in focus at start required many steps in z direction:
https://www.youtube.com/watch?v=s-SjTs8gBPA
While resolution with 12MP photos is 0.21µm/pixel, qcam application did record 2028x1520 with 0.42µm/pixel. qcam application scaled that to display frame, with "only" 1.05µm/pixel resolution. As said, capturing with raspistill allows for 0.21µm/pixel, 5× maginified to what you see here.
P.S:
"new pigpio-stepper-motor fork" posting, new keys_xyz.py example allows to start microscope at exactly the X/Y/Z position it stopped before, and not accidentially move up to 35µm wrong on first step in any of the three dimensions:
viewtopic.php?f=32&t=279671
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
Microstepping is not that easy to get right.
I created a python script that allows to enter an angle in degrees, and the 4 motor coils will get pwm powered with (cos(a),sin(a),cos(a+180°),sin(a+180°)) for evaluation. Centers of micrometer divisions were 10µm apart, or 48 pixels at 0.21µm/pixel resolution of 12MP frame with 180° lens mounted reversed. For evaluation I wanted to see maginified view by 5×, so a single pixel from 12MP frame (center) maps to 5x5 square on display. That way vertical micrometer division centers are 240 pixels apart. This posting contains the details including small python script:
viewtopic.php?f=32&t=279671&p=1695195#p1695195
This is 1360x768 animation of screenshots I created from raspi2png screenshots after having moved to a new angle in right bottom terminal. Watching the big white spot at bottom allows to follow the movements easily (right click for 100% view):

I created a python script that allows to enter an angle in degrees, and the 4 motor coils will get pwm powered with (cos(a),sin(a),cos(a+180°),sin(a+180°)) for evaluation. Centers of micrometer divisions were 10µm apart, or 48 pixels at 0.21µm/pixel resolution of 12MP frame with 180° lens mounted reversed. For evaluation I wanted to see maginified view by 5×, so a single pixel from 12MP frame (center) maps to 5x5 square on display. That way vertical micrometer division centers are 240 pixels apart. This posting contains the details including small python script:
viewtopic.php?f=32&t=279671&p=1695195#p1695195
This is 1360x768 animation of screenshots I created from raspi2png screenshots after having moved to a new angle in right bottom terminal. Watching the big white spot at bottom allows to follow the movements easily (right click for 100% view):

https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
I used successfully DRV8834 stepper motor driver with 1/8 stepping to move one stepper motor from another xy sliding table with an Arduino Uno. Since half stepping is 5µm/step, 1/8 stepping is 1.25µm/step -- and DRV8834 stepper driver also has 1/16 stepping as well as 1/32 stepping!
https://forum.arduino.cc/index.php?topi ... msg4680496
Last week I tried to use DRV8834 for the x stepper of xyz Raspberry microscope. Change from Arduino Uno to Raspberry, and more likely from 5V to 3.3V (although DRV8834 can handle 5V as well as 3.3V) were too much, and moving in x direction did not work.
Today I just removed the DRV8834 and got working xyz Pi microscope again.
So I wanted to look at something small with the microscope.
I have shown images of very pointed lancet already (<10µm wide at top).
Today I did cut a pen needle I use for insulin injection and placed it onto micrometer under microscope:
The needle is slanted like lancet before, but of course contains a hole in the middle when the insulin flows through from pen under skin. The middle hole on the screenshot (scaled to 50%) is roughly 800µm long and 100µm wide (center of micrometer divisions are 10µm apart). So needle is higher than 100µm. I did capture 1360x768 screenshots for all z coordinate in range -474..-434. Then I scaled the screenshots 3×, and created 4fps animated .gif from that. Nice to see moving focus from micrometer plane up and up until maximum, and then down and down again until micrometer plane, repeatedly, Since each z direction half step is 4.4µm, vertical movement for the 40 steps is 176µm in total:

https://forum.arduino.cc/index.php?topi ... msg4680496
Last week I tried to use DRV8834 for the x stepper of xyz Raspberry microscope. Change from Arduino Uno to Raspberry, and more likely from 5V to 3.3V (although DRV8834 can handle 5V as well as 3.3V) were too much, and moving in x direction did not work.
Today I just removed the DRV8834 and got working xyz Pi microscope again.
So I wanted to look at something small with the microscope.
I have shown images of very pointed lancet already (<10µm wide at top).
Today I did cut a pen needle I use for insulin injection and placed it onto micrometer under microscope:
The needle is slanted like lancet before, but of course contains a hole in the middle when the insulin flows through from pen under skin. The middle hole on the screenshot (scaled to 50%) is roughly 800µm long and 100µm wide (center of micrometer divisions are 10µm apart). So needle is higher than 100µm. I did capture 1360x768 screenshots for all z coordinate in range -474..-434. Then I scaled the screenshots 3×, and created 4fps animated .gif from that. Nice to see moving focus from micrometer plane up and up until maximum, and then down and down again until micrometer plane, repeatedly, Since each z direction half step is 4.4µm, vertical movement for the 40 steps is 176µm in total:

https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
The lower the lens FoV, the less distortion:
viewtopic.php?f=43&t=249483&p=1710529&h ... 2#p1674386
Today I mounted 20° M12 lens for finally capturing airgun pellet inflight with HQ camera with multiple exposure single digit nanosecond highspeed flash pulses. Again part of border is not sharp, but the 19.5mm measured diameter means 19.5/4056=4.8µm/pixel resolution with that lens and screwed out as I did. M12 lens is quite screwed out (nearly 4cm between outside lense and HQ sensor). Allows to look at object 8cm apart, with depth of view being >2mm -- today I did mount M12 normally and not reversed as before:
viewtopic.php?f=43&t=284108&p=1738021#p1738021
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
I came across this last night, but I don't know how good this lense is.
https://www.seeedstudio.com/Microscope- ... -4627.html
https://www.seeedstudio.com/Microscope- ... -4627.html
Re: Creating a RPI-assisted microscope
After a year I will revisit this project, and enhance further -- thanks for unlocking this thread.
First statements on results achieved sofar, here for "distance resolution" as distance per pixel, not what length can be cleanly captured per pixel of HQ camera.
Mounting 180° M12 lens reversed as macro lens ...
viewtopic.php?f=43&t=210605&start=75#p1674471

... allows for 150µm/710pixels=0.21µm/pixel (measured with micrometer slide, distance from one division to next on right is 10µm), nearly length that best optical microscope can provide (0.2µm). 4056 pixels horizontally for 12MP photo cover 0.85mm in total at that distance resolution:
viewtopic.php?f=43&t=210605&start=75#p1677057

First statements on results achieved sofar, here for "distance resolution" as distance per pixel, not what length can be cleanly captured per pixel of HQ camera.
Mounting 180° M12 lens reversed as macro lens ...
viewtopic.php?f=43&t=210605&start=75#p1674471

... allows for 150µm/710pixels=0.21µm/pixel (measured with micrometer slide, distance from one division to next on right is 10µm), nearly length that best optical microscope can provide (0.2µm). 4056 pixels horizontally for 12MP photo cover 0.85mm in total at that distance resolution:
viewtopic.php?f=43&t=210605&start=75#p1677057
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
Re: Creating a RPI-assisted microscope
Superglueing cheap 28BYJ48 stepper motor to microscope stand ...
viewtopic.php?f=43&t=210605&p=1892784#p1679128

... allows for 4.4µm/step vertically (z-direction) with half stepping, allowing for impressive focussing, which is needed because of small vertical sharp field of view:
viewtopic.php?f=43&t=210605&start=100#p1704040

Using cheap 1.50$ microscope sliding tables from aliexpress.com ...
viewtopic.php?f=43&t=210605&start=75#p1678566

... allows for 5µm/step in x- and y-direction with half stepping:
viewtopic.php?f=43&t=210605&start=100#p1685377

For x/y/z direction DRV8834 motor driver (allowing for 1/32 stepping) will allow for 0.3125µm/step for x/y and 0.275µm/step for z direction.
viewtopic.php?f=43&t=210605&p=1892784#p1679128
... allows for 4.4µm/step vertically (z-direction) with half stepping, allowing for impressive focussing, which is needed because of small vertical sharp field of view:
viewtopic.php?f=43&t=210605&start=100#p1704040

Using cheap 1.50$ microscope sliding tables from aliexpress.com ...
viewtopic.php?f=43&t=210605&start=75#p1678566
... allows for 5µm/step in x- and y-direction with half stepping:
viewtopic.php?f=43&t=210605&start=100#p1685377

For x/y/z direction DRV8834 motor driver (allowing for 1/32 stepping) will allow for 0.3125µm/step for x/y and 0.275µm/step for z direction.
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/