User avatar
HermannSW
Posts: 6093
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: Raw sensor access / CSI-2 receiver peripheral

Sat Feb 17, 2018 1:02 am

I wanted to see how high framerate can be moved with v2 camera full VGA capturing while still only get few frame skips.

In this earlear posting in this thread
viewtopic.php?f=43&t=109137&start=375#p1263896

I showed that full VGA capturing with raspiraw can be done without frame skips at 240fps, and sometimes even with 300fps.

Today I modified "tools/640x480" slightly for better statistics, new file attached here:

Code: Select all

$ diff ../tools/640x480 640x480 
14c14,17
< echo "$l frames were captured at $((1000000 / $us))fps" 
---
> start=`head -1 tstamps.csv | cut -f3 -d,`
> end=`tail -1 tstamps.csv | cut -f3 -d,`
> echo "$l frames were captured in $(($end-$start))us" 
> echo "majority at $((1000000 / $us))fps, average at $(((1000000*$l)/($end-$start)))fps" 
17c20
< cut -f1 -d, tstamps.csv | sort -n | uniq -c
---
> cut -f1 -d, tstamps.csv | sort -n | uniq -c | sed "s/^/</"
27c30,31
< echo "$per% frame skips"
---
> echo "$skips frame skips ($per%)"
> grep "^[$D]" tstamps.csv | cut -f1 -d, | sort -n | uniq -c | sed "s/^/|/"
$

I did another setup with classic mouse trap, but this time the Lego piece does not trigger the mouse trap, but hangs to get hit by closing mouse trap. This is the command, excluding the delay distribution analysis and list of frame skips (by last grep):

Code: Select all

pi@raspberrypi02:~/raspiraw/t $ ./640x480 2800 420 | grep -v "^[<>]"
removing /dev/shm/out.*.raw
capturing frames for 2800ms with 420fps requested
1155 frames were captured in 2792271us
majority at 424fps, average at 413fps
frame delta time[us] distribution
after skip frame indices (middle column)
31 frame skips (2%)
|      5 4712
|     26 4713
pi@raspberrypi02:~/raspiraw/t $ 

The majority of frames is captured with delay corresponding 424fps.
Dividing total time taken (last timestamp minus first timestamp) by frames captured shows 413fps average framerate.
The capture was restricted to 2.8s because that just fits into /dev/shm ramdisk:

Code: Select all

pi@raspberrypi02:~/raspiraw/t $ df /dev/shm
Filesystem     1K-blocks   Used Available Use% Mounted on
tmpfs             448400 434280     14120  97% /dev/shm
pi@raspberrypi02:~/raspiraw/t $ 

This is the frame just before the closing mouse trap hits the Lego piece
(you can see the closing part of mouse trap washy right of Lego piece):
Image


And this is 1st frame after hit:
Image


Yes, only the top 640x215 area of the frame is correctly updated, in lower part two frames show up alternately. But as the statistics show the average framerate is 413fps, not bad for a 640x215 video.

I created .ogg video from all frames capture for playback at 25fps (for youtube upload):

Code: Select all

pi@raspberrypi02:~/raspiraw/t $ time ( raw2ogg2anim t 1 1126 25 )
removing old auxiliary files
...
done

real	51m36.653s
user	42m54.330s
sys	0m45.740s

The post processing for .ogg and .anim creation took 52min on a Pi 3B!
This is the .ogg video uploaded to youtube (2.8s corresponds to 45s at 25fps, plays slowed down by factor 413/25=16.52):
https://www.youtube.com/watch?v=QbgOGrz ... e=youtu.be

Lego piece is static from 0-10s, then moves from 10-12s, swings back into view from 22-28s, and again 32-40s and 44-45s. Pendulum is the reason Lego piece appears again and again.

Summary:
420fps requested framerate overloads v2 camera capabilities at 640x480.
But surprisingly gives clean 640x215 video in top part at >400fps average framerate.


P.S:
Browser view of scene short before final setup of raspiraw capture:
Image
Attachments
640x480.zip
contains 640x480 tool
(765 Bytes) Downloaded 230 times
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/

User avatar
HermannSW
Posts: 6093
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: Raw sensor access / CSI-2 receiver peripheral

Sat Feb 17, 2018 2:05 am

I revisited capturing 640x480 with 240fps requested, with raspiraw and v2 camera.
Video capture is fine (at 241fps, no frame skips), with new tool described in previous posting:

Code: Select all

pi@raspberrypi02:~/raspiraw/t $ ./640x480 2800 240 | grep -v "^[<>]"
removing /dev/shm/out.*.raw
capturing frames for 2800ms with 240fps requested
674 frames were captured in 2791570us
majority at 241fps, average at 241fps
frame delta time[us] distribution
after skip frame indices (middle column)
0 frame skips (0%)
pi@raspberrypi02:~/raspiraw/t $

But only the top 640x400 part of the frame gets updated.
Anyway, 640x400 at 241fps is fine for me.

Frame 250:
Image

Frame 260:
Image



On the other hand you can get full 640x480 at 180fps with modified raspivid:
viewtopic.php?f=43&t=204775&p=1270498#p1270498
Image
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/

grimepoch
Posts: 117
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Sat Feb 17, 2018 7:11 pm

Update, my impedance is really off so I am going to just redo my board.

Question, I am thinking of switching to CSI 1 instead of 0, so I can remove all vias from the path. Anything I should be aware of using CSI 1?

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 15294
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Sat Feb 17, 2018 8:08 pm

grimepoch wrote:
Sat Feb 17, 2018 7:11 pm
Update, my impedance is really off so I am going to just redo my board.

Question, I am thinking of switching to CSI 1 instead of 0, so I can remove all vias from the path. Anything I should be aware of using CSI 1?
I hadn't realised you were using CSI0 instead of 1. All standard Pi variants use CSI1 although there's no particular reason for that.
There's nothing really different other than the fact that it has the potential for 4 lanes, instead of only 2 lanes available on CSI0.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

xiaoyongtijee
Posts: 5
Joined: Mon Feb 19, 2018 10:12 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Tue Feb 20, 2018 12:15 am

Thanks for 6by9 and all other users who created this fantastic support to access the csi data. The documentations are also very clear and easy to follow. I am able to setup a rpi 2 and got all the raw data without any issue.
We are trying to use this interface to connect our own csi device, which is an FPGA based multiple camera interface. I saw some user (deeluk?) has done something similar but I not very clear on whether our current design can be readily handled by rawcam. Basically our device output multiple streams on the csi interface using virtual channels. I tried and confirmed that we can create multiple vc.ril.rawcam components. Can we use them and unique image ids to receive data for each virtual channel? (our csi tx is not fully ready so I cannot experiment with it yet)

thanks
xiaoyong

xiaoyongtijee
Posts: 5
Joined: Mon Feb 19, 2018 10:12 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Tue Feb 20, 2018 2:03 am

Thanks for this great capability introduced by 6by9. I can easily follow the documents and am able to capture raw data from pi camera.
We are developing an FPGA based multi-camera gadget and trying to use this interface. I see deeluk has some similar device. However for us, we connect it to pi as a single i2c device and want to output multiple streams using the csi virtual channels. I noticed we can create multiple component of vc.ril.rawcam, can we use combined virtual channel number and data type as input of image id property and get callback for each stream?

thanks
Xiaoyong

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 15294
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Tue Feb 20, 2018 1:12 pm

xiaoyongtijee wrote:
Tue Feb 20, 2018 2:03 am
Thanks for this great capability introduced by 6by9. I can easily follow the documents and am able to capture raw data from pi camera.
We are developing an FPGA based multi-camera gadget and trying to use this interface. I see deeluk has some similar device. However for us, we connect it to pi as a single i2c device and want to output multiple streams using the csi virtual channels. I noticed we can create multiple component of vc.ril.rawcam, can we use combined virtual channel number and data type as input of image id property and get callback for each stream?
Creating multiple instances of vc.ril.rawcam for the same physical peripheral will not help. If you try to enable the second one then it should fail as the hardware is already in use.

The hardware only supports splitting the stream into two - one that matches the specified CSI data identifier considered as the image stream, and anything else is considered as a data stream as it is normally used for embedded metadata from sensors.

rawcam splits these into the the buffers without MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO / OMX_BUFFERFLAG_CODECSIDEINFO set for the data that matches the data identifier, and buffers with that flag set for packets that don't match. (AIUI the hardware will actually take 4 data ID values for matching image data, but currently only one is ever set up. Let me know if that is a restriction to you).

With the way it is currently set up, there is an issue that we only look at the frame start/end packets on the image stream. If you sent two non-matching/"data" frames between two matching/"image" frames, then you will lose one of the "data" frames. The length of the data buffer is also misrepresented, so you will need some other technique to determine how much valid data is present in those buffers.
There is an option to get independent interrupts for the image and data frame starts and ends, but it'd require a chunk of rework that I wouldn't be planning to do to rawcam. There may be the potential to rework the V4L2 driver for the hardware to pick up the two streams, but I need to get the basic thing merged first before adding more complications. You'd also get two independent nodes under /dev/video, so I don't know how well that will work.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

grimepoch
Posts: 117
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Tue Feb 20, 2018 4:06 pm

One thing I am going to attempt experimenting with is creating one more line at the end of the image for meta data about that frame. In my case, I want to know which field it is (even/odd) for an interlaced image.

I imagine you could do something similar for distinguishing between two frames.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 15294
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Tue Feb 20, 2018 4:18 pm

grimepoch wrote:
Tue Feb 20, 2018 4:06 pm
One thing I am going to attempt experimenting with is creating one more line at the end of the image for meta data about that frame. In my case, I want to know which field it is (even/odd) for an interlaced image.

I imagine you could do something similar for distinguishing between two frames.
Yes, if you're designing your own FPGA CSI2 source, then inserting metadata as extra line or lines at the end of the image is fine.
In theory we could handle things like the interlaced mode off the Toshiba TC358743 where it sticks the interlaced fields on different data IDs, but routing the data correctly would be a nightmare.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

xiaoyongtijee
Posts: 5
Joined: Mon Feb 19, 2018 10:12 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Tue Feb 20, 2018 5:28 pm

6by9 wrote:
Tue Feb 20, 2018 1:12 pm
xiaoyongtijee wrote:
Tue Feb 20, 2018 2:03 am
Thanks for this great capability introduced by 6by9. I can easily follow the documents and am able to capture raw data from pi camera.
We are developing an FPGA based multi-camera gadget and trying to use this interface. I see deeluk has some similar device. However for us, we connect it to pi as a single i2c device and want to output multiple streams using the csi virtual channels. I noticed we can create multiple component of vc.ril.rawcam, can we use combined virtual channel number and data type as input of image id property and get callback for each stream?
Creating multiple instances of vc.ril.rawcam for the same physical peripheral will not help. If you try to enable the second one then it should fail as the hardware is already in use.

The hardware only supports splitting the stream into two - one that matches the specified CSI data identifier considered as the image stream, and anything else is considered as a data stream as it is normally used for embedded metadata from sensors.

rawcam splits these into the the buffers without MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO / OMX_BUFFERFLAG_CODECSIDEINFO set for the data that matches the data identifier, and buffers with that flag set for packets that don't match. (AIUI the hardware will actually take 4 data ID values for matching image data, but currently only one is ever set up. Let me know if that is a restriction to you).

With the way it is currently set up, there is an issue that we only look at the frame start/end packets on the image stream. If you sent two non-matching/"data" frames between two matching/"image" frames, then you will lose one of the "data" frames. The length of the data buffer is also misrepresented, so you will need some other technique to determine how much valid data is present in those buffers.
There is an option to get independent interrupts for the image and data frame starts and ends, but it'd require a chunk of rework that I wouldn't be planning to do to rawcam. There may be the potential to rework the V4L2 driver for the hardware to pick up the two streams, but I need to get the basic thing merged first before adding more complications. You'd also get two independent nodes under /dev/video, so I don't know how well that will work.
Thanks for the clarification. We will embed data in the image stream to identify them then. Since we generate csi tx data in FPGA so we can adjust data ID as needed. yes, get this to work with V4L2 driver will be hacky. We will just output it to streams that consumable by gstreamer and let user access it that way first.

thanks
Xiaoyong

grimepoch
Posts: 117
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Feb 21, 2018 8:30 pm

xiaoyongtijee wrote:
Tue Feb 20, 2018 5:28 pm

Thanks for the clarification. We will embed data in the image stream to identify them then. Since we generate csi tx data in FPGA so we can adjust data ID as needed. yes, get this to work with V4L2 driver will be hacky. We will just output it to streams that consumable by gstreamer and let user access it that way first.

thanks
Xiaoyong
Just curious, what FPGA did you decide to go with? Currently I am working with a MachX02 myself.

xiaoyongtijee
Posts: 5
Joined: Mon Feb 19, 2018 10:12 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Feb 23, 2018 1:29 am

grimepoch wrote:
Wed Feb 21, 2018 8:30 pm
xiaoyongtijee wrote:
Tue Feb 20, 2018 5:28 pm

Thanks for the clarification. We will embed data in the image stream to identify them then. Since we generate csi tx data in FPGA so we can adjust data ID as needed. yes, get this to work with V4L2 driver will be hacky. We will just output it to streams that consumable by gstreamer and let user access it that way first.

thanks
Xiaoyong
Just curious, what FPGA did you decide to go with? Currently I am working with a MachX02 myself.
we are using xilinx 7 series

xiaoyongtijee
Posts: 5
Joined: Mon Feb 19, 2018 10:12 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Feb 23, 2018 1:53 am

We connected our csi and get some data. However there are some questions.
First is for the image data, since our imaging part is not totally ready so we are using a 40 bits counter data for now. I do see the saved data matches 40 bits (5 bytes) pattern and increment 1 by 1. However the bits are not matching up. I can see the some incremental bits in the middle and the lowest alternating bit separated somewhere else. I think this might be caused by pack/unpack settings however I've set them both to 0. Does it always consider the data as RAW10 and do some pack/unpack? For codecsindeinfo data we don't have this problem, I can see 32 bits counter data without this unknow shifting/scrambles. (Note, after more experiment I think this might be related to the second on as I changed image_id to output everything to codecsideinfo and I see 5 bytes there but still with shifted bits).

Secondly, I can see there are some bits misread occasionally. I think we probably need to adjust those timing parameters, is there some documents for what they are for and how to adjust them in term of mipi spec?

Third is about buffer length returned. We always get buffer returned with same buffer length as allocated based on sensor mode, even though our frames have different size and have corresponding SOF and EOF bits in MIPI. Does rawcam/csi layer always return buffer when it is completely filled?

I also noticed an issue when I experiment with image_id, when I changed image_id so not match with my data type, I still got call backs with flag 0x84, which indicate non sideinfo, although the content is all zero if I zero out all buffer before return them. Does the flag is also sent by some other condition?

thanks
Xiaoyong
xiaoyongtijee wrote:
Tue Feb 20, 2018 5:28 pm
6by9 wrote:
Tue Feb 20, 2018 1:12 pm
xiaoyongtijee wrote:
Tue Feb 20, 2018 2:03 am
Thanks for this great capability introduced by 6by9. I can easily follow the documents and am able to capture raw data from pi camera.
We are developing an FPGA based multi-camera gadget and trying to use this interface. I see deeluk has some similar device. However for us, we connect it to pi as a single i2c device and want to output multiple streams using the csi virtual channels. I noticed we can create multiple component of vc.ril.rawcam, can we use combined virtual channel number and data type as input of image id property and get callback for each stream?
Creating multiple instances of vc.ril.rawcam for the same physical peripheral will not help. If you try to enable the second one then it should fail as the hardware is already in use.

The hardware only supports splitting the stream into two - one that matches the specified CSI data identifier considered as the image stream, and anything else is considered as a data stream as it is normally used for embedded metadata from sensors.

rawcam splits these into the the buffers without MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO / OMX_BUFFERFLAG_CODECSIDEINFO set for the data that matches the data identifier, and buffers with that flag set for packets that don't match. (AIUI the hardware will actually take 4 data ID values for matching image data, but currently only one is ever set up. Let me know if that is a restriction to you).

With the way it is currently set up, there is an issue that we only look at the frame start/end packets on the image stream. If you sent two non-matching/"data" frames between two matching/"image" frames, then you will lose one of the "data" frames. The length of the data buffer is also misrepresented, so you will need some other technique to determine how much valid data is present in those buffers.
There is an option to get independent interrupts for the image and data frame starts and ends, but it'd require a chunk of rework that I wouldn't be planning to do to rawcam. There may be the potential to rework the V4L2 driver for the hardware to pick up the two streams, but I need to get the basic thing merged first before adding more complications. You'd also get two independent nodes under /dev/video, so I don't know how well that will work.
Thanks for the clarification. We will embed data in the image stream to identify them then. Since we generate csi tx data in FPGA so we can adjust data ID as needed. yes, get this to work with V4L2 driver will be hacky. We will just output it to streams that consumable by gstreamer and let user access it that way first.

thanks
Xiaoyong

grimepoch
Posts: 117
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Mar 19, 2018 5:49 pm

Second board built up, same problem, so I don't think it was the impedance.

One question, my clock is not continuous like the adv7282, it turns on and off with the data. Does that matter when trying to use the code and forcing the terminators on?

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 15294
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Mar 19, 2018 5:55 pm

grimepoch wrote:
Mon Mar 19, 2018 5:49 pm
Second board built up, same problem, so I don't think it was the impedance.

One question, my clock is not continuous like the adv7282, it turns on and off with the data. Does that matter when trying to use the code and forcing the terminators on?
Yes. If you are switching between high and low speed modes then you must leave termination set to auto (0).
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

grimepoch
Posts: 117
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Mar 19, 2018 6:32 pm

Ahhhh, okay. Well, that's good to know. Inquiring right now on running the MachX02 in clock always on mode. Running that way will keep me from having to figure out all those timings for the auto switching.

grimepoch
Posts: 117
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Mar 22, 2018 6:44 pm

I got in contact with another person who has used a different FPGA with the CSI port on the Pi. They were not able to get it to work either. While I know you don't feel the clock speed of the incoming data should matter, I think it does. I've not found any evidence of anyone running a slow CSI-2 stream in.

They abandoned trying to get the FPGA to work and used a dedicated parallel->CSI chip running >400Mhz minimum. They also realized that even with the new chip, certain clock timing did not work, even though it was all valid CSI setup data.

I'm in the process of evaluating this option. The fastest I have run the MachX02 was 168Mhz for the CSI clock. They were running at a minimum of ~400Mhz as their slowest clock they made work. I think the MachX02 is limited by the max frequency the PLL can generate, and because I need divisions of that for the byte clock, that is why I cannot get close to 400Mhz. (although I am still trying different ideas).

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 15294
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Mar 22, 2018 9:15 pm

grimepoch wrote:
Thu Mar 22, 2018 6:44 pm
I got in contact with another person who has used a different FPGA with the CSI port on the Pi. They were not able to get it to work either. While I know you don't feel the clock speed of the incoming data should matter, I think it does. I've not found any evidence of anyone running a slow CSI-2 stream in.

They abandoned trying to get the FPGA to work and used a dedicated parallel->CSI chip running >400Mhz minimum. They also realized that even with the new chip, certain clock timing did not work, even though it was all valid CSI setup data.

I'm in the process of evaluating this option. The fastest I have run the MachX02 was 168Mhz for the CSI clock. They were running at a minimum of ~400Mhz as their slowest clock they made work. I think the MachX02 is limited by the max frequency the PLL can generate, and because I need divisions of that for the byte clock, that is why I cannot get close to 400Mhz. (although I am still trying different ideas).
400MHz, or 400Mbit/s? My understanding is that CSI is passing data on both clock edges, so a link frequency of 200MHz will carry 400Mbit/s of data per lane.
I've had it working with the ADV7282M. The datasheet lists it as 432Mbit/s for progressive mode, or 216MHz. I thought I had it syncing in interlaced mode too although I have no way of signalling the interlacing correctly so the resulting image isn't terribly useful. I need to resurrect the V4L2 driver work again, so I'll see if I can confirm that.

I will agree that CSI can be a bit fickle sometimes, so using a dedicated chip may be the better approach if possible.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

grimepoch
Posts: 117
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Mar 23, 2018 6:04 pm

I definitely agree with what you've said, and you are correct, the CSI clk, both edges are used. Basically means the data and clk can change at the same rate, and as you said, means the clock would be 1/2 the data rate in frequency.

So for progressive, you were likely using a 216Mhz clock and for interlaced, 108Mhz.

Thinking about this further, what I wonder is if the delays between all the different states are really the critical part (All the LP modes and the HP mode). I guess what I could do is take the ADV7282M data sheet and setup my timings between modes to be the same, and then try and run at say 216Mhz. That speed should be possible.

grimepoch
Posts: 117
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Mar 26, 2018 5:38 pm

Not having much luck with Toshiba and their Parallel->MIPI chip. I've sent emails, called, contacted local distributors. No one has the chips in stock, so I am not too keen on getting into a chip I cannot get quantity for!

That said, I started digging into the ADV7282M that you mentioned. First, I found the following note about interlaced from an application engineer:

Also note that the MIPI CIS-2 link retains the line and frame timing of the ITU656 specification
e.g. frame rate output is 50Hz for PAL and 60Hz for NTSC inputs, in interlaced mode odd frames have one extra line than even frames.
Which means, you should be able to tell the difference between both frames in what you were doing.

Second thing is that there is an included I2P (interlaced to progressive) circuit. However, it's not frame buffer based. It just interpolated a line from the 2 around it, then applies their own special code to try and prevent flickering and such. Not something I am interested in, but just in case people were wondering what the I2P part of it was.

So, time to redesign the board AGAIN but this time for the ADV7282M.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 15294
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Mar 26, 2018 6:36 pm

grimepoch wrote:
Mon Mar 26, 2018 5:38 pm
Not having much luck with Toshiba and their Parallel->MIPI chip. I've sent emails, called, contacted local distributors. No one has the chips in stock, so I am not too keen on getting into a chip I cannot get quantity for!

That said, I started digging into the ADV7282M that you mentioned. First, I found the following note about interlaced from an application engineer:

Also note that the MIPI CIS-2 link retains the line and frame timing of the ITU656 specification
e.g. frame rate output is 50Hz for PAL and 60Hz for NTSC inputs, in interlaced mode odd frames have one extra line than even frames.
Which means, you should be able to tell the difference between both frames in what you were doing.
It's a slightly confusing situation in V4L2 land. The CSI-2 receiver shouldn't have to know anything about the source, so it doesn't know about these quirks. It also doesn't help that the write pointer in the peripheral isn't 100% accurate, hence relying more on frame end flags than number of bytes written.
That said, I will try to find a moment to have a play with it again.
grimepoch wrote:Second thing is that there is an included I2P (interlaced to progressive) circuit. However, it's not frame buffer based. It just interpolated a line from the 2 around it, then applies their own special code to try and prevent flickering and such. Not something I am interested in, but just in case people were wondering what the I2P part of it was.

So, time to redesign the board AGAIN but this time for the ADV7282M.
Yup, the I2P is the main reason to specify the ADV7282M over the other variants without I2P.
Does the ADV7282M actually do what you want? I thought you were after parallel camera data, so analogue video doesn't help much.
There was a post previously about the TC358746 Toshiba camera bridge chip. I've not got time to dig it out now, but you may be able to ask others about where they sourced it.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

grimepoch
Posts: 117
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Mar 26, 2018 7:26 pm

6by9 wrote:
Mon Mar 26, 2018 6:36 pm
It's a slightly confusing situation in V4L2 land. The CSI-2 receiver shouldn't have to know anything about the source, so it doesn't know about these quirks. It also doesn't help that the write pointer in the peripheral isn't 100% accurate, hence relying more on frame end flags than number of bytes written.
That said, I will try to find a moment to have a play with it again.
Understood. I am planning on trying to use the raspiraw and see what I can do there. That said, if the V4L2 side could be made to work, I have used that interface with a web camera so I have a lot of that code written already. Might be worth playing with both anyways.

Right now, even with V4L2 and the cameras I had used, they were YUV so I had to convert them, which I did with the GPU (with a frag shader). So I'd think performance wise, raspiraw feeding into a texture would be probably the same performance. Not sure.
6by9 wrote: Yup, the I2P is the main reason to specify the ADV7282M over the other variants without I2P.
Does the ADV7282M actually do what you want? I thought you were after parallel camera data, so analogue video doesn't help much.
There was a post previously about the TC358746 Toshiba camera bridge chip. I've not got time to dig it out now, but you may be able to ask others about where they sourced it.
Originally I was going with parallel so I could alter the encoded data myself, however, at this point I am looking for something that actually works! :)

Originally I had not gone the ADV7282M path because I was going for higher bit depth, and, I didn't realize at the time how easy in the analog domain it was for me to convert RGB into YCrCb that the 7282M wants.

The TC358746 is the exact chip I was looking at. Problem is, it seems to be REALLY hard to get at the moment. Zero availability and getting no help from Toshiba at the moment.

r4d10n
Posts: 1
Joined: Thu Sep 21, 2017 6:25 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Sat Apr 07, 2018 5:37 pm

The compatible part TC358748XBG was available in short numbers from Arrow a while ago... It seems that too got exhausted.

On a side note, Terasic D8M-GPIO Digital Camera Package has got the above chipset with OV8865 8MP camera. The best part is that their documentation CD contains extensive documentation and FPGA code examples for interfacing the above bridge chip.

http://d8m.terasic.com/cd (Requires signup)

I'm also looking the possibility of transferring high rate data using MIPI CSI-2 bus over to/from Raspi and was using hoping to use ice40hx series (for its open toolchain). Planning to experiment its LV signalling [http://www.latticesemi.com/~/media/Latt ... evices.pdf] capabilities sometime. Anyone with experience, please advise!

User avatar
HermannSW
Posts: 6093
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Apr 09, 2018 7:03 pm

I want to set shutter time with raspiraw to 0.25ms.
viewtopic.php?f=43&t=201568&p=1299787#p1299787

I tried "--expus" with values 250, 25 and even 5, but all frames have same brightness.
Then I realized that raspiraw seems not to have shutter time option, only exposure.

Any chance to set 0.25ms shutter time with raspiraw?
(I have no problem with source code change)
https://github.com/Hermann-SW/RSA_numbers_factored
https://stamm-wilbrandt.de/GS_cam_1152x192@304fps
https://hermann-sw.github.io/planar_graph_playground
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/

shishirk
Posts: 5
Joined: Thu Apr 19, 2018 11:26 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Apr 19, 2018 11:49 pm

Dear @HermannSW and @6by9,

Thanks for your work on raspiraw.

I am trying to use IMX219 from PiCam2 for capturing small regions of the sensor at high frame rates. I would like to monitor a small area continuously and if some object is detected there, then move the region of interest (ROI) along with that object's movement.

As your have shown that capture of 640x240 or lower do work, but with data not positioned correctly (at least sometimes). To achieve this you have used binning it seems (I may be incorrect) and indicated that height is being set. I would like to do without binning. It is not clear to me where the height is being set in the IMX219 registers, since the code for height update is only fills OmniVision specific registers.

In my experiments, I was able to get pictures using your scripts at a high fps with data positioned in improper locations.
Following problems remain:
1. Setting the registers for top, left, height and width doesn't do anything or gives no data from the sensor (imx219 registers 0x0160 to 0x016F). Sometimes setting registers gives no results even at very low fps.
2. How do you find line_time_ns? How is vts_min calculated? (both occur in sensor->mode)
3. What is reason for misplaced data?
4. What function call transfers data to /dev/shm? You can start_streaming, then sleep for given duration and then stop_streaming. Is there a different thread that is active to transfer data?
5. How do I go about changing ROI to different regions continuously.

Any pointers will be helpful.

thanks and regards,
shishir.

Return to “Camera board”