6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 14076
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Sep 07, 2016 9:00 pm

Thanks logician - nicely done. I've merged the PR :)
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

User avatar
Gavinmc42
Posts: 7300
Joined: Wed Aug 28, 2013 3:31 am

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Sep 08, 2016 8:54 am

Lots of clues here for a baremetal raw image grabber.

Has anyone tried this method for the V2?
Not the same as this?
viewtopic.php?f=43&t=146310&start=25

Trying to do this, but no point until i2c0 is under baremetal control.
viewtopic.php?f=72&t=78922
I'm dancing on Rainbows.
Raspberries are not Apples or Oranges

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 14076
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Sep 08, 2016 9:16 am

Gavinmc42 wrote:Lots of clues here for a baremetal raw image grabber.

Has anyone tried this method for the V2?
Not the same as this?
viewtopic.php?f=43&t=146310&start=25
No, not the same. That was processing the raw Bayer block that gets tagged on the end of JPEGs, and hence is still using the firmware's camera driver. This has the I2C register data sent from the ARM.
I'm in a slightly awkward position that due to docs I've seen under NDAs I can't offer support on I2C command sets for these sensors. The V1 sensor 5MPix I2C traffic was grabbed "off the line" by jbeale, and there's a lurking PR that I need to tidy up where backinside has grabbed all the settings for the other modes. There's nothing wrong with the data that he's captured, but it's not an implementation I'm happy with.

No one has grabbed the I2C register sets for the V2 sensor as yet, therefore it's not available. If someone were to grab the register sets for the V2, then raspiraw could be updated. You may find that it's not necessary to use an I2C analyser to get the register sets...
There's also https://android.googlesource.com/kernel ... o/imx219.c which exposes a pile of register values for IMX219, but only a couple of modes, and not in a currently useful manner.

You mention the crypto. Hopefully I won't be shot for saying that that restricts boards that are allowed to use the firmware driver. It won't affect direct access from the ARM.
Gavinmc42 wrote:Trying to do this, but no point until i2c0 is under baremetal control.
viewtopic.php?f=72&t=78922
I don't normally follow the bare metal forums so hadn't seen that. I'll reply there.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 14076
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Sep 08, 2016 11:51 am

Gavinmc42 wrote:Trying to do this, but no point until i2c0 is under baremetal control.
viewtopic.php?f=72&t=78922
If you're on a Pi3 then you don't need i2c0 to get to the camera, you can pinmux i2c1 to it (GPIOS 44&45 alt2).
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

User avatar
Gavinmc42
Posts: 7300
Joined: Wed Aug 28, 2013 3:31 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Sep 09, 2016 6:53 am

Thanks 6by9.
I had put the Pi 3 on the bottom of the list because of the mux i2c expander stuff and no schematic.
Reading/writing the image sensor by i2c needs to work first.
Just got native Laz/FPC plus Ultibo FPC compiling on a Pi 3.
Going to run out of screens and Pi's on my home setup for the weekend.

Already have i2c1 test units in Ultibo somewhere, will try pinmux Alt2 44/45 stuff.
Think I should get some GPIO pinmux test code going too.

Hmm, baremetal camera on Pi3, well the 4 cores should help :P
Wonder how to turn them into a IPS, pipeline image processors?
Think you just might have shown me to the Nepal train station.
I'm dancing on Rainbows.
Raspberries are not Apples or Oranges

Kubi
Posts: 4
Joined: Fri Sep 09, 2016 6:40 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Sep 09, 2016 7:22 am

@6by9
Thank you for the great work and providing raw data access to the CSI-2 peripheral.
I'm using the raw sensor access for capturing HDMI video data via the tc358743 (based on example code of raspi_tc358743.c).
For data processing and video output I'm using OpenGL ES 2.0.

Would it be possible to add MMAL_ENCODING_OPAQUE to the "vc.ril.rawcam" component? So that the returned buffer can be used for creating an EGL-Image with eglCreateImageKHR(...), like it's done in the raspistill application with the "vc.ril.camera" component. This would speed up video stream processing a lot!

Many thanks in advance.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 14076
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Sep 09, 2016 9:24 am

[quote="Kubi"I'm using the raw sensor access for capturing HDMI video data via the tc358743 (based on example code of raspi_tc358743.c).
For data processing and video output I'm using OpenGL ES 2.0.

Would it be possible to add MMAL_ENCODING_OPAQUE to the "vc.ril.rawcam" component? So that the returned buffer can be used for creating an EGL-Image with eglCreateImageKHR(...), like it's done in the raspistill application with the "vc.ril.camera" component. This would speed up video stream processing a lot![/quote]
Sorry, not a chance, and it doesn't gain you anything anyway.

The GPU has image pools that some components allocate. These make it easy to allow a defined worst case allocation to guarantee the specified use case will work, and then use what they need. It means that you can tell the camera you might use up to 1080P video encode, only want VGA at the moment but may dynamically switch at a later stage without disabling the component. Or for video decode the headers may say that the stream needs up to 6 1080P reference frames, so you allocate that.

OPAQUE buffers pass a reference to those underlying images to avoid any form of copying of the data.

rawcam is writing the incoming data directly into the buffer you've provided. You may have noticed that there is a MMAL_PARAMETER_ZERO_COPY set which means that the buffer is allocated out of the gpu_mem but mapped into the ARM virtual address space. You are accessing exactly the same physical memory addresses as the CSI2 receiver wrote to.
There is no internal image pool within the component to hook an OPAQUE buffer on to, so that bit doesn't work.

There should be a way to take the allocated buffer handle and pass it into GL as a premapped image buffer, but I don't know enough detail of GL to know how to, or whether it can accept 24bpp RGB (no alpha or padding).
I'm working on a bit of code to help out with the upstream OpenGL (not ES) driver where the camera will be writing into dmabufs. It may be possible via that to make all of this work more efficiently, but it's yet another thing that is still a slight work in progress.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

jamesh
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 32227
Joined: Sat Jul 30, 2011 7:41 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Sep 09, 2016 10:15 am

Gavinmc42 wrote:Thanks 6by9.
I had put the Pi 3 on the bottom of the list because of the mux i2c expander stuff and no schematic.
Reading/writing the image sensor by i2c needs to work first.
Just got native Laz/FPC plus Ultibo FPC compiling on a Pi 3.
Going to run out of screens and Pi's on my home setup for the weekend.

Already have i2c1 test units in Ultibo somewhere, will try pinmux Alt2 44/45 stuff.
Think I should get some GPIO pinmux test code going too.

Hmm, baremetal camera on Pi3, well the 4 cores should help :P
Wonder how to turn them into a IPS, pipeline image processors?
Think you just might have shown me to the Nepal train station.
I wrote a SW ISP for some stuff here at work - could decode from bayer, and apply filters, save data, etc to the frame. Took about a month to write the framework and basic stages in C++ to give you some idea of the work involved. You need to add more time for implementation of each extra processing stage. Running on an i7 it was nowhere near real time, but not too bad. It was single threaded though, writing a multithreaded version would be possible but time consuming.
Principal Software Engineer at Raspberry Pi Ltd.
Working in the Applications Team.

Kubi
Posts: 4
Joined: Fri Sep 09, 2016 6:40 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Sep 09, 2016 10:41 am

@6by9
Thank You for your fast reply.
rawcam is writing the incoming data directly into the buffer you've provided. You may have noticed that there is a MMAL_PARAMETER_ZERO_COPY set which means that the buffer is allocated out of the gpu_mem but mapped into the ARM virtual address space. You are accessing exactly the same physical memory addresses as the CSI2 receiver wrote to.
There is no internal image pool within the component to hook an OPAQUE buffer on to, so that bit doesn't work.
Is there such a big difference in the implementation of "vc.ril.camera" to "vc.ril.rawcam"?
Because if i take the "vc.ril.camera" component to read in video data (send by tc358743) i can pass the reference of the buffers directly to EGL ( eglCreateImageKHR(...) ). This is also used with MMAL_PARAMETER_ZERO_COPY set.
See in raspitex_configure_preview_port(RASPITEX_STATE *state,MMAL_PORT_T *preview_port) in https://github.com/6by9/userland/blob/h ... RaspiTex.c
especially the comment above status = mmal_port_parameter_set_boolean(preview_port, MMAL_PARAMETER_ZERO_COPY, MMAL_TRUE);

So, actually i want to use the "vc.ril.rawcam" like "vc.ril.camera" is used in raspistill if it is called with OpenGL output. The fact why i don't use "vc.ril.camera" is because by creating the "vc.ril.camera" component the HDMI to CSI-2 bridge tc358743 is initialized in a way that doesn't match my needs. I need to get in HDMI video data in RGB24 or BGR24. The mismatch of subpixel orientation can be handled in a shader in OpenGL.

So, another solution of my problem could possibly a change in the "vc.ril.camera" where the initialization of the tc358743 is hand over to the user.

Many thanks in advance.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 14076
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Sep 09, 2016 1:25 pm

Kubi wrote:Is there such a big difference in the implementation of "vc.ril.camera" to "vc.ril.rawcam"?
Yes, huge difference.
vc.ril.camera is running the GPU side camera drivers, CSI2 receiver peripheral, ISP, imaging control loops, and some software processing on the images. It's a total of about 110,000 lines of code, although fair chunks of that are bypassed for TC358743 as the image processing algorithms aren't appropriate (eg AE, AGC, AWB, lens shading).
vc.ril.rawcam just talks to the CSI2 receiver peripheral. It's a total of 1500 lines, and much of that is boilerplate code for making a component.
Kubi wrote:Because if i take the "vc.ril.camera" component to read in video data (send by tc358743) i can pass the reference of the buffers directly to EGL ( eglCreateImageKHR(...) ). This is also used with MMAL_PARAMETER_ZERO_COPY set.
See in raspitex_configure_preview_port(RASPITEX_STATE *state,MMAL_PORT_T *preview_port) in https://github.com/6by9/userland/blob/h ... RaspiTex.c
especially the comment above status = mmal_port_parameter_set_boolean(preview_port, MMAL_PARAMETER_ZERO_COPY, MMAL_TRUE);

So, actually i want to use the "vc.ril.rawcam" like "vc.ril.camera" is used in raspistill if it is called with OpenGL output. The fact why i don't use "vc.ril.camera" is because by creating the "vc.ril.camera" component the HDMI to CSI-2 bridge tc358743 is initialized in a way that doesn't match my needs. I need to get in HDMI video data in RGB24 or BGR24. The mismatch of subpixel orientation can be handled in a shader in OpenGL.
So you don't need the other half of the GPU loop that using OPAQUE gives you, which is the colour space conversion from YUV to RGB, and where doing that on the GPU has significant performance benefit (especially as this was for Pi1 days).
Using vc.ril.camera also converts the RGB data off the TC358743 into the YUV domain for a fair chunk of processing, and you're then converting it back again to RGB at the back end for GL. Absolutely no point in doing the conversions, but the vc.ril.camera pipe is set up for YUV.
Kubi wrote:So, another solution of my problem could possibly a change in the "vc.ril.camera" where the initialization of the tc358743 is hand over to the user.
Not going to happen. You'd also have to take over control of the ISP because otherwise the GPU has no idea of what data is being fed in.

The whole point of adding rawcam was to deal with the case of getting access to the raw data with minimal overhead, and removing the complexities of the ISP from the loop.

You have standard RGB888 (or BGR888) images that OpenGL should be able to consume as-is. There is almost zero gain in adding any additional GPU overhead, and MMAL_ENCODING_OPAQUE is just not relevant to rawcam.

Hopefully I'll sort the dmabuf stuff in the next week or so and then you do the magic with the OpenGL stack. There was a conversation a couple of months ago with some of the ex-Broadcom folk which was asking the same sort of questions around dmabuf and the like, so those may get resurrected and solved under OpenGLES too.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

Kubi
Posts: 4
Joined: Fri Sep 09, 2016 6:40 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Sep 09, 2016 3:22 pm

@6by9
Thank you again for your reply.
So you don't need the other half of the GPU loop that using OPAQUE gives you, which is the colour space conversion from YUV to RGB, and where doing that on the GPU has significant performance benefit (especially as this was for Pi1 days).
Using vc.ril.camera also converts the RGB data off the TC358743 into the YUV domain for a fair chunk of processing, and you're then converting it back again to RGB at the back end for GL. Absolutely no point in doing the conversions, but the vc.ril.camera pipe is set up for YUV.
Maybe i didn't describe my requirements clearly (sorry for that). I do not need any ISP or color space conversions or other processing on the csi-2 input data.
In my opinion i think that the problem is the area/space where the memory for the captured data is allocated in. The area/space where "vc.ril.rawcam" allocates memory for the csi-2 raw data isn't accepted as a source for the eglCreateImageKHR(...) function. Whereas the area/space where "vc.ril.camera" allocates memory for the processed data is accepted to create an OpenGL texture using eglCreateImageKHR(...).

So, but if i get you right, there will be no way to get the csi-2 input data from "vc.ril.rawcam" as a reference (MMAL_BUFFER_HEADER_T *buffer; buffer->data) which then can be used to create eglCreateImageKHR(..., buffer->data, ...) like it can be done via "vc.ril.camera"!?

If so, I'm looking forward to the dmabuf solution to easily copy frames around.

drich
Posts: 80
Joined: Tue Jul 28, 2015 7:36 pm
Location: Paris, France 🇫🇷

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Sep 11, 2016 5:27 pm

Any news about your 'vc.ril.isp' component ? I really would like to connect your rawcam component to video_encode but it only accepts YUV and RGB formats. (If you wonder why, I would like to reduce latency as low as possible for a FPV drone project)

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 14076
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Sep 11, 2016 7:00 pm

drich wrote:Any news about your 'vc.ril.isp' component ? I really would like to connect your rawcam component to video_encode but it only accepts YUV and RGB formats. (If you wonder why, I would like to reduce latency as low as possible for a FPV drone project)
I assume this is from one of the two official Bayer sensor modules. In which case it's still not there. In the time I have had to available to play I've been getting purest green images (YUV 0,0,0) through, so something is missing in the config but I've so far failed to work out what.

I would say though that I doubt you'll reduce the latency significantly, and are more likely to increase it.
rawcam and vc.ril.isp will only work on full frames, so you'll have to wait for the complete frame to be received from the sensor before rawcam produces the buffer, and then that gets submitted to isp and it'll produce a full frame out. vc.ril.camera is significantly more pipelined, so as each block of about 128 lines comes in from the sensor it is passed through to the ISP. The ISP will produce output in stripes too, and that can also be processed as it is produced. Admittedly video_encode can't process the frame until it is complete, but you will have saved some time.

I did some experimentation a while back on viewtopic.php?t=153410&p=1027417#p1004792 comparing the buffer PTS value (which is the GPU STC at the frame start interrupt) vs the STC when the buffer is received by userspace in raspivid. 1280x960 was giving a latency of 46-48ms to be presented to userspace. 1080P was taking 80-81ms. Seeing as you're looking at 23ms from frame start until the complete frame is received in the 1296x972 mode, to have processed the full image and H264 encoded it in a further 23ms is pretty low latency.
Memory says that the H264 encode of a 1080P frame takes 40ms total, but pipelined so that the main processing is done in <33ms, so in the 1080P case the frame takes 33ms to get to frame end on the CSI2 bus, so the ISP has completed the frame (80-40-33) = 7ms after the frame end. Seeing as you won't be able to start processing until the frame end, and even at the theoretical ISP throughput of 180MPix/s, the ISP processing is going to take ~11ms for 1080P.

The other thing is that vc.ril.isp will have no AGC or AWB control loops, so it's either full manual, or you've got to write some control loops.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 14076
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Sep 11, 2016 7:04 pm

Kubi wrote:Maybe i didn't describe my requirements clearly (sorry for that). I do not need any ISP or color space conversions or other processing on the csi-2 input data.
In my opinion i think that the problem is the area/space where the memory for the captured data is allocated in. The area/space where "vc.ril.rawcam" allocates memory for the csi-2 raw data isn't accepted as a source for the eglCreateImageKHR(...) function. Whereas the area/space where "vc.ril.camera" allocates memory for the processed data is accepted to create an OpenGL texture using eglCreateImageKHR(...).

So, but if i get you right, there will be no way to get the csi-2 input data from "vc.ril.rawcam" as a reference (MMAL_BUFFER_HEADER_T *buffer; buffer->data) which then can be used to create eglCreateImageKHR(..., buffer->data, ...) like it can be done via "vc.ril.camera"!?

If so, I'm looking forward to the dmabuf solution to easily copy frames around.
I did ping a quick query to an ex Broadcom colleague who did work on the YUV to GL conversion path.
What you're needing is handling of an RSO (raster scan order) buffer. Apparently there isn't great support for that currently in the GLES stack, as the OPAQUE handling code will also convert from YUV to the RGB texture format.
Using RSO apparently also has performance implications within GLES unless your fragment shaders are set to handle RSO specifically.
All greek to me, but he'll try to have a look when some more of the framework stuff is in place.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

drich
Posts: 80
Joined: Tue Jul 28, 2015 7:36 pm
Location: Paris, France 🇫🇷

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Sep 11, 2016 8:13 pm

Thank you for the fast reply.
I have to admit that I have a very low latency right now : about 80ms glass-to-glass (camera-encoder---wifi---decoder-hdmi) in 720p60, which is still too much for fast-response FPV video, according to the numbers you just gave it won't be possible to go further without modifying GPU-side code. Actually I would like to go to something <=50ms, also by reducing the resolution if necessary.
I though it would be possible to go lower using your rawcam in mode 7 (and only encode every 1/3 images)

What would be really great is that Broadcom makes the whole GPU open-source and not only v3d side, allowing people to make their own MMAL/MAX components to meet their specific needs

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 14076
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Sep 11, 2016 8:48 pm

drich wrote:Thank you for the fast reply.
I have to admit that I have a very low latency right now : about 80ms glass-to-glass (camera-encoder---wifi---decoder-hdmi) in 720p60, which is still too much for fast-response FPV video, according to the numbers you just gave it won't be possible to go further without modifying GPU-side code. Actually I would like to go to something <=50ms, also by reducing the resolution if necessary.
I though it would be possible to go lower using your rawcam in mode 7 (and only encode every 1/3 images)
So mode 7 will take 11ms to receive the frame after the frame start based on the 1/90th sec frame period. You'd still gain some advantage from the pipelining in vc.ril.camera that you won't get with a different solution.
I think I quoted my code changes on that other threads to get it to report the latency. Give it a whirl with mode 7 and see what numbers you get.
You would gain a little bit with the V2 sensor with the 120fps mode, where the frame readout time will be down to 8.3ms.

Throwing away frames won't really gain you much other than reducing bandwidth requirements on the encoded video. If you want to try it out, you can achieve the same by taking the buffers from camera->output[1] back to the application, return 2 out of 3 of them straight back to the camera, and only forward 1 out of 3 on to the video encoder. The extra round trips to the ARM should cost <0.2ms.

If you are really wanting to eek away the msecs, then there are MMAL parameters to disable the denoise algorithm, which is probably the only software stage active that is likely to be active on standard video encode. Do avoid enabling DRC as that is quite a resource hog.
drich wrote:What would be really great is that Broadcom makes the whole GPU open-source and not only v3d side, allowing people to make their own MMAL/MAX components to meet their specific needs
You can hope, but it's highly unlikely.

I'm not sure where you think there is any significant time to be saved. There may be a chance to switch the H264 encoder into encoding each frame as 2 independent slices, and have the camera deliver the slices as soon as they are available. I seem to recall there was a similar approach taken for Miracast, but that reduces compression efficiency slightly and increases the complexity significantly. Exposure time and frame readout time are almost always the most significant parts of the equation, and you can't change the physics there.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

drich
Posts: 80
Joined: Tue Jul 28, 2015 7:36 pm
Location: Paris, France 🇫🇷

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Sep 11, 2016 9:31 pm

I'm missing something somewhere then. After disabling all the algorithms (denoise, face-detect, red-eye, stabilisation, ..), setting to mode 7 and just setup a vc.ril.camera->video_render connection without any compression I still get 65ms latency.
Anyway, thank you for your help, I will continue to dig into MMAL headers

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 14076
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Sep 11, 2016 10:05 pm

drich wrote:I'm missing something somewhere then. After disabling all the algorithms (denoise, face-detect, red-eye, stabilisation, ..), setting to mode 7 and just setup a vc.ril.camera->video_render connection without any compression I still get 65ms latency.
Anyway, thank you for your help, I will continue to dig into MMAL headers
Where and how are you measuring the latency?

The numbers I'm talking about are solely within the camera side. Rendering has latency involved too as it has to wait for VSYNC events on the display before updating, so that's adding up to 16ms assuming 60Hz display update. And your display will have some latency too.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

drich
Posts: 80
Joined: Tue Jul 28, 2015 7:36 pm
Location: Paris, France 🇫🇷

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Sep 11, 2016 10:36 pm

I measure the overall latency using the common technique : by filming the display itself with a timer overlay, then take a picture of it with a high-speed shutter camera. I know that this is not the best way to achieve that, but it is the closer to what the eyes see.
White balance is disabled, exposure is set to fixed-fps, and everything else is disabled..

That is why I though that using your vc.ril.rawcam and just converting its format to YUV420/RGB16/24 would get a better result that the current one. Although I did not mentioned that I'm currently using OpenMAX and not MMAL, but as it is a simple wrapper (from what I understood), it should not add any kind of latency.

(edit: btw, I maybe should create a new topic on the forum since my "problem" is not directly related to rawcam)

Charix
Posts: 4
Joined: Fri Aug 12, 2016 10:44 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Sep 25, 2016 7:24 pm

Hi 6by9,

I'm back from vacations. Thanks for your hint about enabling low level logging. What I observe here is that whenever I'm not getting the callback after an image trigger, I get logging output like this:
779239.263: unicam_int_callback: instance=1 status = 0x5021
779239.291: unicam_int_callback: bytes_written = 400, lines_done = 0
779239.302: unicam_int_callback error: CRC error
779239.314: unicam_int_callback error: parity bit error
779239.325: unicam_int_callback error: higher order error
779239.338: unicam_int_callback: Line Interrupt 0
779239.676: unicam_int_callback: instance=1 status = 0x20
779239.701: unicam_int_callback: bytes_written = 432, lines_done = 0
779239.714: unicam_int_callback: Line Interrupt 0
779240.087: unicam_int_callback: instance=1 status = 0x5820
779240.113: unicam_int_callback: bytes_written = 368, lines_done = 0
779240.125: unicam_int_callback error: packet length error
779240.137: unicam_int_callback error: parity bit error
779240.148: unicam_int_callback error: higher order error
779240.160: unicam_int_callback: Line Interrupt 0
779240.598: unicam_int_callback: instance=1 status = 0x20
779240.623: unicam_int_callback: bytes_written = 432, lines_done = 0
779240.636: unicam_int_callback: Line Interrupt 0
779240.909: unicam_int_callback: instance=1 status = 0x6820
779240.934: unicam_int_callback: bytes_written = 432, lines_done = 0
779240.946: unicam_int_callback error: packet length error
779240.959: unicam_int_callback error: single bit error error
779240.970: unicam_int_callback error: higher order error
779240.981: unicam_int_callback: Line Interrupt 0

Does this point to hardware problems? I have a rather adventurous interface board to the imager at the moment, so maybe we should create a better interface pcb first. Or do you see anything else from this logging output?

Thanks in advance,

Charix

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 14076
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Sep 29, 2016 6:18 pm

Charix wrote:CRC error
parity bit error
higher order error
packet length error
Those all say to me that your device or transmission is seriously corrupted. Either your device is producing rubbish, or you aren't treating the signals with the respect they deserve - 1GHz means RF transmission line, and the two halves of the pair should be impedance and length matched. You can get away with some things (I've previously used pairs out of a CAT5 cable to carry CSI data), but at least try to maintain signal integrity.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

logician
Posts: 7
Joined: Thu Sep 01, 2016 4:49 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Oct 07, 2016 9:59 pm

I'm not having any luck with this. Has anyone succeeded to connect a different camera?

I'm using an oddball CSI camera and trying to capture raw frames. I've modified raspiraw to spit out the i2c commands that I need. I see raspiraw trigger on each frame. However the buffers returned to the callback are unmodified, so no data.

Any ideas?

Thanks!
-Juan

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 14076
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Oct 07, 2016 10:39 pm

logician wrote:I'm not having any luck with this. Has anyone succeeded to connect a different camera?

I'm using an oddball CSI camera and trying to capture raw frames. I've modified raspiraw to spit out the i2c commands that I need. I see raspiraw trigger on each frame. However the buffers returned to the callback are unmodified, so no data.
I've had OV5647 (Bayer), Toshiba TC358743 HDMI to CSI (RGB888), and another device (can't say what, but producing YUYV) all working with this. Hopefully I'll get a Analog Devices ADV7280-M working against it soon too.

At a guess it'll be the image_id that isn't being set correctly, so image data will be being treated as metadata. You don't say what format your sensor is producing, but look up the format in Table 10 (YUV), Table 16 (RGB), or Table 20 (Bayer) in the spec, and ensure that value is set as in rx_cfg.image_id (the default is 0x2B which is correct for the Bayer Raw 10 that OV5647 produces).
Alternatively look at the data in the buffers with MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO set to see if your image data is in there. If there is data in there, that is confirmation you have the wrong image_id.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

logician
Posts: 7
Joined: Thu Sep 01, 2016 4:49 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Oct 07, 2016 11:04 pm

Hey 6by9,

Thanks for the quick reply! The sensor I'm working with has an oddball 12 bit format that is not in the supported lists. I ended up using MMAL_ENCODING_RGB24, which would be 24 bits per pixel, and then cutting my line size in half. Does this encoding really matter since we are getting data right off the CSI interface without any unpacking?

I will study your suggestions about image_id. But I also wanted to offer the log trace below. I note that bytes_written and lines_done are zero. Does this narrow down the possibilities?

Thanks!
-Juan

Code: Select all

726289.829: cm_new_input_frames: start
726289.845: cm_new_input_frames: end - new frames locked 1f6be564 and 1f6be5b0, head buffer now 0
726289.854: cm_process_thread: Buffer swap 3 - task on new buffers
726289.859: cm_set_buffers: start
726289.871: set buf params ptr de934000-de9c7000, stride 384
726289.878: cm_set_buffers: end
726482.110: unicam_int_callback: instance=1 status = 0x8
726482.123: unicam_int_callback: bytes_written = 0, lines_done = 0
726482.140: unicam_int_callback: Frame Start, time = 726482101, frame_int_req=0, frame_int_ready=0
726482.678: unicam_int_callback: instance=1 status = 0x10
726482.689: unicam_int_callback: bytes_written = 0, lines_done = 0
726482.697: unicam_int_callback: Frame End
726482.708: cm_process_thread: events 0010
726482.714: cm_unicam_event - enter
726482.732: cm_unicam_event: switching current metadata 0xDE9C8000 to 0xDE8A0000 and current image 0xDEA5C000 to 0xDE934000
726482.762: cm_unicam_event: FRAME_END complete
726482.770: cm_unicam_event - exit
Last edited by logician on Sat Oct 08, 2016 3:38 pm, edited 1 time in total.

logician
Posts: 7
Joined: Thu Sep 01, 2016 4:49 am

Re: Raw sensor access / CSI-2 receiver peripheral

Sat Oct 08, 2016 2:40 am

Hey 6by9,

You were right on. It was the frames flagged by MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO that contained my data. Thanks for the help!

-Juan
Last edited by logician on Sat Oct 08, 2016 3:37 pm, edited 1 time in total.

Return to “Camera board”