I hope to be head-butting this next week,
can anyone clarify relation of Raw to YUV?
eg I had thought YUV was already available as an option
or YUV generally has 2xY plus U and V,
similar to two green or luminance detectors plus 2 colour detectors U and V
Re: RAW output information
RAW is simply the data straight from the sensor (although it doesn't look quite like linear-light to me, seems to be some gamma already applied)? The sensor is a BGGR pattern Bayer type. It is typically not referred to any particular color space, needs calibration and color matrix to become usable as a normal image in a standard RGB color space such as sRGB, which is what most JPEG you see on the internet will use.
YUV data is just a different way of representing color information and can be converted to and from RGB data. There is no direct relationship between RAW sensor data and YUV data. For more information, see http://en.wikipedia.org/wiki/List_of_co ... their_uses
YUV data is just a different way of representing color information and can be converted to and from RGB data. There is no direct relationship between RAW sensor data and YUV data. For more information, see http://en.wikipedia.org/wiki/List_of_co ... their_uses
Re: RAW output information
Raw should be direct off the sensor. Don't think anything is applied. It's before the bebayer stage, so very little is done prior to that. But I'll check.jbeale wrote:RAW is simply the data straight from the sensor (although it doesn't look quite like linear-light to me, seems to be some gamma already applied)? The sensor is a BGGR pattern Bayer type. It is typically not referred to any particular color space, needs calibration and color matrix to become usable as a normal image in a standard RGB color space such as sRGB, which is what most JPEG you see on the internet will use.
YUV data is just a different way of representing color information and can be converted to and from RGB data. There is no direct relationship between RAW sensor data and YUV data. For more information, see http://en.wikipedia.org/wiki/List_of_co ... their_uses
YUV has been through the ISP and is simply the uncompressed data of the final result. JPEG is simply a compressed version of the YUV.
Principal Software Engineer at Raspberry Pi Ltd.
Working in the Applications Team.
Working in the Applications Team.
Re: RAW output information
Personally I think RAW is a bit of a cargo cult, but I’m wondering whether there’s a chance of nopping out specifically the anti-vignetting filter.
Now seeing that the lens can be removed quite easily (that stuff was more waxy than gluey) the compensation of fall-off is counter-productive when the housing is strapped to another lens.
JamesH, do you think there’s any chance of making that optional for all modes or is this cemented in permantly into the postprocessing code?
Now seeing that the lens can be removed quite easily (that stuff was more waxy than gluey) the compensation of fall-off is counter-productive when the housing is strapped to another lens.
JamesH, do you think there’s any chance of making that optional for all modes or is this cemented in permantly into the postprocessing code?
Re: RAW output information
Glad to hear the thread-lock material is not too tough. If you are swapping in another lens, depending on its properties, you might also want to change the default sharpening. I'd agree most general users won't find RAW of interest. Personally I'm thinking about image stacking, for which RAW is helpful to bypass noise reduction among other things.towolf wrote:Personally I think RAW is a bit of a cargo cult, but I’m wondering whether there’s a chance of nopping out specifically the anti-vignetting filter.
Now seeing that the lens can be removed quite easily (that stuff was more waxy than gluey) the compensation of fall-off is counter-productive when the housing is strapped to another lens.
Re: RAW output information
Not sure there is a way of turning off lens shading correction without a new firmware. Since that is a use case that is never encountered in normal operation of a cellphone camera, it's never been implemented except to turn it off in the GPU with a rebuild. I'll try to remember to ask the people who know.towolf wrote:Personally I think RAW is a bit of a cargo cult, but I’m wondering whether there’s a chance of nopping out specifically the anti-vignetting filter.
Now seeing that the lens can be removed quite easily (that stuff was more waxy than gluey) the compensation of fall-off is counter-productive when the housing is strapped to another lens.
JamesH, do you think there’s any chance of making that optional for all modes or is this cemented in permantly into the postprocessing code?
Principal Software Engineer at Raspberry Pi Ltd.
Working in the Applications Team.
Working in the Applications Team.
Re: RAW output information
quick question:
how fast can you shoot a (long) sequence of RAW stills? or in other words, what is the highest FPS you can achieve (for a prolonged period of time) in stills mode with RAW output?
how fast can you shoot a (long) sequence of RAW stills? or in other words, what is the highest FPS you can achieve (for a prolonged period of time) in stills mode with RAW output?
Re: RAW output information
I think less than 1 fps. The limitation is not so much data transfer rate, as it is the autoexposure algorithm stabilizing exposure levels before the shot is taken. If we had true manual control (directly setting ISO and shutter speed) we might do better. I think manual exposure is "on the list" but there is no ETA for that.Perder wrote: what is the highest FPS you can achieve (for a prolonged period of time) in stills mode with RAW output?
The timelapse mode of raspistill leaves the camera on in-between exposures so you might think autoexposure delays would be less of an issue, but there seems to be some exposure bug if you try to do rapid timelapse exposures, and a separate bug or limitation that the --RAW flag seems to be ignored after the first shot of a timelapse sequence.
Re: RAW output information
I reckon there might be data transfer inplications, and not just that but SD card write delays as well. On mobile phones we have found SD card write speed does limit fps when capturing large images.jbeale wrote:I think less than 1 fps. The limitation is not so much data transfer rate, as it is the autoexposure algorithm stabilizing exposure levels before the shot is taken. If we had true manual control (directly setting ISO and shutter speed) we might do better. I think manual exposure is "on the list" but there is no ETA for that.Perder wrote: what is the highest FPS you can achieve (for a prolonged period of time) in stills mode with RAW output?
The timelapse mode of raspistill leaves the camera on in-between exposures so you might think autoexposure delays would be less of an issue, but there seems to be some exposure bug if you try to do rapid timelapse exposures, and a separate bug or limitation that the --RAW flag seems to be ignored after the first shot of a timelapse sequence.
Principal Software Engineer at Raspberry Pi Ltd.
Working in the Applications Team.
Working in the Applications Team.
Re: RAW output information
Early days for all this, but wondering why you're comparing a compressed lossy format JPEG with RAW ?
It would make more sense to be compare it with a format like BMP or PNG
Raspistill can output BMP and PNG too, using the -e option:
Note: just using a file name of .png isn't enough, you need to use the -e option.
Despite the docs mentioning JPEG, RAW does get added to the other formats if specified:
BMP being totally uncompressed is a constant filesize.
Plus you can edit, re-edit and save BMP, PNGs multiple times without the quality loss you get doing that with JPG
(someone earlier in the thread touted lossless saving as an advantage to RAW).
It would make more sense to be compare it with a format like BMP or PNG
Raspistill can output BMP and PNG too, using the -e option:
Code: Select all
raspistill -e png -o image.png
Despite the docs mentioning JPEG, RAW does get added to the other formats if specified:
Code: Select all
-rw-r--r-- 1 pi pi 15116598 Jun 9 09:01 image.bmp
-rw-r--r-- 1 pi pi 3735046 Jun 9 08:56 image.jpg
-rw-r--r-- 1 pi pi 10204383 Jun 9 08:55 image.png
-rw-r--r-- 1 pi pi 21520695 Jun 9 09:02 imageRAW.bmp
-rw-r--r-- 1 pi pi 10115070 Jun 9 08:56 imageRAW.jpg
-rw-r--r-- 1 pi pi 16636117 Jun 9 08:55 imageRAW.png
Plus you can edit, re-edit and save BMP, PNGs multiple times without the quality loss you get doing that with JPG
(someone earlier in the thread touted lossless saving as an advantage to RAW).
Android app - Raspi Card Imager - download and image SD cards - No PC required !
Re: RAW output information
That's a good point about the uncompressed output options, I didn't realize that! Certainly, PNG or BMP is the right choice for an accurate comparison. Do those save the metadata too (exposure, ISO, and so forth)? Note, I suspect most of the difference between a RAW output (once we have that fully worked out) and BMP or PNG or even JPEG will be in the noise reduction and sharpening choices that are made. Apparently the amount of in-device sharpening is user-adjustable, but I think noise reduction is not, and is always applied. I'd be happy to hear I'm wrong though!
Re: RAW output information
JPEG is the only HW accelerated format- the others are done on the GPU but have no dedicated HW so are a lot slower. I'd go with PNG every time as its lossless compression makes it a better bet than BMP and GIF.
Principal Software Engineer at Raspberry Pi Ltd.
Working in the Applications Team.
Working in the Applications Team.
Re: RAW output information
Eh? DSLR optics have this aplenty. The Canon 5D Mark III features lens correction in the firmware, both shading and lateral color. And Canon has provided it in their DSLR RAW post-processing software for years.jamesh wrote:DSLR lenses are large enough, for example, that no lens shading correction is required ...
Re: RAW output information
But fall-off on a 35mm full frame lens is probably insignificant when you are using a sensor with a crop factor of around 10.rkinch wrote:Eh? DSLR optics have this aplenty. The Canon 5D Mark III features lens correction in the firmware, both shading and lateral color. And Canon has provided it in their DSLR RAW post-processing software for years.jamesh wrote:DSLR lenses are large enough, for example, that no lens shading correction is required ...
Re: RAW output information
No, the cos4 falloff depends on the FOV angle which is a function only of focal length and sensor size, and independent of the lens format (SLR, C/CS-mount, M12, etc.).towolf wrote:But fall-off on a 35mm full frame lens is probably insignificant when you are using a sensor with a crop factor of around 10.
Re: RAW output information
Ah, interesting. Didn't know that. So for proper RAW processing we would need to provide the lens shading tables? Or some app that does the lens shading correction for you?rkinch wrote:Eh? DSLR optics have this aplenty. The Canon 5D Mark III features lens correction in the firmware, both shading and lateral color. And Canon has provided it in their DSLR RAW post-processing software for years.jamesh wrote:DSLR lenses are large enough, for example, that no lens shading correction is required ...
Presumably people could actually generate their own = use a light box to get a flat field and work it out from that - that's what we do here anyway when we are not using algorithmic correction.
Principal Software Engineer at Raspberry Pi Ltd.
Working in the Applications Team.
Working in the Applications Team.
Re: RAW output information
OK, but good luck trying to find a 3.6mm lens for Canon or Nikonrkinch wrote:No, the cos4 falloff depends on the FOV angle which is a function only of focal length and sensor size, and independent of the lens format (SLR, C/CS-mount, M12, etc.).towolf wrote:But fall-off on a 35mm full frame lens is probably insignificant when you are using a sensor with a crop factor of around 10.

Re: RAW output information
Yes, if you want to process to flatten the total response across the field, including lens falloff, then this is going to be different for each lens type, so the firmware and applications will have to be runtime configurable in that regard.jamesh wrote:So for proper RAW processing we would need to provide the lens shading tables?
It's a pretty rich feature that you do falloff correction for the stock lens. I see that it is an OV5647 hardware feature and I suppose expected of good cellphone cameras to have this. This would be very advanced to have in a hackable camera like the Raspicam. It has only recently become part of Canon's in-camera DSLR firmware, which also corrects lateral color aberrations.
Not sure what you mean by "RAW" in the raspistill software, which seems to make it a synonym for an uncompressed but otherwise processed (deBayered, black-leveled, etc) image, not the actual raw voltages from the physical sensor pixels, which is what RAW means in DSLRs.
My research involves building instrumentation for fluorescence photography of live human retinas and brains, in which I've done some firmware video image processing work, using Apogee cameras. The OV5647/Broadcom Raspicam could be a suitable imager for that sort of thing, if we could hack the firmware, or at least fetch true raw output. So I share your sense of how these cameras seem to present clean images as just naturally coming from the chip pixels, when in fact there is a fright of things going on to calibrate and correct the somewhat messy counting of photons in zillions of micron-scale buckets.
Re: RAW output information
Actually, the RAW we provide when using --raw is indeed the bayer data direct from the sensor, it has had no processing at all. Note we do not use the OV5647 lens shading correction- it's all done in the GPU.
It's not yet available for the Raspi I don't think, but we do have some algorithmic code that works on lens shading without the need for tables. Not exactly sure how it works but it's much better when there may be slight differences in lenses as we don't have to do an 'average' lens table to cope - it's adaptive. A similar scheme was used on the 808, and recent Samsung phones and the results are impressive.
It's not yet available for the Raspi I don't think, but we do have some algorithmic code that works on lens shading without the need for tables. Not exactly sure how it works but it's much better when there may be slight differences in lenses as we don't have to do an 'average' lens table to cope - it's adaptive. A similar scheme was used on the 808, and recent Samsung phones and the results are impressive.
Principal Software Engineer at Raspberry Pi Ltd.
Working in the Applications Team.
Working in the Applications Team.
Re: RAW output information
Here is a post where I show a crop from the RAW data, which has not been debayered.rkinch wrote:Not sure what you mean by "RAW" in the raspistill software, which seems to make it a synonym for an uncompressed but otherwise processed (deBayered, black-leveled, etc) image, not the actual raw voltages from the physical sensor pixels, which is what RAW means in DSLRs.
http://www.raspberrypi.org/phpBB3/viewt ... 18#p357445
We have a rough tool that generates an Adobe DNG format file from the raspistill --RAW output (JPEG+RAW), which can be loaded into any of various software including the free UFRaw program, to do needed debayer, color correction, etc. UFRaw has a choice of many debayer algorithms, and the ability to do lens distortion correction, lateral color etc. but you have to supply your own lens modelling coefficients, of course. Lens distortion can be calculated as a side-effect of stitching a panorama with panotools.
If you play with the raw data, you will see the true noise level of the sensor (the JPEG/BMP/PNG image from raspistill has gone through some significant de-noising process). I presume it can be improved by image stacking + dark frame removal & flat-fielding but I have not had time to try this yet.
Re: RAW output information
Hello,
For those, like me, that can't get raspiraw to compile inside the pi, I've compiled it for Win32 using cygwin64 (it worked on first try !)
http://stuff.knackes.com/dld/201311/ras ... 98055E.zip
However the result is rather ... green (using Photoshop CS6). Is that expected ? Anybody has the settings to get the most of this data under Photoshop ?
For those, like me, that can't get raspiraw to compile inside the pi, I've compiled it for Win32 using cygwin64 (it worked on first try !)
http://stuff.knackes.com/dld/201311/ras ... 98055E.zip
However the result is rather ... green (using Photoshop CS6). Is that expected ? Anybody has the settings to get the most of this data under Photoshop ?
Last edited by pixelk on Fri Nov 29, 2013 1:32 pm, edited 1 time in total.
Re: RAW output information
Chrome warns me that your zip seems suspect, didn't keep it
And why can't you compile it on the Pi?
As for the greenish color, why not use the White Balance tool in ARC on a neutral piece of the image?
You might also want to look at this thread: http://www.raspberrypi.org/phpBB3/viewt ... 43&t=59344. It's a way of performing shading after the fact; it seems to work quite well with B&W images, color seems more diffcult.

As for the greenish color, why not use the White Balance tool in ARC on a neutral piece of the image?
You might also want to look at this thread: http://www.raspberrypi.org/phpBB3/viewt ... 43&t=59344. It's a way of performing shading after the fact; it seems to work quite well with B&W images, color seems more diffcult.
Re: RAW output information
I don't know why chrome is offended by the zip, it's just the compiled exe and the cygwin dlls needed to run it.
As for the auto white balance in ACR, it doesn't quite cut the mustard : As I have a neutral gray board laying around I will try to create my own color profile.
Thank you.
*** EDIT ***
Firefox doesn't like the doawnload link either, it seem the forum redirects to the URL incorrectly.
As for the auto white balance in ACR, it doesn't quite cut the mustard : As I have a neutral gray board laying around I will try to create my own color profile.
Thank you.
*** EDIT ***
Firefox doesn't like the doawnload link either, it seem the forum redirects to the URL incorrectly.
Re: RAW output information
If you use the stock lens, it has very heavy vignetting (both in color and in luminosity) which is not corrected when using RAW. Here's a clear example: http://www.raspberrypi.org/phpBB3/viewt ... 25#p457557
IMO RAW is better suited when you work with other lenses that do not have vignetting (in their center); then the GPU shading becomes a fault and you want to avoid it. It seems the sensor that used up till now with my Nikon lenses is sub-par as it has a strange magenta-ish and overexposing right side, so I'm planning on using another one but I need to be very careful not to have dust problems.
I think the method I posted in the link in my post above could be used to correct the shading of the stock lens in RAW, not sure it's needed with a good lens and a good sensor. Sadly the method is very, very slow.
IMO RAW is better suited when you work with other lenses that do not have vignetting (in their center); then the GPU shading becomes a fault and you want to avoid it. It seems the sensor that used up till now with my Nikon lenses is sub-par as it has a strange magenta-ish and overexposing right side, so I'm planning on using another one but I need to be very careful not to have dust problems.
I think the method I posted in the link in my post above could be used to correct the shading of the stock lens in RAW, not sure it's needed with a good lens and a good sensor. Sadly the method is very, very slow.
Re: RAW output information
I've been following this topic for a while and have been using raspiraw and the pgm / ppm converters, they've really helped me thank you. Since the last comment a colour matrix has been posted for the OV5647 here: http://users.soe.ucsc.edu/~rcsumner/raw ... trices.txt , specifically { 12782,-4059,-379,-478,9066,1413,1340,1513,5176 } . Putting this into raspiraw as 1.2782, -0.4059, etc. and using { 1.0, 1.4, 1.0 } as the neutral float you get a lot better images than the current default. I'm still a noob and really just copying and pasting but I thought this might have been missed.