It does NOT apply to those using raspistill/vid, PiCamera, or bcm2835-v4l2 which use the firmware camera stack.
Up until now bcm2835-unicam has been a video-node centric driver, ie almost everything happens through a /dev/videoN node. This is the original mode of operation of V4L2, and is relatively simple to use. It has limitations when it comes to more complex devices though, eg CSI2 muxes for multiple cameras, or chips such as ADV748[1|2] which can capture either analogue video or HDMI depending on configuration.
The Media Controller API was added to support these more complex devices, and it allows enumeration of all devices and subdevices, along with allowing configuration of the source and sink pads of each.
eg the relatively simple graph for IMX219 The main thing with Media Controller is that now each node needs to be configured by the userspace app, not just /dev/videoN. Whilst this is an extra (and annoying) step, it does allow for configurations such as where I have both an IMX219 and an IMX477 connected to the same Pi via a mux (in this case an Arducam Doubleplexer).
Based on which source link into the video-mux is enabled either device can be streamed (only one at a time). (Yes it does work).
Changes to the bc2835-unicam driver have just been merged to enable full configuration via the Media Controller API.
Supporting changes to libcamera are about to be posted to libcamera-devel, and then https://github.com/raspberrypi/linux/pull/4645 will be merged to enable Media Controller by default for all sensors that have a merged libcamera tuning. That means for these devices just setting the format/resolution on /dev/videoN will not work, and you need to either
- use Media Controller (which the Raspberry Pi libcamera pipeline handler will do for you)
- Use the override "media-controller=0" when loading the dtoverlay (eg dtoverlay=imx219,media-controller=0) to revert to using the video-node centric mode of operation.
Drivers that do not have a tuning merged (eg adv728x, tc358743, and currently ov7251) have NOT been switched to use Media Controller API by default, and so can still be controlled in the old way. If you wish to use Media Controller, use the override "media-controller=1" when loading the overlay. There is also a kernel module parameter that allows you to globally enable the use of the Media Controller API. This is mainly intended for debugging, not for general use.
I have long resisted this change as it is a pain in the neck to have to configure the resolution in multiple places, and it just fails to stream if you miss one, but there are now benefits to doing this that warrant the pain.
If Media Controller is enabled, what was, for example
now also gains
Code: Select all
v4l2-ctl -v width=3280,height=2464,pixelformat=pRAA
to configure the source pad. (The /dev/mediaN node number may move as they are not fixed allocations).
Code: Select all
media-ctl -v -d /dev/media1 -V ''\''imx219 10-0010'\'':0 [fmt:SRGGB10_1X10/3280x2464 field:none]'
You can programmatically check if Media Controller is to be used for configuring resolutions by checking VIDIOC_QUERYCAP for the capability flag V4L2_CAP_IO_MC. The updates to libcamera do check for this flag and throw an error if not found - sorry, supporting both options in libcamera just complicates things too much.
I hope that all makes sense, otherwise please do ask questions.