[libcamera-devel] [PATCH v3 01/12] documentation: Introduce Camera Sensor Model
Laurent Pinchart
laurent.pinchart at ideasonboard.com
Fri Sep 15 17:30:53 CEST 2023
Hi Jacopo,
Thank you for the patch.
On Fri, Sep 15, 2023 at 03:06:39PM +0200, Jacopo Mondi via libcamera-devel wrote:
> Introduce a documentation page about the 'camera sensor model'
> implemented by libcamera.
>
> The camera sensor model serves to provide to applications a reference
> description of the processing steps that take place in a camera sensor
> in order to precisely control the sensor configuration through the
> forthcoming SensorConfiguration class.
>
> Signed-off-by: Jacopo Mondi <jacopo.mondi at ideasonboard.com>
> Reviewed-by: Naushir Patuck <naush at raspberrypi.com>
> Reviewed-by: Kieran Bingham <kieran.bingham at ideasonboard.com>
> ---
> Documentation/binning.png | Bin 0 -> 66004 bytes
This is a direct copy of the MIPI CCS specification, which explicitly
states
This material is protected by copyright laws, and may not be
reproduced, republished, distributed, transmitted, displayed,
broadcast or otherwise exploited in any manner without the express
prior written permission of MIPI Alliance
We can't use the image. Unless you tell me you've obtained a license,
which would surprise me :-)
Same for skipping.png.
> Documentation/camera-sensor-model.png | Bin 0 -> 74270 bytes
What do you draw these with ? PNG files can't be edited easily, we need
the source. Ideally an SVG file would be best.
> Documentation/camera-sensor-model.rst | 198 ++++++++++++++++++++++++++
> Documentation/index.rst | 1 +
> Documentation/meson.build | 1 +
> Documentation/skipping.png | Bin 0 -> 67218 bytes
> 6 files changed, 200 insertions(+)
> create mode 100644 Documentation/binning.png
> create mode 100644 Documentation/camera-sensor-model.png
> create mode 100644 Documentation/camera-sensor-model.rst
> create mode 100644 Documentation/skipping.png
[snip]
> diff --git a/Documentation/camera-sensor-model.rst b/Documentation/camera-sensor-model.rst
> new file mode 100644
> index 000000000000..93b23340b309
> --- /dev/null
> +++ b/Documentation/camera-sensor-model.rst
> @@ -0,0 +1,198 @@
> +.. SPDX-License-Identifier: CC-BY-SA-4.0
> +
> +.. _camera-sensor-model:
> +
> +The libcamera camera sensor model
> +=================================
> +
> +libcamera defines an abstracted camera sensor model in order to provide
s/abstracted/abstract/
> +a description of each of the processing steps that result in image data being
> +sent on the media bus and that form the image stream delivered to applications.
I had real trouble parsing this :-S I also don't think it's technically
correct, as the sensor model is only about the sensor, it doesn't extend
all the way to applications.
I think the documentation needs considerable improvements. As I don't
want to delay this patch series, this can be done on top.
> +
> +Applications should use the abstracted camera sensor model defined here to
> +precisely control the operations of the camera sensor.
> +
> +The libcamera camera sensor model targets image sensors producing frames in
> +RAW format, delivered through a MIPI CSI-2 compliant bus implementation.
Is there any reason to exclude parallel buses ?
> +
> +The abstract sensor model maps libcamera components to the characteristics and
> +operations of an image sensor, and serves as a reference to model the libcamera
> +CameraSensor class and SensorConfiguration classes and operations.
> +
> +In order to control the configuration of the camera sensor through the
> +SensorConfiguration class, applications should understand this model and map it
> +to the combination of image sensor and kernel driver in use.
If this document is meant for application developers, there should be no
mention of kernel drivers. Hiding kernel drivers from applications is
the whole point of libcamera.
> +
> +The camera sensor model defined here is based on the *MIPI CCS specification*,
> +particularly on *Section 8.2 - Image readout* of *Chapter 8 - Video Timings*.
> +
> +.. image:: camera-sensor-model.png
> +
> +Glossary
> +---------
> +
> +- *Pixel array*: The full grid of pixels, active and inactive ones
> +
> +- *Pixel array active area*: The portion(s) of the pixel array that
> + contains valid and readable pixels; corresponds to the libcamera
> + properties::PixelArrayActiveAreas
> +
> +- *Analog crop rectangle*: The portion of the *pixel array active area* which
> + is read-out and passed to further processing stages
s/read-out/read out/
> +
> +- *Subsampling*: Pixel processing techniques that reduce the image size by
> + interpolating (*binning*) or by skipping adjacent pixels
Binning isn't interpolation.
> +
> +- *Digital crop*: Crop of the sub-sampled image data before scaling
> +
> +- *Digital scaling*: Digital scaling of the image data
> +
> +- *Output crop*: Crop of the scaled image data to form the final output image
There's no digital scaling support below, you can drop digital scaling
and output crop here, as well as in the diagram.
> +
> +- *Frame output*: The frame (image) as output on the media bus by the
> + camera sensor
> +
> +Camera Sensor configuration parameters
> +--------------------------------------
> +
> +The libcamera camera sensor model defines parameters that allow users to
> +control:
The documentation should be written with a definition of the sensor
model first, and the parameters next. The model should explain how the
abstract sensor operates. For instance, it could start with
1. The sensor reads pixels from the *pixel array*. The pixels being read out are
selected by the *analog crop rectangle*.
2. The pixels are subsampled to reduce the image size without affecting the
field of view. Two subsampling techniques are applied:
- Binning combines adjacent pixels of the same colour by averaging or summing
their values, in the analog domain and/or the digital domain.
- Skipping ...
Binning and skipping are applied both horizontally and vertically, with
identical or different factors in the two directions.
3. Digital crop ...
...
X. The pixels are output on the sensor's bus. ...
(with figures and additional explanations for binning and skipping,
including formulas)
The item numbers should refer to numbers in the camera sensor model
diagram.
> +
> +1. The image format bit depth
> +
> +2. The size and position of the *Analog crop rectangle*
> +
> +3. The subsampling factors used to downscale the pixel array readout data to a
> + smaller frame size without reducing the image *field of view*. Two
> + configuration parameters are made available to control the downscaling factor
> +
> + - binning
> + - binning reduces the image resolution by combining adjacent pixels
> + - a vertical and horizontal binning factor can be specified, the image
> + will be downscaled in its vertical and horizontal sizes by the specified
> + factor
> +
> + .. figure:: binning.png
> + :height: 350
> + :width: 400
> +
> + Figure 39 from the MIPI CCS Specification (Version 1.1)
> +
> +
> + .. code-block::
> + :caption: Definition: The horizontal and vertical binning factors
> +
> + horizontal_binning = xBin;
> + vertical_binning = yBin;
> +
> +
> + - skipping
> + - reduces the image resolution by skipping the read-out of a
> + number of adjacent pixels
> + - the skipping factor is specified by the 'increment' number (number of
> + pixels to 'skip') in the vertical and horizontal directions and for
> + even and odd rows and columns
> +
> + .. figure:: skipping.png
> + :height: 400
> + :width: 350
> +
> + Figure 35 from the MIPI CCS Specification (Version 1.1)
> +
> +
> + .. code-block::
> + :caption: Definition: The horizontal and vertical skipping factors
> +
> + horizontal_skipping = (xOddInc + xEvenInc) / 2
> + vertical_skipping = (yOddInc + yEvenInc) / 2
> +
> +
> + - binning and skipping can be generically combined
> + - different sensors perform the binning and skipping stages in different
> + orders
> + - for the sake of computing the final output image size the order of
> + execution is not relevant.
> +
> + - the overall down-scaling factor is obtained by combining the binning and
> + skipping factors
> +
> + .. code-block::
> + :caption: Definition: The total scaling factor (binning + sub-sampling)
> +
> + total_horizontal_downscale = horizontal_binning + horizontal_skipping
> + total_vertical_downscale = vertical_binning + vertical_skipping
> +
> +
> +4. The output data frame size
> + - the output size is used to specify any additional cropping on the
> + sub-sampled frame
> + - \todo Add support for controlling scaling in the digital domain
\todo is a doxygen statement, it's not valid .rst syntax. Please compile
the documentation and check that it looks right.
> +
> +5. The total line length and frame height (*visibile* pixels + *blankings*) as
> + sent on the MIPI CSI-2 bus
> +
> +6. The pixel transmission rate on the MIPI CSI-2 bus
> +
> +The above parameters are combined to obtain the following high-level
> +configurations:
> +
> +- **frame output size**
> +
> + Obtained by applying a crop to the physical pixel array size in the analog
> + domain, followed by optional binning and sub-sampling (in any order),
> + followed by an optional crop step in the output digital domain.
> +
> + \todo Add support for controlling scaling in the digital domain
> +
> +- **frame rate**
> +
> + The combination of the *total frame size*, the image format *bit depth* and
> + the *pixel rate* of the data sent on the MIPI CSI-2 bus allows to compute the
> + image stream frame rate. The equation is the well known:
> +
> + .. code-block::
> +
> + frame_duration = total_frame_size / pixel_rate
> + frame_rate = 1 / frame_duration
> +
> +
> + where the *pixel_rate* parameter is the result of the sensor's configuration
> + of the MIPI CSI-2 bus *(the following formula applies to MIPI CSI-2 when
> + used on MIPI D-PHY physical protocol layer only)*
> +
> + .. code-block::
> +
> + pixel_rate = CSI-2_link_freq * 2 * nr_of_lanes / bits_per_sample
> +
> +
> +The SensorConfiguration class
> +-----------------------------
> +
> +Applications can control the camera sensor configuration by populating the
> +sensorConfig member of the CameraConfiguration class.
> +
> +Camera applications that specify a SensorConfiguration are assumed to be
> +highly-specialized applications that know what sensor they're dealing with and
> +which modes are valid for the sensor in use.
> +
> +\todo The sensorConfig instance should be fully specified in all its fields and
> +it should be applied in full to the camera sensor. For now only consider
> +bit-depth and output size as the kernel interface doesn't allow to fully control
> +the sensor configuration.
> +
> +\todo If the application has populated the sensorConfig with any parameter
> +which can not be applied unchanged on the camera sensor, the
> +CameraConfiguration is considered Invalid and will be rejected by the
> +Camera::configure() call.
> +
> +If the application provides a valid SensorConfiguration, its configuration
> +takes precedence over any conflicting StreamConfiguration request.
> +
> +For example:
> +
> +- If the platform cannot upscale, all the processed streams should be smaller or
> + equal in size of the requested sensor configuration.
> +
> +- If the platform produces RAW streams using the frames from the camera sensor
> + without any additional processing, the RAW stream format should be adjusted to
> + match the one configured by the sensorConfig.
This section belongs to doxygen even more than the rest, in my opinion.
> diff --git a/Documentation/index.rst b/Documentation/index.rst
> index 43d8b017d3b4..63fac72d11ed 100644
> --- a/Documentation/index.rst
> +++ b/Documentation/index.rst
> @@ -23,3 +23,4 @@
> Sensor driver requirements <sensor_driver_requirements>
> Lens driver requirements <lens_driver_requirements>
> Python Bindings <python-bindings>
> + Camera Sensor Model <camera-sensor-model>
As you're documenting the libcamera API, doesn't this belong to the
doxygen documentation ?
> diff --git a/Documentation/meson.build b/Documentation/meson.build
> index b2a5bf15e6ea..7c1502592baa 100644
> --- a/Documentation/meson.build
> +++ b/Documentation/meson.build
> @@ -63,6 +63,7 @@ endif
>
> if sphinx.found()
> docs_sources = [
> + 'camera-sensor-model.rst',
> 'coding-style.rst',
> 'conf.py',
> 'contributing.rst',
[snip]
--
Regards,
Laurent Pinchart
More information about the libcamera-devel
mailing list