[PATCH v1 1/1] Documentation: Add first version of the tuning-guide
Kieran Bingham
kieran.bingham at ideasonboard.com
Tue Aug 6 18:55:06 CEST 2024
Quoting Stefan Klug (2024-08-06 15:44:41)
> This patch adds a initial version of the tuning guide for libcamera. The
> process is established based on the imx8mp and will be improved when we
> do more tuning for different ISPs with our own tooling.
>
> Signed-off-by: Stefan Klug <stefan.klug at ideasonboard.com>
> ---
> Documentation/conf.py | 4 +-
> Documentation/guides/tuning/tuning.rst | 276 +++++++++++++++++++++++++
> Documentation/index.rst | 1 +
> Documentation/meson.build | 1 +
> 4 files changed, 280 insertions(+), 2 deletions(-)
> create mode 100644 Documentation/guides/tuning/tuning.rst
>
> diff --git a/Documentation/conf.py b/Documentation/conf.py
> index 7eeea7f3865b..5387942b9af5 100644
> --- a/Documentation/conf.py
> +++ b/Documentation/conf.py
> @@ -21,8 +21,8 @@
> # -- Project information -----------------------------------------------------
>
> project = 'libcamera'
> -copyright = '2018-2019, The libcamera documentation authors'
> -author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund'
> +copyright = '2018-2024, The libcamera documentation authors'
> +author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund, Stefan Klug'
>
> # Version information is provided by the build environment, through the
> # sphinx command line.
> diff --git a/Documentation/guides/tuning/tuning.rst b/Documentation/guides/tuning/tuning.rst
> new file mode 100644
> index 000000000000..1c30f3f20b4e
> --- /dev/null
> +++ b/Documentation/guides/tuning/tuning.rst
> @@ -0,0 +1,276 @@
> +.. SPDX-License-Identifier: CC-BY-SA-4.0
> +
> +Sensor Tuning Guide
> +===================
> +
> +To create visually good images out of the raw data provided by the camera
s/out of the/from the/
> +sensor, a lot of image processing takes place. This is usually done inside an
> +ISP (Image Signal Processor). To be able to do the necessary processing, the
> +corresponding algorithms need to be parameterized according to the used hardware
s/to the used hardware/to the hardware in use/
> +(typically sensor, light and lens). Creating these parameters is called tuning.
s/Creating/Calculating/ ?
s/is called/is a process called/
> +The tuning process results in a tuning file which is then used by libcamera to
> +parameterize the algorithms at runtime.
s/parameterize the/provide calibrated parameters to the/
> +
> +The processing block of an ISP vary from vendor to vendor and can be
s/block/blocks
> +arbitrarily complex. Never the less a diagram of the most common and important
s/Never the less/Nevertheless/ ?
- based on https://dictionary.cambridge.org/dictionary/english/nevertheless
Is it correct to say these are the most common and important ? That
sounds too subjective and dependent upon use cases...
Perhaps:
"a diagram of common blocks frequently found in an ISP design is shown
below:"
But I don't mind your version.
> +blocks is shown below:
> +
> + ::
> +
> + +--------+
> + | Light |
> + +--------+
> + |
> + v
> + +--------+ +-----+ +-----+ +-----+ +-----+ +-------+
> + | Sensor | -> | BLC | -> | AWB | -> | LSC | -> | CCM | -> | Gamma |
> + +--------+ +-----+ +-----+ +-----+ +-----+ +-------+
> +
> +**Light** The light used to light the scene has a crucial influence on the
> +resulting image. The human eye and brain are specialized in automatically
> +adapting to different lights. In a camera this has to be done by an algorithm.
> +Light is a complex topic and to correctly describe light sources you need to
> +describe the spectrum of the light. To simplify things, lights are categorized
> +according to their `color temperature (K)
> +<https://en.wikipedia.org/wiki/Color_temperature>`_. This is important to
> +keep in mind as it means that calibrations may differ between light sources even
> +though they have the same nominal color temperature. For best results the tuning
> +images need to be taken for a complete range of light sources.
I would put that last statement on it's own line as it's a
distinct/important point ("For best results...")
> +
> +**Sensor** The sensor captures the incoming light and produces the raw images.
"and converts a measured analogue signal to a digital representation which conveys
the raw images."
> +Data is commonly filtered by a color filter array into red, green and blue
Data? or light ... I think the light is filtered by a colour filter...
> +channels. As these filters are not perfect some postprocessing needs to be done
> +to recreate the correct color.
> +
> +**BLC** Black level correction. Even without incoming light a sensor produces
> +pixel values above zero. This is due to a artificially added pedestal value
s/due to a/due to an/
> +(Which is added to get a evenly distributed noise level around zero instead of
> +only the upper half) and other effects in the electronics light dark currents.
Rather than bracket that, I'd suggest:
"
This is due to two system artifacts that impact the measured light
levels on the sensor. Firstly, a deliberate and artificially added
pedestal value is added to get an evenly distributed noise level around
zero to avoid negative values and clipping the data.
Secondly, additional underlying electrical noise can be caused by
various external factors including thermal noise and electrical
interferences.
> +To get good images with real black, that black level needs to be subtracted. As
> +that level is typically known for a sensor it is hardcoded in libcamera and does
> +not need any calibration at the moment.
"As the pedestal value is typically known" (we don't know the electrical noise
component)
Is it hardcoded though? Can't it be overridden by the tuning file still?
> +
> +**AWB** Auto white balance. For a proper image the color channels need to be
I'm not sure if 'proper image' is defined well. But I haven't got a
better option in my head yet...
> +adjusted, to get correct white balance. This means that monochrome objects in
> +the scene appear monochrome in the output image (white is white and gray is
> +gray). In libcamera this is done based on a gray world model. No tuning is
> +necessary for this step.
"Presently in the libipa implementation of libcamera, this is managed by
a grey world model." ?
> +
> +**LSC** Lens shading correction. The lens in use has a big influence on the
> +resulting image. The typical effects on the image are lens-shading (also called
> +vignetting). This means that due to the physical properties of a lens the
> +transmission of light falls of to the corners of the lens. To make things even
'falls off towards the corners' ?
> +harder, this falloff can be different for different colors/wave lengths. LSC is
s/wave lengths/wavelengths/
> +therefore tuned for a set of light sources and for each color channel
> +individually.
> +
> +**CCM** Color correction matrix. After the previous processing blocks the grays
> +are preserved, but colors are still not correct. This is mostly due to the
> +color temperature of the light source and the imperfections in the color filter
> +array of the sensor. To correct for this a so called color correction matrix is
s/a so called color correction matrix/a 'color correction matrix'/ ?
> +calculated. This is a 3x3 matrix, that is used to optimize the captured colors
> +regarding the perceived color error. To do this a chart of known and precisely
> +measured colors (macbeth chart) is captured. Then the matrix is calculated
s/(macbeth chart)/commonly called a macbeth chart/
> +using linear optimization to best map each measured colour of the chart to it's
> +known value. The color error is measured in `deltaE
> +<https://en.wikipedia.org/wiki/Color_difference>`_.
> +
> +**Gamma** Gamma correction. Today's display usually apply a gamma of 2.2 for
> +display. For images to be perceived correctly by the human eye, they need to be
> +encoded with the corresponding inverse gamma. See also
> +<https://en.wikipedia.org/wiki/Gamma_correction>. This block doesn't need
> +tuning, but is crucial for correct visual display.
> +
> +Materials needed for the tuning process
> +---------------------------------------
> +
> +Precisely calibrated optical equipment is very expensive and out of the scope of
> +this document. Still it is possible to get reasonably good calibration results
> +at little costs. The most important devices needed are:
> +
> + - A light box with the ability to produce defined light of different color
> + temperatures. Typical temperatures used for calibration are 2400K
> + (incandescent), 2700K (fluorescent), 5000K (daylight fluorescent), 6500K
> + (daylight). As a first test, keylights for webcam streaming can be used.
> + These are available with support for color temperatures ranging from 2500K
> + to 9000K. For better results professional light boxes are needed.
> + - A ColorChecker chart. These are sold from calibrite and it makes sense to
> + get the original one.
> + - A integration sphere. This is used to create completely homogenious light
> + for the lens shading calibration. We had good results with the use of
> + a homogeneous area light (the above mentioned keylight).
I would perhaps call this "a large light panel with sufficient diffusion
to get an even distribution of light."
> + - An environment without external light sources. Ideally calibration is done
> + without any external lights in a fully dark room. A black curtain is one
> + solution, working by night is the cheap alternative.
> +
Indeed, I think I'm happier with a dark curtain so far so I don't have
to stay up so late.
And I've hit 50% through the document, but run out of time so I'll snip
and pause here!
<snip>
--
Kieran
More information about the libcamera-devel
mailing list