[PATCH v2 1/1] Documentation: Add first version of the tuning-guide

Stefan Klug stefan.klug at ideasonboard.com
Tue Aug 27 11:56:44 CEST 2024


Hi Dan,

Thank you for your review.

On Wed, Aug 14, 2024 at 09:29:01AM +0100, Dan Scally wrote:
> Hi Stefan, thanks for the patch. This is very good!
> 
> On 14/08/2024 08:40, Stefan Klug wrote:
> > This patch adds a initial version of the tuning guide for libcamera. The
> > process is established based on the imx8mp and will be improved when we
> > do more tuning for different ISPs with our own tooling.
> > 
> > Signed-off-by: Stefan Klug <stefan.klug at ideasonboard.com>
> > ---
> >   Documentation/conf.py                  |   4 +-
> >   Documentation/guides/tuning/tuning.rst | 291 +++++++++++++++++++++++++
> >   Documentation/index.rst                |   1 +
> >   Documentation/meson.build              |   1 +
> >   4 files changed, 295 insertions(+), 2 deletions(-)
> >   create mode 100644 Documentation/guides/tuning/tuning.rst
> > 
> > diff --git a/Documentation/conf.py b/Documentation/conf.py
> > index 7eeea7f3865b..5387942b9af5 100644
> > --- a/Documentation/conf.py
> > +++ b/Documentation/conf.py
> > @@ -21,8 +21,8 @@
> >   # -- Project information -----------------------------------------------------
> >   project = 'libcamera'
> > -copyright = '2018-2019, The libcamera documentation authors'
> > -author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund'
> > +copyright = '2018-2024, The libcamera documentation authors'
> > +author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund, Stefan Klug'
> >   # Version information is provided by the build environment, through the
> >   # sphinx command line.
> > diff --git a/Documentation/guides/tuning/tuning.rst b/Documentation/guides/tuning/tuning.rst
> > new file mode 100644
> > index 000000000000..a58dda350556
> > --- /dev/null
> > +++ b/Documentation/guides/tuning/tuning.rst
> > @@ -0,0 +1,291 @@
> > +.. SPDX-License-Identifier: CC-BY-SA-4.0
> > +
> > +Sensor Tuning Guide
> > +===================
> > +
> > +To create visually good images from the raw data provided by the camera sensor,
> > +a lot of image processing takes place. This is usually done inside an ISP (Image
> > +Signal Processor). To be able to do the necessary processing, the corresponding
> > +algorithms need to be parameterized according to the hardware in use (typically
> > +sensor, light and lens). Calculating these parameters is a process called
> > +tuning. The tuning process results in a tuning file which is then used by
> > +libcamera to provide calibrated parameters to the algorithms at runtime.
> > +
> > +The processing blocks of an ISP vary from vendor to vendor and can be
> > +arbitrarily complex. Nevertheless a diagram of common blocks frequently found in
> > +an ISP design is shown below:
> > +
> > + ::
> > +
> > +  +--------+
> > +  |  Light |
> > +  +--------+
> > +      |
> > +      v
> > +  +--------+     +-----+    +-----+    +-----+    +-----+    +-------+
> > +  | Sensor |  -> | BLC | -> | AWB | -> | LSC | -> | CCM | -> | Gamma |
> > +  +--------+     +-----+    +-----+    +-----+    +-----+    +-------+
> > +
> > +**Light** The light used to light the scene has a crucial influence on the
> "light used to light" is a bit funny. Perhaps "light used to illuminate"?
> > +resulting image. The human eye and brain are specialized in automatically
> > +adapting to different lights. In a camera this has to be done by an algorithm.
> > +Light is a complex topic and to correctly describe light sources you need to
> > +describe the spectrum of the light. To simplify things, lights are categorized
> > +according to their `color temperature (K)
> > +<https://en.wikipedia.org/wiki/Color_temperature>`_. This is important to
> > +keep in mind as it means that calibrations may differ between light sources even
> > +though they have the same nominal color temperature.
> > +
> > +For best results the tuning images need to be taken for a complete range of
> > +light sources.
> > +
> > +**Sensor** The sensor captures the incoming light and converts a measured
> > +analogue signal to a digital representation which conveys the raw images. The
> > +light is commonly filtered by a color filter array into red, green and blue
> > +channels. As these filters are not perfect some postprocessing needs to be done
> > +to recreate the correct color.
> > +
> > +**BLC** Black level correction. Even without incoming light a sensor produces
> > +pixel values above zero. This is due to two system artifacts that impact the
> > +measured light levels on the sensor. Firstly, a deliberate and artificially
> > +added pedestal value is added to get an evenly distributed noise level around
> > +zero to avoid negative values and clipping the data.
> > +
> > +Secondly, additional underlying electrical noise can be caused by various
> > +external factors including thermal noise and electrical interferences.
> > +
> > +To get good images with real black, that black level needs to be subtracted. As
> > +that level is typically known for a sensor it is hardcoded in libcamera and does
> > +not need any calibration at the moment. If needed, that value can be manually
> > +overwritten in the tuning configuration.
> Can it? For all IPA modules?
> > +
> > +**AWB** Auto white balance. For a proper image the color channels need to be
> > +adjusted, to get correct white balance. This means that monochrome objects in
> > +the scene appear monochrome in the output image (white is white and gray is
> > +gray). Presently in the libipa implementation of libcamera, this is managed by a
> > +grey world model. No tuning is necessary for this step.
> > +
> > +**LSC** Lens shading correction. The lens in use has a big influence on the
> > +resulting image. The typical effects on the image are lens-shading (also called
> > +vignetting). This means that due to the physical properties of a lens the
> > +transmission of light falls of towards the corners of the lens. To make things
> s/falls of/falls off. I would perhaps link vignetting to the Wikipedia page.
> > +even harder, this falloff can be different for different colors/wavelengths. LSC
> > +is therefore tuned for a set of light sources and for each color channel
> > +individually.
> > +
> > +**CCM** Color correction matrix. After the previous processing blocks the grays
> > +are preserved, but colors are still not correct. This is mostly due to the color
> > +temperature of the light source and the imperfections in the color filter array
> > +of the sensor. To correct for this a 'color correction matrix' is calculated.
> > +This is a 3x3 matrix, that is used to optimize the captured colors regarding the
> > +perceived color error.
> 
> 
> Perhaps "used to optimize the captured colours by correcting the perceived color error"?

To me "correcting the error" also doesn't really cut it. Would you be ok with "used to
optimize the captured colours by minimizing the perceived color error."?

> 
> > To do this a chart of known and precisely measured colors
> > +commonly called a macbeth chart is captured. Then the matrix is calculated using
> > +linear optimization to best map each measured colour of the chart to it's known
> > +value. The color error is measured in `deltaE
> > +<https://en.wikipedia.org/wiki/Color_difference>`_.
> > +
> > +**Gamma** Gamma correction. Today's display usually apply a gamma of 2.2 for
> > +display.
> 
> The double "display" feels a bit off. Hmmmm...maybe "Today's displays
> usually apply a gamma of 2.2 to the image they show"?
> 

done

> 
> >   For images to be perceived correctly by the human eye, they need to be
> > +encoded with the corresponding inverse gamma. See also
> > +<https://en.wikipedia.org/wiki/Gamma_correction>. This block doesn't need
> > +tuning, but is crucial for correct visual display.
> > +
> > +Materials needed for the tuning process
> > +---------------------------------------
> > +
> > +Precisely calibrated optical equipment is very expensive and out of the scope of
> > +this document. Still it is possible to get reasonably good calibration results
> > +at little costs. The most important devices needed are:
> s/costs/cost.

done

> > +
> > +   - A light box with the ability to produce defined light of different color
> > +     temperatures. Typical temperatures used for calibration are 2400K
> > +     (incandescent), 2700K (fluorescent), 5000K (daylight fluorescent), 6500K
> > +     (daylight). As a first test, keylights for webcam streaming can be used.
> > +     These are available with support for color temperatures ranging from 2500K
> > +     to 9000K.
> Is it worth mentioning that you're using those in combination with a DIY
> lightbox rather than just openly in a room?

I added: "9000k and can be combined with a simple white or grey card
box. It is important, that the box is of neutral grey color, so that it
doesn't influence the measurements."

> > For better results professional light boxes are needed.
> > +   - A ColorChecker chart. These are sold from calibrite and it makes sense to
> > +     get the original one.
> I think that the reasons for getting the original one might be worth
> expanding on, as it's probably the most expensive thing to acquire on this
> list.

I changed that to: It makes sense to get the original one, as there is
no easy way to recreate this with similar quality.

> > +   - A integration sphere. This is used to create completely homogenious light
> > +     for the lens shading calibration. We had good results with the use of a
> > +     large light panel with sufficient diffusion to get an even distribution of
> > +     light (the above mentioned keylight).
> > +   - An environment without external light sources. Ideally calibration is done
> > +     without any external lights in a fully dark room. A black curtain is one
> > +     solution, working by night is the cheap alternative.
> > +
> > +
> > +Overview over the process
> s/over/of

done

> > +-------------------------
> > +
> > +The tuning process of libcamera consists of the following steps:
> > +
> > +   1. Take images for the tuning steps with different color temperatures
> s/with/at

done

> > +   2. Configure the tuning process
> > +   3. Run the tuning script to create a corresponding tuning file
> > +
> > +
> > +Taking raw images with defined exposure/gain
> > +--------------------------------------------
> > +
> > +For all the tuning files you need to capture a raw image with a defined exposure
> > +time and gain. It is crucial that no pixels are saturated on any channel. At the
> > +same time the images should not be too dark. Strive for a exposure time, so that
> "Strive for an exposure time such that"

done

> > +the brightest pixel is roughly 90% saturated. Finding the correct exposure time
> > +is currently a manual process. We are working on solutions to aid in that
> > +process.
> Are we? cool!

In my mind it's mostly done :-)... then reality hits.

> >   After finding the correct exposure time settings, the easiest way to
> > +capture a raw image is with the ``cam`` application that gets installed with the
> > +regular libcamera install.
> > +
> > +Create a ``constant-exposure-<temperature>.yaml`` file with the following content:
> > +
> > +.. code:: yaml
> > +
> > +   frames:
> > +     - 0:
> > +         AeEnable: false
> > +         AnalogueGain: 1.0
> > +         ExposureTime: 1000
> > +
> The IPU3 IPA wouldn't handle these...one for my todo list, but perhaps we
> need to formally define the requirements an IPA would need to meet to
> support our tuning process somewhere or something...

Oh what is supported by the ipu3? Yes, a feature table somewhere would
be nice. Someday...

> > +Then capture one or more images using the following command:
> > +  ``cam -c 1 --script constant-exposure-<temperature>.yaml -C -s role=raw --file=image_#.dng``
> > +
> > +.. Note:: Typically the brightness changes for different colour temperatures. So
> > +    this has to be readjusted for every colour temperature.
> > +
> > +
> > +Capture images for lens shading correction
> > +------------------------------------------
> > +
> > +To be able to correct lens shading artifacts, images of a homogeneously lit
> > +neutral gray background are needed. Ideally a integration sphere is used for
> > +that task.
> > +
> > +As a fallback images of a flat panel light can be used. If there are multiple
> > +images for the same color temperature, the tuning scripts will average the
> > +tuning results which can further improve the tuning quality.
> > +
> > +.. Note:: Currently lsc is ignored in the ccm calculation. This can lead to
> > +    slight color deviations and will be improved in the future.
> > +
> > +Images shall be taken for multiple color temperatures and named
> > +``alsc_<temperature>k_#.dng``. ``#`` can be any number, if multiple images are
> > +taken for the same color temperature.
> > +
> > +.. figure:: img/setup_lens_shading.jpg
> > +    :width: 50%
> > +
> > +    Sample setup using a flat panel light. In this specific setup, the camera
> > +    has to be moved close to the light due to the wide angle lens. This has the
> > +    downside that the angle to the light might get way too steep.
> I _think_ that the effect of that would be to increase the severity of the
> vignetting and so the tuning process would increase the gains to counter it
> more aggressively than would really be needed in a natural scene; is that
> right? Maybe that's worth a note box (or warning? Or whatever the orange one
> is)

Yes, that is my expectation. But as I really don't know the actual
impact of the angle and I don't know the impact of the increase in
distance to the light source I kept it quite vaguely. Now that I say it,
maybe we should actually compensate for the light decay due to distance
change. See https://en.wikipedia.org/wiki/Inverse-square_law . So in the
end I don't know exactly what to put in that warning.

> > +
> > +.. figure:: img/camshark-alsc.png
> > +    :width: 100%
> > +
> > +    A calibration image for lens shading correction.
> > +
> > +- Ensure the full sensor is lit. Especially with wide angle lenses it may be
> > +  difficult to get the full field of view lit homogeneously.
> > +- Ensure that the brightest area of the image does not contain any pixels that
> > +  reach more than 95% on any of the colors (can be checked with camshark).
> 
> 
> Ooh where does camshark show that?

There is a "Info" group, showing RGB and Mean values when you hover over
the image.

> 
> > +
> > +Take a raw dng image for multiple color temperatures:
> > +    ``cam -c 1 --script constant-exposure-<temperature>.yaml -C -s role=raw --file=alsc_3200k_#.dng``
> > +
> > +
> > +
> > +Capture images for color calibration
> > +------------------------------------
> > +
> > +To do the color calibration, raw images of the color checker need to be taken
> s/color checker/ColorChecker chart.

done

> > +for different light sources. These need to be named
> 
> > +``<sensor_name>_<lux-level>l_<temperature>k_0.dng``.
> > +
> > +For best results the following hints should be taken into account:
> > +  - Ensure that the 18% gray patch (third from the right in bottom line) is
> > +    roughly at 18% saturation for all channels (a mean of 16-20% should be fine)
> So I think that means you need to set the color temperature on your lamp and
> then tune the intensity until the saturation is met - is that right?

Yes, either by tuning the intensity or changing the exposure time. I
added a sentence for that.

> > +  - Ensure the color checker is homogeneously lit (ideally from 45 degree above)
> > +  - No straylight from other light sources is present
> > +  - The color checker is not too small and not too big (so that neither lens
> > +    shading artifacts nor lens distortions are prevailing in the area of the
> > +    color chart)
> > +
> > +If no lux meter is at hand to precisely measure the lux level, a lux meter app
> > +on a mobile phone can provide a sufficient estimation.
> > +
> > +.. figure:: img/setup_calibration2.jpg
> > +    :width: 50%
> > +
> > +
> > +Run the tuning scripts
> > +----------------------
> > +
> > +After taking the calibration images, you should have a directory with all the
> > +tuning files. It should look something like this:
> > +
> > + ::
> > +
> > +   ../tuning-data/
> > +   ├── alsc_2500k_0.dng
> > +   ├── alsc_2500k_1.dng
> > +   ├── alsc_2500k_2.dng
> > +   ├── alsc_6500k_0.dng
> > +   ├── imx335_1000l_2500k_0.dng
> > +   ├── imx335_1200l_4000k_0.dng
> > +   ├── imx335_1600l_6000k_0.dng
> > +
> > +
> > +The tuning scripts are part of the libcamera source tree. After cloning the
> > +libcamera sources the necessary steps to create a tuning file are:
> > +
> > + ::
> > +
> > +  # install the necessary python packages
> > +  cd libcamera
> > +  python -m venv venv
> > +  source ./venv/bin/activate
> > +  pip install -r utils/tuning/requirements.txt
> > +
> > +  # run the tuning script
> > +  utils/tuning/rkisp1.py -c config.yaml -i ../tuning-data/ -o tuning-file.yaml
> 
> There needs to be an explanation of the creation of config.yaml here I think.
> 

Argh. I knew someone will spot that :-). The config file and code is
something that really needs some love. But you are soo right. I added a
comment that just copies the config-example.yaml.

> 
> Really good work in my opinion.

Thanks!

Best regards,
Stefan

> 
> > +
> > +After the tuning script has run, the tuning file can be tested with any
> > +libcamera based application like `qcam`. To quickly switch to a specific tuning
> > +file, the environment variable ``LIBCAMERA_<pipeline>_TUNING_FILE`` is helpful.
> > +E.g.:
> > +
> > + ::
> > +
> > +    LIBCAMERA_RKISP1_TUNING_FILE=/path/to/tuning-file.yaml qcam -c1
> > +
> > +
> > +Sample images
> > +-------------
> > +
> > +
> > +.. figure:: img/image-no-blc.png
> > +    :width: 800
> > +
> > +    Image without black level correction (and completely invalid color estimation).
> > +
> > +.. figure:: img/image-no-lsc-3000.png
> > +    :width: 800
> > +
> > +    Image without lens shading correction @ 3000k. The vignetting artifacts can
> > +    be clearly seen
> > +
> > +.. figure:: img/image-no-lsc-6000.png
> > +    :width: 800
> > +
> > +    Image without lens shading correction @ 6000k. The vignetting artifacts can
> > +    be clearly seen
> > +
> > +.. figure:: img/image-fully-tuned-3000.png
> > +    :width: 800
> > +
> > +    Fully tuned image @ 3000k
> > +
> > +.. figure:: img/image-fully-tuned-6000.png
> > +    :width: 800
> > +
> > +    Fully tuned image @ 6000k
> > +
> > diff --git a/Documentation/index.rst b/Documentation/index.rst
> > index 5442ae75dde7..991dcf2b66fb 100644
> > --- a/Documentation/index.rst
> > +++ b/Documentation/index.rst
> > @@ -16,6 +16,7 @@
> >      Developer Guide <guides/introduction>
> >      Application Writer's Guide <guides/application-developer>
> > +   Sensor Tuning Guide <guides/tuning/tuning>
> >      Pipeline Handler Writer's Guide <guides/pipeline-handler>
> >      IPA Writer's guide <guides/ipa>
> >      Tracing guide <guides/tracing>
> > diff --git a/Documentation/meson.build b/Documentation/meson.build
> > index 1ba40fdf67ac..8bf09f31afa0 100644
> > --- a/Documentation/meson.build
> > +++ b/Documentation/meson.build
> > @@ -135,6 +135,7 @@ if sphinx.found()
> >           'guides/ipa.rst',
> >           'guides/pipeline-handler.rst',
> >           'guides/tracing.rst',
> > +        'guides/tuning/tuning.rst',
> >           'index.rst',
> >           'lens_driver_requirements.rst',
> >           'python-bindings.rst',


More information about the libcamera-devel mailing list