[PATCH v1 1/1] Documentation: Add first version of the tuning-guide
Stefan Klug
stefan.klug at ideasonboard.com
Tue Aug 13 17:49:04 CEST 2024
Hi Kieran,
Thanks again.
On Wed, Aug 07, 2024 at 11:52:10AM +0100, Kieran Bingham wrote:
> Hi Stefan,
>
> Resuming at ~50%
>
>
> Quoting Stefan Klug (2024-08-06 15:44:41)
> > This patch adds a initial version of the tuning guide for libcamera. The
> > process is established based on the imx8mp and will be improved when we
> > do more tuning for different ISPs with our own tooling.
> >
> > Signed-off-by: Stefan Klug <stefan.klug at ideasonboard.com>
> > ---
> > Documentation/conf.py | 4 +-
> > Documentation/guides/tuning/tuning.rst | 276 +++++++++++++++++++++++++
> > Documentation/index.rst | 1 +
> > Documentation/meson.build | 1 +
> > 4 files changed, 280 insertions(+), 2 deletions(-)
> > create mode 100644 Documentation/guides/tuning/tuning.rst
> >
>
> <snip>
>
> > +Overview over the process
> > +-------------------------
> > +
> > +The tuning process of libcamera consists of the following steps:
> > +
> > + 1. Take images for the tuning steps with different color temperatures
> > + 2. Configure the tuning process
> > + 3. Run the tuning script to create a corresponding tuning file
> > +
> > +
> > +Taking raw images with defined exposure/gain
> > +--------------------------------------------
> > +
> > +For all the tuning files you need to capture a raw image with a defined exposure
> > +time and gain. It is crucial that no pixels are saturated on any channel. At the
> > +same time the images should not be too dark. Strive for a exposure time, so that
> > +the brightest pixel is roughly 90% saturated. Finding the correct exposure time
> > +is currently a manual process. We are working on solutions to aid in that
> > +process. After finding the correct exposure time settings, the easiest way to
> > +capture a raw image is with the ``cam`` application that gets installed with the
> > +regular libcamera install.
> > +
> > +Create a ``constant-exposure.yaml`` file with the following content:
> > +
> > +.. code:: yaml
> > +
> > + frames:
> > + - 0:
> > + AeEnable: false
> > + ExposureTime: 1000
>
> AnalogueGain? Or should it always be expected to be set to 1 ?
>
> Even if set to 1 I'd put that in the yaml example I think so it's
> "there" and tunable/changeable.
done
>
> > +
> > +Then capture one or more images using the following command:
> > + ``cam -c 1 --script constant-exposure.yaml -C -s role=raw --file=image_#.dng``
>
> One issue I see here is that different lights from different colour
> temperatures will likely need different exposure/gain value. It's
> possibly worth stating that somewhere so there's not an assumption that
> one exposure value will all captures.
>
> My light box for instance definitely has different brightnesses across
> the different light sources.
>
> I see you've already stated we're working on improvements here above
> though so I think we can get started with this.
>
> Should we specify the stream size on the raw stream? I expect there will
> often be multiple 'modes' which may or may not be cropped. We should
> probably state that the captures should ideally be from a mode with the
> full sensor area exposed.
We enter undefined space here :). At the moment we can't adjust the LSC
for different modes. For CCM and WB the area shouldn't matter. For LSC
it does and the user should do the calibration for the mode he uses in
his application. Shall we mention that, or just fix it as soon as we can
adjust the LSC?
>
> > +
> > +
> > +Capture images for lens shading correction
> > +------------------------------------------
> > +
> > +To be able to correct lens shading artifacts, images of a homogeneously lit
> > +neutral gray background are needed. Ideally a integration sphere is used for
> > +that task.
> > +
> > +As a fallback images of a area light or a white wall can be used. If there are
>
> 'of an area light' ? I can't tell if you are talking about taking
> pictures of a flat panel led 'light' there, or if you are referring to a light
> or white wall.
>
> I assume you were meaning 'a light or white' wall.
>
> I have found that really challenging when I've tried that in fact - and
> found it far easier to get captures of a large flat panel LED with
> diffuser. Might be another method to recommend or highlight.
>
> <edit, now I see the images I realise you call an 'area light' what I
> would call a flat panel light>
I was only told that a white wall works, but in my house I don't have
such a beautifully homogeneous white wall :-). As led lights are so
cheap, I'll remove the wall and replace the area light by a flat panel
light.
I think I stuck to area light due to my interest in 3D graphics:
https://en.wikipedia.org/wiki/Computer_graphics_lighting#Area
>
>
>
> > +multiple images for the same color temperature, the tuning scripts will average
> > +the tuning results which can further improve the tuning quality.
> > +
> > +.. Note:: Currently lsc is ignored in the ccm calculation. This can lead to
> > + slight color deviations and will be improved in the future.
> > +
> > +Images shall be taken for multiple color temperatures and named
> > +``alsc_<temperature>k_#.dng``. ``#`` can be any number, if multiple images are
> > +taken for the same color temperature.
> > +
> > +.. figure:: img/setup_lens_shading.jpg
> > + :width: 50%
> > +
> > + Sample setup using an area light. Due to the wide angle lens the camera has
> > + to moved closely in front of the light. This is not a perfect solution as
> > + angle to the light might get way too large.
>
> I wouldn't say 'due to the wide angle lens' as not all cameras have wide
> angle lenses, but indeed - a draw back to this method is that cameras
> with a wide angle lens may have a field of view that captures beyond the
> light.
I'd like to express, that it is not necessarily good to move the camera
that close to the light. What about:
Sample setup using a flat panel light. In this specific setup, the camera has to
be moved close to the light due to the wide angle lens. This has
the downside that the angle to the light might get way too steep.
I also replaced 'large' with 'steep'. Sounds more natural to me.
>
>
> > +
> > +.. figure:: img/camshark-alsc.png
> > + :width: 100%
> > +
> > + A calibration image for lens shading correction.
>
> This shows using 'camshark' to view the image, which isn't too
> unreasonable as a viewer.
>
> I'm really looking forward to when that can be the capture and realtime
> adjustment tool too ... but I think it's fine to continue with cam as
> the capture tool for the moment, even if we are highlighting that the
> preview/determining the correct exposure/gains might be worth while
> using camshark for as it has fast feedback and viewing of the scene.
>
> > +
> > +- Ensure the full sensor is lit. Especially with wide angle lenses it may be
> > + difficult to get the full field of view lit homogeneously.
> > +- Ensure no color channel is saturated (can be checked with camshark)
>
> Can you be more specific here perhaps? Ensure that the brightest area of
> the image does not contain any pixels that reach 100% on any of the
> colors.
>
> Maybe we should move the % values in camshark on the color info line to
> a saturation line to be explicit (with just a 'max' of any of the
> channels perhaps)
Yes, makes sense. Max is a bit expensive though. Maybe in the histogram
when we do the calculations anyways.
>
> > +
> > +Take a raw dng image for multiple color temperatures:
> > + ``cam -c 1 --script constant-exposure.yaml -C -s role=raw --file=alsc_3200k_#.dng``
> > +
> > +
>
> I'm afraid again, I think it's important here that the
> constant-exposure.yaml should be specific to each light source!
>
I added a note and renamed the yaml to
constant-exposure-<temperature>.yaml
> I look forward to when the 'tuning tool' can analyse the image and do
> some 'automatic' AE loop on the RAW with manual controls to handle this.
>
> Maybe that's something we should already throw together in (q)cam as well...
> doesn't even have to run everyframe ... but I imagine this will be
> better handled in camshark first.
Yes, you will need some tool to either detect the macbeth chart, or to
be able to specify a position where to measure....
>
>
> > +Capture images for color calibration
> > +------------------------------------
> > +
> > +To do the color calibration, raw images of the color checker need to be taken
> > +for different light sources. These need to be named
> > +``<sensor_name>_<lux-level>l_<temperature>k_0.dng``.
> > +
> > +For best results the following hints should be taken into account:
> > + - Ensure that the 18% gray patch (third from the right in bottom line) is
> > + roughly at 18% saturation for all channels (a mean of 16-20% should be fine)
> > + - Ensure the color checker is homogeneously lit (ideally from 45 degree above)
> > + - No straylight from other light sources is present
> > + - The color checker is not too small and not too big (so that neither lens
> > + shading artifacts nor lens distortions are prevailing in the area of the
> > + color chart)
> > +
> > +If no lux meter is at hand to precisely measure the lux level, a lux meter app
> > +on a mobile phone can provide a sufficient estimation.
>
> Or ... hopefully in the future - another libcamera calibrated device ;-)
>
> (I'm imagining an RPi zero or such with a camera in our tuning boxes in
> the future)
Sure... might be risky at the same time :-)
https://deepgram.com/learn/when-ai-eats-itself
>
>
> > +.. figure:: img/setup_calibration2.jpg
> > + :width: 50%
> > +
> > +
> > +Run the tuning scripts
> > +----------------------
> > +
> > +After taking the calibration images, you should have a directory with all the
> > +tuning files. It should look something like this:
> > +
> > + ::
> > +
> > + ../tuning-data/
> > + ├── alsc_2500k_0.dng
> > + ├── alsc_2500k_1.dng
> > + ├── alsc_2500k_2.dng
> > + ├── alsc_6500k_0.dng
> > + ├── imx335_1000l_2500k_0.dng
> > + ├── imx335_1200l_4000k_0.dng
> > + ├── imx335_1600l_6000k_0.dng
>
> Why do the colour charts have imx335 but the lsc not ? Should we be more
> consistent?
Funny. The first sample files had that pattern and I didn't really
question it. We should enforce a sensor name and that it's the same on
every image. I'll put that on my todo list.
>
> > +
> > +
> > +The tuning scripts are part of the libcamera source tree. After cloning the
> > +libcamera sources the necessary steps to create a tuning file are:
> > +
> > + ::
> > +
> > + # install the necessary python packages
> > + cd libcamera
> > + python -m venv venv
> > + source ./venv/bin/activate
> > + pip install -r utils/tuning/requirements.txt
> > +
> > + # run the tuning script
> > + utils/tuning/rkisp1.py -c config.yaml -i ../tuning-data/ -o tuning-file.yaml
> > +
> > +After the tuning script has run, the tuning file can be tested with any
> > +libcamera based application like `qcam`. To quickly switch to a specific tuning
> > +file, the environment variable ``LIBCAMERA_<pipeline>_TUNING_FILE`` is helpful.
> > +E.g.:
> > +
> > + ::
> > +
> > + LIBCAMERA_RKISP1_TUNING_FILE=/path/to/tuning-file.yaml qcam -c1
> > +
> > +
> > +Sample images
> > +-------------
> > +
> > +
> > +.. figure:: img/image-no-blc.png
> > + :width: 800
> > +
> > + Image without black level correction (and completely invalid color estimation).
> > +
> > +.. figure:: img/image-no-lsc-3000.png
> > + :width: 800
> > +
> > + Image without lens shading correction @ 3000k. The vignetting artifacts can
> > + be clearly seen
> > +
> > +.. figure:: img/image-no-lsc-6000.png
> > + :width: 800
> > +
> > + Image without lens shading correction @ 6000k. The vignetting artifacts can
> > + be clearly seen
> > +
> > +.. figure:: img/image-fully-tuned-3000.png
> > + :width: 800
> > +
> > + Fully tuned image @ 3000k
> > +
> > +.. figure:: img/image-fully-tuned-6000.png
> > + :width: 800
> > +
> > + Fully tuned image @ 6000k
>
> I think we'll add more sample images and descriptions on what to look
> for to an untrained eye in the future too, so I like this section ;-)
>
> I wonder if we should highlight or 'circle' the areas of interest like
> the vignetting? (or other interesting points in future sample images).
> But I don't think we need to do this now.
>
> > +
> > diff --git a/Documentation/index.rst b/Documentation/index.rst
> > index 5442ae75dde7..991dcf2b66fb 100644
> > --- a/Documentation/index.rst
> > +++ b/Documentation/index.rst
> > @@ -16,6 +16,7 @@
> >
> > Developer Guide <guides/introduction>
> > Application Writer's Guide <guides/application-developer>
> > + Sensor Tuning Guide <guides/tuning/tuning>
> > Pipeline Handler Writer's Guide <guides/pipeline-handler>
> > IPA Writer's guide <guides/ipa>
> > Tracing guide <guides/tracing>
> > diff --git a/Documentation/meson.build b/Documentation/meson.build
> > index 30d395234952..471eabcac344 100644
> > --- a/Documentation/meson.build
> > +++ b/Documentation/meson.build
> > @@ -77,6 +77,7 @@ if sphinx.found()
> > 'guides/ipa.rst',
> > 'guides/pipeline-handler.rst',
> > 'guides/tracing.rst',
> > + 'guides/tuning/tuning.rst',
> > 'index.rst',
> > 'lens_driver_requirements.rst',
> > 'python-bindings.rst',
> > --
> > 2.43.0
>
> Oh I got there. Looking good though! Thanks!
Phew, thank you. I'll post a v2 soon.
>
> --
> Kieran
More information about the libcamera-devel
mailing list