[libcamera-devel] Colour spaces
David Plowman
david.plowman at raspberrypi.com
Sat May 22 11:55:55 CEST 2021
Hi everyone
I wanted to pick up this discussion on colour spaces again, as it
remains a bit of a gap in the API currently. I was wondering, has
anyone else had any time to consider this question?
I think a lot of this comes down to what Android requires, so I've
been trying to look that up. Unfortunately I haven't really found any
resources that show me, in practice, what camera applications do.
(Can anyone enlighten me, or point me at helpful references?)
Anyway, in the absence of clear examples, here are some of the
(possibly mistaken) impressions I've formed:
1. There's something called an "android dataspace" which seems to
cover the notion of colourspaces. It even seems to feature in each
stream of the "camera stream configuration".
2. The "dataspace" appears to be a portmanteau of various smaller
bitfields, representing standards, transfer functions, range
quantisation and so on, indeed not so very unlike V4L2.
3. Previously there was some discussion of colour transforms, forward
matrices, reference illuminants and so on. I'm still of the view that
these are things you query but don't set. In fact, I suspect they're
provided mainly for writing DNG files - not unlike what we have in
dng_writer.cpp. (Is anyone actually familiar with this?)
Any thoughts on this subject? If folks think this is broadly correct then
I'm happy trying to proceed along those lines - and otherwise, I of
course look forward to being corrected!
Thanks
David
On Tue, 12 Jan 2021 at 02:01, Laurent Pinchart
<laurent.pinchart at ideasonboard.com> wrote:
>
> Hi David,
>
> On Thu, Jan 07, 2021 at 01:46:04PM +0000, David Plowman wrote:
> > Hi everyone
> >
> > I've just found myself wondering how I would signal to libcamera that
> > I want a particular YUV colour space, such as JFIF for jpeg, BT601 for
> > SD video, and so on. I haven't spotted a way to do this... have I
> > perhaps missed something? (I notice a PixelFormatInfo class with a
> > ColourEncoding enum, but it seems limited.)
>
> You haven't missed anything, it's not there.
>
> > If there really isn't a way to do this, then I suppose all the usual
> > questions apply. Should it be a control? Or should it go in the stream
> > configuration? And what values - start with the V4L2 ones?
> >
> > Anyone have any thoughts on this?
>
> Quite a few (sorry :-)). Please keep in mind that my knowledge is
> limited regarding this topic though, so feel free to correct me when I'm
> wrong.
>
> First of all, colour space is an umbrella term that is often abused to
> mean different things, so I'd like to know which parts you're interested
> in (it may well be all of them).
>
> I'm looking at BT.709, which nicely summarizes the colour-related
> information in sections 1 and 3 (section 2 is not related to colours,
> section 4 contains related information in the quantization levels, but
> those are also present in section 3 as far as I can tell):
>
> 1. Opto-electronic conversion
>
> This specifies the opto-electronic transfer characteristics and the
> chromaticity coordinates (in CIE xyY space) of R, G, B and the D65
> reference white.
>
> 3. Signal format
>
> This specifies the transfer function (expressed as a gamma value), the
> colour encoding and the quantization.
>
>
> To obtain BT.709 from a given camera sensor, we need to take the
> sensor's colour characteristics into account to calculate the correct
> colour transformation matrix (and the tone mapping curve) to obtain the
> BT.709 primaries. This could be done inside libcamera with an API to
> specify the desired "colour space", but I don't think that's the best
> option. Exposing the colour characteristics of the sensor is needed for
> correct processing of RAW data, and with manual control of the colour
> transformation matrix and tone mapping curve, we could then achieve any
> output colour space.
>
> We already expose the colour transformation matrix, so we're missing the
> tone mapping curve as a control, and the sensor colour characteristics
> as a property. The latter are exposed in Android as a combination of a
> reference illuminant and a transform matrix. It's actually multiple
> matrices, one to map CIE XYZ to the colour space of the reference sensor
> (a.k.a. golden module) and one to map the reference sensor colour space
> to the colour space of the device's sensor - I assume the latter will be
> an identity matrix when devices don't undergo per-device calibration.
> There's also a forward matrix that maps white balanced colours (using
> the reference illuminant) from the reference sensor colour space to the
> CIE XYZ D50 white point. I'm sure I'm missing a few subtle points.
>
> This approach makes sense as it's very flexible, but it's also hard to
> use, and we really need a helper class to deal with this and compute the
> colour correction matrix for the sensor to produce standard preset
> colourspaces.
>
> I suppose we can deal with quantization in either the colour
> transformation matrix or the RGB to YUV matrix, and I'm not sure what's
> best, or even what hardware typically support. Does the colour
> transformation typically have an offset vector, or is it only a
> multiplication matrix ?
>
> --
> Regards,
>
> Laurent Pinchart
More information about the libcamera-devel
mailing list