[libcamera-devel] Colour spaces

Naushir Patuck naush at raspberrypi.com
Tue Jun 1 08:16:14 CEST 2021


Hi David,

Thank you for sketching this out.  Colourspace is indeed something that
folks using Raspberry Pi have been asking about.
I have a few questions, see inline:

On Fri, 28 May 2021 at 14:56, David Plowman <david.plowman at raspberrypi.com>
wrote:

> Hi again
>
> In the interests of trying to move this discussion forward, I'd like
> to propose an idea, based on the assumption that my previous email was
> along the right lines (debatable, I know)!
>
> So what I'd like to do is create something like this:
>
> struct ColorSpace {
>         enum class Standard {
>                 JFIF,
>                 SMPTE170M,
>                 REC709,
>                 REC2020,
>                 RAW,
>                 VIDEO
>         };
>         enum class Encoding {
>                 DEFAULT,
>                 REC601,
>                 REC709,
>                 REC2020
>         };
>

Are Standard and Encoding mutually exclusive?  Perhaps I am missing the
intention, but can we not combine them into one enum?


>         enum class TransferFunction {
>                 DEFAULT,
>                 IDENTITY,
>                 SRGB,
>                 REC709
>         };
>         enum class Range {
>                 DEFAULT,
>                 FULL,
>                 LIMITED
>         };
>

Regarding the DEFAULT item for all the enums above, should we be explicit
in what they give us?  So possibly remove the DEFAULT item,
and the fist item is always the default, like TransferFunction::IDENTITY
and Range::FULL for the above?

Thanks,
Naush


>
>         Standard standard;
>         Encoding encoding;
>         TransferFunction transferFunction;
>         Range range;
> };
>
> This is fairly minimal, though it contains everything that the Pi
> (indeed most applications?) needs. Of course, adding more cases is
> trivial.
>
> It should feel familiar to the V4L2-cognoscenti, and as far as I can
> tell, the Android "dataspace" fits too. You choose your "standard" and
> can leave everything else at "default", or else you can override them
> as you wish.
>
> It's easy to use. We'd put it into the stream configuration, and it
> would get filled in according to the stream role - RAW for raw
> streams, JFIF for stills and VIDEO for video recording. The call to
> Configure would change VIDEO to one of SMPTE170M, REC709 or REC2020
> according to resolution.
>
> The pipeline handler of course gets the chance to change the colour
> spaces according to its own internal constraints, and ultimately it
> will get passed down to the V4L2 setFormat method. But note that
> libcamera application code wouldn't need to change at all, the
> (usually) "right" thing will simply happen by default.
>
> Any thoughts on the matter? Is this a way forward or have I misunderstood
> how it all works?
>
> Thanks!
> David
>
> On Sat, 22 May 2021 at 10:55, David Plowman
> <david.plowman at raspberrypi.com> wrote:
> >
> > Hi everyone
> >
> > I wanted to pick up this discussion on colour spaces again, as it
> > remains a bit of a gap in the API currently. I was wondering, has
> > anyone else had any time to consider this question?
> >
> > I think a lot of this comes down to what Android requires, so I've
> > been trying to look that up. Unfortunately I haven't really found any
> > resources that show me, in practice, what camera applications do.
> > (Can anyone enlighten me, or point me at helpful references?)
> >
> > Anyway, in the absence of clear examples, here are some of the
> > (possibly mistaken) impressions I've formed:
> >
> > 1. There's something called an "android dataspace" which seems to
> > cover the notion of colourspaces. It even seems to feature in each
> > stream of the "camera stream configuration".
> >
> > 2. The "dataspace" appears to be a portmanteau of various smaller
> > bitfields, representing standards, transfer functions, range
> > quantisation and so on, indeed not so very unlike V4L2.
> >
> > 3. Previously there was some discussion of colour transforms, forward
> > matrices, reference illuminants and so on. I'm still of the view that
> > these are things you query but don't set. In fact, I suspect they're
> > provided mainly for writing DNG files - not unlike what we have in
> > dng_writer.cpp. (Is anyone actually familiar with this?)
> >
> > Any thoughts on this subject? If folks think this is broadly correct then
> > I'm happy trying to proceed along those lines - and otherwise, I of
> > course look forward to being corrected!
> >
> > Thanks
> > David
> >
> >
> > On Tue, 12 Jan 2021 at 02:01, Laurent Pinchart
> > <laurent.pinchart at ideasonboard.com> wrote:
> > >
> > > Hi David,
> > >
> > > On Thu, Jan 07, 2021 at 01:46:04PM +0000, David Plowman wrote:
> > > > Hi everyone
> > > >
> > > > I've just found myself wondering how I would signal to libcamera that
> > > > I want a particular YUV colour space, such as JFIF for jpeg, BT601
> for
> > > > SD video, and so on. I haven't spotted a way to do this... have I
> > > > perhaps missed something? (I notice a PixelFormatInfo class with a
> > > > ColourEncoding enum, but it seems limited.)
> > >
> > > You haven't missed anything, it's not there.
> > >
> > > > If there really isn't a way to do this, then I suppose all the usual
> > > > questions apply. Should it be a control? Or should it go in the
> stream
> > > > configuration? And what values - start with the V4L2 ones?
> > > >
> > > > Anyone have any thoughts on this?
> > >
> > > Quite a few (sorry :-)). Please keep in mind that my knowledge is
> > > limited regarding this topic though, so feel free to correct me when
> I'm
> > > wrong.
> > >
> > > First of all, colour space is an umbrella term that is often abused to
> > > mean different things, so I'd like to know which parts you're
> interested
> > > in (it may well be all of them).
> > >
> > > I'm looking at BT.709, which nicely summarizes the colour-related
> > > information in sections 1 and 3 (section 2 is not related to colours,
> > > section 4 contains related information in the quantization levels, but
> > > those are also present in section 3 as far as I can tell):
> > >
> > > 1. Opto-electronic conversion
> > >
> > > This specifies the opto-electronic transfer characteristics and the
> > > chromaticity coordinates (in CIE xyY space) of R, G, B and the D65
> > > reference white.
> > >
> > > 3. Signal format
> > >
> > > This specifies the transfer function (expressed as a gamma value), the
> > > colour encoding and the quantization.
> > >
> > >
> > > To obtain BT.709 from a given camera sensor, we need to take the
> > > sensor's colour characteristics into account to calculate the correct
> > > colour transformation matrix (and the tone mapping curve) to obtain the
> > > BT.709 primaries. This could be done inside libcamera with an API to
> > > specify the desired "colour space", but I don't think that's the best
> > > option. Exposing the colour characteristics of the sensor is needed for
> > > correct processing of RAW data, and with manual control of the colour
> > > transformation matrix and tone mapping curve, we could then achieve any
> > > output colour space.
> > >
> > > We already expose the colour transformation matrix, so we're missing
> the
> > > tone mapping curve as a control, and the sensor colour characteristics
> > > as a property. The latter are exposed in Android as a combination of a
> > > reference illuminant and a transform matrix. It's actually multiple
> > > matrices, one to map CIE XYZ to the colour space of the reference
> sensor
> > > (a.k.a. golden module) and one to map the reference sensor colour space
> > > to the colour space of the device's sensor - I assume the latter will
> be
> > > an identity matrix when devices don't undergo per-device calibration.
> > > There's also a forward matrix that maps white balanced colours (using
> > > the reference illuminant) from the reference sensor colour space to the
> > > CIE XYZ D50 white point. I'm sure I'm missing a few subtle points.
> > >
> > > This approach makes sense as it's very flexible, but it's also hard to
> > > use, and we really need a helper class to deal with this and compute
> the
> > > colour correction matrix for the sensor to produce standard preset
> > > colourspaces.
> > >
> > > I suppose we can deal with quantization in either the colour
> > > transformation matrix or the RGB to YUV matrix, and I'm not sure what's
> > > best, or even what hardware typically support. Does the colour
> > > transformation typically have an offset vector, or is it only a
> > > multiplication matrix ?
> > >
> > > --
> > > Regards,
> > >
> > > Laurent Pinchart
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.libcamera.org/pipermail/libcamera-devel/attachments/20210601/f6205da1/attachment-0001.htm>


More information about the libcamera-devel mailing list