[libcamera-devel] Colour space adjustment

David Plowman david.plowman at raspberrypi.com
Thu Sep 8 16:53:09 CEST 2022


Hi Laurent

Thanks for the reply! As you suggested, I thought it might be helpful to
demonstrate some code that I think some people might write. I'm afraid I
was thinking in Python (seems to be all I do these days!) but I don't think
it makes much difference. It's not entirely unlike some of the code you might
find in Picamera2.

Anyway, here we go. Actually all the examples are somewhat similar in
nature.

The first is a function to set the same colour space on all our streams (as
we do on the Pi, though I imagine we can't be the only platform like this).
Anyway, here's what I would have used:

def set_colour_space(camera_config, colour_space):
    for stream_config in camera_config:
        stream_config.color_space = colour_space

Here's what I think I need to use now:

RGB_FORMATS = [libcamera.PixelFormat(name) for name in ('RGB888', 'BGR888',
'"XRGB8888', 'XBGR8888')]
# have I got all the RGB formats here?

def set_colour_space(camera_config, colour_space):
    for stream_config in camera_config:
        stream_config.color_space = colour_space
        if stream_config.pixel_format in RGB_FORMATS:
            # is it YCbCrEncoding.Null or YCbCrEncoding.None?
            stream_config.color_space.ycbcrEncoding =
libcamera.ColorSpace.YCbCrEncoding.Null
            # is there a Range.Null, or is it OK to use Range.Linear even
though RGB isn't using it?
            stream_config.color_space.range =
libcamera.ColorSpace.Range.Null

In the next example, a user just wants to set the Rec2020 colour space on
both their streams, whatever they are. Previously they could have used:

camera_config.at(0).color_space = libcamera.ColorSpace.Rec2020()
camera_config.at(1).color_space = libcamera.ColorSpace.Rec2020()

but now I think this might be "more correct":

RGB_FORMATS = [libcamera.PixelFormat(name) for name in ('RGB888', 'BGR888',
'"XRGB8888', 'XBGR8888')]
# have I got all the RGB formats here?

RGB_REC2020 = libcamera.ColorSpace(libcamera.ColorSpace.Rec2020().primaries,

 ilbcamera.ColorSpace.Rec2020().transferFunction,
                                   libcamera.ColorSpace.YCbCrEncoding.Null,
                                   libcamera.ColorSpace.Range.Null) # or do
I want Linear?
# RGB_REC2020 is a version of Rec2020 specially for RGB streams

camera_config.at(0).color_space = RGB_REC2020 if
camera_config.at(0).pixel_format
in RGB_FORMATS else libcamera.ColorSpace.Rec2020
camera_config.at(1).color_space = RGB_REC2020 if
camera_config.at(1).pixel_format
in RGB_FORMATS else libcamera.ColorSpace.Rec2020

Final example. If my configuration changes, have I now got different colour
spaces? Previously:

colour_spaces = [stream_config.color_space for stream_config in
camera_config]
if camera_config.validate() ==
libcamera.CameraConfiguration.Status.Adjusted:
    for stream_config, colour_space in zip(camera_config, colour_spaces):
        if stream_config.color_space != colour_space:
            print("Oops! Colour space has changed!")

And now I think this works:

RGB_FORMATS = [libcamera.PixelFormat(name) for name in ('RGB888', 'BGR888',
'"XRGB8888', 'XBGR8888')]
# have I got all the RGB formats here?

colour_spaces = [stream_config.color_space for stream_config in
camera_config]
if camera.configure(camera_config):
    for stream_config, colour_space in zip(camera_config, colour_spaces):
        if stream_config.pixel_format in RGB_FORMATS:
            colour_space.ycbcrEncoding =
libcamera.ColorSpace.YCbCrEncoding.Null
            colour_space.range = libcamera.ColorSpace.Range.Null # or
Linear?
        if stream_config.color_space != colour_space:
            print("Oops! Colour space has changed!")

I guess my point isn't even about the extra bits of code, it's not so much
and I expect someone could make them look nicer.

I think my problem is that it feels like we've created an API with little
traps specially designed to catch out users who haven't read the
documentation very carefully. There are nasty little "have I done the right
thing?" questions that, to be honest, even I'm not totally sure about. And
I would have had to go and read up what I needed to do and spend time
trying to figure it all out when previously, this simply wasn't necessary.

Basically, it all seems very user-unfriendly to me!

Thanks - and looking forward to other people's views on this!

David


On Wed, 7 Sept 2022 at 01:32, Laurent Pinchart <
laurent.pinchart at ideasonboard.com> wrote:

> Hi David,
>
> On Tue, Sep 06, 2022 at 05:15:58PM +0100, David Plowman wrote:
> > Hi Laurent
> >
> > I thought I might try and take a step back first and discuss the ideas
> > before launching into the implementation details...
> >
> > Firstly, when I implemented the ColorSpace class I decided to include
> > all those other bits (transfer function, YCbCr matrix, range) in the
> > class because it seemed better to group things together. Of course a
> > colour space is strictly just the primaries, but given that certain
> > values are expected for those other things, it seemed good to include
> > them. Uncontroversial so far, I hope!!
>
> So far, so good. What's in a name, though, all this would be simpler if
> the term "color space" wasn't abused to mean the association of a color
> space and a color model. I don't think we can rename the world, we'll
> likely have to use "color space" in the sense you've defined with the
> ColorSpace class.
>
> > Next, looking at V4L2, because of its history it's in the position of
> > having several different ways to represent the same colour space. I
> > found this very painful and thought that "one representation for one
> > colour space" is the approach you would choose if you could. Maybe
> > that's more controversial, but I really liked the "one representation"
> > idea and grouping all the fields together, so that's where I landed -
> > for us, a ColorSpace includes all four things, and any standard
> > ColorSpace is encoded by just one combination of values.
>
> The way I see it, "standard color space" are "presets", names that can
> be associated with particular values of the four fields. The association
> is unidirectional though, from a name to the individual fields, as we
> can't prevent anyone (including standardization bodies or private
> organizations) from coming up with a different name that means the same
> thing. In that case, I would prefer using a single standard color space
> name in libcamera, without aliases. If ACME Corp. decided to use ACMERGB
> with a definition identical to sRGB, I would use sRGB through libcamera
> and ignore ACMERGB. It could perhaps be mentioned in the documentation,
> but that's all, we wouldn't have a ColorSpace::AcmeRGB.
>
> > I'm also of the view - others may disagree - that a colour space is a
> > "platonic thing". It exists "out there" and doesn't  change. So this
> > led to those constant ColorSpace values, Jpeg (now Sycc), Rec709 etc.
>
> That I don't really agree with. There are standard color spaces that are
> well-defined, yes, but a combination of four ColorSpace fields that do
> not correspond to one of the standard color spaces is still a color
> space, even if it's not standard.
>
> > and the intention that most (almost all) applications don't care about
> > the individual fields, they'll just choose one of the "standard"
> > values, and which then doesn't/shouldn't change (unless your choice is
> > unsupported).
>
> I agree that most applications won't and shouldn't need to care about
> details, yes.
>
> > The consequence of all this is that it's easy to test if you have a
> > particular colour space (for example Rec709). You use "== Rec709". If
> > a pipeline handler or an application wants to check if they still have
> > Rec709, it's just "== Rec709". One colour space, one representation.
> > But it seems to have become more complicated now.
> >
> > On a slightly different topic, but still on a philosophical note, I
> > think I disagree that sYCC doesn't make sense for RGB streams. It
> > would be like saying you shouldn't ask for an RGB stream for Rec2020
> > because Rec2020 defines a YCbCr matrix. The user would need to know
> > exactly which fields of the ColorSpace to overwrite (to force it to
> > "make sense"), or we'd have to define some kind of RGB2020 which is
> > like sRGB but for Rec2020 and say "use that one instead". I don't feel
> > like we're helping our users there!
>
> We're getting to philosophical territory indeed :-) If a ColorSpace
> means the combination of a color space (as in the chromaticities of the
> color primaries) and the processing steps used to produce a given
> digital color model (as in the transfer function, Y'CbCr encoding and
> quantiztion used to create digital Y, Cb and Cr values), then it can
> also be deemed confusing to include in the ColorSpace parameters for
> processing steps that are not applicable to the format being captured.
>
> This being said, I agree there are pros and cons in both cases, both
> when it comes to the public API and when it comes to the internal
> implementation. I would prioritize the public API over the internals to
> decide what is best, as we can always have internal helpers to help with
> the latter. We can discuss the pros and cons further.
>
> > In fact all my applications and Picamera2 use Sycc a lot. I really
> > liked the decoupling of the ColorSpace and the output encodings, and I
> > made constant use of it! In the new scheme, the ColorSpace is no
> > longer my "platonic thing", it's a collection of values that represent
> > some processing that may change depending on what the output encoding
> > is, and the net result may be the same as what you had before but you
> > have to check it for yourself.
> >
> > So anyway, it seems to me like we've lost something, but I realise
> > it's way too late to be raising all these objections now. I also take
> > the point that there's a lot of thinking behind what I originally did
> > that was never consolidated into any form of documentation, so I can
> > only apologise for that too.
>
> There's no need to apologize, and it's not too late. Nothing is set in
> stone (but if the effort to change it is large, we may ask for help
> :-)). I am sorry not to have realized that this change would make you
> unhappy, and I don't intend to ignore your feedback.
>
> > Now, to try and answer some of the other points:
> >
> > On Mon, 5 Sept 2022 at 17:49, Laurent Pinchart wrote:
> > > On Mon, Sep 05, 2022 at 05:11:08PM +0100, David Plowman wrote:
> > > > On Mon, 5 Sept 2022 at 14:59, Laurent Pinchart wrote:
> > > > > On Mon, Sep 05, 2022 at 01:37:00PM +0100, Naushir Patuck wrote:
> > > > > > Hi,
> > > > > >
> > > > > > On a related note, we are also getting this warning on the
> Raspberry Pi
> > > > > > platform with commit e297673e7686:
> > > > > >
> > > > > > WARN V4L2 v4l2_subdevice.cpp:512 'imx477 10-001a': Unknown
> subdev format
> > > > > > 0x7002, defaulting to RGB encoding
> > > > > >
> > > > > > This is because the embedded buffer stream format (i.e. a
> non-image data
> > > > > > format) does not map to a libcamera PixelFormat* at present.  In
> the past,
> > > > > > we've had similar issues and decided not to use WARN level to
> flag this.
> > > > > > Can we do the same here please?
> > > > >
> > > > > Absolutely. I recall a discussion about this, and thought we had
> solved
> > > > > the problem already, but it seems I'm wrong.
> > > > >
> > > > > > Not directly related to the logging message, but I suppose the
> same
> > > > > > question should also be asked here - should we be overwriting the
> > > > > > colorEncoding in these circumstances?
> > > > >
> > > > > It depends what you mean by "these cirscumstances". For metadata
> formats
> > > > > colorspace doesn't make sense, so we should set the colorSpace
> field to
> > > > > nullopt without any warning. For unknown media bus codes in
> general,
> > > > > that's different, as the simple pipeline handler can propagate them
> > > > > through the pipeline in a generic way without caring what they are
> > > > > exactly.
> > > > >
> > > > > The imx477 driver seems to always report V4L2_COLORSPACE_DEFAULT
> for the
> > > > > colorspace field. I think setting colorSpace to nullopt in that
> case
> > > > > would be the right thing to do. I'll prepare a patch and will CC
> you.
> > > > >
> > > > > > On Mon, 5 Sept 2022 at 13:25, David Plowman via libcamera-devel
> wrote:
> > > > > >
> > > > > > > Hi everyone
> > > > > > >
> > > > > > > We've recently checked out the latest version of libcamera and
> are having
> > > > > > > a few colour space issues. Obviously I'm sorry to be raising
> this at a
> > > > > > > stage where the commits have already been applied, but maybe
> we can think
> > > > > > > of some kind of acceptable workaround.
> > > > >
> > > > > Or even an acceptable good solution :-) Sorry for breaking it for
> you in
> > > > > the first place.
> > > > >
> > > > > > > The problems come with commit 4aa71c ("libcamera: color_space:
> Move color
> > > > > > > space adjustment to ColorSpace class"). The specific behaviour
> that causes
> > > > > > > us grief is where it overwrites the YCbCr encoding matrix to
> "None" when
> > > > > > > the pixel format is an RGB one.
> > > > > > >
> > > > > > > In the Pi's pipeline, pixels fundamentally end up as YUV and
> there's an
> > > > > > > extra conversion that happens if you want RGB. So this means
> we do need to
> > > > > > > know that YCbCr matrix.
> > > > >
> > > > > Does that mean that the ISP will internally encode RGB to YUV and
> then
> > > > > decode YUV to RGB ? If so, does the YCbCr matrix matter, or os the
> only
> > > > > requirement that both conversions should use the same matrix ?
> > > >
> > > > The matrices certainly need to be consistent (i.e. one the inverse of
> > > > the other). But in some circumstances it matters that we have the
> > > > right matrix. On the Pi, if you ask for a "low resolution" stream
> then
> > > > that's always a YUV stream, so it does need to have the correct
> > > > matrix. I'd certainly prefer to use the "correct" matrix all the
> time,
> > > > whether or not the low resolution stream is being output, and not
> make
> > > > the code "more fiddly" in this respect.
> > >
> > > I understand that as meaning that if a user requests a high resolution
> > > RGB stream and a low-resolution YUV stream, the Y'CbCr encoding will be
> > > dictated by the low-resolution stream, and the ISP has to be programmed
> > > accordingly. Is it that the ISP driver expects the Y'CbCr encoding to
> be
> > > programmed through V4L2 on the high-resolution stream in all cases ?
> >
> > No I don't think it does, but we are left having to go and check
> > exactly what happens down in the firmware before we can move on.
> > (Colour space stuff down in the firmware is one of my least favourite
> > things on the planet.)
>
> Let's do that, I'd like to have a better understanding of what the
> firmware expects, or it will be hard for me to propose good options.
>
> On a side note, any chance of moving more of the color space handling to
> the libcamera side ? :-) If the RGB2RGB matrix (or its equivalent in
> this ISP), transfer function LUT, Y'CbCr encoding matrix and tone
> mapping LUT are programmable, it would be nice to expose them and handle
> them in the IPA module.
>
> > > If that's the core of the issue, the pipeline handler should take the
> > > libcamera color space from the YUV stream and apply it as a V4L2
> > > colorspace on the video node for the RGB stream, as it's the
> > > responsibility of the pipeline handler to translate between the
> > > libcamera API and the backend API. The conversion is done in
> > > V4L2Device::fromColorSpace(), which does not take the pixel format into
> > > account, so you have full control of the color space settings there and
> > > can apply a YUV color space with an RGB pixel format to a V4L2 video
> > > node. Would that work in your case ?
> >
> > Yes probably, but I guess the gist of this email is that it's not
> > really the implementation details that I'm struggling with.
>
> We have an agreement here, but I still want to make sure we offer helper
> that match your needs (unless your needs are very, very exotic).
>
> > > If the issue is somewhere else, then I'm afraid my slow brain will need
> > > another explanation :-)
> > >
> > > > > > > More generally, I'm not sure how I feel about taking "standard
> colour
> > > > > > > spaces" and  then overwriting some parts of them, so that
> afterwards
> > > > > > > something that is equivalent to (for example) sYCC is then no
> longer ==
> > > > > > > Sycc(). Possibly I'm being a bit paranoid here, I'm not sure.
> > > > >
> > > > > From an application point of view, an RGB format should always
> have its
> > > > > YCbCrEncoding set to None, as the data isn't encoded in YCbCr. I
> don't
> > > > > really see what other good alternatives we would have.
> > > >
> > > > I think the alternative is not to touch fields that we think (rightly
> > > > or wrongly) "don't matter". But I feel this is probably more of a
> > > > symptom, not the root cause.
> > > >
> > > > The heart of my difficulty may be more that I don't like the colour
> > > > space changing because of the output format. They really should be
> > > > independent. Now, whether you regard the "true" colour space as being
> > > > "the whole ColorSpace object" or just the "the primaries field in the
> > > > ColorSpace object" is perhaps the crux of this question.
> > > >
> > > > I know that I really wanted to avoid having multiple representations
> > > > of the same colour space, which I had chosen to mean the entire
> > > > ColorSpace object (see comments near the top of colour_space.cpp). I
> > > > understand the history why V4L2 has "default" values for its fields,
> > > > but it leads to multiple representations of the same things - and
> that
> > > > just seemed really awkward to me. I'm worrying that we've now created
> > > > exactly this - whether two colour spaces are actually the same
> depends
> > > > on quite a complicated test involving the output formats at hand. To
> > > > try and illustrate this:
> > > >
> > > > 1. An application asks for sYCC and an RGB output. The YCbCr encoding
> > > > gets changed and flagged as "adjusted". Do I have to check that the
> > > > "adjustment" didn't actually change the "true" colour space (by which
> > > > I mean the "primaries" field, and maybe some others)? "adjusted"
> > > > doesn't seem so helpful any more!
> > >
> > > The adjusted flag covers the whole configuration, so you have to check
> > > the whole configuration indeed (and not just the color space). That's a
> > > limitation of the configuration API, maybe we'll want finer-grained
> > > adjust flags in the future, I don't know yet.
> >
> > Possibly, though I am missing the former world where I could just do
> > "== Sycc" or "== Rec709" to see if the colour space has changed. Now
> > they change even when it makes no difference, and I need more code to
> > work it out.
>
> Let's discuss the application use cases in more details. Could you
> provide one or two examples that show how the changes have made it more
> difficult or awkward for applications ? Feel free to start a separate
> discussion thread instead of replying inline here if that can help
> focussing on the application API.
>
> > > > 2. Suppose I want to check if I have the sYCC colour space. Now
> > > > checking "myColSpace == Sycc" isn't enough, even if Sycc is what I
> put
> > > > in. I also need to check "|| myColSpace == <something-else>" but only
> > > > if there are no YUV outputs. It just all seems more awkward than it
> > > > was before.
> > >
> > > Is that on the application side ? I don't think an application should
> > > use Sycc at all for RGB data, it should use Srgb instead, and can
> > > compare with == Srgb.
> >
> > I don't think I agree here - I talked about that above. It's also an
> > example of pushing annoying stuff onto applications which, having
> > written a few, I don't appreciate! :)
>
> As noted just above, I'd very much appreciate application code example
> to show to extent of the issue.
>
> > > > On an unrelated (and more minor) note, I'm a bit uneasy about using
> > > > the colour encoding in the PixelFormat (in ColorSpace::adjust),
> > > > because we know that for Raspberry Pi, some of them are wrong (the
> "R"
> > > > formats most obviously). I don't think this is causing any problems,
> > > > but seeing them being used always makes me a bit anxious!
> > >
> > > I need to look at your patches for the R8 format, sorry about the
> delay.
> > > I'll keep the color space in mind.
> > >
> > > > > > > Anyway, I'm not quite sure what to do about this. Is this
> behaviour now
> > > > > > > important to other platforms? I guess we have the option to
> override the
> > > > > > > validateColorSpaces method for the RPiCameraConfiguration. Or
> add another
> > > > > > > flag "don't-touch-my-colour-spaces". But I'm not sensing great
> elegance
> > > > > > > here...
> > > > > > >
> > > > > > > Any suggestions?
> > > > >
> > > > > I'm not sure to see what the proit's a bit like defining our own
> validateColorSpaces. We would simply be handling colour spaces differently.
> blem is. Could you describe what issues
> > > > > you see in the application-facing behaviour and in the V4L2-facing
> > > > > behaviour ?
> > > >
> > > > Currently we're seeing "unrecognised YCbCr encoding" warnings in the
> > > > fromColorSpace method.
> > >
> > > That was a bug, I've pushed the fix today, see commit a03ce60cf969
> > > ("libcamera: v4l2_device: Map YCbCrEncoding::None to V4L2"). I've also
> > > sent "[PATCH] libcamera: v4l2_subdevice: Silence warning for unknown
> > > metadata formats" to the list. Could you please test those ?
>
> If you have a chance to test those at some point it would be very
> appreciated, I'd like to fix the warning quickly. We can continue the
> discussion in parallel to improve the API, which will likely take a bit
> longer.
>
> > > > Probably it does the right thing "by luck", but
> > > > I think it could break if someone was recording a Rec.709 video and
> we
> > > > asked for a simultaneous RGB stream. It could clobber the correct
> > > > colour space information in the RGB stream, and then use that for
> both
> > > > streams.
> > > >
> > > > I'm wondering what the benefits would be in adapting to the new
> > > > scheme? Or if we had our own version of validateColourSpaces (which
> > > > kept the old behaviour) would we be loosing anything?
> > >
> > > It would give different behaviours to applications depending on the
> > > platform, I'm not thrilled by that. How the color spaces are used in
> the
> > > public API should be clearly documentation and consistent.
> >
> > I believe the current validateColorSpaces (if I have read it right) is
> > actually broken for us, which is why I bring it up. It will overwrite
> > the YCbCr matrix of any RGB stream to none, and then it may "share"
> > this "fixed" colour space with any YUV streams that you have. So you
> > end up with no YCbCr matrix anywhere.
>
> Indeed, I see the problem. The previous implementation wasn't right
> either though. If you tried to capture a high-resolution RGB stream in,
> for instance, ColorSpace::Sycc, and a low-resolution YUV stream in, for
> instance, ColorSpace::Rec709, it would pick Sycc as it corresponds to
> the highest resolution stream (that logic of picking the color space of
> the highest resolution stream was there before and hasn't been changed).
> Both Sycc and Rec709 have the same primaries and transfer function, they
> only differ in the Y'CbCr encoding and quantization range, so what the
> user initially requested (ignoring the fact that the encoding and
> quantization are not applicable to RGB) is achievable, but we end up
> delivering Sycc for the YUV stream.
>
> Before discussing how to fix the helper, is the above scenario a correct
> use case ?
>
> > We could either replace it in our pipeline handler, or fix it in some
> > way. Not sure what would be better there. Of course, I'd rather it
> > didn't splat my YCbCr endoings at all!
> >
> > > Let's try to decouple the two questions (how to configure the ISP
> > > through V4L2 to ensure correct color spaces in all cases, and the
> > > behaviour of the public API).
> >
> > Yes, it would be nice. As you can probably tell, implementation
> > details, though annoying, bother me less. It seems to be the loss of
> > my ideal platonic colour spaces that I am mourning. I'm sorry I only
> > woke up to all this so late.
>
> I've woken up countless times to see patches being merged and then
> panicking, it's an unpleasant feeling, so I'll just tell you it's not
> too late (but that doesn't mean we'll necessarily easily agree on how
> things should be changed :-)).
>
> --
> Regards,
>
> Laurent Pinchart
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.libcamera.org/pipermail/libcamera-devel/attachments/20220908/ade26269/attachment.htm>


More information about the libcamera-devel mailing list