[libcamera-devel] [PATCH v3 4/4] pipeline: raspberrypi: Allow either strict or non-strict buffer matching

Naushir Patuck naush at raspberrypi.com
Fri Mar 5 10:02:55 CET 2021


Hi David,

On Thu, 4 Mar 2021 at 17:19, David Plowman <david.plowman at raspberrypi.com>
wrote:

> Hi Naush, Laurent
>
> On Thu, 4 Mar 2021 at 08:49, Naushir Patuck <naush at raspberrypi.com> wrote:
> >
> > Hi Laurent,
> >
> > On Thu, 4 Mar 2021 at 02:12, Laurent Pinchart <
> laurent.pinchart at ideasonboard.com> wrote:
> >>
> >> Hello,
> >>
> >> On Thu, Feb 25, 2021 at 11:17:37AM +0000, David Plowman wrote:
> >> > Another interesting case that we're looking at is a sensor with
> >> > on-board HDR. Because the sensor tonemaps the image before outputting
> >> > it, it means that the statistics we get from the ISP are no good for
> >> > the AGC/AEC algorithm - they're not linear with exposure/gain. Because
> >> > of this the sensor actually outputs some statistics of its own that we
> >> > can collect into the "embedded data" buffers in our usual way. From
> >> > there, the plan would be to pass them to our AGC/AEC.
> >> >
> >> > But anyway, this is another case where "strict" matching is important.
> >> > (Also, there will likely be changes in our pipeline handler and IPAs
> >> > in the short-to-medium term to support this kind of use-case more
> >> > straightforwardly!).
> >> >
> >> > On Mon, 22 Feb 2021 at 13:52, Naushir Patuck <naush at raspberrypi.com>
> wrote:
> >> > > On Sun, 21 Feb 2021 at 23:09, Laurent Pinchart <
> laurent.pinchart at ideasonboard.com> wrote:
> >> > >> On Thu, Feb 18, 2021 at 05:01:26PM +0000, Naushir Patuck wrote:
> >> > >> > A new flag, StrictBufferMatching, is used to control the
> behavior of
> >> > >> > the embedded and bayer buffer matching routine.
> >> > >> >
> >> > >> > If set to true (default), we reject and drop all bayer frames
> that do
> >> > >> > not have a matching embedded data buffer and vice-versa. This
> guarantees
> >> > >> > the IPA will always have the correct frame exposure and gain
> values to
> >> > >> > use.
> >> > >> >
> >> > >> > If set to false, we use bayer frames that do not have a matching
> >> > >> > embedded data buffer. In this case, IPA will use use the
> ControlList
> >> > >> > passed to it for gain and exposure values.
> >> > >> >
> >> > >> > Additionally, allow external stream buffers to behave as if
> >> > >> > StrictBufferMatching = false since we are not allowed to drop
> them.
> >> > >>
> >> > >> Could you explain the use case for both options ? Assuming an API
> for
> >> > >> applications to set this, how would an application decide ?
> >> > >
> >> > > As an example, consider a sensor that generates some kind of
> computational
> >> > > metadata (e.g. face/object detection) in the sensor silicon and
> sends it out
> >> > > via embedded data.  In such cases, the image data alone is not very
> helpful
> >> > > for the application.  An application would need both embedded data
> and image
> >> > > data buffers to do its magic, and crucially they must be in sync.
> This is where
> >> > > the StrictBufferMatching = true would be used.  We are actually
> talking to a
> >> > > vendor about this exact use case. Of course, today we do not have
> the ability to
> >> > > get to that embedded data buffer in the application, but we will in
> the future, so
> >> > > right now this hint is more for the IPA having a guarantee that
> both buffers are in
> >> > > sync.
> >> > >
> >> > > For more simpler use cases, e.g. high framerate video, we do not
> necessarily care
> >> > > about strict matching, as the embedded buffer will only be used for
> AWB/AE loops,
> >> > > and avoiding frame drops is more desirable.  These cases would use
> StrictBufferMatching = false.
> >> > > Perhaps a better strategy would be to set this to the default...?
> >> > >
> >> > > Hope that makes sense.
> >>
> >> So I understand why strict matching is important, and that's the default
> >> in this patch. Non-strict matching relaxes the requirements, but I'm not
> >> sure what impact it will have on algorithms.
> >
> >
> > Hopefully non-strict buffer matching will not have any impact on the
> algorithms :-)
> > I suppose it could possibly cause AE to go wrong in extreme cases where
> > for whatever reason, the locally stored values were incorrect.
> >
> >>
> >> Furthermore, to make this really useful, we would need a way to
> >> configure the policy at runtime, and I think it may be a hard choice for
> >> applications. I wonder if there could be a way to handle the buffer
> >> matching policy automagically ? The code hear deals with the case were
> >> no matching buffer is available, which I assume would be caused by a
> >> frame drop (either on the image stream or the embedded data stream).
> >> Would it be possible to detect such situations and handle them in an
> >> automatic way ?
> >
> >
> > The pipeline handler is able to detect a buffer that cannot be matched,
> > but after that, I don't see a way for it to know how to handle this in an
> > automatic way.  It does not know the context of the buffers or
> application,
> > e.g, if there is some computer vision embedded data that goes with the
> > image frame so if we don't have a match we should drop the frame.
> >
> >>
> >> I'm not sure what to propose exactly, but going in the
> >> direction of exposing to applications controls that tune heuristics in
> >> ill-defined ways worries me. We could easily end up with controls that
> >> applications would have no meaningful way to set. If you ask me if I
> >> need to avoid frame drops or avoid bad images, I'll tell you I need both
> >> :-) I could possibly make a decision if I knew the exact risk of frame
> >> drop, and exactly how "bad" images could be, but I don't think that's
> >> something we'll be able to express in an API. I'd really like to avoid
> >> going in this direction.
> >
> >
> > Yes, I do see your concerns here.  We do not want to put more onus on
> > the application side to manage pipelines, and in possibly a vendor
> specific
> > way.  However, I do see that more advanced users, or unique apps that
> > have a very specific purpose may want this sort of control available.
> >
> > Some time ago, we talked about allowing applications to provide "hints"
> > to the pipeline handlers on possible operating behaviors.   I was
> thinking
> > that StrictBufferMatching could be one of those hints.  Other possible
> > hints may be HDR, ZSL, high framerate, low power operating modes
> > as a few examples from the top of my head.
> >
> > We still need to be careful, as hints should not be enforceable, but act
> > only as a suggestion (hence "hints" :-)) that the pipeline handler may or
> > may not handle.
> >
> > I'm not sure that the right approach here is, or if there even is one...?
>
> I wonder if CamHelpers would help here? Particularly in view of our
> plans going forward to generalise the parsing for more types of
> metadata, the CamHelpers will have to know what the metadata is and
> could take a view on the risks of dropping frames or not.
>

While I think this could be part of the solution, I don't think it would be
complete.  I still feel there is a level of control that an application may
(optionally) want to exert on the pipeline handlers.  If, for example,
the application wants the pipeline handler to return the metadata buffer
(I know we cannot do this right now but...), then the IPA cannot make
the decision in isolation.  If you expand on the "hints" idea, you could
also see the application sending hints about mode choices, e.g. fast
fps preferred, or larger FoV preferred, etc. The IPA would not know or
care about any of these choices.

Regards,
Naush
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.libcamera.org/pipermail/libcamera-devel/attachments/20210305/db9d7a39/attachment.htm>


More information about the libcamera-devel mailing list