<div dir="ltr"><div dir="ltr">Hi Laurent,</div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, 4 Mar 2021 at 02:12, Laurent Pinchart <<a href="mailto:laurent.pinchart@ideasonboard.com">laurent.pinchart@ideasonboard.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hello,<br>
<br>
On Thu, Feb 25, 2021 at 11:17:37AM +0000, David Plowman wrote:<br>
> Another interesting case that we're looking at is a sensor with<br>
> on-board HDR. Because the sensor tonemaps the image before outputting<br>
> it, it means that the statistics we get from the ISP are no good for<br>
> the AGC/AEC algorithm - they're not linear with exposure/gain. Because<br>
> of this the sensor actually outputs some statistics of its own that we<br>
> can collect into the "embedded data" buffers in our usual way. From<br>
> there, the plan would be to pass them to our AGC/AEC.<br>
> <br>
> But anyway, this is another case where "strict" matching is important.<br>
> (Also, there will likely be changes in our pipeline handler and IPAs<br>
> in the short-to-medium term to support this kind of use-case more<br>
> straightforwardly!).<br>
> <br>
> On Mon, 22 Feb 2021 at 13:52, Naushir Patuck <<a href="mailto:naush@raspberrypi.com" target="_blank">naush@raspberrypi.com</a>> wrote:<br>
> > On Sun, 21 Feb 2021 at 23:09, Laurent Pinchart <<a href="mailto:laurent.pinchart@ideasonboard.com" target="_blank">laurent.pinchart@ideasonboard.com</a>> wrote:<br>
> >> On Thu, Feb 18, 2021 at 05:01:26PM +0000, Naushir Patuck wrote:<br>
> >> > A new flag, StrictBufferMatching, is used to control the behavior of<br>
> >> > the embedded and bayer buffer matching routine.<br>
> >> ><br>
> >> > If set to true (default), we reject and drop all bayer frames that do<br>
> >> > not have a matching embedded data buffer and vice-versa. This guarantees<br>
> >> > the IPA will always have the correct frame exposure and gain values to<br>
> >> > use.<br>
> >> ><br>
> >> > If set to false, we use bayer frames that do not have a matching<br>
> >> > embedded data buffer. In this case, IPA will use use the ControlList<br>
> >> > passed to it for gain and exposure values.<br>
> >> ><br>
> >> > Additionally, allow external stream buffers to behave as if<br>
> >> > StrictBufferMatching = false since we are not allowed to drop them.<br>
> >><br>
> >> Could you explain the use case for both options ? Assuming an API for<br>
> >> applications to set this, how would an application decide ?<br>
> ><br>
> > As an example, consider a sensor that generates some kind of computational<br>
> > metadata (e.g. face/object detection) in the sensor silicon and sends it out<br>
> > via embedded data. In such cases, the image data alone is not very helpful<br>
> > for the application. An application would need both embedded data and image<br>
> > data buffers to do its magic, and crucially they must be in sync. This is where<br>
> > the StrictBufferMatching = true would be used. We are actually talking to a<br>
> > vendor about this exact use case. Of course, today we do not have the ability to<br>
> > get to that embedded data buffer in the application, but we will in the future, so<br>
> > right now this hint is more for the IPA having a guarantee that both buffers are in<br>
> > sync.<br>
> ><br>
> > For more simpler use cases, e.g. high framerate video, we do not necessarily care<br>
> > about strict matching, as the embedded buffer will only be used for AWB/AE loops,<br>
> > and avoiding frame drops is more desirable. These cases would use StrictBufferMatching = false.<br>
> > Perhaps a better strategy would be to set this to the default...?<br>
> ><br>
> > Hope that makes sense.<br>
<br>
So I understand why strict matching is important, and that's the default<br>
in this patch. Non-strict matching relaxes the requirements, but I'm not<br>
sure what impact it will have on algorithms.<br></blockquote><div><br></div><div>Hopefully non-strict buffer matching will not have any impact on the algorithms :-)</div><div>I suppose it could possibly cause AE to go wrong in extreme cases where</div><div>for whatever reason, the locally stored values were incorrect.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
Furthermore, to make this really useful, we would need a way to<br>
configure the policy at runtime, and I think it may be a hard choice for<br>
applications. I wonder if there could be a way to handle the buffer<br>
matching policy automagically ? The code hear deals with the case were<br>
no matching buffer is available, which I assume would be caused by a<br>
frame drop (either on the image stream or the embedded data stream).<br>
Would it be possible to detect such situations and handle them in an<br>
automatic way ? </blockquote><div><br></div><div>The pipeline handler is able to detect a buffer that cannot be matched,</div><div>but after that, I don't see a way for it to know how to handle this in an</div><div>automatic way. It does not know the context of the buffers or application,</div><div>e.g, if there is some computer vision embedded data that goes with the</div><div>image frame so if we don't have a match we should drop the frame.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">I'm not sure what to propose exactly, but going in the<br>
direction of exposing to applications controls that tune heuristics in<br>
ill-defined ways worries me. We could easily end up with controls that<br>
applications would have no meaningful way to set. If you ask me if I<br>
need to avoid frame drops or avoid bad images, I'll tell you I need both<br>
:-) I could possibly make a decision if I knew the exact risk of frame<br>
drop, and exactly how "bad" images could be, but I don't think that's<br>
something we'll be able to express in an API. I'd really like to avoid<br>
going in this direction.<br></blockquote><div><br></div><div>Yes, I do see your concerns here. We do not want to put more onus on</div><div>the application side to manage pipelines, and in possibly a vendor specific</div><div>way. However, I do see that more advanced users, or unique apps that</div><div>have a very specific purpose may want this sort of control available. <br></div><div><br></div><div>Some time ago, we talked about allowing applications to provide "hints"</div><div>to the pipeline handlers on possible operating behaviors. I was thinking</div><div>that StrictBufferMatching could be one of those hints. Other possible</div><div>hints may be HDR, ZSL, high framerate, low power operating modes</div><div>as a few examples from the top of my head. </div><div><br></div><div>We still need to be careful, as hints should not be enforceable, but act</div><div>only as a suggestion (hence "hints" :-)) that the pipeline handler may or</div><div>may not handle.</div><div><br></div><div>I'm not sure that the right approach here is, or if there even is one...?</div><div><br></div><div>Regards,</div><div>Naush</div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">> >> > Signed-off-by: Naushir Patuck <<a href="mailto:naush@raspberrypi.com" target="_blank">naush@raspberrypi.com</a>><br>
> >> > ---<br>
> >> > .../pipeline/raspberrypi/raspberrypi.cpp | 31 ++++++++++++++++---<br>
> >> > 1 file changed, 26 insertions(+), 5 deletions(-)<br>
> >> ><br>
> >> > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp<br>
> >> > index d969c77993eb..7f66d6995176 100644<br>
> >> > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp<br>
> >> > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp<br>
> >> > @@ -46,6 +46,22 @@ namespace libcamera {<br>
> >> ><br>
> >> > LOG_DEFINE_CATEGORY(RPI)<br>
> >> ><br>
> >> > +/*<br>
> >> > + * Set this to true to reject and drop all bayer frames that do not have a<br>
> >> > + * matching embedded data buffer and vice-versa. This guarantees the IPA will<br>
> >> > + * always have the correct frame exposure and gain values to use.<br>
> >> > + *<br>
> >> > + * Set this to false to use bayer frames that do not have a matching embedded<br>
> >> > + * data buffer. In this case, IPA will use use our local history for gain and<br>
> >> > + * exposure values, occasional frame drops may cause these number to be out of<br>
> >> > + * sync for a short period.<br>
> >> > + *<br>
> >> > + * \todo Ideally, this property should be set by the application, but we don't<br>
> >> > + * have any mechanism to pass generic properties into a pipeline handler at<br>
> >> > + * present.<br>
> >> > + */<br>
> >> > +static const bool StrictBufferMatching = true;<br>
> >> > +<br>
> >> > namespace {<br>
> >> ><br>
> >> > bool isRaw(PixelFormat &pixFmt)<br>
> >> > @@ -1724,13 +1740,13 @@ bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em<br>
> >> > embeddedBuffer = nullptr;<br>
> >> > while (!embeddedQueue_.empty()) {<br>
> >> > FrameBuffer *b = embeddedQueue_.front();<br>
> >> > - if (!unicam_[Unicam::Embedded].isExternal() && b->metadata().timestamp < ts) {<br>
> >> > + if (b->metadata().timestamp < ts) {<br>
> >> > embeddedQueue_.pop();<br>
> >> > unicam_[Unicam::Embedded].queueBuffer(b);<br>
> >> > embeddedRequeueCount++;<br>
> >> > LOG(RPI, Warning) << "Dropping unmatched input frame in stream "<br>
> >> > << unicam_[Unicam::Embedded].name();<br>
> >> > - } else if (unicam_[Unicam::Embedded].isExternal() || b->metadata().timestamp == ts) {<br>
> >> > + } else if (b->metadata().timestamp == ts) {<br>
> >> > /* We pop the item from the queue lower down. */<br>
> >> > embeddedBuffer = b;<br>
> >> > break;<br>
> >> > @@ -1744,10 +1760,15 @@ bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em<br>
> >> ><br>
> >> > LOG(RPI, Debug) << "Could not find matching embedded buffer";<br>
> >> ><br>
> >> > - if (!sensorMetadata_) {<br>
> >> > + if (unicam_[Unicam::Embedded].isExternal() ||<br>
> >> > + !StrictBufferMatching || !sensorMetadata_) {<br>
> >> > /*<br>
> >> > - * If there is no sensor metadata, simply return the<br>
> >> > - * first bayer frame in the queue.<br>
> >> > + * If any of the follwing is true:<br>
> >> > + * - This is an external stream buffer (so cannot be dropped).<br>
> >> > + * - We do not care about strict buffer matching.<br>
> >> > + * - There is no sensor metadata present.<br>
> >> > + * we can simply return the first bayer frame in the queue<br>
> >> > + * without a matching embedded buffer.<br>
> >> > */<br>
> >> > LOG(RPI, Debug) << "Returning bayer frame without a match";<br>
> >> > bayerQueue_.pop();<br>
<br>
-- <br>
Regards,<br>
<br>
Laurent Pinchart<br>
</blockquote></div></div>