[libcamera-devel] [PATCH 1/1] libcamera: controls: Add StartupFrame control
David Plowman
david.plowman at raspberrypi.com
Mon Jul 31 16:35:36 CEST 2023
Hi again everyone!
On Tue, 11 Jul 2023 at 15:59, Naushir Patuck <naush at raspberrypi.com> wrote:
>
> On Tue, 11 Jul 2023 at 11:31, Kieran Bingham
> <kieran.bingham at ideasonboard.com> wrote:
> >
> > Quoting Naushir Patuck via libcamera-devel (2023-07-11 11:23:32)
> > > Hi all,
> > >
> > > On a semi-related topic, we talked offline about improving the drop frame
> > > support by queuing a request buffer multiple times to avoid the need for
> > > allocating internal buffers. I've tried this out and here are my findings.
> > >
> > > Firstly, to handle queuing a single buffer multiple times, I need to increase
> > > the number of cache slots in V4L2BufferCache(). Perhaps
> > > V4L2VideoDevice::importBuffers()
> > > should be updated to not take in a count parameter and we just allocate slots
> > > for the maximum buffer count possible in V4L2 (32 I think)? There has been a
> > > long-standing \todo in the RPi code to choose an appropriate value, and the
> > > maximum number is really the only appropriate value I can think of.
> >
> > I still think allocating the maximum here in the v4l2 components is
> > appropriate as they are 'cheap' ...
>
> Agree, I think 32 is the limit according to V4L2.
>
> >
> >
> >
> > > Once I got this working, unfortunately I realised this method would never
> > > actually work correctly in the common scenario where the application configures
> > > and uses a RAW stream. In such cases we would queue the RAW buffer into Unicam
> > > N times for N dropped frames. However this buffer is also imported into the ISP
> > > for processing and stats generation, all while it is also being filled by Unicam
> > > for the next sensor frame. This makes the stats entirely unusable.
> >
> >
> > Aha that's a shame. I thought the restrictions were going to be in the
> > kernel side, so at least it's interesting to know that we /can/ queue
> > the same buffer multiple times (with a distinct v4l2_buffer id) and get
> > somewhat of the expected behaviour....
> >
> > >
> > > So in the end we either have to allocate additional buffers for drop frames
> > > (like we do right now), or we implement something like this series where the
> > > application is responsible for dropping/ignoring these frames.
> >
> > Of course if the expected behaviour doesn't suit the use case ... then
> > ...
> >
> > This may all be different for pipeline's with an inline ISP though ...
> > so the research is still useful.
>
> So the question is... should we continue with this series as a possible
> improvement if the pipeline handler wants to support this control?
So yes, I would indeed like to revisit this question. I still think
it's useful for the original reasons, and the new use-case I'd like to
bring into the mix now is HDR.
One HDR method combines a number of different exposures to create the
output. Now this isn't so relevant for the Pi seeing as we have no
hardware for combining images, but you could imagine it being
important to platforms more widely. The problem comes when this HDR
mode is engaged... what do we do while we wait for all those different
exposures to come through? Because until we have one of each, we can't
really produce the image that the application is expecting.
The idea would be that requests cycle round as normal, but come with
metadata saying "I'm not ready yet" while HDR is "starting up". Note
that this could of course happen at any time now, not just when the
camera starts (or mode switches).
I still like the idea of generic "I'm not ready" metadata because
applications won't want to understand all the different ways in which
things might not be ready. Though supplementary information that
details what we're still waiting for might be helpful. Thoughts...?
Thanks!
David
>
> >
> > --
> > Kieran
> >
> > >
> > > Regards,
> > > Naush
> > >
> > >
> > >
> > >
> > >
> > > On Wed, 31 May 2023 at 13:50, David Plowman via libcamera-devel
> > > <libcamera-devel at lists.libcamera.org> wrote:
> > > >
> > > > This control is passed back in a frame as metadata to indicate whether
> > > > the camera system is still in a startup phase, and the application is
> > > > advised to avoid using the frame.
> > > >
> > > > Signed-off-by: David Plowman <david.plowman at raspberrypi.com>
> > > > ---
> > > > src/libcamera/control_ids.yaml | 15 +++++++++++++++
> > > > 1 file changed, 15 insertions(+)
> > > >
> > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml
> > > > index adea5f90..4742d907 100644
> > > > --- a/src/libcamera/control_ids.yaml
> > > > +++ b/src/libcamera/control_ids.yaml
> > > > @@ -694,6 +694,21 @@ controls:
> > > > Continuous AF is paused. No further state changes or lens movements
> > > > will occur until the AfPauseResume control is sent.
> > > >
> > > > + - StartupFrame:
> > > > + type: bool
> > > > + description: |
> > > > + The value true indicates that the camera system is still in a startup
> > > > + phase where the output images may not be reliable, or that certain of
> > > > + the control algorithms (such as AEC/AGC, AWB, and possibly others) may
> > > > + still be changing quite rapidly.
> > > > +
> > > > + Applications are advised to avoid using these frames. Mostly, they will
> > > > + occur when the camera system starts for the first time, although,
> > > > + depending on the sensor and the implementation, they could occur at
> > > > + other times.
> > > > +
> > > > + The value false indicates that this is a normal frame.
> > > > +
> > > > # ----------------------------------------------------------------------------
> > > > # Draft controls section
> > > >
> > > > --
> > > > 2.30.2
> > > >
More information about the libcamera-devel
mailing list