[libcamera-devel] [RFC PATCH v3 05/16] controls: Replace AeLocked with AeState, and add AeLock

paul.elder at ideasonboard.com paul.elder at ideasonboard.com
Wed Aug 25 12:20:56 CEST 2021


Hi David,

Thank you for your input.

On Tue, Aug 24, 2021 at 04:04:47PM +0100, David Plowman wrote:
> Hi Laurent
> 
> Sorry for the late reply, it seemed to get erroneously discarded by my
> human level spam filter (i.e. myself)! Apologies for that. I've also
> deleted anything with more than three "> > >" indents, I was rather
> drowning - hope that doesn't confuse anyone!
> 
> Anyway, here are some of my thoughts on the matter. In the end I
> decided to leave them at the top here, I'm not too sure how well they
> fit into all the text below.
> 
> 1. I like the idea of being able to fix some but not all of the
> AEC/AGC "levers" (e.g. gain but not exposure). Android seems not to
> allow this? But I think it's essential.
> 
> 2. We don't like unfixing these levers by writing a magic zero. There
> needs to be another mechanism.
> 
> 3. In the RPi code, we fix specific levers by writing a fixed value.
> Yet there's no way to fix a single lever at the current value, which
> is what happens to all the levers when we disable AEC/AGC. That feels
> a bit inconsistent.
> 
> 4. I think point 1 above means that fixing all levers is a natural
> state of the AEC/AGC algorithm, so there doesn't need to be any kind
> of completely separate manual mode. My personal view is therefore that
> libcamera possibly shouldn't emulate the Android behaviour here.

After a long meditation, I've reached these same points as well. I'm
glad we agree on these points.

> 
> So here's something that might work for everyone:
> 
> - Every lever has its own enable, assuming the implementation can support it.
> 
> - Disabling AEC/AGC globally is the same as disabling all the levers.
> 
> - Enabling a specific lever is obviously the new version of writing a
> zero to it!
> 
> - When you disable a lever, it gets fixed to its current value.
> 
> - Or you can disable a lever by writing a specific value, which is
> obviously then the value it gets.

And I'm working on a design that practically does all these things :)

> 
> So I think that might cover all the use cases we have? I don't really
> see the point of a separate "AeLock" control in libcamera itself, it
> simply seems unnecessary (unless I've missed something). Nonetheless,
> the Android view of the world can easily be implemented on top of
> this, including the "lock" mechanism.

We need aelock to transition seamlessly from auto to manual. If you set
AE to off and feed the values that were last computed by AE both in a
request, then the requests that were already in-flight would still have
AE on, and the values might change. Then once your first "AE off"
request gets in, it'll be a seamful change from auto to manual.

Instead, you set aelock and wait for it to propagate through the
pipeline. Once it is, you can copy those exposure values, and send a
request with those + AE off. Thus you can have seamless switching from
auto to off.

I'm thinking of having some sort of lock for each of the "sub-levers"
(as you call it, I call them "sub-controls"). But I need to figure out
and confirm that the flow will work.


Thanks,

Paul

> I expect non-Android applications will prefer to disable levers
> without specifying a value so that you just pick up the current one.
> To me this feels like the most usual situation.
> 
> But I'm always happy to be corrected or persuaded otherwise!
> 
> Thanks
> 
> David
> 
> On Mon, 16 Aug 2021 at 03:07, Laurent Pinchart
> <laurent.pinchart at ideasonboard.com> wrote:
> >
> > Hi Naush,
> >
> > On Mon, Jul 12, 2021 at 03:50:29PM +0100, Naushir Patuck wrote:
> > > On Sun, 11 Jul 2021 at 23:37, Laurent Pinchart wrote:
> > > > On Mon, Jul 05, 2021 at 06:34:07PM +0100, Naushir Patuck wrote:
> ...
> > > >
> > > > Is it ? I thought it should be AeStateConverged.
> > > >
> ...
> > > >
> > > > Please see below.
> > > >
> ...
> > > >
> > > > First of all, let me clarify the goal. The libcamera native API needs to
> > > > offer the features required to implement the Android camera HAL API, but
> > > > we don't need to just duplicate it. When there are reasons to depart
> > > > (for instance to fix historical mistakes made in Android, to provide
> > > > additional features, or just to expose an API we consider more
> > > > consistent), we should do so.
> > > >
> > > > Then, a word about the process. Implementing support for an Android
> > > > control requires plumbing it through the camera stack: definition of a
> > > > libcamera control, translation between the Android and libcamera control
> > > > in the Android camera HAL, implementation of the libcamera control in
> > > > the pipeline handlesr, and, when applicable, in the IPAs. Defining
> > > > libcamera controls, as any API design effort, takes time. To allow
> > > > moving forward on all the other parts, we have a fast-track process to
> > > > define draft controls that mimick the Android controls. All the draft
> > > > controls will be redesigned before freezing the libcamera public API.
> > > >
> > > > This means that AeLock and AeState should be marked as draft controls as
> > > > part of this patch.
> > > >
> > > > This being said, we don't need to define controls as draft controls if
> > > > we want to already go through a complete design process. The discussion
> > > > that stemmed from this patch series goes in that direction.
> > > >
> > > > It's useful, in this context, to understand why the Android AE (and AWB)
> > > > controls have been designed this way. Conceptually speaking, the
> > > > algorithms are considered, in Android, to be one level above manual
> > > > controls.
> > > >
> > > >  Controls                                                        Metadata
> > > >              /---------- == OFF ---------\
> > > >              |                           v
> > > > aeMode      -/             +------+     |\
> > > > aeLock                 --> |  AE  | --> |0\     +--------+
> > > > aeExposureCompensation     +------+     | | --> | Sensor | --> exposureTime
> > > >                                         | |     +--------+     sensitivity
> > > > exposureTime   -----------------------> |1/
> > > > sensitivity                             |/
> > > >
> > > > (The AE controls are in the adnroid.control class, while the sensor
> > > > manual control are in the android.sensor class.)
> > > >
> > > > The aeMode control selects whether the AE algorithm (when != OFF) or the
> > > > manual controls (when == OFF) drive the sensor. The aeLock control tells
> > > > the AE algorithm to freeze its output (when ON) or operate normally
> > > > (when OFF). The metadata always report the values applied to the sensor,
> > > > regardless of whether they come from the AE algorithm or from manual
> > > > controls.
> > > >
> > > > One important difference, compared to the model we currently implement,
> > > > is that the Android AE algorithm is independent from the manual sensor
> > > > controls. The values of the manual controls aren't affected by the AE
> > > > algorithm, and when AE is disabled, the last set manual control values
> > > > are applied. This thus require an aeLock control to transition
> > > > seamlessly between AE and manual mode, as the application needs to set
> > > > the manual controls to the values applied to the sensor on the last
> > > > frame that AE was on, which couldn't otherwise be done due to the
> > > > internal delays.
> > >
> > > Thank you for the detailed explanation here.  I can now better understand
> > > the Android AE state transitions and why they may be needed.
> > >
> > > > The existing libcamera AE controls are ill-defined. There's a \todo
> > > > comment to document the interactions between AeEnable and setting a
> > > > fixed value for ExposureTime or AnalogueGain. I'm actually not sure what
> > > > happens if AeEnable is set to false when ExposureTime and/or
> > > > AnalogueGain are 0, or when AeEnable is set to true when ExposureTime
> > > > and/or AnalogueGain are not 0.
> > >
> > > To provide some brief context on the Raspberry Pi AGC, we never really
> > > go into a "disabled" state at all.  This is because the AGC algorithm always
> > > provides the shutter/gain values to pass to the sensor, even in manual mode
> > > (more on this below).  We use an internal Paused state to lock the shutter/gain
> > > values so that they can never change, which essentially mimics a "disabled"
> > > state.
> > >
> > > For manual operation, we pass shutter/gain ctrl values (either or both) to the AGC,
> > > and that is what the AGC requests off the sensor.  This happens regardless of
> > > whether the AGC is in it's internal paused state or not.  If the AGC is not in
> > > paused state, and we only provide one manual control value (e.g. shutter speed),
> > > then the AGC will vary the gain as it sees fit to converge to the desired
> > > brightness.  Setting a manual shutter/gain ctrl value back to 0 hands back control
> > > of the control to the AGC and it starts adapting it as necessary to converge to
> > > the desired brightness.
> > >
> > > This type of implementation does allow us to essentially do all that Android would
> > > except of the AGC, but does also allow us the flexibility to do Aperture/Shutter/Gain
> > > priority modes if needed.
> >
> > If I understand correctly, when the AeEnable control is set to false,
> > the AGC algorithm is paused, which just means that it sets the fixed
> > shutter and gain to the last computed values. Those are the values that
> > will then be applied to the sensor, until either the AeEnable control is
> > set back to true, or the manual gain or exposure time control is set to
> > a different value.
> >
> > This leads us to the first interesting behaviour: if the application
> > keeps queuing request that contain the AeEnable control set to false, as
> > well as manual exposure and/or gain values, in order to apply the manual
> > values as expected, the Agc::SetFixedShutter() and
> > Agc::SetFixedAnalogueGain() functions copy the fixed values to the
> > status_ structure, so that pausing the algorithm won't override them.
> > There's a bit of fragility here that I'm not sure to like, as it seems
> > fairly error-prone (not in the existing Raspberry Pi IPA implementation,
> > but from a general IPA module development point of view).
> >
> > Another interesting side effect is what happens if the application then
> > sets the exposure time and/or gain to 0 while keeping AeEnable to false.
> > The AGC algorithm will resume operation in that case. We can of course
> > argue that this is a corner case and that applications shouldn't do
> > this, but it's in any case something that needs to be documented.
> >
> > Looking at how to switch from auto to manual exposure seamlessly, an
> > application would need to go through the following sequence:
> >
> > - Set AeEnable to false in a request, without setting a manual gain or
> >   exposure.
> > - Wait until metadata reports AeEnable set to false (this isn't
> >   currently implemented in the Raspberry Pi IPA), to know that the
> >   setting has taken effect.
> > - Take the reported gain and exposure time from the request that
> >   reported AeEnable set to false, and use them as starting values for
> >   manual control of the AGC.
> >
> > On Android, the equivalent sequence is documented in
> > https://ideasonboard.org/android/docs.html#controls_android.control.aeLock.
> > The sequences are quite similar, the main difference being that when
> > Android calls aeLock is called AeEnable in libcamera. This causes
> > another issue: the AeEnable control needs to be reported in metadata,
> > but its value doesn't actually indicate whether the gain and exposure
> > time are controlled automatically or not. If an application sets
> > AeEnable to true but also sets manual gain and exposure time, metadata
> > will report AeEnable to true, but the AGC will be manually controlled.
> > This will also be difficult to document in a clear way I think.
> >
> > Another point that bothers me is that treating 0 as a magic value to
> > mean automatic control prevents from reporting a minimum value for the
> > exposure time and gain to applications. The Raspberry Pi IPA report the
> > AnalogueGain control as having a 1.0 minimum value, but when we'll
> > enforce bound checking in the libcamera core (as this really shouldn't
> > be duplicated by all IPAs), this will break.
> >
> > > > We need a formal model for how libcamera algorithms interact with manual
> > > > controls. It doesn't have to duplicate the Android model, but it has to
> > > > offer at least the same features. I believe we need to take it one step
> > > > further, as we want to support exposure priority and aperture priority.
> > > > I think we should actually talk about gain/sensitivity priority instead
> > > > of aperture priority for the existing feature, and already prepare for
> > > > devices that have a controllable lens aperture.
> > > >
> > > > Now, the hard question: how should libcamera model this ? :-) Naush,
> > > > David, your feedback from a RPi point of view would be very appreciated.
> > >
> > > It would be easy for me to say do it the Raspberry Pi way :-)
> >
> > I have no ideological opposition to that. What bothers me at the moment
> > is that the interactions between the controls related to AGC are not
> > well defined in the documentation, and based on the comments above, I
> > think it will be difficult to do so in a clear way, and avoid
> > error-prone behaviours. There's also the bound checking issue for the
> > gain (and exposure time, the lower limit happens to be 0 for the
> > Raspberry Pi IPA, but that won't be the case universally) that seems
> > difficult to fix.
> >
> > I fear that documenting the behaviour of the existing controls will be
> > difficult, and will result in unclear documentation. If we want to keep
> > the existing behaviour, I'd like to see documentation that shows that my
> > fears are unfounded.
> >
> > This being said, I'm not advocating for replicating the Android
> > controls, as, as discussed previously, they don't support implementing
> > shutter or gain priority, which is an important feature. We may however
> > decide to design a different semantics for the AE controls that would
> > support all our use cases and result in simpler documentation.
> >
> > > But in reality, I don't think the Android AE model gains us an advantage over our existing
> > > model.  And having to track one less state (locked vs disabled) is always a good thing.
> > >
> > > Explicitly defining AeModes to have (amongst others) ShutterPrioiry, GainPrioiry,
> > > AperturePriority and Auto modes would be useful though.  This avoids the messiness of
> > > setting 0 to hand back manual control back to the AGC?
> >
> > Once we'll get more than two parameters to control (adding aperture to
> > shutter and gain), I think this would become a bit messy, as there will
> > be 8 combinations of what can be controlled manually vs. automatically.
> >
> > > I'm sure David will also want to share his opinions on this, but he won't be able to
> > > respond for a few days.
> >
> > If David has time to share his opinion at some point, that would be
> > helpful.
> >
> > > > Please also let me know if the above explanation is clear or if you need
> > > > more information.
> > > >
> ...


More information about the libcamera-devel mailing list