[libcamera-devel] Autofocus

Kate Hsuan hpa at redhat.com
Tue Nov 16 04:47:18 CET 2021


Hi Kieran,

On Mon, Nov 15, 2021 at 7:40 PM Kieran Bingham
<kieran.bingham at ideasonboard.com> wrote:
>
> Hi all,
>
> +To: Kate, Han-Lin
>
> Kate, Han-Lin,
>
> You've both been working on this area recently. I think Kate, you might
> have joined the list after this mail was sent, I'm not sure - so you
> might not have seen this.

Yes, I can see this :)

>
> Do either of you have any feedback or comments?
>

Since the thread "Surface GO 2 VCM..." and the help from Hans, my
Surface Go 2 VCM started to work. The autofocus function MVP had been
proposed to prove the VCM can be controlled based on AF raw buffer.
There is still something that should be improved, such as the
v4l2-subdev control interface and AF configuration interface. We may
have some discussion of them to make an implementation architecture.
Now I have hardcoded everything in my AF class.

> It looks like you are all working on the same topic at the same time, so
> I think it might be beneficial to keep in touch with each other to
> prevent implementing 3 different solutions at the same time.
>

Sure, I agree with you. We could make an architecture design and
requirement analysis before implementing it.

>
> Quoting David Plowman (2021-10-21 13:29:10)
> > Hi everyone
> >
> > Another subject I wanted to address was autofocus, as we're not so
> > very far away from being able to do some work on this. In the first
> > instance, I thought, it would be worth quickly reviewing what Android
> > wants/does.
> >
> > 1. Autofocus Modes
> >
> > Android seems to define basically 2 controls, an AF "mode" and a
> > "trigger". It also reports back its state. Full details are here:
> > https://source.android.com/devices/camera/camera3_3Amodes
> >
> > To summarise, the mode has values like "off" (manual control allowed),
> > "auto" (performs a sweep when triggered) and a couple of flavours of
> > CAF (continuous autofocus). The "trigger" control simply involves a
> > "start" and "cancel". The state that gets reported back seemed
> > unsurprising ("I'm doing nothing", "I'm scanning", "I've finished
> > scanning and I'm in focus" and so on).
>
> I've never looked at autofocus or focus in detail, so I'm new to this -
> but that all sounds like reasonable expectations so far indeed.
>
>
> > Generally I thought what Android defines here was mostly fine. It
> > does seem to conflate certain ideas together all into a single
> > mode control. For example:
> >
> > * It doesn't separate out the "range" that is being searched. There's
> >   an "auto" and "auto-macro" mode, but this doesn't allow you to
> >   explicitly seratch, for example, the "full" range, or the "full but
> >   not extreme macro" range.
>
> Can the systems 'detect' when macro ranges should be used? My phone
> somehow seems to 'know' it's very close to an object and switches to
> macro automatcially. (However it uses a separate lens for that).
>
> I guess if the AF algorithm decides it's gone as close as possible?
>
>
> > * Nor can the speed of the autofocus be specified separately. It has a
> >   notion of "fast" CAF and "slow" CAF, but what about a "fast" AF
> >   sweep? I think we've done those in the past.
>
> Is the 'speed' expected to be configurable? What differences are there
> between a fast focus and a slow focus? I don't think that's an option
> I've ever seen on my phones... It just 'does it'.

>
> > I also thought the "off" mode was unhelpful, as the implication seemed
> > to be that you have to set the mode to "off" to move the lens
> > yourself. This would be irritating in the following use case:
>
> I guess this sounds like it relates a lot to the developments Paul has
> been doing on handling the control states from Android, and mapping them
> appropriately to AE / AGC / AWB etc...
>
> > - Do an AF sweep and capture a picture.
> >
> > - Then go back to preview, move the lens to hyperfocal, and wait for
> >   the next AF trigger.
>
> Is an AF trigger a 'user' event, or an automatic event? Perhaps this is
> the event that happens when a user 'touches' the screen?
>
>
> > Do you have to set the mode to "off" (and wait?) before moving the
> > lens to hyperfocal, before then going back to "auto"?  It
> > would be much easier if you could simply stay in "auto" the whole
> > time. Or perhaps it behaves like this and didn't explain it
> > clearly. But basically I thought the whole "off" mode was pointless,
> > and possibly harmful (rather like earlier Android AEC/AGC
> > discussions!).
> >
> > Android also has an "EDOF" mode which seems to be the same as
> > "off" and would therefore seem to be even more pointless!
> >
> > 2. Lens Position
> >
> > Android provides you with a "minimum focus" lens position, and a
> > hyperfocal position. It's defined that infinity is always at lens
> > position zero. A lens can be "uncalibrated", or it can be "calibrated"
> > in which case the units of these numbers are formally dioptres (that's
> > 1/metres).
> >
> > For fixed focus lenses, the minimum focus position should also be
> > identically zero; the case that the lens might not be focused at
> > infinity does not seem to feature.
> >
> > See:
> > https://developer.android.com/reference/android/hardware/camera2/CameraCharacteristics#LENS_INFO_FOCUS_DISTANCE_CALIBRATION
> >
> > One thing to be clear on is that for any camera module, not all the
> > lens driver chip range will be available owing to the module's
> > physical characteristics. And remember that there's no way to find out
> > from the hardware exactly what module is attached. I think we'd
> > envisage the useful lens driver range and hyperfocal position being
> > stored in the device tree.
>
> This sounds important indeed. So that's VCM contstraints, with further
> constraints imposed by the physical module it is contained within.
>
> I presume if we don't adhere to these constraints, we get a 'crash' and
> the VCM physically can't move to the desired position.
>
>
> > The V4L2 control can then report its useful range (rather than the
> > range supported by the driver chip), and its default position could be
> > hyperfocal. For uncalibrated lenses these can be reported as
> > properties, with a suitable transform within libcamera so that
> > infinity comes out at zero.
>
> This sounds like properties that need to be discussed on the linux-media
> mailing list as well.
>
> > For calibrated lenses I guess there needs to be at least a dioptre
> > value for the minimum focus and hyperfocal lens positions, so that
> > these can be advertised to applications. Perhaps a module database
> > would do?
>
> I guess this is leading towards something like the CameraLens class that
> has been propsed by Han-lin?
>
>
>
> > But you might in fact need a full mapping between the two. Such a
> > thing could easily live in an "rpi.autofocus" algorithm in our JSON
> > file, and other platforms would have to make their own
> > arrangements. Or could libcamera store something common?
> >
> > 3. So what should we do?
> >
> > I'd suggest the following controls:
> >
> > "AF mode" - just "auto" or "continuous"
> >
> > "AF range" - "macro", "full" or "normal" (= "full" without the macro
> > end)
> >
> > "AF speed" - "fast" or "slow"
> >
> > "AF trigger" - "start" or "cancel"
> >
> > There needs to be an "AF state" reported back in metadata, same idea
> > as the Android one. I would say we need "scanning", "focused",
> > "failed" and "reset" (when the camera has just started, or a scan was
> > cancelled, or the lens moved manually).
> >
> > "lens position" - a value, with 0 being infinity, etc.
> > Setting a value in "auto" mode would cancel any sweep that is in
> > progress. In CAF mode, I think we'd ignore it.
>
> So the way to define a 'manual' position would be to set
>
>  AFMode=auto, LensPosition=$DESIRED_POSITION ?
>
> Why would the manual positon be ignored in CAF mode?
>
> > And we want a "lens position" property giving the infinity and
> > minimum focus positions as the range, and hyperfocal as the default.
> >
> > At some point someone will want a "is the lens calibrated?" property
> > and we'll have to worry about full calibrations. But that could be a
> > "phase 2".
>
> That's an interesting idea. Being able to report if we are calibrated or
> not sounds like the same for the AE/AGC/AWB?
>
> Would that be a separate property to report back if the whole system is
> determined to be 'tuned' or not?
>
> >
> > As always, I'd love to hear other people's thoughts!
>
> I look forward to seeing this develop!
>
> --
> Kieran
>
>
> > Thanks and best regards
> > David
>

Thank you.

--
BR,
Kate



More information about the libcamera-devel mailing list