[libcamera-devel] Optic handling questions

Matthias Fend matthias.fend at emfend.at
Fri Dec 16 11:39:16 CET 2022


Hello Kieran,

Am 15.12.2022 um 17:30 schrieb Kieran Bingham:
> Hi Matthias,
> 
> Quoting Matthias Fend via libcamera-devel (2022-12-15 13:43:47)
>> Hi all,
>>
>> I have some questions about handling the optic controls. It looks like
>> this has been considered, but apart from the use of a simple focus lens
>> in IPU3, it doesn't seem to be used extensively yet.
> 
> I think there's already at least a couple of attempts on the list to get
> AF in for both RPi and RKISP1.
> 
> ipa: rapsberrypi: Introduce auto-focus (auto-mode) 2022-12-01
> - https://patchwork.libcamera.org/project/libcamera/list/?series=3653
> 
> libcamera: rkisp1: Add lens control 2022-08-09
> - https://patchwork.libcamera.org/project/libcamera/list/?series=3398
> 
> ipa: rkisp1: Add autofocus algorithm 2022-07-13
> https://patchwork.libcamera.org/project/libcamera/list/?series=3276
> 
> ipa: raspberrypi: Introduce an autofocus algorithm 2022-06-13
> - https://patchwork.libcamera.org/project/libcamera/list/?series=3174
> 
> Introduce camera lens properties 2022-06-09
> - https://patchwork.libcamera.org/project/libcamera/list/?series=3166
> 
> Enabling AF algorithm to get the VCM attributes from the device driver 2022-04-26
> - https://patchwork.libcamera.org/project/libcamera/list/?series=3069
> 
> So it seems like there's lots of 'unfinished' developments that need
> some attention in some form, or rebasing, or reviewing ...
> 
> (There may be more, this was from me scanning through patchwork).

Thank you for gathering all these references!
I've looked at a few before and I don't find much of a difference in the 
concept of optical handling compared to what has already been merged.

> 
>> As I understand it, the current control flow looks something like this:
>> - IPA sets v4l2 control (e.g. V4L2_CID_FOCUS_ABSOLUTE)
> 
> IPA's shouldn't really be dealing in V4L2 controls. We're trying to make
> IPA's work in libcamera controls. (That may not yet be the case though,
> so they may still be using v4l2 controls currently).
> 
> 
>> - Control is passed to the pipeline handler
>> - Pipeline handler reads values from known v4l2 controls (e.g.
>> V4L2_CID_FOCUS_ABSOLUTE)
>> - Pipeline handler calls dedicated CameraLens methods (e.g.
>> setFocusPosition)
>> - CameraLens then actually converts to v4l2 control (e.g.
>> V4L2_CID_FOCUS_ABSOLUTE) and applies it to sub device
> 
> Yes, ultimately - the CameraLens class should be responsible for taking
> any normalised libcamera control and mapping it to any device specific
> value.
> 
> 
>> At first glance, it looks like the controls are being unnecessarily
>> converted back and forth once, and I'm wondering if the pipeline handler
>> really needs to be included in this way here.
> 
> I think the CameraLens class should be responsible for most of the
> abstractions, as it should be responsible for the underlying v4l-subdev
> for the VCM.
> 
> 
>> Currently that would mean that every pipeline handler needs to know how
>> to convert all v4l2 lens-specific controls (like V4L2_CID_ZOOM_ABSOLUTE,
>> V4L2_CID_IRIS_ABSOLUTE, ...) into CameraLens methods.
>> Since the IPAs are already working with v4l2 controls and need to know
>> all the lens details, wouldn't it be easier if those controls were
>> passed directly to the subdevice?
> 
> IPA's should be abstracted from V4L2. It's the responsibility of the
> Pipeline handler, and the helper classes (CameraLens / CameraSensor) to
> perform any conversion from libcamera types to V4L2 types.
> 
> 
>> The pipeline handler's involvement would then be reduced to finding the
>> appropriate subdevice and then transparently passing the v4l2 controls
>> from the IPA to the device.
>>
>> But maybe I'm missing something crucial here.
>> I would be happy about any hints and I would also be interested to know
>> if there are already plans in which direction this topic should develop.
> 
> 
> There's lots of prior work - but nothings had sufficient momentum to get
> completed and merged it seems.
> 
> There was some work that was blocked due to us reworking the internal
> handling of the algorithms, which is now done, so some of the above
> series need to be rebased or reworked on top of that.
> 
> There's also a recent development to change from Hyperfocal based units
> to dioptre based units for focus points - so worth checking on that too.

To make sure I got it right, let me summarize what the final goal looks 
like:
- IPAs should use libcamera specific controls to control optics.
- The controls should use normalized (e.g. 0.0-1.0) or real units (dB, mm).
- The pipeline uses a (generic) visual helper to convert the controls to 
v4l2 controls

Apart from replacing the v4l2 controls in the IPA with libcamera 
controls and using a common optic helper for all pipelines, it stays as is.

> 
> What are your targets/goals?

I think for simple use cases like VCM lenses (which I think is the most 
common setup) this is a good idea.
However, for more advanced optics with multiple lens groups, there are 
scenarios where:
1. Some of the IPA algorithms require detailed knowledge of optics
2. The IPA must position the lenses absolutely and in their native units 
(e.g. steps for stepper drivers) accurately.
3. A custom lens driver uses many lens-specific v4l2 controls to somehow 
offer its numerous individual possibilities


To 1
Examples of this could be lens light intensity versus lens group 
position that needs to be considered for AGC, or limited focus range 
versus other groups that might be of interest for AFC.
Things like that are really hard to abstract in a generic lens 
controller helper.

To 2
This may be the case when the IPA needs to move lens groups in relation 
to other lens groups or for calibration purposes.
Perhaps one could also achieve this by first converting the desired 
absolute position to the normalized range and then hoping that the chain 
eventually converts it back to the specified value when the v4l2 control 
is written.
But that seems like a somehow unreliable workaround.

To 3
Since only a limited selection of optic-specific v4l2 controls is 
available, a driver for more complex optics may be implemented in such a 
way that it uses existing controls somewhat creatively to map this. Or 
does it even have to define additional, own v42l controls for itself.
This is then probably only the case with a combination of an OOT kernel 
driver and an OOT IPA module.
The libcamera lens controller helper will probably never map all v4l2 
controls, and if it does, the abstraction most likely won't always match 
the driver's behavior.

If there were a way to pass on all v4l2 control elements unaltered from 
the IPA to the subdevice, then the cases mentioned would also be possible.
Perhaps this can be implemented as an additional or alternative way to 
the normalized and abstracted lens controller.

Abstracting the optics is certainly a good idea and will work perfectly 
for most applications. I have tried to show a few things that cannot be 
realized with it or only with great difficulty.
Since I estimate that libcamera is used more than 95% with cameras 
without moving lenses or with a simple VCM driver, I also understand 
that other cases will not be considered.
Nevertheless, I hope that I was able to bring in a few interesting aspects.

~Matthias

> 
> --
> Kieran


More information about the libcamera-devel mailing list