[libcamera-devel] Software gathering of image statistics for libcamera on Librem 5

Jacopo Mondi jacopo at jmondi.org
Fri Dec 9 12:48:13 CET 2022


On Fri, Dec 09, 2022 at 11:55:38AM +0100, Pavel Machek via libcamera-devel wrote:
> Hi!
>
> > > I'm playing with camera on Librem 5. So far I added af/ae/awb support
> > > to millipixels done completely in software, but I guess libcamera
> > > would be more suitable place for it. If you have some hints/example
> > > where to look at, it would be nice.
> >
> > The area you are looking at is essentially doing the role of a software
> > based ISP processor. It would be a component that a pipeline handler
> > could choose to instantiate, to support any algorithms that use the data
> > you produce. It can be difficult to produce 'generic' data that any
> > > IPA
>
> I don't need full ISP. I just need to gather enough statistics.
>
> > But how that would fit in the libcamera design would need further
> > consideration ... falling back to software ISP ... should really be
> > avoided ... but perhaps there's benefits for early bring up - or simply
> > being able to manage a simple way of getting AWB/AE for raw streams,
> > without an ISP (as we have on the RKISP1), so I think it has some
> > value.
>
> ..to do AWB/AE/AF. If I wanted to, I'd only need to access 200 pixels
> or so.
>
> > Ideally - we shouldn't rely on the CPU where possible though, and that's
> > why several people have been looking at a GPU/OpenGL based option. I
> > could imagine a software fallback would be a part of that pairing of
> > SoftwareISP component.
>
> Due to small ammount of computation, this is not really job for GPU.
>
> Thanks for rkisp pointer. I'm quite confused by that code, not knowing
> much about rkisp. Does it have some special support in hardware, like
> providing one stream of frames for application and second for
> statistics gathering?

Not really through frame streams but through parameters buffers. ISPs
do -not-produce frames (well, they do, but not from a user point of
view) which will still be captured through a video node, but are
rather subdevices with at least (in the simplest use case)

1) one input video input stream (from camera sensor/CSI receiver)
2) one output video stream (to DMA engines)
3) one parameters input
4) one statistics output

Through 1) you queue RAW buffers which will be crunched to produce
statistics available from 4) and apply on the frames transformation
according to the parameters set through 3). The transformed frames
will be sent to the memory and captured from 2).

ISP do indeed perform transformations and corrections on the images
(debayer, awb etc) but more importantly produce statistics.

The IPA module you're looking at handles the closed phase loop  that
upon receiving statistic from the ISP, compute the parameters for next
frame using statistics and control values from the sensor.

The statistics and parameters format are HW specific, that's why you
have one IPA module for ISP. Similarly the control values for sensors
are sensor specific, that's why you have CameraSensorHelpers to do the
translation to a generic format IPAs can use in src/libipa/

The format of statistics/parameters is not the only reason why you
need per-ISP modules, but also because each ISP need to be operated
differently, in the order it applyes transformation to the images, in
what operations are avaiable etc

>
> For Librem 5 (and possibly PinePhone and similar), I'd like to snoop
> on frames provided to the application, and set up exposure
> accordingly. But I don't see any code in libcamera accessing image
> data.

We don't. All statistics are computed by the ISP, as we never really
considered software based implementation to be actually usable in
practice and you keep getting back mentions of GPU because it's not
only about statistics but also about performing computationally
intensive transformations on the image buffers, such as debayering
raw frames into formats consumable by users.

Are you working with YUV or RAW sensors ?

>
> Closest I could get is Image class (src/apps/common/image.h) used by
> file sinks in cam application. Would it be acceptable to move that
> into libcamera and use it to peek at the pixels?

I here assume you're using the simple pipeline handler. If you want a
component that implements AEGC you need to receive as input the frame
data and generate as output the sensor's control values for
CID_EXPOSURE and CID_ANALOGUE_GAIN. I would model that as a soft-IPA
module that receives the dmabuf of a just-completed frame, maps it,
computes statistics and use them to generate control values for the
sensor.

However, if you use RAW frames, either you do SW debayering which is
rather intensive, or you can only produce RAW frames. If you use YUV
sensor you get to compute statistics on a frame already debayered and
processed by the sensor (which might have an AEGC module on-board) and
the result you get might be suboptimal (you should rather try to use
the on-board ISP and tune it's AGC/AWB routines in that case)

Also the requirement that for each frame you have to compute stats and
generate values will stall the pipeline and reduce frame rate. There
are several hacks you could attempt, like computing stats on every x
frames, but it really sounds like a proof of concept at best.

All of this for AEGC, which produces values to be applied on the
-sensor-. What about AWB ? Are you doing software AWB too ? How
are you realizing it, by changing what parameters ?

I actually suggest to consider GPU, but seeing a pure software
implementation running would be nice for a comparison. However, I
would be hesitant in having anything like that upstream for the above
mentioned reasons.

Thanks
  j

>
> Best regards,
> 								Pavel
> --
> People of Russia, stop Putin before his war on Ukraine escalates.


-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 833 bytes
Desc: not available
URL: <https://lists.libcamera.org/pipermail/libcamera-devel/attachments/20221209/9e2e36fd/attachment.sig>


More information about the libcamera-devel mailing list