[libcamera-devel] Software gathering of image statistics for libcamera on Librem 5

Pavel Machek pavel at ucw.cz
Fri Dec 9 14:29:44 CET 2022


Hi!

> > Due to small ammount of computation, this is not really job for GPU.
> >
> > Thanks for rkisp pointer. I'm quite confused by that code, not knowing
> > much about rkisp. Does it have some special support in hardware, like
> > providing one stream of frames for application and second for
> > statistics gathering?
> 
> Not really through frame streams but through parameters buffers. ISPs
> do -not-produce frames (well, they do, but not from a user point of
> view) which will still be captured through a video node, but are
> rather subdevices with at least (in the simplest use case)
> 
> 1) one input video input stream (from camera sensor/CSI receiver)
> 2) one output video stream (to DMA engines)
> 3) one parameters input
> 4) one statistics output
> 
> Through 1) you queue RAW buffers which will be crunched to produce
> statistics available from 4) and apply on the frames transformation
> according to the parameters set through 3). The transformed frames
> will be sent to the memory and captured from 2).
> 
> ISP do indeed perform transformations and corrections on the images
> (debayer, awb etc) but more importantly produce statistics.

Ok, thanks for description.

> The IPA module you're looking at handles the closed phase loop  that
> upon receiving statistic from the ISP, compute the parameters for next
> frame using statistics and control values from the sensor.

IPA == "image processing algorithms", right?

> > For Librem 5 (and possibly PinePhone and similar), I'd like to snoop
> > on frames provided to the application, and set up exposure
> > accordingly. But I don't see any code in libcamera accessing image
> > data.
> 
> We don't. All statistics are computed by the ISP, as we never really
> considered software based implementation to be actually usable in
> practice and you keep getting back mentions of GPU because it's not
> only about statistics but also about performing computationally
> intensive transformations on the image buffers, such as debayering
> raw frames into formats consumable by users.
> 
> Are you working with YUV or RAW sensors ?

RAW sensors are the important ones.

> > Closest I could get is Image class (src/apps/common/image.h) used by
> > file sinks in cam application. Would it be acceptable to move that
> > into libcamera and use it to peek at the pixels?
> 
> I here assume you're using the simple pipeline handler. If you want a
> component that implements AEGC you need to receive as input the frame
> data and generate as output the sensor's control values for
> CID_EXPOSURE and CID_ANALOGUE_GAIN. I would model that as a soft-IPA
> module that receives the dmabuf of a just-completed frame, maps it,
> computes statistics and use them to generate control values for the
> sensor.

AEGC == "auto exposure gain contrast"?

Anyway, I'm a bit of worried about mapping the frames once in
libcamera and second time in application. Syscalls are probably fast
enough but it is kind of ugly.

> However, if you use RAW frames, either you do SW debayering which is
> rather intensive, or you can only produce RAW frames. If you use YUV

SW debayering for preview is not a problem. You need to scale image
down to fit the screen at the same time, so... you can do simple
thing (R, G+G/2, B).

> Also the requirement that for each frame you have to compute stats and
> generate values will stall the pipeline and reduce frame rate. There
> are several hacks you could attempt, like computing stats on every x
> frames, but it really sounds like a proof of concept at best.

Umm, better hack would be to compute stats with only part of the
buffer (and only take every n-th pixel into account). That needs to be
done, anyway, when photographer asks for "spot metering". I'm
confident good-enough statistics can be done in 10% CPU... and we have
4 CPUs available.

> All of this for AEGC, which produces values to be applied on the
> -sensor-. What about AWB ? Are you doing software AWB too ? How
> are you realizing it, by changing what parameters ?

Yes, millipixels is doing software AWB. I'd not call it important (and
don't know details), but it is "fast enough". Code is here:

https://source.puri.sm/Librem5/millipixels/-/blob/master/quickpreview.c

> I actually suggest to consider GPU, but seeing a pure software
> implementation running would be nice for a comparison. However, I
> would be hesitant in having anything like that upstream for the above
> mentioned reasons.

Well, there's no other good place to put camera support :-(. We should
not really AE/AF support in each application, and it would not be
acceptable in kernel.

Best regards,
								Pavel

-- 
People of Russia, stop Putin before his war on Ukraine escalates.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 195 bytes
Desc: not available
URL: <https://lists.libcamera.org/pipermail/libcamera-devel/attachments/20221209/686db6fc/attachment.sig>


More information about the libcamera-devel mailing list