[libcamera-devel] [RFC PATCH 0/2] libcamera: converter: introduce software converter for debayering and AWB

Mattijs Korpershoek mkorpershoek at baylibre.com
Wed Sep 6 11:35:00 CEST 2023


Hi Jacopo,

On mar., sept. 05, 2023 at 17:10, Jacopo Mondi <jacopo.mondi at ideasonboard.com> wrote:

> Hi Mattijs
>
> On Tue, Sep 05, 2023 at 04:01:41PM +0200, Mattijs Korpershoek wrote:
>> On mer., août 30, 2023 at 17:01, Mattijs Korpershoek <mkorpershoek at baylibre.com> wrote:
>>
>> > Hi Jacopo,
>> >
>> > On mer., août 30, 2023 at 16:38, Jacopo Mondi <jacopo.mondi at ideasonboard.com> wrote:
>> >
>> >> Hi Mattijs,
>> >>
>> >> On Wed, Aug 30, 2023 at 01:30:23PM +0200, Mattijs Korpershoek via libcamera-devel wrote:
>> >>> Hi Andrey,
>> >>>
>> >>> Thank you for this work.
>> >>>
>> >>> I'm very interested by this series, mostly because I would like to add
>> >>> YUYV -> NV12 pixel format conversion in this same class (to simplify
>> >>> some Android use cases I'm working on).
>> >>
>> >> Have you considered using libYUV for such task ? It also feels this
>> >> can be contained in the Android HAL layer where we already use libYUV ?
>> >
>> > Yes, using libYUV is what I plan to do for this.
>> >
>> > I will look into doing this only in the Android HAL layer but it feels
>> > easier to plumb this conversion as a Converter class.
>>
>> I've given this some more thought.
>>
>> If I would implement this as a Converter class but contain it in the
>> Android HAL layer, simple pipeline won't be able to use it (because we don't want
>> libcamera.so to depend on libcamera-hal.so)
>>
>> Some options I see:
>>
>> 1. keep converter_sw as part of libcamera.so and improve it with libyuv conversion
>>    -> adds libyuv dependency, which could be made optional
>>
>> 2. add an android specific pipeline in libcamera-hal.so which would be
>>    based on the simple pipeline.
>>
>> 1. Sounds more maintainable in the long run to me.
>>
>> Andrey, Bryan, Srinivas, I know this discussion is a bit out of scope of
>> the initial patch. If this generates too much noise for you let me know
>> if I should create another thread/drop you from this one.
>>
>
> In my head these are two rather distinct things. The softISP
> implementation does basically provide an IPA module, which generates
> and consume statistics to calculate the parameters used to tune the
> (software) image processing. It is plugged to the simple pipeline
> handler as a Converter and, when it comes to format conversion will
> likely do RAW->YUV through debayering and color space conversion.

Ok

>
> Your use case seems very similar to what is already implemented as the
> PostProcessor class in the Android HAL, which so far only duplicates
> YUV streams but could be expanded to support format conversion between
> YUV formats. A long time has passed since last time I worked on the
> HAL and I might be missing something of course.

Yes. Last time I tried, I could not figure out how to allocate some
additional internal buffers to do the conversion but I will give it
another try. For a media novice like myself, I have a difficult time
following everything in the Android HAL layer.


>
> IOW I don't see the softISP dealing with YUV-to-YUV format conversion.

Yes, I agree on that part. I was thinking of using the softISP as a
"reference implementation" for implementing another Converter based
class for doing my YUYV->NV12 conversion.

>
> If you really want to explore the Converter class and add a plugin to
> the Simple pipeline that produces NV12 when your system can only do
> YUYV, I feel like this would be a different effort than this one ?

I agree that if I go with the Converter class option, it makes more
sense to make a separate plugin/class.

I'll do some more work on this and I'll open another thread later on.

Thanks for your help!

>
>> >
>> >>
>> >> Thanks
>> >>    j
>> >>
>> >>>
>> >>> I'm not familiar enough with the libcamera codebase to judge how well
>> >>> this is implemented but I'm looking forward to see comments on this patchset.
>> >>>
>> >>> On dim., août 06, 2023 at 21:01, Andrey Konovalov via libcamera-devel <libcamera-devel at lists.libcamera.org> wrote:
>> >>>
>> >>> > Here is a draft implementation of Bayer demosaicing which follows the
>> >>> > Converter interface and runs on CPU.
>> >>> >
>> >>> > It also includes a naive version of Grey World AWB. Just for demo purposes
>> >>> > (to make the final image looking a bit nicer). Otherwise, Converter isn't
>> >>> > the right place for AWB - only the statistics should be gathered here,
>> >>> > and the rest belongs to an IPA module.
>> >>> >
>> >>> > Currently this software converter supports single output only, but adding
>> >>> > the second stream for statistics is under consideration.
>> >>> >
>> >>> > As libcamera::Converter currently assumes a media device underneath the
>> >>> > convertor, I wasn't able to avoid hacking the simple pipeline handler to make
>> >>> > it work with the software converter. For the same reason ConverterFactory
>> >>> > is not used for now.
>> >>> >
>> >>> > Only RAW10P format from the sensor is currently supported, but adding more
>> >>> > Bayer formats wouldn't be a problem. Out of 10 bits, only 8 most significant
>> >>> > ones are used to lessen the load on CPU. Simple bilinear interpolation is
>> >>> > used for the same reason.
>> >>> >
>> >>> > AWB simplifications:
>> >>> > - a naive implementation of "Grey World" algorithm: all pixels are used (no
>> >>> >   brightest and darkest pixels excluded from the calculations)
>> >>> > - to lessen the load on CPU, works on raw Bayer data and takes red values from
>> >>> >   red pixels, blue values from blue, and green values from green pixels only.
>> >>> > - to lessen the load on CPU, the red/green/blue gains calculated from the
>> >>> >   current frame data are applied to the next frame. These gains are purely
>> >>> >   in software (no V4L2 controls are set).
>> >>> >
>> >>> > No performance analysis or tuning have been done yet. On RB5 board this
>> >>> > software convertor gives:
>> >>> >   ~ 5 fps at 3278x2462 (camera sensor runs at 15 fps)
>> >>> >   ~ 18..19 fps at 1918x1078 (out of 30 fps)
>> >>> >   ~ 18..19 fps at 1638x1230 (out of 30 fps)
>> >>> >   ~ 30 fps at 638x478 (out of 30 fps)
>> >>> > (The resolutions above are the output ones; demosaic filter drops 1 pixel
>> >>> > from each side of the frame, so that 3280x2464 from the camera sensor becomes
>> >>> > 3278x2462 etc)
>> >>>
>> >>> Could you detail a bit how this was tested? Did you validate this using
>> >>> "cam" ?
>> >>>
>> >>> Also, is there a reason for testing this with real hardware?
>> >>>
>> >>> Can't this be validated/developped upon the Virtual Media Controller
>> >>> Driver (vimc) ?
>> >>>
>> >>> https://docs.kernel.org/admin-guide/media/vimc.html
>> >>>
>> >>> >
>> >>> > Andrey Konovalov (2):
>> >>> >   libcamera: converter: add software converter
>> >>> >   [DNI] libcamera: pipeline: simple: a hack to use sotware converter
>> >>> >     with qcom-camss
>> >>> >
>> >>> >  .../internal/converter/converter_softw.h      |  90 ++++
>> >>> >  src/libcamera/converter/converter_softw.cpp   | 430 ++++++++++++++++++
>> >>> >  src/libcamera/converter/meson.build           |   3 +-
>> >>> >  src/libcamera/pipeline/simple/simple.cpp      |  22 +-
>> >>> >  4 files changed, 542 insertions(+), 3 deletions(-)
>> >>> >  create mode 100644 include/libcamera/internal/converter/converter_softw.h
>> >>> >  create mode 100644 src/libcamera/converter/converter_softw.cpp
>> >>> >
>> >>> > --
>> >>> > 2.34.1


More information about the libcamera-devel mailing list