[libcamera-devel] [PATCH] android: camera_device: Report supported JPEG sizes

Jacopo Mondi jacopo at jmondi.org
Thu Aug 6 15:23:06 CEST 2020


Hi Tomasz,

On Thu, Aug 06, 2020 at 01:04:47PM +0200, Tomasz Figa wrote:
> Hi Jacopo,
>
> On Wed, Aug 5, 2020 at 11:30 PM Jacopo Mondi <jacopo at jmondi.org> wrote:
> >
> > Hi Laurent,
> >    +Tomasz, Hiro and Han-Lin as there's a cros question at the end :)
> >    +Kieran for the Compressor interface part
> >
> > On Wed, Aug 05, 2020 at 07:53:41PM +0300, Laurent Pinchart wrote:
> > > Hi Jacopo,
> > >
> > > On Wed, Aug 05, 2020 at 05:37:45PM +0200, Jacopo Mondi wrote:
> > > > When producing the list of image resolution to claim as supported by the
> > > > camera HAL, the JPEG stream was assumed to be 'always valid' as, at the
> > > > time, there was no JPEG support in place at all.
> > > >
> > > > With the introduction of support for JPEG compression, reporting
> > > > non-valid sizes as supported obviously causes troubles.
> > > >
> > > > In order to avoid reporting non-supported resolutions as supported,
> > > > produce the list of available JPEG sizes by using the ones supported
> > > > by the YCbCr_420_888 format, from which the JPEG stream is encoded.
> > > >
> > > > Signed-off-by: Jacopo Mondi <jacopo at jmondi.org>
> > > > ---
> > > >
> > > > Patch to be applied on top of Kieran's JPEG work.
> > > > ---
> > > >  src/android/camera_device.cpp | 38 +++++++++++++++++++++++++----------
> > > >  1 file changed, 27 insertions(+), 11 deletions(-)
> > > >
> > > > diff --git a/src/android/camera_device.cpp b/src/android/camera_device.cpp
> > > > index ec8ca934842a..6a9a038a2b53 100644
> > > > --- a/src/android/camera_device.cpp
> > > > +++ b/src/android/camera_device.cpp
> > > > @@ -398,27 +398,43 @@ int CameraDevice::initializeStreamConfigurations()
> > > >              */
> > > >             formatsMap_[androidFormat] = mappedFormat;
> > > >
> > > > +           /*
> > > > +            * Stop here for JPEG streams: the JPEG supported sizes will
> > > > +            * be tested later using the here recorded non-blob stream sizes.
> > > > +            */
> > > > +           if (androidFormat == HAL_PIXEL_FORMAT_BLOB)
> > > > +                   continue;
> > > > +
> > > >             for (const Size &res : cameraResolutions) {
> > > >                     cfg.pixelFormat = mappedFormat;
> > > >                     cfg.size = res;
> > > >
> > > > -                   CameraConfiguration::Status status = cameraConfig->validate();
> > > > -                   /*
> > > > -                    * Unconditionally report we can produce JPEG.
> > > > -                    *
> > > > -                    * \todo The JPEG stream will be implemented as an
> > > > -                    * HAL-only stream, but some cameras can produce it
> > > > -                    * directly. As of now, claim support for JPEG without
> > > > -                    * inspecting where the JPEG stream is produced.
> > > > -                    */
> > > > -                   if (androidFormat != HAL_PIXEL_FORMAT_BLOB &&
> > > > -                       status != CameraConfiguration::Valid)
> > > > +                   if (cameraConfig->validate() != CameraConfiguration::Valid)
> > > >                             continue;
> > > >
> > > >                     streamConfigurations_.push_back({ res, androidFormat });
> > > >             }
> > > >     }
> > > >
> > > > +   /*
> > > > +    * Insert the JPEG sizes by using the ones recorded for YUV streams
> > > > +    * from which JPEG is produced.
> > > > +    */
> > > > +   std::vector<Camera3StreamConfiguration> jpegConfigurations;
> > > > +   jpegConfigurations.reserve(cameraResolutions.size());
> > > > +
> > > > +   for (const auto &config : streamConfigurations_) {
> > > > +           /* \todo JPEG can be produced from other formats too! */
> > >
> > > Another todo item, the android.scaler.availableStreamConfigurations
> > > documentation lists required resolutions (see
> > > https://android.googlesource.com/platform/system/media/+/refs/heads/master/camera/docs/docs.html
> > > which very annoyingly googlesource.com can't display as html in a web browser).
> > >
> > > JPEG  android.sensor.info.activeArraySize     Any
> > > JPEG  1920x1080 (1080p)                       Any     if 1080p <= activeArraySize
> > > JPEG  1280x720 (720)                          Any     if 720p <= activeArraySize
> > > JPEG  640x480 (480p)                          Any     if 480p <= activeArraySize
> > > JPEG  320x240 (240p)                          Any     if 240p <= activeArraySize
> > >
> >
> > Those are already mandatory to be supported if I'm not mistaken.
> >
> > we feed cameraResolutions for all the non-jpeg formats here:
> >
> >                 for (const Size &res : cameraResolutions) {
> >                         cfg.pixelFormat = mappedFormat;
> >                         cfg.size = res;
> >
> >                         if (cameraConfig->validate() != CameraConfiguration::Valid)
> >                                 continue;
> >
> >                         streamConfigurations_.push_back({ res, androidFormat });
> >                 }
> >
> > Then we use the sizes associated with the mandatory
> > HAL_PIXEL_FORMAT_YCbCr_420_888 to add entries for jpeg. Have I missed
> > something on this part ?
> >
> > Looking at the code now, we could create an initJpeg() functions, as
> > all the jpeg streams are basically just ignored now and
> > initialized at the end of the function.
> >
> > Maybe it's really time to break this file apart, it's already quite
> > big and terse. What if we delegate the jpeg initalization part to the
> > Compressor interface (looking at it now, we could have gone for
> > Encoder as a name maybe).
> >
> > As of now it would basically require two
> > more methods: a way to get the a metadata pack with the values of
> > ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES and ANDROID_JPEG_MAX_SIZE
> > initalized, and a vector of supported sizes given the vector of the
> > ones so far collected.
> >
> > Then we'll probably need a factory to create the right Compressor, but
> > that's for later and we should pobably first discuss how to decide
> > "which" encoder to initialize before getting to the "how".
> >
> > In example, how does it work today in CrOS and what are the
> > requirements going forward ?
>
> If you mean JPEG resolution advertisement, I think it depends on the
> camera hardware.
>
> If it's a UVC camera, it often has higher resolution capture
> capabilities with MJPEG rather than YUV. At the same time, Android
> requires YUV capture at "RECORD" resolutions, which are typically
> higher (e.g. 1920x1080) than what a UVC camera can output at
> reasonable framerate without resorting to MJPEG, so we have to
> advertise any resolution supported by MJPEG as supported for YUV
> streams and decode the MJPEG frames in the HAL to provide YUV frames
> to the user.
>
> For RAW cameras, the JPEG streams are usually provided from YUV
> streams anyway, so I believe reporting all the resolutions supported
> by YUV as supported by JPEG should be correct.

Thanks, that's usefull to know.

>
> If you mean JPEG encoder selection, we have the encoding abstracted
> using a generic userspace interface exposed from Chrome (called JEA -
> JpegEncodeAccelerator), which automatically uses whatever available in
> the system. Currently it provides 3 implementations - V4L2, VAAPI and
> software.

Yep, I meant this part. My first thought was a factory in our Camera
HAL, but the criteria to decide which encode to instantiate should be
defined first. Otherwise the alternative is for downstream to maintain
a version "which does the right thing" and I think we want to avoid
it.

Thanks
  j

>
> Best regards,
> Tomasz


More information about the libcamera-devel mailing list