[libcamera-devel] [PATCH] android: yuv: Fix wrong access of source buffer

Hirokazu Honda hiroh at chromium.org
Thu Aug 5 17:21:41 CEST 2021


Hi Laurent,

On Fri, Aug 6, 2021 at 12:02 AM Laurent Pinchart
<laurent.pinchart at ideasonboard.com> wrote:
>
> Hi Hiro,
>
> Thank you for the patch.
>
> On Thu, Aug 05, 2021 at 10:28:05PM +0900, Hirokazu Honda wrote:
> > File descriptors of FrameBuffer given in PostProcessor::process()
> > for all the planes point the same buffer. To access the beginning
> > of the second or later plane, it is necessary to add offsets to
> > the beginning of the buffer.
> > Fix the wrong access to the second plane of NV12 in
> > PostProcessorYuv.
>
> This is an issue that comes from V4L2, where NV12, a multi-planar
> format, can be stored with the two planes contiguously in the same
> dmabuf, or in separate dmabufs. We have two V4L2 pixel formats for those
> (for instance, for NV12, that's V4L2_PIX_FMT_NV12 and
> V4L2_PIX_FMT_NV12M), and two APIs, the V4L2 single-planar API that only
> supports the single dmabuf format, and the V4L2 multi-planar API that
> supports either formats.
>
> This patch will work with V4L2_PIX_FMT_NV12 but not with
> V4L2_PIX_FMT_NV12M. Let's not force applications to deal with this mess,
> I don't want to expose it through the libcamera API. We should instead
> handle it in the MappedBuffer class, so that maps()[1] will return the
> correct value.
>
> In order to support both NV12 and NV12M, we'll have to add an offset
> field to FrameBuffer::Plane, but in the short term, as we only support
> the single-dmabuf formats in our pipeline handlers, I think we can
> hardcode the calculation below in the MappedBuffer class (I haven't
> tried it though, so there may be something I'm missing).
>

Yeah, that is what I would like to discuss through this patch.
I agree with handling this MappedBuffer.
The current libcamera implementation doesn't handle multi-planar format.
I filed https://bugs.libcamera.org/show_bug.cgi?id=9 previously.
Shall I also add strides() to MappedBuffer or FrameBuffer?

Regards,
-Hiro
> > Signed-off-by: Hirokazu Honda <hiroh at chromium.org>
> > ---
> >  src/android/yuv/post_processor_yuv.cpp | 7 +++++--
> >  1 file changed, 5 insertions(+), 2 deletions(-)
> >
> > diff --git a/src/android/yuv/post_processor_yuv.cpp b/src/android/yuv/post_processor_yuv.cpp
> > index 772e805b..3b801e96 100644
> > --- a/src/android/yuv/post_processor_yuv.cpp
> > +++ b/src/android/yuv/post_processor_yuv.cpp
> > @@ -62,9 +62,12 @@ int PostProcessorYuv::process(const FrameBuffer &source,
> >               return -EINVAL;
> >       }
> >
> > -     int ret = libyuv::NV12Scale(sourceMapped.maps()[0].data(),
> > +     const uint8_t *sourceY = sourceMapped.maps()[0].data();
> > +     const uint8_t *sourceUV = sourceY + sourceStride_[0] * sourceSize_.height;
> > +
> > +     int ret = libyuv::NV12Scale(sourceY,
> >                                   sourceStride_[0],
> > -                                 sourceMapped.maps()[1].data(),
> > +                                 sourceUV,
> >                                   sourceStride_[1],
> >                                   sourceSize_.width, sourceSize_.height,
> >                                   destination->plane(0).data(),
>
> --
> Regards,
>
> Laurent Pinchart


More information about the libcamera-devel mailing list