[libcamera-devel] [PATCH v2 00/13] libcamera: ipu3: Refactoring of ImgU

Laurent Pinchart laurent.pinchart at ideasonboard.com
Mon Jun 29 14:08:30 CEST 2020


Hello,

On Mon, Jun 29, 2020 at 12:29:40PM +0200, Tomasz Figa wrote:
> On Mon, Jun 29, 2020 at 12:22 PM Niklas Söderlund wrote:
> > On 2020-06-29 11:55:47 +0200, Tomasz Figa wrote:
> > > On Sun, Jun 28, 2020 at 2:15 AM Niklas Söderlund wrote:
> > > >
> > > > Hi,
> > > >
> > > > This series do some cleanup of the IPU3 pipeline. It removes dead or
> > > > almost dead code, replacing it with something more coherent to what is
> > > > done elsewhere in the pipeline. It breaks out the ImgU to a separate cpp
> > > > and h file.
> > > >
> > > > It reworks a bit how the streams are identified and handled inside the
> > > > pipeline. When the pipeline was first developed only the output and
> > > > viewfinder streams where available, now we have a RAW stream in addition
> > > > to this. Some of the abstraction around streams and which hardware
> > > > resource they are backed by made sens in the earlier context but not so
> > > > much once RAW is added to the mix. The biggest difference in this rework
> > > > is that the ImgUDevice gets explicit functions to configure each of its
> > > > sink instead of having a generic one which depends which pointer is
> > > > passed to it. This makes reading code where RAW and ImgU streams are
> > > > mixed much nicer IMHO.
> > >
> > > CIO2 is quite a generic capture interface, which could be handled by
> > > some platform-independent code. Have we considered splitting the
> > > pipeline into a generic capture part and platform-specific processing
> > > part? I think it could be useful on its own, as the capture part of
> > > all the SoCs could be handled by generic code, but I also suspect that
> > > the ability to split the pipeline into multiple smaller pipelines and
> > > then join them together could be very useful when dealing with many
> > > SoC generations, since vendors quite often replace some parts of the
> > > pipeline with new hardware, while keeping the others unchanged.
> > >
> > > What do you think?
> >
> > I think it is a good idea and something that have been lurking in the
> > back of my head for some time. I'm even entertaining the thought if we
> > could levy the simple-pipeline handler here somehow. I still have to dig
> > in the code to see what is possible and what is not, and what is a good
> > idea and what is not :-)

Likewise, I've been thinking we should handle it in a more generic way.
I thought there would be a chance to share code with the RPi pipeline
handler as it also handles a CSI-2 receiver and a mem-to-mem ISP, but
the driver for the CSI-2 receiver is video-node-centric there, while
CIO2 is MC-centric. I've thus postponed working on this until we get a
second pipeline handler that could share code with IPU3.

> Cool. Please let me know if I could be useful in brainstorming various
> ideas.
> 
> > Right now tho I think the most important area to try and help out
> > pipeline handlers is around the Request object and how internal buffers
> > (RAW, Statistics and Parameters) are attached to it when for example the
> > RAW buffer could come from the application or from an internal pool.
> > This will be a common problem for all pipelines and it would be nice if
> > the core could help out here.
> 
> Agreed. It was more of just a long term note. I agree that this series
> makes for a good overall improvement on its own.
> 
> > > > Lastly some assumptions that buffers must be allocated at video nodes
> > > > that are not involved in the capture session are being challenged. This
> > > > was true a year ago but not any more it seems. Chancing this simplifies
> > > > the driver enormously and saves on memory that otherwise would be
> > > > wasted. I have really tried to force the end result to failed by
> > > > resetting the hardware between each test so that no video node
> > > > configuration from a previous sessions saves the day.
> > > >
> > > > I have for both sensors reset the hardware and then tested the following
> > > > capture combinations successfully. After the first capture session for
> > > > each one in the list that was done after a power cycle all the other
> > > > captures where tried in a semi random order and always succeeded.
> > > >
> > > >   cam -c 1 -s role=viewfinder -C
> > > >   cam -c 1 -s role=still  -C
> > > >   cam -c 1 -s role=still -s role=viewfinder -C
> > > >   cam -c 1 -s role=still -s role=viewfinder -s role=stillraw -C
> > > >   cam -c 1 -s role=stillraw -C
> > > >
> > > > Niklas Söderlund (13):
> > > >   libcamera: ipu3: Remove unused name_ filed from IPU3Stream
> > > >   libcamera: ipu3: Import instead of allocate statistic buffers
> > > >   libcamera: ipu3: Always import buffers for ImgU sinks
> > > >   libcamera: ipu3: Remove usage of IPU3CameraData from ImgUDevice
> > > >   libcamera: ipu3: imgu: Move the ImgUDevice class to separate files
> > > >   libcamera: ipu3: imgu: Do not cache index
> > > >   libcamera: ipu3: imgu: Mark things that are internal as private
> > > >   libcamera: ipu3: imgu: Use specific functions to configure each sink
> > > >   libcamera: ipu3: Do not duplicate data in IPU3Stream
> > > >   libcamera: ipu3: Remove the active flag from IPU3Stream
> > > >   libcamera: ipu3: Remove IPU3Stream
> > > >   libcamera: ipu3: imgu: Remove ImgUOutput
> > > >   libcamera: ipu3: imgu: Use unique_ptr for video and subdevices
> > > >
> > > >  src/libcamera/pipeline/ipu3/imgu.cpp    | 353 +++++++++++++++
> > > >  src/libcamera/pipeline/ipu3/imgu.h      |  87 ++++
> > > >  src/libcamera/pipeline/ipu3/ipu3.cpp    | 548 +++---------------------
> > > >  src/libcamera/pipeline/ipu3/meson.build |   1 +
> > > >  4 files changed, 505 insertions(+), 484 deletions(-)
> > > >  create mode 100644 src/libcamera/pipeline/ipu3/imgu.cpp
> > > >  create mode 100644 src/libcamera/pipeline/ipu3/imgu.h
> > > >

-- 
Regards,

Laurent Pinchart


More information about the libcamera-devel mailing list