[libcamera-devel] Debayering

paul.elder at ideasonboard.com paul.elder at ideasonboard.com
Fri Nov 12 10:25:55 CET 2021


Hello,

On Fri, Nov 12, 2021 at 02:09:43AM +0200, Laurent Pinchart wrote:
> Hello,
> 
> On Thu, Nov 11, 2021 at 11:56:07AM +0000, Kieran Bingham wrote:
> > Hi Jacopo, Dorota,
> > 
> > + Added Siyuan and Paul to Cc directly
> > 
> > Quoting Jacopo Mondi (2021-11-10 20:32:29)
> > > Hi Dorota,
> > > 
> > > On Wed, Nov 10, 2021 at 02:01:23PM +0100, Dorota Czaplejewicz wrote:
> > > > Hi all,
> > > >
> > > > I'm making sure that Librem 5 cameras can be easily used via
> > > > libcamera. One of the steps that are needed for the "easily" part is
> > > > having a builtin debayering mechanism.
> > > >
> > > > Before I jump head-in, I wanted to ask for some guidance about where
> > > > to place this support, and how to expose it to the consumer.
> > > >
> > > > For a simplest implementation, supported cameras could add the RGB
> > > > format to the list of supported formats, and transparently convert
> > > > the bayer data to RGB if configured for RGB. That opens up a
> > > > question about the opt-in mechanism, but is otherwise simple enough.
> > 
> > I think this should be 'opt in' only by the platform (not the 'camera'
> > or 'sensor'). To me that means it's a pipeline handler decision as to
> > whether it exposes a software/GPU ISP. (a CPU ISP should only be
> > considered a shortcut to GPU accelleration later I believe).
> > 
> > The Camera would only expose the outputs supported by the capabilities
> > of the (Soft)ISP, just as on other platforms the Camera only exposes the
> > features supported by the HW ISP.
> > 
> > This can be handled by either the SimplePipelineInfo if we just use the
> > simple pipeline handler, or if the platform is more complex because of
> > necessitating a SoftISP then a dedicated pipeline handler (where the
> > SoftISP is just an internal re-usable component).
> 
> Kieran is reading my mind :-)
> 
> > > > But it comes with a downside: the raw bayer frames are lost, and the
> > > > setting cannot be changed at runtime, even though it's a pure
> > > > software operation. To overcome that, some other API than
> > > > configuration would have to be used.
> > > >
> > > > I intend to start by adding a CPU-based debayering implementation
> > > > for the sake of building it up quickly. Would a simple
> > > > implementation like I outlined above suitable for upstreaming?
> > 
> > That would depend on how it's architectured. I don't think we'd like
> > 'quick hacks' in place internally to be merged.
> > 
> > > For the record, this seems potentially similar to other 'plugins' like
> > > the software JPEG encoder we have in the HAL. It would be lovely to
> > > find a proper place where to place them :)
> > 
> > It is certainly similar, but I think there's a key difference that it is
> > more of a somewhat required functionality for a (complex) camera
> > pipeline.
> > 
> > Otherwise we may as well tell applications to just read the raw output
> > from the CSI2 receiver themselves ...
> > 
> > > I used to think at software post-processors like these more like
> > > external plugins to libcamera, something applications could easily pipe
> > > their streams through instead of receiving processed data from an
> > > additional software-generated stream from the Camera, but I would be
> > > more than happy to have my mind changed.
> > > 
> > > If I got your proposal right, the sw-debayer would be an opt-in
> > > mechanism for pipeline handlers, which in the case they cannot perform
> > > debayering through an ISP would create an additional 'virtual' stream
> > > that application can select transparently from the Camera. The ideal
> > > target would be the simple pipeline handler, as it is meant to support
> > > platforms without an ISP.
> > 
> > Agreed, the Simple Pipeline Handler could probably wrap most of the
> > work generically with a SoftISP/SoftIPA.
> > 
> > You call it a 'virtual stream' though. Are the outputs of the IPU3 IMGU
> > 'virtual'? Are these different?
> > 
> > > The first concern I have for such a design is about the processing
> > > time software debayering could take as, compared to JPEG which is
> > > triggered just for still capture requests, the debayering would need to
> > > be applied to every frame. Is this correct ?
> > 
> > Yes, I can't imagine this not being for every frame.
> > 
> > Note that even JPEG could be for every frame to support MJPEG streams
> > which are found on webcams ...
> > (but I'm not saying MJPEG should be part of this).
> > 
> > > I cannot quantify how long it could take (and I guess it also depends
> > > on the platform capabilities and the system load) but it doesn't seems
> > > impossible that it could create stalls in the pipeline, requiring the
> > > pipeline handler to drop frames and reduce the frame rate. Do you have
> > > any idea about the possible numbers here ?
> > 
> > It would certainly be lower performance if the implementation uses the
> > CPU. This could later be abstracted to use the GPU when available too,
> > internally in the SoftISP component.
> > 
> > Any low-performance CPU option IMO would just be a stepping stone to
> > getting to a hardware accelerated (GPU) version.
> > 
> > > I'm also a bit skeptical that implementing opt-in at pipeline handler
> > > level is the right decision. As I understood it, such an opt-in
> > > mechanism should be driven by applications, not be platform specific
> > > even if I understand that for the simple pipeline handler this looks
> > > like something desirable for all platforms.
> > 
> > I disagree here. I don't think this should be driven by applications.
> > 
> > Here's what we currently have...
> > 
> > ┌─────────────────────────────────────────────┐  ┌────────────────────┐
> > │  libcamera                                  │  │                    │
> > │ ┌─────────────────────────────────────────┐ │  │ Camera Application │
> > │ │      Simple Pipeline Handler            │ │  │                    │
> > │ │   ┌────────┐    ┌───────────┐           │ │  │                    │
> > │ │   │        │    │ CSI2      │           │ │  │                    │
> > │ │   │ Sensor ├───►│  Receiver ├───────────┼─┼──┼─►                  │
> > │ │   │        │    │           │           │ │  │                    │
> > │ │   └────────┘    └───────────┘           │ │  │                    │
> > │ │                                         │ │  │                    │
> > │ └─────────────────────────────────────────┘ │  │                    │
> > │                                             │  │                    │
> > └─────────────────────────────────────────────┘  └────────────────────┘
> > 
> > Although it has an 'optional' SimpleConvertor so it's also this:
> > 
> > ┌─────────────────────────────────────────────┐  ┌────────────────────┐
> > │  libcamera                                  │  │                    │
> > │ ┌─────────────────────────────────────────┐ │  │ Camera Application │
> > │ │      Simple Pipeline Handler            │ │  │                    │
> > │ │   ┌────────┐  ┌───────────┐  ┌────────┐ │ │  │                    │
> > │ │   │        │  │ CSI2      │  │Simple  │ │ │  │                    │
> > │ │   │ Sensor ├─►│  Reciever ├─►│        ├─┼─┼──┼──►                 │
> > │ │   │        │  │           │  │Convert │ │ │  │                    │
> > │ │   └────────┘  └───────────┘  └────────┘ │ │  │                    │
> > │ │                                         │ │  │                    │
> > │ └─────────────────────────────────────────┘ │  │                    │
> > │                                             │  │                    │
> > └─────────────────────────────────────────────┘  └────────────────────┘
> > 
> > And for platforms that have a raw sensor if this isn't handled by the
> > pipeline handler we would end up with this:
> > 
> > ┌─────────────────────────────────────────────┐  ┌────────────────────┐
> > │  libcamera                                  │  │                    │
> > │ ┌─────────────────────────────────────────┐ │  │ Camera Application │
> > │ │                                         │ │  │                    │
> > │ │      Simple Pipeline Handler            │ │  │                    │
> > │ │   ┌────────┐    ┌───────────┐           │ │  │   ┌──────────┐     │
> > │ │   │        │    │ CSI2      │           │ │  │   │ 3a       │     │
> > │ │   │ Sensor ├───►│  Receiver ├───────────┼─┼──┼─► │ debayer  │     │
> > │ │   │        │    │           │           │ │  │   │          │     │
> > │ │   └───▲────┘    └───────────┘           │ │  │   └─────┬────┘     │
> > │ │       │                                 │ │  │         │          │
> > │ │       └─────────────────────────────────┼─┼──┼─────────┘          │
> > │ │                                         │ │  │                    │
> > │ └─────────────────────────────────────────┘ │  │                    │
> > │                                             │  │                    │
> > └─────────────────────────────────────────────┘  └────────────────────┘
> > 
> > (even if maybe the 3a/debayer is some other library?)
> > 
> > Does that match what you have suggested? (I'm not 100% sure)
> > 
> > If you consider an application like cheese, (which granted uses our
> > gstreamer element) It doesn't know how to debayer, and it doesn't know how
> > to feedback to the sensor that "... oh ... it's a bit too bright, can
> > you lower the gain..."
> > 
> > I don't believe any layer to add that functionality should live between
> > libcamera and the application. That is functionality that an application
> > should /expect/ libcamera to be providing.
> > 
> > For any platform with no ISP - but a CSI2 receiver and a raw/bayer
> > sensor connected ... we want to do some processing. That means stats to
> > get the autoexposure/gain, and debayering. It doesn't all have to be at
> > once ... gain/exposure could be manual to start with for instance...
> > 
> > But I really believe it needs to be this with the SoftISP block being
> > similar in nature to the SimpleConvertor. A block that can be optionally
> > used if the platform 'requires' it:
> > 
> > 
> > ┌─────────────────────────────────────────────┐  ┌────────────────────┐
> > │  libcamera                                  │  │                    │
> > │                                             │  │ Camera Application │
> > │ ┌─────────────────────────────────────────┐ │  │                    │
> > │ │    Simple-ish Pipeline Handler          │ │  │                    │
> > │ │ ┌────────┐    ┌───────────┐  ┌───────┐  │ │  │                    │
> > │ │ │        │    │ CSI2      │  │ CPU   │  │ │  │                    │
> > │ │ │ Sensor ├───►│  Receiver ├─►┤  ISP  ├──┼─┼──┼─►                  │
> > │ │ │        │    │           │  │ GPU   │  │ │  │                    │
> > │ │ └────▲───┘    └───────────┘  └────┬──┘  │ │  │                    │
> > │ │      │                            │     │ │  │                    │
> > │ └──────┼────────┬───────────┬───────┼─────┘ │  │                    │
> > │        │        ├───────────┤       │       │  │                    │
> > │        └────────┤ SimpleIPA │◄──────┘       │  │                    │
> > │                 └───────────┘               │  │                    │
> > │                                             │  │                    │
> > └─────────────────────────────────────────────┘  └────────────────────┘
> > 
> > Of course this can still pass through the raw stream directly to the
> > application if that's what it really wants ... (just as we handle raw
> > streams in RPi or .. IPU3?* )
> 
> That's how I was envisioning this. It should be up to the pipeline

That's what I was imagining too.

> handler how to build a pipeline from the building blocks available on
> the platform. So far we only support dedicated hardware devices (ISPs
> for IPU3, RkISP1 and RPi, and a V4L2 memory-to-memory format converter
> and scaler for the simple pipeline handler). Some platforms that lack an
> ISP but have a powerful enough GPU should be able to use a GPU-based
> software ISP implementation, and a CPU-based reference implementation
> could also be useful for development. Further in the future, I could
> also imagine a platform with a hardware ISP using the GPU for additional
> processing, before and/or after ISP processing.
> 
> Let's also note that some platforms (I'm thinking about some i.MX SoCs)
> lack an ISP but can compute bayer statistics in hardware. The design of
> the soft ISP should take this into account, to make software statistics
> calculation optional when hardware statistics are available.

The current interface design that Siyuan has has statistics and image
processing as separate components, so this should be supported
out-of-the-box. We'll have to solidify the requirements of the
interface, though, to mandate that soft ISP implementations make sure
that those two are separate and dont't have dependencies under-the-hood.

> 
> >   *side note - do raw streams work on IPU3 currently?

afaik they do, but only the IPU3 formats (and maybe the packed 8, 10,
and 12-bit formats?).


Paul

> > 
> > I'm weary about how/who/where the additional buffers required get
> > allocated though.
> > 
> > > There are other possible developments which might be worth a
> > > comparison with.
> > > 
> > > There is an implementation of a SW ISP on the list from Siyuan. My
> > > understanding is that it should target format conversion too and not
> > > just statistics handling. I've cc-ed Paul and Siyuan in case they want
> > > to chime in.
> > 
> > Added to cc.
> > 
> > > Laurent is also working on an API to perform stream reprocessing. The
> > > usual use case is to have one RAW frame as captured from the
> > > camera 'reprocessed' through the capture pipeline. In example on
> > > Android one of the most common reprocessing is through a JPEG encoder
> > > to implement ZSL so I'm not sure if it applies 100%, but it might be
> > > worth exploring if instead of an additional virtual 'output' stream
> > > from the Camera we could be represent a debayer as an 'input' stream
> > > to the Camera.
> > 
> > I'm sure Laurent can comment here, and I'm happy to be proven wrong but
> > I'm not sure reprocessing can fit here as it's not - reprocessing - it's
> > just ... processing...
> 
> A reprocessing API could fit here in the sense that an application could
> capture raw frames and send them back to the pipeline handler for
> processing by the soft ISP, the same would it could do it on a platform
> with a hardware ISP, but the base use case is capture of processed
> frames without using a reprocessing API. That's what I think we need to
> implement first.
> 
> > > Can I ask how you have envisioned your capture pipeline ? I
> > > assume you have a raw sensor connected to a pipeline handler that has
> > > no ISP. What component would be driving the adjustment of the sensor
> > > parameters in such an architecture ?
> > 
> > I believe that's the core design and architecture of libcamera, and
> > therefore why something like the last diagram above is what should be
> > considered as a starting point when designing this.


More information about the libcamera-devel mailing list