[libcamera-devel] [PATCH] libcamera: camera: Document the camera and pipeline model

Laurent Pinchart laurent.pinchart at ideasonboard.com
Tue Nov 24 09:52:42 CET 2020


Hi Kieran,

On Mon, Nov 09, 2020 at 02:44:35PM +0000, Kieran Bingham wrote:
> On 03/11/2020 02:05, Laurent Pinchart wrote:
> > Introduce a pipeline model that lists the operations applied by the
> > camera pipeline. This is a first step towards defining explicitly how
> > the camrera processes images, and how the libcamera controls affect the
> 
> s/camrera/camera/
> 
> > processing.
> > 
> > The initial list of operations is meant to be expanded, and possibly
> > refactored (a block diagram should also be considered to make this
> > easier to read). How the controls affect the pipeline is largely missing
> > at this stage, with only a short explanation of the digital zoom to show
> > how this is meant to be documented. More documentation will be added
> > over time.
> > 
> > Signed-off-by: Laurent Pinchart <laurent.pinchart at ideasonboard.com>
> > ---
> >  src/libcamera/camera.cpp | 92 +++++++++++++++++++++++++++++++++-------
> >  1 file changed, 76 insertions(+), 16 deletions(-)
> > 
> > diff --git a/src/libcamera/camera.cpp b/src/libcamera/camera.cpp
> > index dffbd6bd5a10..eff999ec322a 100644
> > --- a/src/libcamera/camera.cpp
> > +++ b/src/libcamera/camera.cpp
> > @@ -23,22 +23,82 @@
> >   * \file camera.h
> >   * \brief Camera device handling
> >   *
> > - * At the core of libcamera is the camera device, combining one image source
> > - * with processing hardware able to provide one or multiple image streams. The
> > - * Camera class represents a camera device.
> > - *
> > - * A camera device contains a single image source, and separate camera device
> > - * instances relate to different image sources. For instance, a phone containing
> > - * front and back image sensors will be modelled with two camera devices, one
> > - * for each sensor. When multiple streams can be produced from the same image
> > - * source, all those streams are guaranteed to be part of the same camera
> > - * device.
> > - *
> > - * While not sharing image sources, separate camera devices can share other
> > - * system resources, such as an ISP. For this reason camera device instances may
> > - * not be fully independent, in which case usage restrictions may apply. For
> > - * instance, a phone with a front and a back camera device may not allow usage
> > - * of the two devices simultaneously.
> > + * \page camera-model Camera Model
> > + *
> > + * libcamera acts as a middleware between applications and camera hardware. It
> > + * provides a solution to an unsolvable problem: reconciling applications,
> 
> Are we really solving an unsolvable problem? Doesn't that by definition
> mean it's solvable ? :-)

It's magiiiiic ;-)

> > + * which need to run on different systems without dealing with device-specific
> > + * details, and camera hardware, which exhibits a wide variety of features,
> > + * limitations and architecture variations. In order to do so, it creates an
> > + * abstract camera model that hides the camera hardware from applications. The
> > + * model is designed to strike the right balance between genericity, to please
> > + * generic applications, and flexibililty, to expose even the most specific
> 
> s/flexibililty/flexibility/
> 
> > + * hardware features to the most demanding applications.
> > + *
> > + * In libcamera, a Camera is defined as a device that can capture frames
> > + * continuously from a camera sensor and store them in memory. If supported by
> > + * the device and desired by the application, the camera may store each
> > + * captured frame in multiple copies, possibly in different formats and sizes.
> > + * Each of these memory outputs of the camera is called a Stream.
> > + *
> > + * A camera contains a single image source, and separate camera instances
> > + * relate to different image sources. For instance, a phone containing front
> > + * and back image sensors will be modelled with two cameras, one for each
> > + * sensor. When multiple streams can be produced from the same image source,
> > + * all those streams are guaranteed to be part of the same camera.
> > + *
> > + * While not sharing image sources, separate cameras can share other system
> > + * resources, such as ISPs. For this reason camera instances may not be fully
> > + * independent, in which case usage restrictions may apply. For instance, a
> > + * phone with a front and a back camera may not allow usage of the two cameras
> > + * simultaneously.
> > + *
> > + * The camera model defines an implicit pipeline, whose input is the camera
> > + * sensor, and whose outputs are the streams. Along the pipeline, the frames
> > + * produced by the camera sensor are transformed by the camera into a format
> > + * suitable for applications, with image processing that improves the quality
> > + * of the captured frames. The camera exposes a set of controls that
> > + * application may use to manually control the processing steps. This
> 
> s/application/applications/
> 
> > + * high-level camera model is the minimum baseline that all cameras must
> > + * conform to.
> > + *
> > + * \section camera-pipeline-model Pipeline Model
> > + *
> > + * Camera hardware differs in the supported image processing operations and the
> > + * order in which they are applied. The libcamera pipelines abstract the
> > + * hardware differences and expose a logical view of the processing operations
> > + * with a fixed order. This offers low-level control of those operations to
> > + * applications, while keeping application code generic.
> > + *
> > + * Starting from the camera sensor, a pipeline applies the following
> > + * operations, in that order.
> > + *
> > + * - Pixel exposure
> > + * - Analog to digital conversion and readout
> > + * - Black level subtraction
> > + * - Defective pixel correction
> > + * - Lens shading correction
> > + * - Spatial noise filtering
> > + * - Per-channel gains (white balance)
> > + * - Demosaicing (color filter array interpolation)
> > + * - Color correction matrix (typically RGB to RGB)
> > + * - Gamma correction
> > + * - Color space transformation (typically RGB to YUV)
> > + * - Cropping
> > + * - Scaling
> 
> Do we really define that they happen in that order? Are there
> implications if pipelines "don't"
> 
> > + *
> > + * Not all cameras implement all operations, and they are not necessarily
> > + * implemented in the above order at the hardware level. The libcamera pipeline
> > + * handlers translate the pipeline model to the real hardware configuration.
> 
> er ... ok ... I'm not (yet) quite sure of the relevance of stating the
> order of operations if the pipelines can still do what they like ;-)

The pipeline model defines a logical view of the pipeline, which
mandates operations to be applied in the above order from the point of
view of an external observer. The hardware implementation may differ,
but the effect needs to match the pipeline model.

> > + *
> > + * \subsection digital-zoom Digital Zoom
> > + *
> > + * Digital zoom is implemented as a combination of the cropping and scaling
> > + * stages of the pipeline. Cropping is controlled explicitly through the
> > + * controls::ScalerCrop control, while scalling is controlled implicitly based
> 
> s/scalling/scaling/
> 
> > + * on the crop rectangle and the output stream size. The crop rectangle is
> > + * expressed relatively to the full pixel array size and indicates how the field
> > + * of view is affected by the pipeline.
> >   */
> 
> With the small fixups above:
> 
> Reviewed-by: Kieran Bingham <kieran.bingham at ideasonboard.com>
> 
> >  
> >  namespace libcamera {

-- 
Regards,

Laurent Pinchart


More information about the libcamera-devel mailing list