[libcamera-devel] [PATCH 2/3] Documentation: fix typos

Jacopo Mondi jacopo at jmondi.org
Wed May 4 09:11:42 CEST 2022


Hi Quentin

On Tue, May 03, 2022 at 06:30:37PM +0200, Quentin Schulz via libcamera-devel wrote:
> From: Quentin Schulz <quentin.schulz at theobroma-systems.com>
>
> A few typos made it to the docs, so let's fix them.
>
> Cc: Quentin Schulz <foss+libcamera at 0leil.net>
> Signed-off-by: Quentin Schulz <quentin.schulz at theobroma-systems.com>
> ---
>  Documentation/guides/introduction.rst     |  2 +-
>  Documentation/guides/pipeline-handler.rst | 28 +++++++++++------------
>  2 files changed, 15 insertions(+), 15 deletions(-)
>
> diff --git a/Documentation/guides/introduction.rst b/Documentation/guides/introduction.rst
> index d080679f..07f66881 100644
> --- a/Documentation/guides/introduction.rst
> +++ b/Documentation/guides/introduction.rst
> @@ -221,7 +221,7 @@ Camera Device
>    of producing one or more image streams, and provides the API to interact with
>    the underlying device.
>
> -  If a system has multiple instances of the same hardware attached, each has it's
> +  If a system has multiple instances of the same hardware attached, each has its

ack

>    own instance of the camera class.
>
>    The API exposes full control of the device to upper layers of libcamera through
> diff --git a/Documentation/guides/pipeline-handler.rst b/Documentation/guides/pipeline-handler.rst
> index a7208f57..989b0163 100644
> --- a/Documentation/guides/pipeline-handler.rst
> +++ b/Documentation/guides/pipeline-handler.rst
> @@ -75,7 +75,7 @@ Prerequisite knowledge: libcamera architecture
>  ----------------------------------------------
>
>  A pipeline handler makes use of the following libcamera classes to realize the
> -functionalities descibed above. Below is a brief overview of each of those:
> +functionalities described above. Below is a brief overview of each of those:

ack

>
>  .. TODO: (All) Convert to sphinx refs
>  .. TODO: (MediaDevice) Reference to the Media Device API (possibly with versioning requirements)
> @@ -405,7 +405,7 @@ Creating camera devices
>  If the pipeline handler successfully matches with the system it is running on,
>  it can proceed to initialization, by creating all the required instances of the
>  ``V4L2VideoDevice``, ``V4L2Subdevice`` and ``CameraSensor`` hardware abstraction
> -classes. If the Pipeline handler supports an ISP, it can then also Initialise
> +classes. If the Pipeline handler supports an ISP, it can then also initialise
>  the IPA module before proceeding to the creation of the Camera devices.
>
>  An image ``Stream`` represents a sequence of images and data of known size and
> @@ -687,8 +687,8 @@ and validated to adjust it to a supported configuration. This may involve
>  adjusting the formats or image sizes or alignments for example to match the
>  capabilities of the device.
>
> -Applications may choose to repeat validation stages, adjusting paramters until a
> -set of validated StreamConfigurations are returned that is acceptable for the
> +Applications may choose to repeat validation stages, adjusting parameters until
> +a set of validated StreamConfigurations are returned that is acceptable for the

ack

>  applications needs. When the pipeline handler receives a valid camera
>  configuration it can use the image stream configurations to apply settings to
>  the hardware devices.
> @@ -765,11 +765,11 @@ example (with only one stream), the pipeline handler always returns the same
>  configuration, inferred from the underlying V4L2VideoDevice.
>
>  How it does this is shown below, but examination of the more full-featured
> -pipelines for IPU3, RKISP1 and RaspberryPi are recommend to explore more
> +pipelines for IPU3, RKISP1 and RaspberryPi are recommended to explore more

ack

>  complex examples.
>
>  To generate a ``StreamConfiguration``, you need a list of pixel formats and
> -frame sizes which supported outputs of the stream. You can fetch a map of the
> +frame sizes supported by outputs of the stream. You can fetch a map of the

I think what was meant is

To generate a ``StreamConfiguration``, you need a list of pixel formats and
frame sizes which are supported as outputs of the stream. You can fetch a map of the


>  ``V4LPixelFormat`` and ``SizeRange`` supported by the underlying output device,
>  but the pipeline handler needs to convert this to a ``libcamera::PixelFormat``
>  type to pass to applications. We do this here using ``std::transform`` to
> @@ -811,9 +811,9 @@ Continue adding the following code to support this:
>     StreamConfiguration cfg(formats);
>
>  As well as a list of supported StreamFormats, the StreamConfiguration is also
> -expected to provide an initialised default configuration. This may be arbitrary,
> -but depending on use case you may which to select an output that matches the
> -Sensor output, or prefer a pixelformat which might provide higher performance on
> +expected to provide an initialised default configuration. This may be arbitrary,

ack

> +but depending on use case you may wish to select an output that matches the

'wish' or either simply 'want'

> +Sensor output, or prefer a pixelFormat which might provide higher performance on

I would rather use ``PixelFormat`` as the term indicates a libcamera
a type

>  the hardware. The bufferCount represents the number of buffers required to
>  support functional continuous processing on this stream.
>
> @@ -826,7 +826,7 @@ support functional continuous processing on this stream.
>  Finally add each ``StreamConfiguration`` generated to the
>  ``CameraConfiguration``, and ensure that it has been validated before returning
>  it to the application. With only a single supported stream, this code adds only
> -a single StreamConfiguration however a StreamConfiguration should be added for
> +a single StreamConfiguration. However a StreamConfiguration should be added for
>  each supported role in a device that can handle more streams.

ack

>
>  Add the following code to complete the implementation of
> @@ -841,7 +841,7 @@ Add the following code to complete the implementation of
>     return config;
>
>  To validate a camera configuration, a pipeline handler must implement the
> -`CameraConfiguration::validate()`_ function in it's derived class to inspect all
> +`CameraConfiguration::validate()`_ function in its derived class to inspect all

ack

>  the stream configuration associated to it, make any adjustments required to make
>  the configuration valid, and return the validation status.
>
> @@ -1372,9 +1372,9 @@ classes documentation.
>  .. _libcamera Signal and Slot: http://libcamera.org/api-html/classlibcamera_1_1Signal.html#details
>
>  In order to notify applications about the availability of new frames and data,
> -the ``Camera`` device exposes two ``Signals`` which applications can connect to
> -be notified of frame completion events. The ``bufferComplete`` signal serves to
> -report to applications the completion event of a single ``Stream`` part of a
> +the ``Camera`` device exposes two ``Signals`` to which applications can connect

not sure the 'to' is required. Native speakers ?

However, all minor comments, if you agree they can be addressed when
applying

Reviewed-by: Jacopo Mondi <jacopo at jmondi.org>

Thanks
  j

> +to be notified of frame completion events. The ``bufferComplete`` signal serves
> +to report to applications the completion event of a single ``Stream`` part of a
>  ``Request``, while the ``requestComplete`` signal notifies the completion of all
>  the ``Streams`` and data submitted as part of a request. This mechanism allows
>  implementation of partial request completion, which allows an application to
> --
> 2.35.1
>


More information about the libcamera-devel mailing list