[libcamera-devel] [PATCH v3 3/4] pipeline: raspberrypi: Only enabled embedded stream when available

Naushir Patuck naush at raspberrypi.com
Mon Feb 22 14:40:38 CET 2021


Hi Laurent,

Thank you for your review feedback.

On Sun, 21 Feb 2021 at 23:08, Laurent Pinchart <
laurent.pinchart at ideasonboard.com> wrote:

> Hi Naush,
>
> Thank you for the patch.
>
> s/enabled/enable/ in the subject ?
>
> On Thu, Feb 18, 2021 at 05:01:25PM +0000, Naushir Patuck wrote:
> > The pipeline handler would enable and use the Unicam embedded data stream
> > even if the sensor did not support it. This was to allow a means to
> > pass exposure and gain values for the frame to the IPA in a synchronised
> > way.
> >
> > The recent changes to get the pipeline handler to pass a ControlList
> > with exposure and gain values means this is no longer required. Disable
> > the use of the embedded data stream when a sensor does not support it.
>
> Nice :-)
>
> > This change also removes the mappedEmbeddedBuffers_ map as it is no
> > longer used.
> >
> > Signed-off-by: Naushir Patuck <naush at raspberrypi.com>
> > ---
> >  .../pipeline/raspberrypi/raspberrypi.cpp      | 112 +++++++++++-------
> >  1 file changed, 70 insertions(+), 42 deletions(-)
> >
> > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > index b43d86166c63..d969c77993eb 100644
> > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > @@ -177,12 +177,6 @@ public:
> >       /* Stores the ids of the buffers mapped in the IPA. */
> >       std::unordered_set<unsigned int> ipaBuffers_;
> >
> > -     /*
> > -      * Map of (internal) mmaped embedded data buffers, to avoid having
> to
> > -      * map/unmap on every frame.
> > -      */
> > -     std::map<unsigned int, MappedFrameBuffer> mappedEmbeddedBuffers_;
> > -
> >       /* DMAHEAP allocation helper. */
> >       RPi::DmaHeap dmaHeap_;
> >       FileDescriptor lsTable_;
> > @@ -636,14 +630,7 @@ int PipelineHandlerRPi::configure(Camera *camera,
> CameraConfiguration *config)
> >
> >               if (isRaw(cfg.pixelFormat)) {
> >                       cfg.setStream(&data->unicam_[Unicam::Image]);
> > -                     /*
> > -                      * We must set both Unicam streams as external,
> even
> > -                      * though the application may only request RAW
> frames.
> > -                      * This is because we match timestamps on both
> streams
> > -                      * to synchronise buffers.
> > -                      */
> >                       data->unicam_[Unicam::Image].setExternal(true);
> > -                     data->unicam_[Unicam::Embedded].setExternal(true);
> >                       continue;
> >               }
> >
> > @@ -715,17 +702,6 @@ int PipelineHandlerRPi::configure(Camera *camera,
> CameraConfiguration *config)
> >               return ret;
> >       }
> >
> > -     /* Unicam embedded data output format. */
> > -     format = {};
> > -     format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);
> > -     LOG(RPI, Debug) << "Setting embedded data format.";
> > -     ret = data->unicam_[Unicam::Embedded].dev()->setFormat(&format);
> > -     if (ret) {
> > -             LOG(RPI, Error) << "Failed to set format on Unicam
> embedded: "
> > -                             << format.toString();
> > -             return ret;
> > -     }
> > -
> >       /* Figure out the smallest selection the ISP will allow. */
> >       Rectangle testCrop(0, 0, 1, 1);
> >       data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP,
> &testCrop);
> > @@ -742,6 +718,41 @@ int PipelineHandlerRPi::configure(Camera *camera,
> CameraConfiguration *config)
> >       if (ret)
> >               LOG(RPI, Error) << "Failed to configure the IPA: " << ret;
> >
> > +     /*
> > +      * The IPA will set data->sensorMetadata_ to true if embedded data
> is
> > +      * supported on this sensor. If so, open the Unicam embedded data
> > +      * node and configure the output format.
> > +      */
> > +     if (data->sensorMetadata_) {
> > +             format = {};
> > +             format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);
> > +             LOG(RPI, Debug) << "Setting embedded data format.";
> > +             data->unicam_[Unicam::Embedded].dev()->open();
>
> The device is opened here, but never closed. Should that be fixed ? What
> are the drawbacks in opening the device in match() as done today, even
> if not used ?
>

We rely on the V4L2VideoDevice destructor to close down all the device
nodes.
I take it this is not the way to do it? :-)  If not, where would you advise
to put this
call?  Since we call open() in match(), it does not seem right to close the
node
in stop().  I did have a quick scan of rkisp1 and ipu3 pipelines, and they
do not
seem to call close() on devices either, so no hints there.

The reason why we don't open the embedded data node in match() like we used
to
is because of the kernel driver.  In order to synchronize buffers for
embedded data
and image data nodes in the driver, we must open the embedded data node if
it is
used, i.e. we cannot keep it open and not queue it buffers to "disable" the
stream.
Of course, this can change in the driver, but it does add mode complexity
to the
buffer sync logic.  Perhaps this should be noted as a \todo to fix in the
future.


>
> > +             ret =
> data->unicam_[Unicam::Embedded].dev()->setFormat(&format);
> > +             if (ret) {
> > +                     LOG(RPI, Error) << "Failed to set format on Unicam
> embedded: "
> > +                                     << format.toString();
> > +                     return ret;
> > +             }
> > +
> > +             /*
> > +              * If a RAW/Bayer stream has been requested by the
> application,
> > +              * we must set both Unicam streams as external, even
> though the
> > +              * application may only request RAW frames. This is
> because we
> > +              * match timestamps on both streams to synchronise buffers.
> > +              */
> > +             if (rawStream)
> > +                     data->unicam_[Unicam::Embedded].setExternal(true);
> > +     } else {
> > +             /*
> > +              * No embedded data present, so we do not want to iterate
> over
> > +              * the embedded data stream when starting and stopping.
> > +              */
> > +             data->streams_.erase(std::remove(data->streams_.begin(),
> data->streams_.end(),
> > +
> &data->unicam_[Unicam::Embedded]),
> > +                                  data->streams_.end());
>
> Hmmmm... Given that only one sensor, and thus one camera, is supported
> by this pipeline handler, this should work, but conceptually it's not
> very nice. Isn't this a decision that should be made at match() time,
> not configure() time ? It would be right to do this here if the code was
> modelled to support multiple sensors, but in that case we would need to
> close() the embedded data video node in appropriate locations, and also
> add the embedded stream back to data->streams_ when metadata is present.
>

Fully agree.  I did want this to be handled in match(), but as you said, we
do not
have the information about sensor metadata support there.


>
> One option to move this to match() time would be for the IPA to return
> if the sensor supports metadata from init() instead of configure(). That
> would require passing the sensor name to the IPA in init() too.
>

> It's the mix n' match that bothers me I think. I don't mind if we decide
> to model the pipeline handler and IPA with the assumption that there can
> only be a single camera, with a hardcoded pipeline known from the very
> beginning, or in a more dynamic way that could allow runtime switching
> between sensors, but it would be nice if the architecture of the
> pipeline handler and IPA was consistent relatively to the model we pick.
> Does this make sense ?
>

Yes, I do agree with this as well.  I am happy for ipa->init() to pass back
the required
parameters so match() can be used to open the embedded data node if
required.
I presume we have all the tools needed to do this with the IPA interfaces
change to
use mojom definitions?  If so I can update the signature of ipa->init() to
pass in the
sensor name and return out the "SensorConfig" parameters.

Regards,
Naush



>
> > +     }
> > +
> >       /*
> >        * Update the ScalerCropMaximum to the correct value for this
> camera mode.
> >        * For us, it's the same as the "analogue crop".
> > @@ -949,10 +960,16 @@ bool PipelineHandlerRPi::match(DeviceEnumerator
> *enumerator)
> >       for (auto &stream : data->isp_)
> >               data->streams_.push_back(&stream);
> >
> > -     /* Open all Unicam and ISP streams. */
> > +     /*
> > +      * Open all Unicam and ISP streams. The exception is the embedded
> data
> > +      * stream, which only gets opened if the IPA reports that the
> sensor
> > +      * supports embedded data. This happens in
> RPiCameraData::configureIPA().
> > +      */
> >       for (auto const stream : data->streams_) {
> > -             if (stream->dev()->open())
> > -                     return false;
> > +             if (stream != &data->unicam_[Unicam::Embedded]) {
> > +                     if (stream->dev()->open())
> > +                             return false;
> > +             }
> >       }
> >
> >       /* Wire up all the buffer connections. */
> > @@ -1109,19 +1126,13 @@ int PipelineHandlerRPi::prepareBuffers(Camera
> *camera)
> >                       return ret;
> >       }
> >
> > -     if (!data->sensorMetadata_) {
> > -             for (auto const &it :
> data->unicam_[Unicam::Embedded].getBuffers()) {
> > -                     MappedFrameBuffer fb(it.second, PROT_READ |
> PROT_WRITE);
> > -                     data->mappedEmbeddedBuffers_.emplace(it.first,
> std::move(fb));
> > -             }
> > -     }
> > -
> >       /*
> >        * Pass the stats and embedded data buffers to the IPA. No other
> >        * buffers need to be passed.
> >        */
> >       mapBuffers(camera, data->isp_[Isp::Stats].getBuffers(),
> ipa::RPi::MaskStats);
> > -     mapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(),
> ipa::RPi::MaskEmbeddedData);
> > +     if (data->sensorMetadata_)
> > +             mapBuffers(camera,
> data->unicam_[Unicam::Embedded].getBuffers(), ipa::RPi::MaskEmbeddedData);
>
> Maybe a bit of line wrap ? :-)
>
> >
> >       return 0;
> >  }
> > @@ -1154,7 +1165,6 @@ void PipelineHandlerRPi::freeBuffers(Camera
> *camera)
> >       std::vector<unsigned int> ipaBuffers(data->ipaBuffers_.begin(),
> data->ipaBuffers_.end());
> >       data->ipa_->unmapBuffers(ipaBuffers);
> >       data->ipaBuffers_.clear();
> > -     data->mappedEmbeddedBuffers_.clear();
> >
> >       for (auto const stream : data->streams_)
> >               stream->releaseBuffers();
> > @@ -1652,7 +1662,7 @@ void RPiCameraData::tryRunPipeline()
> >
> >       /* If any of our request or buffer queues are empty, we cannot
> proceed. */
> >       if (state_ != State::Idle || requestQueue_.empty() ||
> > -         bayerQueue_.empty() || embeddedQueue_.empty())
> > +         bayerQueue_.empty() || (embeddedQueue_.empty() &&
> sensorMetadata_))
> >               return;
> >
> >       if (!findMatchingBuffers(bayerFrame, embeddedBuffer))
> > @@ -1675,17 +1685,24 @@ void RPiCameraData::tryRunPipeline()
> >       state_ = State::Busy;
> >
> >       unsigned int bayerId =
> unicam_[Unicam::Image].getBufferId(bayerFrame.buffer);
> > -     unsigned int embeddedId =
> unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);
> >
> >       LOG(RPI, Debug) << "Signalling signalIspPrepare:"
> > -                     << " Bayer buffer id: " << bayerId
> > -                     << " Embedded buffer id: " << embeddedId;
> > +                     << " Bayer buffer id: " << bayerId;
> >
> >       ipa::RPi::ISPConfig ispPrepare;
> > -     ispPrepare.embeddedBufferId = ipa::RPi::MaskEmbeddedData |
> embeddedId;
> >       ispPrepare.bayerBufferId = ipa::RPi::MaskBayerData | bayerId;
> > -     ispPrepare.embeddedBufferPresent = true;
> >       ispPrepare.controls = std::move(bayerFrame.controls);
> > +
> > +     if (embeddedBuffer) {
> > +             unsigned int embeddedId =
> unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);
> > +
> > +             ispPrepare.embeddedBufferId = ipa::RPi::MaskEmbeddedData |
> embeddedId;
> > +             ispPrepare.embeddedBufferPresent = true;
> > +
> > +             LOG(RPI, Debug) << "Signalling signalIspPrepare:"
> > +                             << " Bayer buffer id: " << embeddedId;
> > +     }
> > +
> >       ipa_->signalIspPrepare(ispPrepare);
> >  }
> >
> > @@ -1727,6 +1744,17 @@ bool
> RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
> >
> >                       LOG(RPI, Debug) << "Could not find matching
> embedded buffer";
> >
> > +                     if (!sensorMetadata_) {
> > +                             /*
> > +                              * If there is no sensor metadata, simply
> return the
> > +                              * first bayer frame in the queue.
> > +                              */
> > +                             LOG(RPI, Debug) << "Returning bayer frame
> without a match";
> > +                             bayerQueue_.pop();
> > +                             embeddedBuffer = nullptr;
> > +                             return true;
> > +                     }
> > +
> >                       if (!embeddedQueue_.empty()) {
> >                               /*
> >                                * Not found a matching embedded buffer
> for the bayer buffer in
>
> --
> Regards,
>
> Laurent Pinchart
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.libcamera.org/pipermail/libcamera-devel/attachments/20210222/aa9f74df/attachment-0001.htm>


More information about the libcamera-devel mailing list