<div dir="ltr"><div dir="ltr">Hi Laurent,<div><br></div><div>Thank you for your review feedback.</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sun, 21 Feb 2021 at 23:08, Laurent Pinchart <<a href="mailto:laurent.pinchart@ideasonboard.com">laurent.pinchart@ideasonboard.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hi Naush,<br>
<br>
Thank you for the patch.<br>
<br>
s/enabled/enable/ in the subject ?<br>
<br>
On Thu, Feb 18, 2021 at 05:01:25PM +0000, Naushir Patuck wrote:<br>
> The pipeline handler would enable and use the Unicam embedded data stream<br>
> even if the sensor did not support it. This was to allow a means to<br>
> pass exposure and gain values for the frame to the IPA in a synchronised<br>
> way.<br>
> <br>
> The recent changes to get the pipeline handler to pass a ControlList<br>
> with exposure and gain values means this is no longer required. Disable<br>
> the use of the embedded data stream when a sensor does not support it.<br>
<br>
Nice :-)<br>
<br>
> This change also removes the mappedEmbeddedBuffers_ map as it is no<br>
> longer used.<br>
> <br>
> Signed-off-by: Naushir Patuck <<a href="mailto:naush@raspberrypi.com" target="_blank">naush@raspberrypi.com</a>><br>
> ---<br>
> .../pipeline/raspberrypi/raspberrypi.cpp | 112 +++++++++++-------<br>
> 1 file changed, 70 insertions(+), 42 deletions(-)<br>
> <br>
> diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp<br>
> index b43d86166c63..d969c77993eb 100644<br>
> --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp<br>
> +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp<br>
> @@ -177,12 +177,6 @@ public:<br>
> /* Stores the ids of the buffers mapped in the IPA. */<br>
> std::unordered_set<unsigned int> ipaBuffers_;<br>
> <br>
> - /*<br>
> - * Map of (internal) mmaped embedded data buffers, to avoid having to<br>
> - * map/unmap on every frame.<br>
> - */<br>
> - std::map<unsigned int, MappedFrameBuffer> mappedEmbeddedBuffers_;<br>
> -<br>
> /* DMAHEAP allocation helper. */<br>
> RPi::DmaHeap dmaHeap_;<br>
> FileDescriptor lsTable_;<br>
> @@ -636,14 +630,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)<br>
> <br>
> if (isRaw(cfg.pixelFormat)) {<br>
> cfg.setStream(&data->unicam_[Unicam::Image]);<br>
> - /*<br>
> - * We must set both Unicam streams as external, even<br>
> - * though the application may only request RAW frames.<br>
> - * This is because we match timestamps on both streams<br>
> - * to synchronise buffers.<br>
> - */<br>
> data->unicam_[Unicam::Image].setExternal(true);<br>
> - data->unicam_[Unicam::Embedded].setExternal(true);<br>
> continue;<br>
> }<br>
> <br>
> @@ -715,17 +702,6 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)<br>
> return ret;<br>
> }<br>
> <br>
> - /* Unicam embedded data output format. */<br>
> - format = {};<br>
> - format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);<br>
> - LOG(RPI, Debug) << "Setting embedded data format.";<br>
> - ret = data->unicam_[Unicam::Embedded].dev()->setFormat(&format);<br>
> - if (ret) {<br>
> - LOG(RPI, Error) << "Failed to set format on Unicam embedded: "<br>
> - << format.toString();<br>
> - return ret;<br>
> - }<br>
> -<br>
> /* Figure out the smallest selection the ISP will allow. */<br>
> Rectangle testCrop(0, 0, 1, 1);<br>
> data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &testCrop);<br>
> @@ -742,6 +718,41 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)<br>
> if (ret)<br>
> LOG(RPI, Error) << "Failed to configure the IPA: " << ret;<br>
> <br>
> + /*<br>
> + * The IPA will set data->sensorMetadata_ to true if embedded data is<br>
> + * supported on this sensor. If so, open the Unicam embedded data<br>
> + * node and configure the output format.<br>
> + */<br>
> + if (data->sensorMetadata_) {<br>
> + format = {};<br>
> + format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);<br>
> + LOG(RPI, Debug) << "Setting embedded data format.";<br>
> + data->unicam_[Unicam::Embedded].dev()->open();<br>
<br>
The device is opened here, but never closed. Should that be fixed ? What<br>
are the drawbacks in opening the device in match() as done today, even<br>
if not used ?<br></blockquote><div><br></div><div>We rely on the V4L2VideoDevice destructor to close down all the device nodes.</div><div>I take it this is not the way to do it? :-) If not, where would you advise to put this</div><div>call? Since we call open() in match(), it does not seem right to close the node</div><div>in stop(). I did have a quick scan of rkisp1 and ipu3 pipelines, and they do not</div><div>seem to call close() on devices either, so no hints there.</div><div><br></div><div>The reason why we don't open the embedded data node in match() like we used to</div><div>is because of the kernel driver. In order to synchronize buffers for embedded data</div><div>and image data nodes in the driver, we must open the embedded data node if it is</div><div>used, i.e. we cannot keep it open and not queue it buffers to "disable" the stream.</div><div>Of course, this can change in the driver, but it does add mode complexity to the</div><div>buffer sync logic. Perhaps this should be noted as a \todo to fix in the future.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<br>
> + ret = data->unicam_[Unicam::Embedded].dev()->setFormat(&format);<br>
> + if (ret) {<br>
> + LOG(RPI, Error) << "Failed to set format on Unicam embedded: "<br>
> + << format.toString();<br>
> + return ret;<br>
> + }<br>
> +<br>
> + /*<br>
> + * If a RAW/Bayer stream has been requested by the application,<br>
> + * we must set both Unicam streams as external, even though the<br>
> + * application may only request RAW frames. This is because we<br>
> + * match timestamps on both streams to synchronise buffers.<br>
> + */<br>
> + if (rawStream)<br>
> + data->unicam_[Unicam::Embedded].setExternal(true);<br>
> + } else {<br>
> + /*<br>
> + * No embedded data present, so we do not want to iterate over<br>
> + * the embedded data stream when starting and stopping.<br>
> + */<br>
> + data->streams_.erase(std::remove(data->streams_.begin(), data->streams_.end(),<br>
> + &data->unicam_[Unicam::Embedded]),<br>
> + data->streams_.end());<br>
<br>
Hmmmm... Given that only one sensor, and thus one camera, is supported<br>
by this pipeline handler, this should work, but conceptually it's not<br>
very nice. Isn't this a decision that should be made at match() time,<br>
not configure() time ? It would be right to do this here if the code was<br>
modelled to support multiple sensors, but in that case we would need to<br>
close() the embedded data video node in appropriate locations, and also<br>
add the embedded stream back to data->streams_ when metadata is present.<br></blockquote><div><br></div><div>Fully agree. I did want this to be handled in match(), but as you said, we do not</div><div>have the information about sensor metadata support there.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<br>
One option to move this to match() time would be for the IPA to return<br>
if the sensor supports metadata from init() instead of configure(). That<br>
would require passing the sensor name to the IPA in init() too.<br></blockquote><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><br>
It's the mix n' match that bothers me I think. I don't mind if we decide<br>
to model the pipeline handler and IPA with the assumption that there can<br>
only be a single camera, with a hardcoded pipeline known from the very<br>
beginning, or in a more dynamic way that could allow runtime switching<br>
between sensors, but it would be nice if the architecture of the<br>
pipeline handler and IPA was consistent relatively to the model we pick.<br>
Does this make sense ?<br></blockquote><div><br></div><div>Yes, I do agree with this as well. I am happy for ipa->init() to pass back the required</div><div>parameters so match() can be used to open the embedded data node if required.</div><div>I presume we have all the tools needed to do this with the IPA interfaces change to</div><div>use mojom definitions? If so I can update the signature of ipa->init() to pass in the</div><div>sensor name and return out the "SensorConfig" parameters.</div><div><br></div><div>Regards,</div><div>Naush</div><div><br></div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<br>
> + }<br>
> +<br>
> /*<br>
> * Update the ScalerCropMaximum to the correct value for this camera mode.<br>
> * For us, it's the same as the "analogue crop".<br>
> @@ -949,10 +960,16 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)<br>
> for (auto &stream : data->isp_)<br>
> data->streams_.push_back(&stream);<br>
> <br>
> - /* Open all Unicam and ISP streams. */<br>
> + /*<br>
> + * Open all Unicam and ISP streams. The exception is the embedded data<br>
> + * stream, which only gets opened if the IPA reports that the sensor<br>
> + * supports embedded data. This happens in RPiCameraData::configureIPA().<br>
> + */<br>
> for (auto const stream : data->streams_) {<br>
> - if (stream->dev()->open())<br>
> - return false;<br>
> + if (stream != &data->unicam_[Unicam::Embedded]) {<br>
> + if (stream->dev()->open())<br>
> + return false;<br>
> + }<br>
> }<br>
> <br>
> /* Wire up all the buffer connections. */<br>
> @@ -1109,19 +1126,13 @@ int PipelineHandlerRPi::prepareBuffers(Camera *camera)<br>
> return ret;<br>
> }<br>
> <br>
> - if (!data->sensorMetadata_) {<br>
> - for (auto const &it : data->unicam_[Unicam::Embedded].getBuffers()) {<br>
> - MappedFrameBuffer fb(it.second, PROT_READ | PROT_WRITE);<br>
> - data->mappedEmbeddedBuffers_.emplace(it.first, std::move(fb));<br>
> - }<br>
> - }<br>
> -<br>
> /*<br>
> * Pass the stats and embedded data buffers to the IPA. No other<br>
> * buffers need to be passed.<br>
> */<br>
> mapBuffers(camera, data->isp_[Isp::Stats].getBuffers(), ipa::RPi::MaskStats);<br>
> - mapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(), ipa::RPi::MaskEmbeddedData);<br>
> + if (data->sensorMetadata_)<br>
> + mapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(), ipa::RPi::MaskEmbeddedData);<br>
<br>
Maybe a bit of line wrap ? :-)<br>
<br>
> <br>
> return 0;<br>
> }<br>
> @@ -1154,7 +1165,6 @@ void PipelineHandlerRPi::freeBuffers(Camera *camera)<br>
> std::vector<unsigned int> ipaBuffers(data->ipaBuffers_.begin(), data->ipaBuffers_.end());<br>
> data->ipa_->unmapBuffers(ipaBuffers);<br>
> data->ipaBuffers_.clear();<br>
> - data->mappedEmbeddedBuffers_.clear();<br>
> <br>
> for (auto const stream : data->streams_)<br>
> stream->releaseBuffers();<br>
> @@ -1652,7 +1662,7 @@ void RPiCameraData::tryRunPipeline()<br>
> <br>
> /* If any of our request or buffer queues are empty, we cannot proceed. */<br>
> if (state_ != State::Idle || requestQueue_.empty() ||<br>
> - bayerQueue_.empty() || embeddedQueue_.empty())<br>
> + bayerQueue_.empty() || (embeddedQueue_.empty() && sensorMetadata_))<br>
> return;<br>
> <br>
> if (!findMatchingBuffers(bayerFrame, embeddedBuffer))<br>
> @@ -1675,17 +1685,24 @@ void RPiCameraData::tryRunPipeline()<br>
> state_ = State::Busy;<br>
> <br>
> unsigned int bayerId = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer);<br>
> - unsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);<br>
> <br>
> LOG(RPI, Debug) << "Signalling signalIspPrepare:"<br>
> - << " Bayer buffer id: " << bayerId<br>
> - << " Embedded buffer id: " << embeddedId;<br>
> + << " Bayer buffer id: " << bayerId;<br>
> <br>
> ipa::RPi::ISPConfig ispPrepare;<br>
> - ispPrepare.embeddedBufferId = ipa::RPi::MaskEmbeddedData | embeddedId;<br>
> ispPrepare.bayerBufferId = ipa::RPi::MaskBayerData | bayerId;<br>
> - ispPrepare.embeddedBufferPresent = true;<br>
> ispPrepare.controls = std::move(bayerFrame.controls);<br>
> +<br>
> + if (embeddedBuffer) {<br>
> + unsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);<br>
> +<br>
> + ispPrepare.embeddedBufferId = ipa::RPi::MaskEmbeddedData | embeddedId;<br>
> + ispPrepare.embeddedBufferPresent = true;<br>
> +<br>
> + LOG(RPI, Debug) << "Signalling signalIspPrepare:"<br>
> + << " Bayer buffer id: " << embeddedId;<br>
> + }<br>
> +<br>
> ipa_->signalIspPrepare(ispPrepare);<br>
> }<br>
> <br>
> @@ -1727,6 +1744,17 @@ bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em<br>
> <br>
> LOG(RPI, Debug) << "Could not find matching embedded buffer";<br>
> <br>
> + if (!sensorMetadata_) {<br>
> + /*<br>
> + * If there is no sensor metadata, simply return the<br>
> + * first bayer frame in the queue.<br>
> + */<br>
> + LOG(RPI, Debug) << "Returning bayer frame without a match";<br>
> + bayerQueue_.pop();<br>
> + embeddedBuffer = nullptr;<br>
> + return true;<br>
> + }<br>
> +<br>
> if (!embeddedQueue_.empty()) {<br>
> /*<br>
> * Not found a matching embedded buffer for the bayer buffer in<br>
<br>
-- <br>
Regards,<br>
<br>
Laurent Pinchart<br>
</blockquote></div></div>