<div dir="ltr"><div dir="ltr">Hi all,</div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, 8 Dec 2021 at 21:13, Laurent Pinchart <<a href="mailto:laurent.pinchart@ideasonboard.com">laurent.pinchart@ideasonboard.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hi Kieran,<br>
<br>
On Wed, Dec 08, 2021 at 08:59:29PM +0000, Kieran Bingham wrote:<br>
> Quoting Laurent Pinchart (2021-12-08 18:57:10)<br>
> > On Tue, Dec 07, 2021 at 07:07:46PM +0530, Umang Jain wrote:<br>
> > > On 12/7/21 6:51 PM, Umang Jain wrote:<br>
> > > > On 12/7/21 5:39 AM, Laurent Pinchart wrote:<br>
> > > >> On Mon, Dec 06, 2021 at 11:39:43PM +0000, Kieran Bingham wrote:<br>
> > > >>> The SensorTimestamp is defined to be the time of capture of the image.<br>
> > > >>> While all streams should have the same timestamp, this is not always<br>
> > > >>> defined or guaranteed as ISP drivers may not forward sequence numbers<br>
> > > >>> and timestamps from their input buffer.<br>
> > > >><br>
> > > >> That should then bo solved by the pipeline handler, which should store<br>
> > > >> the correct timestamp in the buffer metadata.<br>
> > > >><br>
> > > >>> Use the Request metadata to get the SensorTimestamp which must be<br>
> > > >>> set by the pipeline handlers according to the data from the capture<br>
> > > >>> device.<br>
> > > >>><br>
> > > >>> Signed-off-by: Kieran Bingham <<a href="mailto:kieran.bingham@ideasonboard.com" target="_blank">kieran.bingham@ideasonboard.com</a>><br>
> > > >>> ---<br>
> > > >>> src/cam/camera_session.cpp | 7 ++++---<br>
> > > >>> 1 file changed, 4 insertions(+), 3 deletions(-)<br>
> > > >>><br>
> > > >>> diff --git a/src/cam/camera_session.cpp b/src/cam/camera_session.cpp<br>
> > > >>> index 1bf460fa3fb7..50170723c30f 100644<br>
> > > >>> --- a/src/cam/camera_session.cpp<br>
> > > >>> +++ b/src/cam/camera_session.cpp<br>
> > > >>> @@ -359,10 +359,11 @@ void CameraSession::processRequest(Request <br>
> > > >>> *request)<br>
> > > >>> const Request::BufferMap &buffers = request->buffers();<br>
> > > >>> /*<br>
> > > >>> - * Compute the frame rate. The timestamp is arbitrarily retrieved from<br>
> > > >>> - * the first buffer, as all buffers should have matching timestamps.<br>
> > > >>> + * Compute the frame rate. The timestamp is retrieved from the<br>
> > > >>> + * SensorTimestamp property, though all streams should have the<br>
> > > >>> + * same timestamp.<br>
> > > >>> */<br>
> > > >>> - uint64_t ts = buffers.begin()->second->metadata().timestamp;<br>
> > > >>> + uint64_t ts = request->metadata().get(controls::SensorTimestamp);<br>
> > > >><br>
> > > >> This seems reasonable. Why do we have timestamps in the buffer metadata<br>
> > > >> ? :-)<br>
> > > <br>
> > > Strong chance I have mis-understood the question, I later realized this <br>
> > > is a cam-related patch.<br>
> > > <br>
> > > to me, the question translated :<br>
> > > <br>
> > > why do we have timestamps in the FrameBuffer.metadata_.timestamp ?<br>
> > <br>
> > To clarify, I meant to ask why we have both controls::SensorTimestamp<br>
> > *and* timestamps in individual buffers<br>
> > (FrameBuffer::metadata_.timestamp). I suppose it's historical, the<br>
> > timestamp member comes from<br>
> > <br>
> > commit e94e52c0cb27f92c085da6e776af8b3d3172bbb2<br>
> > Author: Niklas Söderlund <<a href="mailto:niklas.soderlund@ragnatech.se" target="_blank">niklas.soderlund@ragnatech.se</a>><br>
> > Date: Thu Jan 24 23:34:51 2019 +0100<br>
> > <br>
> > libcamera: v4l2_device: Update dequeued buffer information<br>
> > <br>
> > while controls::SensorTimestamp has been introduced in<br>
> > <br>
> > commit 84e79bd8b5347d91309cbd759bbd988b5144fb8d<br>
> > Author: Jacopo Mondi <<a href="mailto:jacopo@jmondi.org" target="_blank">jacopo@jmondi.org</a>><br>
> > Date: Thu Jul 30 15:18:50 2020 +0200<br>
> > <br>
> > libcamera: control_ids: Define draft controls<br>
> > <br>
> > > So ignore the discussion (if you want) :-P<br>
> > > <br>
> > > > Because there is no buffer assigned on the time-point where we want to <br>
> > > > capture the timestamp.<br>
> > > ><br>
> > > > The exact location of the timestamp is a \todo<br>
> > > ><br>
> > > > * \todo The sensor timestamp should be better estimated <br>
> > > > by connecting<br>
> > > > * to the V4L2Device::frameStart signal.<br>
> > > ><br>
> > > > in all the pipeline-handlers as of now.<br>
> > > ><br>
> > > > We *want* to capture at frameStart signal, which emits in response to <br>
> > > > VIDIOC_DQEVENT, but we can't as no buffer assigned (yet)<br>
> > > ><br>
> > > > We probably need to assign a container taking care of timestamps with <br>
> > > > sequence numbers until a buffer is assigned down-the-line and then set <br>
> > > > the buffer metadata by reading seq# & timestamp from that container.<br>
> > <br>
> > Should we just drop FrameBuffer::metadata_.timestamp in favour of<br>
> > controls::SensorTimestamp ? What would be the drawbacks, if any ?<br>
> <br>
> Drawbacks for using Metadata control:<br>
> - Have to parse the metadata control list to get timestamps<br>
> (/sequences)<br>
<br>
The request metadata is also not available before the request fully<br>
completes, while buffers will complete individually before. This is<br>
something we may want to address in a generic way though, I suspect<br>
there will be use cases for providing request metadata early.<br>
<br>
> Pro's for Metadata control:<br>
> - Only one place for it, all buffers in the same request are defined as<br>
> having the same value (sequence and timestamp).<br>
<br>
That's the part I like :-)<br>
<br>
> Drawbacks for per-buffer metadata:<br>
> - Must be explictily assigned to keep them in sync even though they are<br>
> the output of multiple video devices perhaps.<br>
> <br>
> Pro's for for per-buffer metadata:<br>
> - Quicker to retrieve the values? (defined structure?)<br>
<br>
Possibly a bit quicker, but that seems marginal to me.<br>
<br>
> Any more on either side?<br>
> <br>
> <br>
> Personally, I'm currently thinking these are better defined by the<br>
> metadata() control list. After all, that's the purpose of the list<br>
> right?<br>
<br>
I think so. I see very little value today in providing buffer completion<br>
timestamps that would be different than the sensor timestamp (this could<br>
possibly help measuring the processing time, but if that's the goal, it<br>
could also be done by capturing the system timestamp in the buffer<br>
completion handler on the application side, which would provide<br>
statistics about the full processing time, including both the hardware<br>
processing and the software processing).<br></blockquote><div><br></div><div>The above reason was exactly my rationale for keeping both :-)</div><div>However, it's not something we rely on, or even use really, so</div><div>I am happy to get rid of the buffer timestamps.</div><div><br></div><div>Naush</div><div><br></div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<br>
Let's wait for more feedback, but if there's a general agreement that<br>
this is the direction we want to take, we could remove<br>
FrameBuffer::metadata_.timestamp independently of addressing the sensor<br>
sequence issue.<br>
<br>
> > > >>> double fps = ts - last_;<br>
> > > >>> fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;<br>
> > > >>> last_ = ts;<br>
<br>
-- <br>
Regards,<br>
<br>
Laurent Pinchart<br>
</blockquote></div></div>