[libcamera-devel] [PATCH 3/8] cam: Use SensorTimestamp rather than buffer metadata

Umang Jain umang.jain at ideasonboard.com
Tue Dec 7 14:37:46 CET 2021


On 12/7/21 6:51 PM, Umang Jain wrote:
> Hi Laurent,
>
> On 12/7/21 5:39 AM, Laurent Pinchart wrote:
>> Hi Kieran,
>>
>> Thank you for the patch.
>>
>> On Mon, Dec 06, 2021 at 11:39:43PM +0000, Kieran Bingham wrote:
>>> The SensorTimestamp is defined to be the time of capture of the image.
>>> While all streams should have the same timestamp, this is not always
>>> defined or guaranteed as ISP drivers may not forward sequence numbers
>>> and timestamps from their input buffer.
>> That should then bo solved by the pipeline handler, which should store
>> the correct timestamp in the buffer metadata.
>>
>>> Use the Request metadata to get the SensorTimestamp which must be
>>> set by the pipeline handlers according to the data from the capture
>>> device.
>>>
>>> Signed-off-by: Kieran Bingham <kieran.bingham at ideasonboard.com>
>>> ---
>>>   src/cam/camera_session.cpp | 7 ++++---
>>>   1 file changed, 4 insertions(+), 3 deletions(-)
>>>
>>> diff --git a/src/cam/camera_session.cpp b/src/cam/camera_session.cpp
>>> index 1bf460fa3fb7..50170723c30f 100644
>>> --- a/src/cam/camera_session.cpp
>>> +++ b/src/cam/camera_session.cpp
>>> @@ -359,10 +359,11 @@ void CameraSession::processRequest(Request 
>>> *request)
>>>       const Request::BufferMap &buffers = request->buffers();
>>>         /*
>>> -     * Compute the frame rate. The timestamp is arbitrarily 
>>> retrieved from
>>> -     * the first buffer, as all buffers should have matching 
>>> timestamps.
>>> +     * Compute the frame rate. The timestamp is retrieved from the
>>> +     * SensorTimestamp property, though all streams should have the
>>> +     * same timestamp.
>>>        */
>>> -    uint64_t ts = buffers.begin()->second->metadata().timestamp;
>>> +    uint64_t ts = request->metadata().get(controls::SensorTimestamp);
>> This seems reasonable. Why do we have timestamps in the buffer metadata
>> ? :-)


Strong chance I have mis-understood the question, I later realized this 
is a cam-related patch.

to me, the question translated :

     why do we have timestamps in the FrameBuffer.metadata_.timestamp ?

So ignore the discussion (if you want) :-P

>
>
> Because there is no buffer assigned on the time-point where we want to 
> capture the timestamp.
>
> The exact location of the timestamp is a \todo
>
>              * \todo The sensor timestamp should be better estimated 
> by connecting
>              * to the V4L2Device::frameStart signal.
>
> in all the pipeline-handlers as of now.
>
> We *want* to capture at frameStart signal, which emits in response to 
> VIDIOC_DQEVENT, but we can't as no buffer assigned (yet)
>
> We probably need to assign a container taking care of timestamps with 
> sequence numbers until a buffer is assigned down-the-line and then set 
> the buffer metadata by reading seq# & timestamp from that container.
>
>
>>
>>>       double fps = ts - last_;
>>>       fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;
>>>       last_ = ts;


More information about the libcamera-devel mailing list