[PATCH v2 8/9] android: Add JpegExifMetadata to store tags setting into Exif
Cheng-Hao Yang
chenghaoyang at chromium.org
Thu Jan 2 14:11:57 CET 2025
Hi Jacopo,
On Fri, Dec 13, 2024 at 9:59 AM Cheng-Hao Yang
<chenghaoyang at chromium.org> wrote:
>
> Hi Jacopo,
>
> On Fri, Dec 13, 2024 at 4:26 PM Jacopo Mondi
> <jacopo.mondi at ideasonboard.com> wrote:
> >
> > Hi Harvey
> >
> > On Thu, Dec 12, 2024 at 05:00:23PM +0800, Cheng-Hao Yang wrote:
> > > Hi Jacopo,
> > >
> > > On Wed, Dec 4, 2024 at 11:35 PM Cheng-Hao Yang
> > > <chenghaoyang at chromium.org> wrote:
> > > >
> > > > Hi Jacopo,
> > > >
> > > > On Wed, Dec 4, 2024 at 11:16 PM Jacopo Mondi
> > > > <jacopo.mondi at ideasonboard.com> wrote:
> > > > >
> > > > > Hi Harvey
> > > > >
> > > > > On Wed, Dec 04, 2024 at 10:31:16PM +0800, Cheng-Hao Yang wrote:
> > > > > > Hi Jacopo,
> > > > > >
> > > > > > On Wed, Dec 4, 2024 at 6:48 PM Jacopo Mondi
> > > > > > <jacopo.mondi at ideasonboard.com> wrote:
> > > > > > >
> > > > > > > Hi Harvey
> > > > > > >
> > > > > > > On Wed, Dec 04, 2024 at 06:29:53PM +0800, Cheng-Hao Yang wrote:
> > > > > > > > Hi Jacopo,
> > > > > > > >
> > > > > > > > On Wed, Dec 4, 2024 at 5:42 PM Jacopo Mondi
> > > > > > > > <jacopo.mondi at ideasonboard.com> wrote:
> > > > > > > > >
> > > > > > > > > Hi Harvey
> > > > > > > > >
> > > > > > > > > On Tue, Dec 03, 2024 at 10:33:22PM +0800, Cheng-Hao Yang wrote:
> > > > > > > > > > Hi Jacopo,
> > > > > > > > > >
> > > > > > > > > > On Tue, Dec 3, 2024 at 1:04 AM Jacopo Mondi
> > > > > > > > > > <jacopo.mondi at ideasonboard.com> wrote:
> > > > > > > > > > >
> > > > > > > > > > > Hi Harvey
> > > > > > > > > > >
> > > > > > > > > > > On Fri, Nov 29, 2024 at 05:17:30PM +0800, Cheng-Hao Yang wrote:
> > > > > > > > > > > > Hi Jacopo,
> > > > > > > > > > > >
> > > > > > > > > > > > On Fri, Nov 29, 2024 at 5:05 PM Cheng-Hao Yang
> > > > > > > > > > > > <chenghaoyang at chromium.org> wrote:
> > > > > > > > > > > > >
> > > > > > > > > > > > > Hi Jacopo,
> > > > > > > > > > > > >
> > > > > > > > > > > > > On Thu, Nov 28, 2024 at 11:35 PM Jacopo Mondi
> > > > > > > > > > > > > <jacopo.mondi at ideasonboard.com> wrote:
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > Hi Harvey
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > On Wed, Nov 27, 2024 at 09:25:58AM +0000, Harvey Yang wrote:
> > > > > > > > > > > > > > > From: Han-Lin Chen <hanlinchen at chromium.org>
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > With partial result, some metadata, which needs to be added into Exif,
> > > > > > > > > > > > > > > may be sent back to framework earlier before Jpeg post-processing.
> > > > > > > > > > > > > > > Add a type JpegExifMetadata associated with StreamBuffer to store the values,
> > > > > > > > > > > > > > > so Jpeg post-processing doesn't need to reference to current metadata.
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > Signed-off-by: Han-Lin Chen <hanlinchen at chromium.org>
> > > > > > > > > > > > > > > Co-developed-by: Harvey Yang <chenghaoyang at chromium.org>
> > > > > > > > > > > > > > > Signed-off-by: Harvey Yang <chenghaoyang at chromium.org>
> > > > > > > > > > > > > > > ---
> > > > > > > > > > > > > > > src/android/camera_device.cpp | 26 ++++++++++++++++++++++++
> > > > > > > > > > > > > > > src/android/camera_device.h | 2 ++
> > > > > > > > > > > > > > > src/android/camera_request.h | 6 ++++++
> > > > > > > > > > > > > > > src/android/camera_stream.h | 4 ++++
> > > > > > > > > > > > > > > src/android/jpeg/post_processor_jpeg.cpp | 12 ++++++-----
> > > > > > > > > > > > > > > 5 files changed, 45 insertions(+), 5 deletions(-)
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > diff --git a/src/android/camera_device.cpp b/src/android/camera_device.cpp
> > > > > > > > > > > > > > > index 9fd851bc8..e085e18b2 100644
> > > > > > > > > > > > > > > --- a/src/android/camera_device.cpp
> > > > > > > > > > > > > > > +++ b/src/android/camera_device.cpp
> > > > > > > > > > > > > > > @@ -1250,6 +1250,10 @@ void CameraDevice::requestComplete(Request *request)
> > > > > > > > > > > > > > > CameraStream *stream = iter->first;
> > > > > > > > > > > > > > > StreamBuffer *buffer = iter->second;
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > + if (stream->isJpegStream()) {
> > > > > > > > > > > > > > > + generateJpegExifMetadata(descriptor, buffer);
> > > > > > > > > > > > > > > + }
> > > > > > > > > > > > > > > +
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > no {} for single line statements
> > > > > > > > > > > >
> > > > > > > > > > > > Done
> > > > > > > > > > > >
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > > FrameBuffer *src = request->findBuffer(stream->stream());
> > > > > > > > > > > > > > > if (!src) {
> > > > > > > > > > > > > > > LOG(HAL, Error) << "Failed to find a source stream buffer";
> > > > > > > > > > > > > > > @@ -1443,6 +1447,28 @@ void CameraDevice::notifyError(uint32_t frameNumber, camera3_stream_t *stream,
> > > > > > > > > > > > > > > callbacks_->notify(callbacks_, ¬ify);
> > > > > > > > > > > > > > > }
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > +/*
> > > > > > > > > > > > > > > + * Set jpeg metadata used to generate EXIF in the JPEG post processing.
> > > > > > > > > > > > > > > + */
> > > > > > > > > > > > > > > +void CameraDevice::generateJpegExifMetadata(Camera3RequestDescriptor *request,
> > > > > > > > > > > > > > > + StreamBuffer *buffer) const
> > > > > > > > > > > > > > > +{
> > > > > > > > > > > > > > > + const ControlList &metadata = request->request_->metadata();
> > > > > > > > > > > > > > > + auto &jpegExifMetadata = buffer->jpegExifMetadata;
> > > > > > > > > > > > > > > + jpegExifMetadata.emplace(StreamBuffer::JpegExifMetadata());
> > > > > > > > > > > > > > > +
> > > > > > > > > > > > > > > + const int64_t exposureTime = metadata.get(controls::ExposureTime).value_or(0);
> > > > > > > > > > > > > > > + jpegExifMetadata->sensorExposureTime = exposureTime;
> > > > > > > > > > > > > > > +
> > > > > > > > > > > > > > > + /*
> > > > > > > > > > > > > > > + * todo: Android Sensitivity should only include analog gain X digital
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > \todo
> > > > > > > > > > > >
> > > > > > > > > > > > done
> > > > > > > > > > > >
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > > + * gain from sensor. Digital gain on ISP shouldn't be included.
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > mmm, I guess how the gain is split between analogue and digital on the
> > > > > > > > > > > > > > sensor is up to the IPA implementation, and currently I only see vc4
> > > > > > > > > > > > > > handling it and it sets it on the ISP.
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > I wonder if you couldn't simply use AnalogueGain here
> > > > > > > > > > > >
> > > > > > > > > > > > I think the comment here is assuming the further changes that use
> > > > > > > > > > > > AnalogueGain directly here, while might not be needed in this patch...
> > > > > > > > > > > > Removed.
> > > > > > > > > > >
> > > > > > > > > > > I'm not sure I get this comment in full. I was suggesting to use
> > > > > > > > > > > AnalogueGain directly here, but if it isn't required, I'm fine with
> > > > > > > > > > > keeping a \todo
> > > > > > > > > >
> > > > > > > > > > Yeah I know, while I think using AnalogueGain or even a \todo is
> > > > > > > > > > irrelevant in this patch. We may add a separate one to use
> > > > > > > > > > AnalogueGain directly :)
> > > > > > > > > >
> > > > > > > > > > >
> > > > > > > > > > > >
> > > > > > > > > > > > I'll remember (hopefully) when we use AnalogueGain here in the
> > > > > > > > > > > > following patches.
> > > > > > > > > > > >
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > > + * Calculate sensitivity accordingly when we can differentiate
> > > > > > > > > > > > > > > + * the source of digital gains.
> > > > > > > > > > > > > > > + */
> > > > > > > > > > > > > > > + jpegExifMetadata->sensorSensitivityISO = 100;
> > > > > > > > > > > > > > > +}
> > > > > > > > > > > > > > > +
> > > > > > > > > > > > > > > /*
> > > > > > > > > > > > > > > * Produce a set of fixed result metadata.
> > > > > > > > > > > > > > > */
> > > > > > > > > > > > > > > diff --git a/src/android/camera_device.h b/src/android/camera_device.h
> > > > > > > > > > > > > > > index 815a695d1..3c46ff918 100644
> > > > > > > > > > > > > > > --- a/src/android/camera_device.h
> > > > > > > > > > > > > > > +++ b/src/android/camera_device.h
> > > > > > > > > > > > > > > @@ -102,6 +102,8 @@ private:
> > > > > > > > > > > > > > > void sendCaptureResult(Camera3RequestDescriptor *request) const;
> > > > > > > > > > > > > > > void setBufferStatus(StreamBuffer &buffer,
> > > > > > > > > > > > > > > StreamBuffer::Status status);
> > > > > > > > > > > > > > > + void generateJpegExifMetadata(Camera3RequestDescriptor *request,
> > > > > > > > > > > > > > > + StreamBuffer *buffer) const;
> > > > > > > > > > > > > > > std::unique_ptr<CameraMetadata> getResultMetadata(
> > > > > > > > > > > > > > > const Camera3RequestDescriptor &descriptor) const;
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > diff --git a/src/android/camera_request.h b/src/android/camera_request.h
> > > > > > > > > > > > > > > index bd75d4595..bd87b36fd 100644
> > > > > > > > > > > > > > > --- a/src/android/camera_request.h
> > > > > > > > > > > > > > > +++ b/src/android/camera_request.h
> > > > > > > > > > > > > > > @@ -44,6 +44,11 @@ public:
> > > > > > > > > > > > > > > StreamBuffer(StreamBuffer &&);
> > > > > > > > > > > > > > > StreamBuffer &operator=(StreamBuffer &&);
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > + struct JpegExifMetadata {
> > > > > > > > > > > > > > > + int64_t sensorExposureTime;
> > > > > > > > > > > > > > > + int32_t sensorSensitivityISO;
> > > > > > > > > > > > > > > + };
> > > > > > > > > > > > > > > +
> > > > > > > > > > > > > > > CameraStream *stream;
> > > > > > > > > > > > > > > buffer_handle_t *camera3Buffer;
> > > > > > > > > > > > > > > std::unique_ptr<HALFrameBuffer> frameBuffer;
> > > > > > > > > > > > > > > @@ -51,6 +56,7 @@ public:
> > > > > > > > > > > > > > > Status status = Status::Success;
> > > > > > > > > > > > > > > const libcamera::FrameBuffer *srcBuffer = nullptr;
> > > > > > > > > > > > > > > std::unique_ptr<CameraBuffer> dstBuffer;
> > > > > > > > > > > > > > > + std::optional<JpegExifMetadata> jpegExifMetadata;
> > > > > > > > > > > > > > > Camera3RequestDescriptor *request;
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > private:
> > > > > > > > > > > > > > > diff --git a/src/android/camera_stream.h b/src/android/camera_stream.h
> > > > > > > > > > > > > > > index 30f64f690..47cd7ab85 100644
> > > > > > > > > > > > > > > --- a/src/android/camera_stream.h
> > > > > > > > > > > > > > > +++ b/src/android/camera_stream.h
> > > > > > > > > > > > > > > @@ -125,6 +125,10 @@ public:
> > > > > > > > > > > > > > > const libcamera::StreamConfiguration &configuration() const;
> > > > > > > > > > > > > > > libcamera::Stream *stream() const;
> > > > > > > > > > > > > > > CameraStream *sourceStream() const { return sourceStream_; }
> > > > > > > > > > > > > > > + bool isJpegStream() const
> > > > > > > > > > > > > > > + {
> > > > > > > > > > > > > > > + return camera3Stream_->format == HAL_PIXEL_FORMAT_BLOB;
> > > > > > > > > > > > > > > + }
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > int configure();
> > > > > > > > > > > > > > > int process(StreamBuffer *streamBuffer);
> > > > > > > > > > > > > > > diff --git a/src/android/jpeg/post_processor_jpeg.cpp b/src/android/jpeg/post_processor_jpeg.cpp
> > > > > > > > > > > > > > > index f5a90785d..48782b574 100644
> > > > > > > > > > > > > > > --- a/src/android/jpeg/post_processor_jpeg.cpp
> > > > > > > > > > > > > > > +++ b/src/android/jpeg/post_processor_jpeg.cpp
> > > > > > > > > > > > > > > @@ -112,8 +112,11 @@ void PostProcessorJpeg::process(StreamBuffer *streamBuffer)
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > const FrameBuffer &source = *streamBuffer->srcBuffer;
> > > > > > > > > > > > > > > CameraBuffer *destination = streamBuffer->dstBuffer.get();
> > > > > > > > > > > > > > > + const std::optional<StreamBuffer::JpegExifMetadata> &jpegExifMetadata =
> > > > > > > > > > > > > > > + streamBuffer->jpegExifMetadata;
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > ASSERT(destination->numPlanes() == 1);
> > > > > > > > > > > > > > > + ASSERT(jpegExifMetadata.has_value());
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > This means it's not optional, isn't it ?
> > > > > > > > > > > >
> > > > > > > > > > > > Means it's not std::nullopt. Any suggestions?
> > > > > > > > > > >
> > > > > > > > > > > yeah, what I meant was "if you ASSERT()" you expect it to be always
> > > > > > > > > > > populated, so std::optional<> doesn't bring any value. But you
> > > > > > > > > > > probably want to make sure that jpegExifMetadata has been populated
> > > > > > > > > > > when this function is called, so feel free to keep std::optional<>
> > > > > > > > > >
> > > > > > > > > > Sure :)
> > > > > > > > > >
> > > > > > > > > > >
> > > > > > > > > > > >
> > > > > > > > > > > > BR,
> > > > > > > > > > > > Harvey
> > > > > > > > > > > >
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > const CameraMetadata &requestMetadata = streamBuffer->request->settings_;
> > > > > > > > > > > > > > > CameraMetadata *resultMetadata = streamBuffer->request->resultMetadata_.get();
> > > > > > > > > > > > > > > @@ -139,15 +142,14 @@ void PostProcessorJpeg::process(StreamBuffer *streamBuffer)
> > > > > > > > > > > > > > > */
> > > > > > > > > > > > > > > exif.setTimestamp(std::time(nullptr), 0ms);
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > - ret = resultMetadata->getEntry(ANDROID_SENSOR_EXPOSURE_TIME, &entry);
> > > > > > > > > > > > > > > - exif.setExposureTime(ret ? *entry.data.i64 : 0);
> > > > > > > > > > > > > > > + /* Exif requires nsec for exposure time */
> > > > > > > > > > > > > > > + exif.setExposureTime(jpegExifMetadata->sensorExposureTime * 1000);
> > > > > > > > > > > > > > > + exif.setISO(jpegExifMetadata->sensorSensitivityISO);
> > > > > > > > > > > > > > > +
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > StreamBuffer has a pointer to the Camera3RequestDescriptor it belongs
> > > > > > > > > > > > > > to. From there you could get the Request metadata as you currently do
> > > > > > > > > > > > > > in CameraDevice::generateJpegExifMetadata().
> > > > > > > > > > > > >
> > > > > > > > > > > > > That's a very good question, only that I wonder how we can handle the
> > > > > > > > > > > > > threading issue properly.
> > > > > > > > > > > > >
> > > > > > > > > > > > > The current implementation of `Request::metadata()` [1] doesn't seem
> > > > > > > > > > > > > to consider race conditions, and our goal is to support partial results,
> > > > > > > > > > > > > which means that the post processor thread might try to access metadata
> > > > > > > > > > > > > when the request is still being processed in the pipeline handler, which
> > > > > > > > > > > > > might set further metadata tags.
> > > > > > > > > > >
> > > > > > > > > > > Is this an issue about threading or about the order in which metadata
> > > > > > > > > > > and buffer completes ?
> > > > > > > > > >
> > > > > > > > > > What I meant is the threading issue, while the order is also an issue, true.
> > > > > > > > > >
> > > > > > > > > > >
> > > > > > > > > > > I see in the next patch you call generateJpegExifMetadata() on a
> > > > > > > > > > > ((Mapped|Internal) && Jpeg) bufferComplete event.
> > > > > > > > > > >
> > > > > > > > > > > https://gitlab.freedesktop.org/camera/libcamera/-/commit/b731fe488badef2861da914913290e16afb716c8#88faf21e943da09c94a5b31cd420d91c35371290_1193_1227
> > > > > > > > > > >
> > > > > > > > > > > And immediately after, if such a buffer has been completed, you call
> > > > > > > > > > > process() on it
> > > > > > > > > > > https://gitlab.freedesktop.org/camera/libcamera/-/commit/b731fe488badef2861da914913290e16afb716c8#88faf21e943da09c94a5b31cd420d91c35371290_1193_1251
> > > > > > > > > > >
> > > > > > > > > > > Is this right ?
> > > > > > > > > > >
> > > > > > > > > > > So when the buffer completes, you inspect the so-far-completed
> > > > > > > > > > > metadata and extract ExposureTime (and AnalogueGain eventually).
> > > > > > > > > > >
> > > > > > > > > > > 1) Why you don't do that at metadataAvailable time ?
> > > > > > > > > >
> > > > > > > > > > Makes a lot of sense. Will try to apply in the next patch.
> > > > > > > > > >
> > > > > > > > > > > 2) What does guarantee that the pipeline has populated ExposureTime at
> > > > > > > > > > > the time the buffer to process has completed ?
> > > > > > > > > >
> > > > > > > > > > That's a very good question. I'd like your input: Do you think we
> > > > > > > > > > should pause the jpeg process until all necessary metadata tags are
> > > > > > > > > > available?
> > > > > > > > > >
> > > > > > > > >
> > > > > > > > > Good question..
> > > > > > > > >
> > > > > > > > > As I see in PostProcessorJpeg::process(StreamBuffer *streamBuffer) the
> > > > > > > > > jpegExifMetadata metadata are used to populate the EXIF data. In the
> > > > > > > > > same function the jpeg post-processor is run, so at the moment EXIF
> > > > > > > > > metadata creation and post-processing happens at the same time and is in
> > > > > > > > > facts run at requestComplete() time.
> > > > > > > > >
> > > > > > > > > Let me get the requirements straight here: I presume the idea is to
> > > > > > > > > produce the jpeg frame as soon as:
> > > > > > > > >
> > > > > > > > > 1) The source buffer is ready
> > > > > > > > > 2) All the required metadata to populate the EXIF tags (collected by
> > > > > > > > > your generateJpegExifMetadata()) are available
> > > > > > > > >
> > > > > > > > > without waiting for the whole request to complete ?
> > > > > > > > >
> > > > > > > > > In your last patch you introduce support for handling
> > > > > > > > > Camera::bufferCompleted and the newly introduced
> > > > > > > > > Camera::metadataAvailable signals but the post-processing still
> > > > > > > > > happens when the buffer is completed without verifying that the
> > > > > > > > > required metadata to populate the EXIF tags are available.
> > > > > > > > >
> > > > > > > > > Does this match your understanding ?
> > > > > > > >
> > > > > > > > Yes, it's my understanding as well. Thanks for sorting the logic out.
> > > > > > > >
> > > > > > > > Han-lin proposes a question though: The ExposureTime (and the upcoming
> > > > > > > > AnalogueGain) metadata should already be ready when the jpeg buffer is
> > > > > > > > completed, so perhaps we could expect them to be notified with signal
> > > > > > >
> > > > > > > Yes, "should", however there's nothing that forces a pipeline handler
> > > > > > > to guarantee that they are. In example, a pipeline handler (especially
> > > > > > > an existing one not ported to use the metadataAvailable signal) might not
> > > > > > > support early metadata completion but could signl buffer being
> > > > > > > completed. Unless we make it mandatory for pipelines to signal those
> > > > > > > metadata before buffers and implement compliance tests to validate
> > > > > > > that, I don't think we can assume anything about ordering.
> > > > > > >
> > > > > > > > metadataAvailable earlier than bufferCompleted being called? He
> > > > > > > > suggests adding a WARNING/ERROR log and a \todo, although I feel that
> > > > > > > > you wouldn't like it(?
> > > > > > > >
> > > > > > >
> > > > > > > I don't think signalling metadata at requestComplete time -after-
> > > > > > > bufferCompleted is a warning/error condition. I'll check with other
> > > > > > > what they think about this.
> > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > > >
> > > > > > > > > One possible way to handle this is to
> > > > > > > > > 1) at bufferCompleted time accumulate a list of completed FrameBuffers
> > > > > > > > > to process (and signal them with process_capture_request). Check if the
> > > > > > > > > Request::metadata() where all the partial metadata are accumulated
> > > > > > > > > contains the necessary tags for EXIF, if they do call process()
> > > > > > > > > otherwise skip it
> > > > > > > > > 2) at metadataAvailable time check if the EXIF are there, if they are
> > > > > > > > > walk the list of completed buffers to process and run post-processing
> > > > > > > >
> > > > > > > > Yes, basically this is what I have in my mind as well, thanks.
> > > > > > > >
> > > > > > > > >
> > > > > > > > > I'm sure there will be challanges, but what do you think of this as a
> > > > > > > > > general outline ?
> > > > > > > > >
> > > > > > > > > Now, how to get there... I really think your last patch should be
> > > > > > > > > broken out to pieces. It's too much stuff to digest in one go.
> > > > > > > > >
> > > > > > > > > There are patches in this series that can be fast-tracked,
> > > > > > > > > specifically the ones that make it possible to correctly handle
> > > > > > > > > multiple Mapped stream on one Direct.
> > > > > > > > >
> > > > > > > > > I would take from this series:
> > > > > > > > > 2/9
> > > > > > > > > 3/9 + 4/9 squased together as suggested in the review
> > > > > > > >
> > > > > > > > Sure, I can squash them if you think that makes more sense. (I spent
> > > > > > > > quite some time breaking this into two though in the beginning haha...
> > > > > > > >
> > > > > > >
> > > > > > > Sorry about that, but the patch is really about supporting multiple
> > > > > > > mapped streams, isn't it ?
> > > > > >
> > > > > > Yeah, and I still think that the 3rd patch doesn't contribute to
> > > > > > supporting multiple mapped streams...
> > > > >
> > > > > Do you mean 3/9, right ?
> > > > >
> > > > > Feel free to keep it separate if you prefer, as long as 4/9 is clearly
> > > > > about "supporting multiple mapped streams"
> > > >
> > > > Ah okay, thanks :)
> > > >
> > > > >
> > > > > > Haven't uploaded as a patch, but you can take a look at the new commit message:
> > > > > > https://gitlab.freedesktop.org/chenghaoyang/libcamera/-/commit/0c5b916a26878776d29982f8ab43c1155daa1f07
> > > > >
> > > > > Careful this patch contains a bunch of unrelated refactories at the
> > > > > end
> > > >
> > > > Yeah, my linter on nvim insists on cleaning up the whole file. I'll
> > > > remove it when uploading :)
> > > >
> > > > >
> > > > > >
> > > > > > No worries, I can still squash it.
> > > > > >
> > > > > > >
> > > > > > > > > 5/9
> > > > > > > > > 6/9
> > > > > > > >
> > > > > > > > I think I'll split this into two: one to use CAMERA3_MSG_ERROR_RESULT,
> > > > > > > > and the other one to report CAMERA3_MSG_ERROR_REQUEST out of order.
> > > > > > > >
> > > > > > >
> > > > > > > Thanks
> > > > > >
> > > > > > We still have some ongoing discussion in the patch. Please help take a
> > > > > > look when you have time. Thanks!
> > > > >
> > > > > On 5/9 and 6/9 ?
> > > > >
> > > > > 5/9 I sent my R-b tag
> > > > > 6/9 I suggested a new commit message but the rest is ok with me
> > > >
> > > > Yeah on 6/9 I assumed you would have questions about introducing
> > > > CAMERA3_MSG_ERROR_RESULT. If not, I'm happy to keep it as is.
> > > >
> > > > >
> > > > > >
> > > > > > >
> > > > > > > > > 7/9 (pending review, I asked Laurent to give there a look)
> > > > > > > > >
> > > > > > > > > Then, prepare a series to fast-track JPEG processing and support
> > > > > > > > > partial results. I would defer 1/9 to that series. Then introduce
> > > > > > > >
> > > > > > > > Sure :)
> > > > > > > >
> > > > > > > > > support for handling bufferCompleted() (without handling
> > > > > > > > > metadataAvailable yet, this will validate that the HAL works with
> > > > > > > > > platforms that do not deliver early metadata). Then handle
> > > > > > > > > metadataAvailable on top.
> > > > > > > >
> > > > > > > > I suggest to support signal metadataAvailable first, because:
> > > > > > > >
> > > > > > > > - It actually involves less changes, as a set of metadata wouldn't be
> > > > > > > > blocked by post processing, and we could immediately notify the
> > > > > > > > application with a partial result.
> > > > >
> > > > > ack
> > > > >
> > > > > > > > - metadataAvailable and bufferCompleted are actually both new to
> > > > > > > > Android adapter.
> > > > > > > > - I've actually done this split in this order, but haven't sent them
> > > > >
> > > > > fine with this ordering
> > > > >
> > > > > > > > as patches yet. You can check on gitlab first:
> > > > > > > > https://gitlab.freedesktop.org/chenghaoyang/libcamera/-/commit/c25f483b7c4b2e6b2d1fc2eb5cf2851db874ab11
> > > > > > > > https://gitlab.freedesktop.org/chenghaoyang/libcamera/-/commit/97b8c426656755b4a6e21cc8d8397ccd92395481
> > > > > > > >
> > > > > > >
> > > > > > > Sure, as long as we test with CTS with both pipelines supporting
> > > > > > > metadataAvailable and pipelines not supporting metadataAvailable
> > > > > >
> > > > > > Before the whole series get merged, I'll test it on mtkisp7, which
> > > > > > will support the new signal. We also have ipu3 on soraka-libcamera
> > > > > > that doesn't support it yet.
> > > > > >
> > > > >
> > > > > That's great, thanks
> > > > >
> > > > > > >
> > > > > > > > >
> > > > > > > > > Again, there will certainly be challanges, and I'm not even sure the
> > > > > > > > > breakdown of the second part is possible, but indeed I would separate
> > > > > > > > > 1/9 and 8/9 9/9 from the rest of the patches and fast-track the
> > > > > > > > > others. What do you think ?
> > > > > > > >
> > > > > > > > Sure, just one question: Do you think we should merge this Jpeg
> > > > > > > > metadata patch into the rest two of the upcoming patches
> > > > > > >
> > > > > > > Please note supporting metadataAvailable and bufferComplete in the HAL
> > > > > > > might require more than just two patches :)
> > > > > >
> > > > > > I really hope I can split them into pieces, while I'm running out of
> > > > > > ideas... If you can take a brief look and let me know how you expect
> > > > > > them to be split, I'd appreciate that a lot!
> > > > > >
> > > > > > Some random ideas:
> > > > > > On `android: Support partial results with metadataAvailable`:
> > > > > >
> > > > > > - Add a patch that only adds `kMaxMetadataPackIndex`, and use it in
> > > > > > the only result.
> > > > > > - Add `Camera3ResultDescriptor` and
> > > > > > `Camera3RequestDescriptor::finalResult_` (with resultMetadata_
> > > > > > migrated), while I'm not sure if it makes sense...
> > > > > >
> > > > >
> > > > > see below
> > > > >
> > > > > > >
> > > > > > > > (metadataAvailable & bufferCompleted)? It's indeed a bit difficult to
> > > > > > > > implement it without Camera3ResultDescriptor and instances available.
> > > > > > >
> > > > > > > Ideally, after having plumbed in support for metadataAvailable and
> > > > > > > bufferComplete we should be in a state where "processing Mapped
> > > > > > > streams as soon as they're available" should become quite natural to
> > > > > > > be done on top.
> > > > > > >
> > > > > > > I presume you will need Camera3ResultDescriptor already when
> > > > > > > handling (metadataAvailable & bufferCompleted) or am I wrong ?
> > > > > >
> > > > > > Yes, what I meant is that I find it difficult to keep this patch
> > > > > > separated from the upcoming patches, unless we implement it on top of
> > > > > > them. In this way though, getting the correct result metadata when
> > > > >
> > > > > Indeed I think it should happen after support for partial metadata and
> > > > > bufferCompleted have been developed
> > > > >
> > > > > > post-processing with jpeg will be unsupported in the middle. Is that
> > > > >
> > > > > Why do you think so ? As long as process() happens at requestCompleted
> > > > > time, all the metadata info you need will be available in
> > > > > Request::metadata
> > > > >
> > > > > > acceptable?
> > > > >
> > > > > I still fail to see why doing this in lock-step would bring issues,
> > > > > but I admit I only glanced through
> > > > > 8 files changed, 621 insertions(+), 277 deletions(-)
> > > > >
> > > > > Could you please tell me in which of the following steps you'll have
> > > > > issues:
> > > > >
> > > > > 1) Add support for metadataAvailable signal
> > > > >
> > > > > Support the new signal and allow partial metadata completion. The
> > > > > HAL can now call process_capture_results multiple times for the
> > > > > same request.
> > > > >
> > > > > Post-processing is not changed as it happens at requestCompleted
> > > > > time, when the buffers are available and the required metadata to
> > > > > populate EXIF will be available in Request::metadata() as it used
> > > > > to be
> > > > >
> > > > > 2) Add support for bufferCompleted
> > > > >
> > > > > Support earlier buffer completion. The HAL should already be
> > > > > instrumented to send partial results for the Direct buffer that has
> > > > > just completed.
> > >
> > > I assume this includes sending the direct buffers back to the
> > > application when receiving signals of bufferCompleted, while delaying
> > > post-processing to requestCompleted.
> > >
> >
> > What makes you think we have to delay post-processing to
> > requestComplete ?
> >
> > My first thought was that as soon as a Direct buffer is ready, if
> > any stream is Mapped on it, we can schedule post-processing.
>
> I assumed that's in the third step (patch). I don't know how to start
> post-processing without the functionalities to check if the required
> metadata to populate the EXIF tags are available (for JPEG).
>
> >
> > > When I tried to implement this, I encountered an issue that I don't
> > > know how to fix: When we send a partial result containing a direct
> > > buffer, the buffer's lifetime cannot be guaranteed. Therefore, if a
> >
> > Yeah, you're now giving it back to the Android framework which is
> > probably free to re-use it as it likes.
> >
> > > Mapped stream/buffer depends on the direct buffer in post-processing,
> > > the post-processing might fail due to the lack of the source buffer.
> > > This basically stops us from sending buffers back to the application
> > > earlier than all post-processing done.
> >
> > If, for some reason I'm now missing, we have to delay post-processing
> > to requestCompleted, then Direct buffers on which another Stream is
> > Mapped on have to be delayed to when post-processing is done, as we do
> > today. Only Direct buffers with no mapped Stream can be signalled
> > earlier.
> >
> > As said, anticipating post-processing to when both the source buffer
> > and the required metadata are available might be desirable, but I
> > guess I'm missing something here.
>
> Exactly, checking the required metadata is mandatory to do
> post-processing earlier in partial results.
> In our previous plan, it happens after the patch to introduce the
> bufferCompleted signal.
Friendly ping: any update?
BR,
Harvey
>
> BR,
> Harvey
>
> >
> > >
> > > Therefore, unless this patch only sets buffers' status and releases
> > > direct buffers' fences, it's very hard to be standalone.
> > >
> > > WDYT?
> > >
> > > BR,
> > > Harvey
> > >
> > >
> > > >
> > > > Yes, if we keep the processing (or at least the jpeg one) in
> > > > requestComplete, then we won't drop the feature.
> > > >
> > > > >
> > > > > 3) Schedule post-processing as soon as the source buffer is available
> > > > >
> > > > > Track if for each request the required metadata to populate the EXIF
> > > > > tags are available (for JPEG) and schedule post-processing as soon
> > > > > as the source buffer (and metadata for JPEG) are available.
> > > > >
> > > > > This indeed might require multiple patches to implement tracking of
> > > > > the metadata and buffer status
> > > > >
> > > > > All steps should be validated with a pipeline that support
> > > > > metadataAvailable and with a pipeline that doesn't.
> > > > >
> > > > > As far as I can all of this currently happens in a single patch (9/9
> > > > > and partially in 8/9). I might be surely missing details and
> > > > > implementation issues, but logically the above seems to me a
> > > > > reasonable break-down of 8/9 and 9/9 ?
> > > >
> > > > Yes, I'll try to implement this. Thanks!
> > > >
> > > > BR,
> > > > Harvey
> > > >
> > > > >
> > > > > >
> > > > > > BR,
> > > > > > Harvey
> > > > > >
> > > > > > >
> > > > > > > >
> > > > > > > > BR,
> > > > > > > > Harvey
> > > > > > > >
> > > > > > > > >
> > > > > > > > >
> > > > > > > > > > >
> > > > > > > > > > >
> > > > > > > > > > >
> > > > > > > > > > >
> > > > > > > > > > > > >
> > > > > > > > > > > > > WDYT?
> > > > > > > > > > > > >
> > > > > > > > > > > > > [1]: https://git.libcamera.org/libcamera/libcamera.git/tree/src/libcamera/request.cpp#n530
> > > > > > > > > > > > >
> > > > > > > > > > >
> > > > > > > > > > > I might have missed why this is related :)
> > > > > > > > > >
> > > > > > > > > > Okay, I think I misunderstood the \todo, but the point is that it
> > > > > > > > > > doesn't handle the threading issue. We should avoid reading it in the
> > > > > > > > > > post processor thread, especially with partial results, as the Request
> > > > > > > > > > might not have been completed.
> > > > > > > > > >
> > > > > > > > > > BR,
> > > > > > > > > > Harvey
> > > > > > > > > >
> > > > > > > > > > >
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > What is the advantage of caching the jpegExifMetadata at
> > > > > > > > > > > > > > CameraDevice::requestComplete() time ?
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > Thanks
> > > > > > > > > > > > > > j
> > > > > > > > > > > > > >
> > > > > > > > > > > > > > > ret = requestMetadata.getEntry(ANDROID_LENS_APERTURE, &entry);
> > > > > > > > > > > > > > > if (ret)
> > > > > > > > > > > > > > > exif.setAperture(*entry.data.f);
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > - ret = resultMetadata->getEntry(ANDROID_SENSOR_SENSITIVITY, &entry);
> > > > > > > > > > > > > > > - exif.setISO(ret ? *entry.data.i32 : 100);
> > > > > > > > > > > > > > > -
> > > > > > > > > > > > > > > exif.setFlash(Exif::Flash::FlashNotPresent);
> > > > > > > > > > > > > > > exif.setWhiteBalance(Exif::WhiteBalance::Auto);
> > > > > > > > > > > > > > >
> > > > > > > > > > > > > > > --
> > > > > > > > > > > > > > > 2.47.0.338.g60cca15819-goog
> > > > > > > > > > > > > > >
More information about the libcamera-devel
mailing list