[libcamera-devel] [PATCH v3 10/10] android: camera_stream: Run post processor in a thread

Umang Jain umang.jain at ideasonboard.com
Wed Sep 22 13:44:55 CEST 2021


Hello Laurent,

I'll defer the reply to this review. Reason is, I believe some 
conversations happened on the IRC and evolved a bit. I'll try to 
implement and see what works and how it shapes up overall and write my 
thoughts here (implementation-point-of-view) that will be done as Part 
III. I hope that's ok!

On 9/22/21 5:28 AM, Laurent Pinchart wrote:
> Hi Umang,
>
> Thank you for the patch.
>
> On Mon, Sep 20, 2021 at 11:07:52PM +0530, Umang Jain wrote:
>> This commit makes the post processor run in a separate thread.
>> To enable the move to a separate thread, he post processor
> s/he post/the post/
>
>> needs to be inherited from libcamera::Object and use the
> s/to be inherited/to inherit/
>
>> Object::moveToThread() API to execute the move.
>>
>> A user defined destructor is introduced to stop the thread for cleanup.
> s/user defined/user-defined/
>
>> Since CameraStream is move-constructible and compiler implicitly
>> generates its constructor, defining a destructor will prevent the
>> compiler from adding an implicitly-declared move constructor. Therefore,
>> we need to explicitly add a move constructor as well.
>>
>> Also, the destination buffer for post-processing needs to be alive and
>> stored as a part of overall context, hence a CameraBuffer unique-pointer
>> member is introduced in the Camera3RequestDescriptor struct.
>>
>> ---
>>   /* \todo Use ConnectionTypeQueued instead of ConnectionTypeBlocking */
> Indeed, that defeates the point of a thread a little bit :-) What's
> blocking the switch to the queued connection type ?
>
>> ---
>>
>> Signed-off-by: Umang Jain <umang.jain at ideasonboard.com>
>> ---
>>   src/android/camera_device.h   |  2 ++
>>   src/android/camera_stream.cpp | 29 +++++++++++++++++++++--------
>>   src/android/camera_stream.h   |  5 +++++
>>   src/android/post_processor.h  |  3 ++-
>>   4 files changed, 30 insertions(+), 9 deletions(-)
>>
>> diff --git a/src/android/camera_device.h b/src/android/camera_device.h
>> index 0bd570a1..e3f3fe7c 100644
>> --- a/src/android/camera_device.h
>> +++ b/src/android/camera_device.h
>> @@ -27,6 +27,7 @@
>>   #include <libcamera/request.h>
>>   #include <libcamera/stream.h>
>>   
>> +#include "camera_buffer.h"
>>   #include "camera_capabilities.h"
>>   #include "camera_metadata.h"
>>   #include "camera_stream.h"
>> @@ -55,6 +56,7 @@ struct Camera3RequestDescriptor {
>>   	CameraMetadata settings_;
>>   	std::unique_ptr<CaptureRequest> request_;
>>   	std::unique_ptr<CameraMetadata> resultMetadata_;
>> +	std::unique_ptr<CameraBuffer> destBuffer_;
>>   
>>   	camera3_capture_result_t captureResult_ = {};
>>   	libcamera::FrameBuffer *internalBuffer_;
>> diff --git a/src/android/camera_stream.cpp b/src/android/camera_stream.cpp
>> index c18c7041..6578ce09 100644
>> --- a/src/android/camera_stream.cpp
>> +++ b/src/android/camera_stream.cpp
>> @@ -55,6 +55,9 @@ CameraStream::CameraStream(CameraDevice *const cameraDevice,
>>   		 * is what we instantiate here.
>>   		 */
>>   		postProcessor_ = std::make_unique<PostProcessorJpeg>(cameraDevice_);
>> +		thread_ = std::make_unique<libcamera::Thread>();
> I don't think you need to allocate the thread dynamically, you can have
> a
>
> 	libcamera::Thread thread_;
>
> in CameraStream.
>
>> +		postProcessor_->moveToThread(thread_.get());
>> +		thread_->start();
>>   	}
>>   
>>   	if (type == Type::Internal) {
>> @@ -63,6 +66,14 @@ CameraStream::CameraStream(CameraDevice *const cameraDevice,
>>   	}
>>   }
> Please add
>
> CameraStream::CameraStream(CameraStream &&other) = default;
>
> here and drop the ' = default' in the header file, we don't need to
> inline the move-constructor.
>
>> +CameraStream::~CameraStream()
>> +{
>> +	if (thread_) {
>> +		thread_->exit();
>> +		thread_->wait();
>> +	}
> This makes me realize that we're missing support for being able to flush
> the requests queued to the post-processor. That will require some extra
> work, and I'm tempted to say we should get this series merged with
> ConnectionTypeBlocking first as an intermediate step. I'm fairly
> confident this is going in the right direction and that implementing
> flush support won't require a complete redesign, but there's still this
> little voice that tells me the risk is not zero. Feel free to agree or
> disagree :-)
>
> One thing that will likely need to be reworked more extensively is this
> patch though. The most naive implementation would be to add start() and
> stop() functions to CameraStream. start() will start the thread, stop()
> will stop it (calling exit and wait). You'll have to call create a
> custom class inheriting from Thread (really sorry for going back and
> forth on this topic :-S) to reimplement run() and call
>
> 	dispatchMessages(Message::Type::InvokeMessage);
>
> after calling exec(). Otherwise you'll have queued process() calls that
> will be left dangling. Performance won't be great, as *all* queued
> requests will be processed, even the ones whose processing hasn't
> started yet when stop() is called.
>
> To make it better, we should complete the ongoing processing, and then
> complete the other requests that have been queued but haven't been
> processed yet. This will require keeping a queue of pending requests in
> the camera stream. Requests will be added to the queue before they get
> passed to the post-processor, and removed when the post-processor
> signals completion. You'll be able to remove the dispatchMessages() call
> (and won't have to reimplement Thread::run(), so no need to subclass it
> after all \o/), and after the thread stops (after wait() returns),
> you'll have to go through the queue and signal completion of all
> requests still stored there. This shouldn't be too complex, I'd aim for
> this option directly as you won't need to subclass Thread.
>
> Ideally, to avoid delays when starting the camera and when flushing it,
> we should keep the thread running at all times (when no work will be
> queued to it, it will not consume any CPU time). There's no way to
> instruct a running thread to finish the ongoing processing and drop the
> pending invokeMethod() calls, so you'll need to subclass Thread (did I
> mention going back and forth ? :-)) and reimplement run(). This time you
> won't be able to rely on exec() and invokeMethod(), so you'll have to
> implement a queue of request in the thread subclass, a custom loop in
> run() that will process the requests from the queue, and add a condition
> variable to signal to the run() function that a new request has been
> added to the queue, or that a flush() has been requested. That way
> you'll be able to get rid of start() and stop(), and only add flush().
>
>> +}
>> +
>>   const StreamConfiguration &CameraStream::configuration() const
>>   {
>>   	return config_->at(index_);
>> @@ -111,21 +122,23 @@ void CameraStream::process(const FrameBuffer *source,
>>   		return;
>>   	}
>>   
>> -	/*
>> -	 * \todo Buffer mapping and processing should be moved to a
>> -	 * separate thread.
>> -	 */
>>   	const StreamConfiguration &output = configuration();
>> -	CameraBuffer dest(camera3Dest, formats::MJPEG, output.size,
>> -			  PROT_READ | PROT_WRITE);
>> -	if (!dest.isValid()) {
>> +	std::unique_ptr<CameraBuffer> dest =
>> +		std::make_unique<CameraBuffer>(camera3Dest, formats::MJPEG,
>> +					       output.size, PROT_READ | PROT_WRITE);
>> +
>> +	if (!dest->isValid()) {
>>   		LOG(HAL, Error) << "Failed to map android blob buffer";
>>   		request->status_ = Camera3RequestDescriptor::ProcessStatus::Error;
>>   		handleProcessComplete(request);
>>   		return;
>>   	}
> Blank line.
>
>> +	request->destBuffer_ = std::move(dest);
>>   
>> -	postProcessor_->process(source, &dest, request);
>> +	/* \todo Use ConnectionTypeQueued instead of ConnectionTypeBlocking */
>> +	postProcessor_->invokeMethod(&PostProcessor::process,
>> +				     ConnectionTypeBlocking, source,
>> +				     request->destBuffer_.get(), request);
>>   }
>>   
>>   void CameraStream::handleProcessComplete(Camera3RequestDescriptor *request)
>> diff --git a/src/android/camera_stream.h b/src/android/camera_stream.h
>> index d4ec5c25..42db72d8 100644
>> --- a/src/android/camera_stream.h
>> +++ b/src/android/camera_stream.h
>> @@ -13,6 +13,8 @@
>>   
>>   #include <hardware/camera3.h>
>>   
>> +#include <libcamera/base/thread.h>
>> +
>>   #include <libcamera/camera.h>
>>   #include <libcamera/framebuffer.h>
>>   #include <libcamera/framebuffer_allocator.h>
>> @@ -113,6 +115,8 @@ public:
>>   	CameraStream(CameraDevice *const cameraDevice,
>>   		     libcamera::CameraConfiguration *config, Type type,
>>   		     camera3_stream_t *camera3Stream, unsigned int index);
>> +	CameraStream(CameraStream &&other) = default;
>> +	~CameraStream();
>>   
>>   	Type type() const { return type_; }
>>   	const camera3_stream_t &camera3Stream() const { return *camera3Stream_; }
>> @@ -143,6 +147,7 @@ private:
>>   	 */
>>   	std::unique_ptr<std::mutex> mutex_;
>>   	std::unique_ptr<PostProcessor> postProcessor_;
>> +	std::unique_ptr<libcamera::Thread> thread_;
>>   };
>>   
>>   #endif /* __ANDROID_CAMERA_STREAM__ */
>> diff --git a/src/android/post_processor.h b/src/android/post_processor.h
>> index 48ddd8ac..fdfd52d3 100644
>> --- a/src/android/post_processor.h
>> +++ b/src/android/post_processor.h
>> @@ -7,6 +7,7 @@
>>   #ifndef __ANDROID_POST_PROCESSOR_H__
>>   #define __ANDROID_POST_PROCESSOR_H__
>>   
>> +#include <libcamera/base/object.h>
>>   #include <libcamera/base/signal.h>
>>   
>>   #include <libcamera/framebuffer.h>
>> @@ -18,7 +19,7 @@ class CameraMetadata;
>>   
>>   struct Camera3RequestDescriptor;
>>   
>> -class PostProcessor
>> +class PostProcessor : public libcamera::Object
>>   {
>>   public:
>>   	virtual ~PostProcessor() = default;


More information about the libcamera-devel mailing list