[libcamera-devel] [PATCH] ipa: raspberrypi: Rate-limit the controller algorithms

David Plowman david.plowman at raspberrypi.com
Thu Apr 15 16:25:58 CEST 2021


Hi Naush

Thanks for this patch, I think it's a good idea!

On Thu, 15 Apr 2021 at 11:18, Naushir Patuck <naush at raspberrypi.com> wrote:
>
> The controller algorithms currently run on every frame provided to the
> IPA by the pipeline handler. This may be undesirable for very fast fps
> operating modes where it could significantly increase the computation
> cycles (per unit time) without providing any significant changes to the
> IQ parameters. The added latencies could also cause dropped frames.
>
> Pass the FrameBuffer timestamp to the IPA through the controls. This
> timestamp will be used to rate-limit the controller algorithms to run
> with a minimum inter-frame time given by a compile time constant,
> currently set to 16.66ms.
>
> Signed-off-by: Naushir Patuck <naush at raspberrypi.com>
> ---
>  src/ipa/raspberrypi/raspberrypi.cpp           | 39 ++++++++++++++++++-
>  .../pipeline/raspberrypi/raspberrypi.cpp      |  5 +++
>  2 files changed, 42 insertions(+), 2 deletions(-)
>
> diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp
> index dad6395f0823..aa071473d6e7 100644
> --- a/src/ipa/raspberrypi/raspberrypi.cpp
> +++ b/src/ipa/raspberrypi/raspberrypi.cpp
> @@ -61,6 +61,14 @@ constexpr unsigned int DefaultExposureTime = 20000;
>  constexpr double defaultMinFrameDuration = 1e6 / 30.0;
>  constexpr double defaultMaxFrameDuration = 1e6 / 0.01;
>
> +/*
> + * Determine the minimum allowable inter-frame duration (in us) to run the
> + * controller algorithms. If the pipeline handler provider frames at a rate
> + * higher than this, we rate-limit the controller prepare() and process() calls

It wasn't clear to me whether prepare() was being rate-limited too?
More on that below...

> + * to lower than or equal to this rate.
> + */
> +constexpr double controllerMinFrameDuration = 1e6 / 60.0;
> +
>  LOG_DEFINE_CATEGORY(IPARPI)
>
>  class IPARPi : public ipa::RPi::IPARPiInterface
> @@ -68,7 +76,7 @@ class IPARPi : public ipa::RPi::IPARPiInterface
>  public:
>         IPARPi()
>                 : controller_(), frameCount_(0), checkCount_(0), mistrustCount_(0),
> -                 lsTable_(nullptr), firstStart_(true)
> +                 lastRunTimestamp_(0), lsTable_(nullptr), firstStart_(true)
>         {
>         }
>
> @@ -145,6 +153,12 @@ private:
>         /* How many frames we should avoid running control algos on. */
>         unsigned int mistrustCount_;
>
> +       /* Frame timestamp for the last run of the controller. */
> +       uint64_t lastRunTimestamp_;
> +
> +       /* Do we run a Controller::process() for this frame? */
> +       bool processPending_;
> +
>         /* LS table allocation passed in from the pipeline handler. */
>         FileDescriptor lsTableHandle_;
>         void *lsTable_;
> @@ -406,7 +420,7 @@ void IPARPi::signalStatReady(uint32_t bufferId)
>  {
>         if (++checkCount_ != frameCount_) /* assert here? */
>                 LOG(IPARPI, Error) << "WARNING: Prepare/Process mismatch!!!";
> -       if (frameCount_ > mistrustCount_)
> +       if (processPending_ && frameCount_ > mistrustCount_)
>                 processStats(bufferId);

Although not related directly to this bit of code, I wonder whether we
have an interaction between skipping the algorithms and the frame
dropping that we do at startup (while waiting for those algorithms to
converge)? i.e. are we elapsing those "drop frames" at a faster rate
than the IPA can run in order to achieve convergence during those
frames?

>
>         reportMetadata();
> @@ -894,6 +908,7 @@ void IPARPi::returnEmbeddedBuffer(unsigned int bufferId)
>
>  void IPARPi::prepareISP(const ipa::RPi::ISPConfig &data)
>  {
> +       int64_t frameTimestamp = data.controls.get(controls::draft::SensorTimestamp);
>         struct DeviceStatus deviceStatus = {};
>         bool success = false;
>
> @@ -919,6 +934,26 @@ void IPARPi::prepareISP(const ipa::RPi::ISPConfig &data)
>                 fillDeviceStatus(exposureLines, gainCode, deviceStatus);
>         }
>
> +       if (lastRunTimestamp_ &&
> +           frameTimestamp - lastRunTimestamp_ < controllerMinFrameDuration * 1e3) {
> +               /*
> +                * Ensure we update the controller metadata with the new frame's
> +                * exposure/gain values so that the correct values are returned
> +                * out in libcamera metadata later on. All other metadata values
> +                * must remain the same as the last frame.
> +                */
> +               rpiMetadata_.Set("device.status", deviceStatus);
> +               processPending_ = false;
> +               LOG(IPARPI, Debug) << "Rate-limiting the controller! inter-frame duration: "
> +                                  << frameTimestamp - lastRunTimestamp_
> +                                  << ", min duration "
> +                                  << controllerMinFrameDuration * 1e3;
> +               return;
> +       }
> +
> +       lastRunTimestamp_ = frameTimestamp;
> +       processPending_ = true;
> +
>         ControlList ctrls(ispCtrls_);
>
>         rpiMetadata_.Clear();

So here I'm inferring that we don't skip prepare(). Mostly I think
this is fine as prepare() does much less work. Probably ALSC is the
worst as it interpolates lens shading tables.

One little corner we might want to watch out for is something like the
AE/AGC lock count. This counts in Agc::Prepare() how long things
appear to be "steady", and if Process() runs less often than
Prepare(), it seems possible that Prepare() may see things as steadier
than they really are. Maybe we're ok, but it's just something to
check.

Also, I have a feeling we may get conflicts here with the
CamHelper::Prepare/Process patch. Now it's a race to see who gets in
first! :)

Thanks!

David

> diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> index 2a917455500f..9cf9c8c6cebd 100644
> --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> @@ -1414,6 +1414,11 @@ void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer)
>                  * DelayedControl and queue them along with the frame buffer.
>                  */
>                 ControlList ctrl = delayedCtrls_->get(buffer->metadata().sequence);
> +               /*
> +                * Add the frame timestamp to the ControlList for the IPA to use
> +                * as it does not receive the FrameBuffer object.
> +                */
> +               ctrl.set(controls::draft::SensorTimestamp, buffer->metadata().timestamp);
>                 bayerQueue_.push({ buffer, std::move(ctrl) });
>         } else {
>                 embeddedQueue_.push(buffer);
> --
> 2.25.1
>


More information about the libcamera-devel mailing list