[libcamera-devel] [PATCH v2 2/3] libcamera: raspberrypi: Add control of sensor vblanking

Laurent Pinchart laurent.pinchart at ideasonboard.com
Mon May 25 01:16:08 CEST 2020


Hi Naush,

Thank you for the patch.

On Wed, May 13, 2020 at 10:11:19AM +0100, Naushir Patuck wrote:
> Add support for setting V4L2_CID_VBLANK appropriately when setting
> V4L2_CID_EXPOSURE. This will allow adaptive framerates during
> viewfinder use cases (e.g. when the exposure time goes above 33ms, we
> can reduce the framerate to lower than 30fps).
> 
> The minimum and maximum frame durations are provided via libcamera
> controls, and will prioritise exposure time limits over any AGC request.

General question about the AGC and frame duration selection behaviour.
When the scene illumination drops and the applications allows the frame
rate to be lowered, is it best to adapt the frame duration continuously,
or jump from, let's say, 30fps to 15fps and stay at 15fps ? I could
imagine adaptative frame durations to be annoying to handle for codecs
or displays, but I probably lack a good view of the details.

> V4L2_CID_VBLANK is controlled through the staggered writer, just like
> the exposure and gain controls.
> 
> Signed-off-by: Naushir Patuck <naush at raspberrypi.com>
> ---
>  include/ipa/raspberrypi.h                     |  1 +
>  src/ipa/raspberrypi/cam_helper.cpp            | 38 ++++++++++++++-
>  src/ipa/raspberrypi/cam_helper.hpp            | 15 +++++-
>  src/ipa/raspberrypi/cam_helper_imx219.cpp     | 11 ++++-
>  src/ipa/raspberrypi/cam_helper_imx477.cpp     | 11 ++++-
>  src/ipa/raspberrypi/cam_helper_ov5647.cpp     | 11 ++++-
>  src/ipa/raspberrypi/raspberrypi.cpp           | 48 ++++++++++++++++---
>  .../pipeline/raspberrypi/raspberrypi.cpp      |  2 +
>  8 files changed, 124 insertions(+), 13 deletions(-)
> 
> diff --git a/include/ipa/raspberrypi.h b/include/ipa/raspberrypi.h
> index c109469e..74954db2 100644
> --- a/include/ipa/raspberrypi.h
> +++ b/include/ipa/raspberrypi.h
> @@ -51,6 +51,7 @@ static const ControlInfoMap RPiControls = {
>  	{ &controls::Brightness, ControlInfo(-1.0f, 1.0f) },
>  	{ &controls::Contrast, ControlInfo(0.0f, 32.0f) },
>  	{ &controls::Saturation, ControlInfo(0.0f, 32.0f) },
> +	{ &controls::FrameDurations, ControlInfo(1.0e3f, 1.0e9f) }
>  };
>  
>  } /* namespace libcamera */
> diff --git a/src/ipa/raspberrypi/cam_helper.cpp b/src/ipa/raspberrypi/cam_helper.cpp
> index 7f05d2c6..06732241 100644
> --- a/src/ipa/raspberrypi/cam_helper.cpp
> +++ b/src/ipa/raspberrypi/cam_helper.cpp
> @@ -34,8 +34,10 @@ CamHelper *CamHelper::Create(std::string const &cam_name)
>  	return nullptr;
>  }
>  
> -CamHelper::CamHelper(MdParser *parser)
> -	: parser_(parser), initialized_(false)
> +CamHelper::CamHelper(MdParser *parser, unsigned int maxFrameLength,
> +		     unsigned int frameIntegrationDiff)
> +	: parser_(parser), initialized_(false), maxFrameLength_(maxFrameLength),
> +	  frameIntegrationDiff_(frameIntegrationDiff)
>  {
>  }
>  
> @@ -56,6 +58,38 @@ double CamHelper::Exposure(uint32_t exposure_lines) const
>  	return exposure_lines * mode_.line_length / 1000.0;
>  }
>  
> +uint32_t CamHelper::GetVBlanking(double &exposure, double minFrameDuration,
> +				 double maxFrameDuration) const
> +{
> +	uint32_t frameLengthMin, frameLengthMax, vblank;
> +	uint32_t exposureLines = ExposureLines(exposure);
> +
> +	assert(initialized_);
> +	/*
> +	 * Clip frame length by the frame duration range and the maximum allowable
> +	 * value in the sensor, given by maxFrameLength_.
> +	 */
> +	frameLengthMin = std::min<uint32_t>(1e3 * minFrameDuration / mode_.line_length,
> +					    maxFrameLength_);
> +	frameLengthMax = std::min<uint32_t>(1e3 * maxFrameDuration / mode_.line_length,
> +					    maxFrameLength_);

Not something to be addressed as part of this series, but as we divide
and multiply by 1000 in various places, should we specify all duration
controls in nanoseconds instead of microseconds ?

> +	/*
> +	 * Limit the exposure to the maximum frame duration requested, and
> +	 * re-calculate if it has been clipped.
> +	 */
> +	exposureLines = std::min(frameLengthMax - frameIntegrationDiff_,
> +				 exposureLines);
> +	exposure = Exposure(exposureLines);
> +
> +	/* Limit the vblank to the range allowed by the frame length limits. */
> +	vblank = std::max<uint32_t>(mode_.height, exposureLines) -
> +		 mode_.height + frameIntegrationDiff_;

If the exposure time is shorter than the frame height, do we need the
frameIntegrationDiff_ margin ? Assuming a sensor that would have a
minimum vblank of 0 and a frameIntegrationDiff_ of 40, if height was
1080 and exposureLines 500, there should be no need to set vblank to 40,
right ?

> +	vblank = std::max(vblank, frameLengthMin - mode_.height);
> +	vblank = std::min(vblank, frameLengthMax - mode_.height);

You can use

	vblank = utils::clamp(vblank, frameLengthMin - mode_.height,
			      frameLengthMax - mode_.height);

> +
> +	return vblank;
> +}
> +
>  void CamHelper::SetCameraMode(const CameraMode &mode)
>  {
>  	mode_ = mode;
> diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp
> index 0c8aa29a..fdcdbddb 100644
> --- a/src/ipa/raspberrypi/cam_helper.hpp
> +++ b/src/ipa/raspberrypi/cam_helper.hpp
> @@ -68,12 +68,15 @@ class CamHelper
>  {
>  public:
>  	static CamHelper *Create(std::string const &cam_name);
> -	CamHelper(MdParser *parser);
> +	CamHelper(MdParser *parser, unsigned int maxFrameLength,
> +		  unsigned int frameIntegrationDiff);
>  	virtual ~CamHelper();
>  	void SetCameraMode(const CameraMode &mode);
>  	MdParser &Parser() const { return *parser_; }
>  	uint32_t ExposureLines(double exposure_us) const;
>  	double Exposure(uint32_t exposure_lines) const; // in us
> +	virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,
> +				      double maxFrameDuration) const;

Does this function need to be virtual ?

>  	virtual uint32_t GainCode(double gain) const = 0;
>  	virtual double Gain(uint32_t gain_code) const = 0;
>  	virtual void GetDelays(int &exposure_delay, int &gain_delay) const;
> @@ -83,10 +86,20 @@ public:
>  	virtual unsigned int MistrustFramesStartup() const;
>  	virtual unsigned int MistrustFramesModeSwitch() const;
>  	virtual CamTransform GetOrientation() const;
> +
>  protected:
>  	MdParser *parser_;
>  	CameraMode mode_;
> +
> +private:
>  	bool initialized_;
> +	/* Largest possible frame length, in units of lines. */
> +	unsigned int maxFrameLength_;
> +	/*
> +	 * Smallest difference between the frame length and integration time,
> +	 * in units of lines.
> +	 */
> +	unsigned int frameIntegrationDiff_;
>  };
>  
>  // This is for registering camera helpers with the system, so that the
> diff --git a/src/ipa/raspberrypi/cam_helper_imx219.cpp b/src/ipa/raspberrypi/cam_helper_imx219.cpp
> index 35c6597c..fd95a31a 100644
> --- a/src/ipa/raspberrypi/cam_helper_imx219.cpp
> +++ b/src/ipa/raspberrypi/cam_helper_imx219.cpp
> @@ -50,13 +50,22 @@ public:
>  	unsigned int MistrustFramesModeSwitch() const override;
>  	bool SensorEmbeddedDataPresent() const override;
>  	CamTransform GetOrientation() const override;
> +
> +private:
> +	/*
> +	 * Smallest difference between the frame length and integration time,
> +	 * in units of lines.
> +	 */
> +	static constexpr int frameIntegrationDiff = 4;
> +	/* Largest possible frame length, in units of lines. */
> +	static constexpr int maxFrameLength = 0xffff;
>  };
>  
>  CamHelperImx219::CamHelperImx219()
>  #if ENABLE_EMBEDDED_DATA
>  	: CamHelper(new MdParserImx219())

Shouldn't this be updated too ?

>  #else
> -	: CamHelper(new MdParserRPi())
> +	: CamHelper(new MdParserRPi(), maxFrameLength, frameIntegrationDiff)
>  #endif
>  {
>  }
> diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> index 69544456..4a1cab76 100644
> --- a/src/ipa/raspberrypi/cam_helper_imx477.cpp
> +++ b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> @@ -39,10 +39,19 @@ public:
>  	double Gain(uint32_t gain_code) const override;
>  	bool SensorEmbeddedDataPresent() const override;
>  	CamTransform GetOrientation() const override;
> +
> +private:
> +	/*
> +	 * Smallest difference between the frame length and integration time,
> +	 * in units of lines.
> +	 */
> +	static constexpr int frameIntegrationDiff = 22;
> +	/* Largest possible frame length, in units of lines. */
> +	static constexpr int maxFrameLength = 0xffdc;
>  };
>  
>  CamHelperImx477::CamHelperImx477()
> -	: CamHelper(new MdParserImx477())
> +	: CamHelper(new MdParserImx477(), maxFrameLength, frameIntegrationDiff)
>  {
>  }
>  
> diff --git a/src/ipa/raspberrypi/cam_helper_ov5647.cpp b/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> index 3dbcb164..d814fa90 100644
> --- a/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> +++ b/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> @@ -22,6 +22,15 @@ public:
>  	unsigned int HideFramesModeSwitch() const override;
>  	unsigned int MistrustFramesStartup() const override;
>  	unsigned int MistrustFramesModeSwitch() const override;
> +
> +private:
> +	/*
> +	 * Smallest difference between the frame length and integration time,
> +	 * in units of lines.
> +	 */
> +	static constexpr int frameIntegrationDiff = 4;
> +	/* Largest possible frame length, in units of lines. */
> +	static constexpr int maxFrameLength = 0xffff;
>  };
>  
>  /*
> @@ -30,7 +39,7 @@ public:
>   */
>  
>  CamHelperOv5647::CamHelperOv5647()
> -	: CamHelper(new MdParserRPi())
> +	: CamHelper(new MdParserRPi(), maxFrameLength, frameIntegrationDiff)
>  {
>  }
>  
> diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp
> index 3bcc0815..1af5e74a 100644
> --- a/src/ipa/raspberrypi/raspberrypi.cpp
> +++ b/src/ipa/raspberrypi/raspberrypi.cpp
> @@ -53,6 +53,8 @@ namespace libcamera {
>  /* Configure the sensor with these values initially. */
>  #define DEFAULT_ANALOGUE_GAIN 1.0
>  #define DEFAULT_EXPOSURE_TIME 20000
> +#define DEFAULT_MIN_FRAME_DURATION (1e6 / 30.0)
> +#define DEFAULT_MAX_FRAME_DURATION (1e6 / 0.01)
>  
>  LOG_DEFINE_CATEGORY(IPARPI)
>  
> @@ -136,6 +138,9 @@ private:
>  	/* LS table allocation passed in from the pipeline handler. */
>  	uint32_t lsTableHandle_;
>  	void *lsTable_;
> +
> +	/* Frame duration (1/fps) given in microseconds. */
> +	double minFrameDuration_, maxFrameDuration_;

Shouldn't these be float, or uint32_t depending on the outcome of the
discussion for patch 1/3 ?

>  };
>  
>  int IPARPi::init(const IPASettings &settings)
> @@ -252,13 +257,20 @@ void IPARPi::configure(const CameraSensorInfo &sensorInfo,
>  		controller_.Initialise();
>  		controllerInit_ = true;
>  
> -		/* Calculate initial values for gain and exposure. */
> +		/* Calculate initial values for gain, vblank, and exposure */
> +		minFrameDuration_ = DEFAULT_MIN_FRAME_DURATION;
> +		maxFrameDuration_ = DEFAULT_MAX_FRAME_DURATION;
> +
> +		double exposure = DEFAULT_EXPOSURE_TIME;
> +		int32_t vblank = helper_->GetVBlanking(exposure, minFrameDuration_,
> +						       maxFrameDuration_);
> +		int32_t exposure_lines = helper_->ExposureLines(exposure);
>  		int32_t gain_code = helper_->GainCode(DEFAULT_ANALOGUE_GAIN);
> -		int32_t exposure_lines = helper_->ExposureLines(DEFAULT_EXPOSURE_TIME);
>  
>  		ControlList ctrls(unicam_ctrls_);
> -		ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> +		ctrls.set(V4L2_CID_VBLANK, vblank);
>  		ctrls.set(V4L2_CID_EXPOSURE, exposure_lines);
> +		ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
>  
>  		IPAOperationData op;
>  		op.operation = RPI_IPA_ACTION_V4L2_SET_STAGGERED;
> @@ -630,6 +642,17 @@ void IPARPi::queueRequest(const ControlList &controls)
>  			break;
>  		}
>  
> +		case controls::FRAME_DURATIONS: {
> +			auto frameDurations = ctrl.second.get<Span<const float>>();
> +
> +			/* This will be applied once AGC recalculations occur. */
> +			minFrameDuration_ = frameDurations[0];
> +			maxFrameDuration_ = frameDurations[1];

Should the values be adjusted based on the sensor capabilities ?

> +			libcameraMetadata_.set(controls::FrameDurations,
> +					       { frameDurations[0], frameDurations[1] });

Hmmm... From an application point of view, wouldn't it be more useful to
know what frame duration was actually used for the current frame ? You
don't have to implement this as part of this series, but I could imagine
a FrameDuration metadata control being useful for that purpose.
FrameDurations would be set by applications, and FrameDuration would be
reported through metadata. (On a side note, would FrameDurationLimits be
a better name than FrameDuration then ?)

> +			break;
> +		}
> +
>  		default:
>  			LOG(IPARPI, Warning)
>  				<< "Ctrl " << controls::controls.at(ctrl.first)->name()
> @@ -795,7 +818,12 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus)
>  	op.operation = RPI_IPA_ACTION_V4L2_SET_STAGGERED;
>  
>  	int32_t gain_code = helper_->GainCode(agcStatus->analogue_gain);
> -	int32_t exposure_lines = helper_->ExposureLines(agcStatus->shutter_time);
> +
> +	/* GetVBlanking might clip exposure time to the fps limits. */
> +	double exposure = agcStatus->shutter_time;
> +	int32_t vblanking = helper_->GetVBlanking(exposure, minFrameDuration_,
> +						  maxFrameDuration_);

I had completely missed, when reviewing the GetVBlanking()
implementation above, that the first parameter was a referenced and was
used to return a value. We tend in libcamera to use const references for
input parameters, and pointers for output or in/out parameters to make
that clearer. It's not ideal though, as pointer can be null, and using
references when a parameter should not be null is nicer. No need to
change the code here, but maybe a comment on top of GetVBlanking() would
be useful to point out that the exposure time can be adjusted.

> +	int32_t exposure_lines = helper_->ExposureLines(exposure);
>  
>  	if (unicam_ctrls_.find(V4L2_CID_ANALOGUE_GAIN) == unicam_ctrls_.end()) {
>  		LOG(IPARPI, Error) << "Can't find analogue gain control";
> @@ -807,14 +835,20 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus)
>  		return;
>  	}
>  
> -	LOG(IPARPI, Debug) << "Applying AGC Exposure: " << agcStatus->shutter_time
> -			   << " (Shutter lines: " << exposure_lines << ") Gain: "
> +	LOG(IPARPI, Debug) << "Applying AGC Exposure: " << exposure
> +			   << " (Shutter lines: " << exposure_lines << ", AGC requested "
> +			   << agcStatus->shutter_time << ") Gain: "
>  			   << agcStatus->analogue_gain << " (Gain Code: "
>  			   << gain_code << ")";

Could the AGC algorithm get confused if it requests a certain exposure,
and a smaller value is set in the sensor ? Will it keep trying to push
the exposure up to increase the luminance of the scene without
notificing it's pointless ?

>  
>  	ControlList ctrls(unicam_ctrls_);
> -	ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> +	/*
> +	 * VBLANK must be set before EXPOSURE as the former will adjust the
> +	 * limits of the latter control.
> +	 */

This will be nasty on the V4L2 side. We write multiple controls with a
single VIDIOC_S_EXT_CTRLS call, and the V4L2 control framework will
first clamp all control values to the limits before setting any control.
The limits for the exposure time will thus not be adjusted before its
value is clamped :-( I'm not sure how to best solve this. Ideally the
problem should be fixed in the control framework, but I doubt that will
be happen. We need to at least ask Hans to see what options we have (or
rather, what options we don't have). One easy hack may be to report a
large range for the exposure time limits in the camera sensor drivers,
and adjust the value in .s_ctrl() instead of relying on the control
framework to do that for us.

> +	ctrls.set(V4L2_CID_VBLANK, vblanking);
>  	ctrls.set(V4L2_CID_EXPOSURE, exposure_lines);
> +	ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
>  	op.controls.push_back(ctrls);
>  	queueFrameAction.emit(0, op);
>  }
> diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> index 21a1d7f7..948290e2 100644
> --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> @@ -1159,7 +1159,9 @@ void RPiCameraData::queueFrameAction(unsigned int frame, const IPAOperationData
>  		if (!staggeredCtrl_) {
>  			staggeredCtrl_.init(unicam_[Unicam::Image].dev(),
>  					    { { V4L2_CID_ANALOGUE_GAIN, action.data[0] },
> +					      { V4L2_CID_VBLANK, action.data[1] },
>  					      { V4L2_CID_EXPOSURE, action.data[1] } });

So the initial value of the exposure time and the vertical blanking are
always identical ?

> +
>  			sensorMetadata_ = action.data[2];
>  		}
>  

-- 
Regards,

Laurent Pinchart


More information about the libcamera-devel mailing list