[libcamera-devel] [PATCH RFC] rkisp1: adjust vblank to target framerate

Laurent Pinchart laurent.pinchart at ideasonboard.com
Wed Jun 7 19:30:24 CEST 2023


Hello,

On Wed, Jun 07, 2023 at 10:45:25AM +0200, Jacopo Mondi via libcamera-devel wrote:
> On Tue, Jun 06, 2023 at 07:27:12PM +0200, Benjamin Bara via libcamera-devel wrote:
> > On Wed, 24 May 2023 at 11:04, Jacopo Mondi wrote:
> > >    I wonder if we shouldn't instead remove the call to setControls(0)
> > >    in IPA::start() and return a list of v4l2 controls from
> > >    IPA::start() as raspberrypi does so that the new VBLANK EXPOSURE
> > >    and GAIN are computed all on the IPA side by re-using
> > >    updateControls() which re-computes the limits for the
> > >    Camera::controls as well.
> > >
> > >     If I'm not mistaken, with your current implementation the
> > >     Camera::controls limits are not updated after start(), right ?
> >
> > Exactly, they aren't.
> > As I am fairly new to libcamera and so far only used libcamera in
> > combination with gst-launch: Is it possible to change the frame duration
> > after start() is called? Because IMHO, vblank is static, as long as the
> 
> Frame duration is certainly controllable during streaming yes.
> 
> controls::FrameDurationLimits is a regular control, and like all other
> controls, it is meant to be set by the application to change the
> streaming behaviour. It of course needs to be handled in the pipeline
> and the IPA (something which is missing in RkISP1, in example).
> 
> Now, if I read it right, the gst element only allows you to control
> the frame duration at startup, I'm not sure this is a shortcoming of
> the current implementation or is there anything in gst which prevents
> to set those control while the "pipeline is rolling", but when it
> comes to libcamera itself, the intended behaviour is for application
> to be able to set controls during streaming.
> 
> If you want to experiment with that, I would start by hacking out the
> 'cam' test application and set a FrameDurationLimits[] = {x, x} in a
> Request after the streaming has started.

No need to hack the application, you can use a capture script ;-) See
`cam -h` and src/apps/cam/capture-script.yaml.

> From there the control is
> received by the pipeline handler and then handed to the IPA. The IPA
> should set the agc limits (we even have a todo)
> 
>         /* \todo Honour the FrameDurationLimits control instead of hardcoding a limit */
> 
> and compute a new vblank value to be sent back to the pipeline (in the
> same way as it now passed back V4L2_CID_EXPOSURE and
> V4L2_CID_ANALOGUE_GAIN).
> 
> It's quite some work, but there shouldn't be anything too complex. If
> you're interested in giving this a go I would be glad to help.
> 
> > frame duration is static. Obviously, if the frame duration limit is
> > dynamic after start() is called, then it would make sense to also have
> > vblank recalculated afterwards. Under my assumption of a static frame
> > duration, I guess it would even make sense to put it "before" or outside
> > of the IPA-specific ph::start(), as it is just related to the camera
> > sensor, and independent of the IPA - but I guess start() is the first
> > call to libcamera where the frame durations are actually known.
> >
> > >     The only drawback I see with my proposal is that the
> > >     re-computation of the EXPOSURE v4l2 control limits has to be done
> > >     manually in the IPA instead of relaying on it being done by the
> > >     driver when setting a new VBLANK as per your current
> > >     implementation.
> >
> > Yes, I think so too. This needs to be implemented per-sensor in the
> > helper I guess. I skimmed a little bit through the camera sensor drivers
> > and it looks like most of the drivers adapt the v4l2 exposure limits as
> > soon as vblank is changed (except e.g. imx258 or imx415). So I guess at
> > least it seems to be quite reliable.
> >
> > So IMHO, for the "given frame duration limit" case, I guess it just
> > boils down to the question if the limits can change after calling
> > start(). For other use cases, like reducing the frame rate to increase
> > exposure when in saturation (or similar), your suggestion might fit
> > better. What do you think?

-- 
Regards,

Laurent Pinchart


More information about the libcamera-devel mailing list