[libcamera-devel] How to detect camera after power on the raspberryPi.

Dave Stevenson dave.stevenson at raspberrypi.com
Tue Oct 20 14:25:03 CEST 2020


Hi Kieran

On Tue, 20 Oct 2020 at 11:50, Kieran Bingham
<kieran.bingham at ideasonboard.com> wrote:
>
> Hi Dave,
>
> On 20/10/2020 11:32, Dave Stevenson wrote:
> > Hi Kieran
> >
> > On Tue, 20 Oct 2020 at 10:36, Kieran Bingham
> > <kieran.bingham at ideasonboard.com> wrote:
> >>
> >> Hi Dave,
> >>
> >> On 19/10/2020 16:53, Dave Stevenson wrote:
> >>> Hi Laurent
> >>>
> >>> On Mon, 19 Oct 2020 at 13:50, Laurent Pinchart
> >>> <laurent.pinchart at ideasonboard.com> wrote:
> >>>>
> >>>> Hi Dave,
> >>>>
> >>>> On Mon, Oct 19, 2020 at 11:10:55AM +0100, Dave Stevenson wrote:
> >>>>> On Mon, 19 Oct 2020 at 10:38, <tetsuya.nomura at soho-enterprise.com> wrote:
> >>>>>>
> >>>>>> Dear Sirs.
> >>>>>>
> >>>>>> May I have your favor about initialization of the RaspberryPi and the camera?
> >>>>>>
> >>>>>> I'm now evaluating the camera extension board of FPD LINK III technology to extend the camera cable for 2~3m from the RaspberryPi.
> >>>>>> https://soho-enterprise.com/2020/10/11/post-866/
> >>>>>>
> >>>>>> This board can pass the I2C communication to IMX219 after configuring the transceiver chip,
> >>>>>> However, after configuration and run the qcam, it seems the qcam program does not detect the IMX219
> >>>>>> And I cannot start the camera even I designate the image sensor name.
> >>>>>
> >>>>> The Pi kernel driver for imx219 probes the sensor driver as the module
> >>>>> loads, which is only a few seconds into boot if you've added
> >>>>> "dtoverlay=imx219" into config.txt. No or incorrect response from the
> >>>>> sensor and it won't create the /dev/video0 device nodes. If your
> >>>>> transceivers need some form of configuration then I suspect this has
> >>>>> not happened by this point.
> >>>>>
> >>>>> The correct approach is probably to have a driver for your transceiver
> >>>>> in the kernel driver chain, ie imx219 -> transceiver -> Unicam,
> >>>>> however that opens a big can of worms over configuration as it forces
> >>>>> the use of the Media Controller API to configure each link
> >>>>> independently.
> >>>>
> >>>> A very open question: assuming we can retain control through the video
> >>>> node as implemented today for the existing use cases, and implement
> >>>> MC-based pipeline control in parallel in the driver (possibly selecting
> >>>> one of the options through a module parameter) for use cases that would
> >>>> be tricky to support otherwise, would you be OK with that ?
> >>>
> >>> I'm totally OK with it!
> >>> I've had a couple of other threads in the last few weeks that kicked
> >>> me to look into MC and what is required.
> >>>
> >>> The first was trying to support Analog Devices ADV7482 which Kieran
> >>> will know and "love". That showed up a couple of oddities around
> >>
> >> Ahem, cough cough. Indeed :-)
> >>
> >> Is there a board available that can connect this to the RPi?
> >> If so I'd be interested in buying one.
> >
> > Not that I'm aware of.
> > The forum contributor's company had brought up a board with ADV7282-M
> > (analogue video to CSI2) on. Based on that they ploughed ahead in
> > building a prototype Compute Module carrier board for ADV7482 (and DPI
> > to VGA as done on the VGA666 board). That's the point that he posted
> > on the forums as things didn't connect up nice and easily.
> > He has been kind enough to ship one to me from Chile(!), but I haven't
> > had time to power it up as yet. Looking at the PCB it appears to only
> > route 2 CSI-2 data lanes which is a bit of a shame - oh well.
>
> Oh indeed, would have been nice to route all 4 lanes for a
> compute-module. Perhaps it's not too late for their next board revision?
>
> Well if you hit any blocker, or need anything let us know.

Just FYI the forum threads are
ADV7482: https://www.raspberrypi.org/forums/viewtopic.php?f=98&t=285492
TC358748: https://www.raspberrypi.org/forums/viewtopic.php?f=98&t=287424

I got it as far as creating the full MC graph for all the entities
with a hacked driver that did no I2C comms. Now I have the hardware
I'll see where I can get it to there.

> >>> linking to async_notifier and incorrect behaviour with our use of pad
> >>> indexes. Those two bits are easy to rectify, although having made the
> >>> async_notifier stuff match other platforms it no longer probed the
> >>> simple modules correctly :-( There must be a solution there somewhere,
> >>> it's only finding the correct runes.
> >>
> >> The ADV748x requires endpoint matching, and it complicates things indeed.
> >>
> >> For a long time, I think the only receiver that supported this driver
> >> was the Renesas RCar-CSI2/VIN.
> >>
> >> There is a series here:
> >>
> >> https://lore.kernel.org/linux-media/20200701062140.12953-1-laurent.pinchart+renesas@ideasonboard.com/
> >>
> >> Which should finally ease things and make endpoint matching compatible
> >> with all receivers, which I believe has landed in v5.9.
> >
> > I thought I'd been running on our 5.9 branch, but looking at the
> > branch I'd pushed for their reference[1] it looks to be 5.4.
> > I'll try rebasing and see what I get.
> >
> > [1] https://github.com/6by9/linux/tree/adv748x
> >
> >>> The second use case was Toshiba's TC358746/TC358748 parallel to CSI
> >>> bridges, where similar to this case it's chaining sensor ->
> >>> TC35874[6|8] -> Unicam.
> >>>
> >>> With regard switching behaviours, it's relatively simple to look at
> >>> the immediately upstream node and see if it has any sink pads. If yes
> >>> we go MC, otherwise video-device. Doing so dynamically at runtime is
> >>> nicer than a module parameter.
> >>
> >> +Niklas,
> >>
> >> I sort of agree here, as it should be easier for end users. Niklas may
> >> have different opinions as he has felt pain of supporting both MC and
> >> non-MC interfaces in the RCar-VIN.
> >
> > Felt as in past tense and you've dropped non-MC?
> > I'll have a look at the RCar-VIN driver to see if I can work through
> > the differences.
> > ...
> > It appears MC is enabled dependent on the platform - that gives me
> > something to follow.
> > I'm a little confused though as it still seems to call
> > v4l2_subdev_call(sd, pad, set_fmt, pad_cfg, &format);
> > from rvin_s_fmt_vid_cap to set the format on the upstream subdevice
> > sink pad. Unless I'm missing something then I thought it wasn't
> > allowed in MC to propagate formats across links.
> >
> >>> That's the easy bit. The harder bit is working out which bits of
> >>> functionality that then requires dropping when in MC mode. At the
> >>> moment I'm not totally clear over the correct approach for configuring
> >>> the actual Unicam hardware - S_FMT on /dev/videoN, or MC API setting
> >>> the sink pad format of Unicam. The mismash of MEDIA_BUS_FMT_xxx vs
> >>> V4L2_PIX_FMT_xxx formats surfaces again, how do you avoid format
> >>> mismatches (and unpacking to 16bit), and which do you use to configure
> >>> CSI data type filtering.
> >>> I'm assuming all the subdev API calls for EDIDs, timings (DV and
> >>> standards), parm, and enumerations disappear with MC too, as you open
> >>> the subdev node to configure that lot.
> >>
> >> Indeed, those can then be operated on the subdev node.
> >
> > OK, so lots of disable_ioctl calls but those are easy enough and
> > already half in place for when the subdev doesn't support them.
> >
> > So just the step of understanding what configures the hardware
> > format/width/height/stride/buffer size.
>
> Userspace ? I think that's where all the format propagation comes in.

So MEDIA_IOC_SETUP_LINK can set the link from sensor to CSI2 receiver
to 640x480 YUYV, and VIDIOC_S_FMT sets the video node to 1280x720
BGGR, and there's no validation between the two? Or do the two get
reconciled at link_validate when the pipeline is enabled?
Format on the sensor end is set via VIDIOC_SUBDEV_S_FMT on the
v4l2-subdevN node with a v4l2_mbus_pixelcode code, so we still need a
mapping table between V4L2_PIX_FMT_xxx, and MEDIA_BUS_FMT_xxx if we
want to validate.

I guess the CSI2 receiver always has to be programmed based on the
S_FMT info as that defines the geometry of the buffers that are being
written to. In which case the CSI2 data type falls out from the
V4L2_PIX_FMT_xxx as well. That's all easy, but how do you determine
the correct set of formats to advertise in VIDIOC_ENUM_FMT, and to
validate the pixel format in [TRY|S]_FMT against?

This all feels like it should be in the beginner's guide to MC. Is
there such a thing and I've just missed it?

> I'm very much aware that the TC358746/TC358748 is not a camera sensor,
> but libcamera has so much of this 'done' that I do wonder how non-camera
> (media-controller) type devices should be used in the future. Perhaps
> there is some core parts of libcamera that should be factored out to
> facilitate non-camera type devices that use the same pipelines, or
> perhaps libcamera would grow to support them. It feels like a bit of a
> shame that they are out of scope for us at the moment.
>
> Same with the ADV748x HDMI input ... that's not a camera, but libcamera
> would make it much easier to configure/capture from - though
> technically, the 8 analogue inputs could be ... ;-)

Does libcamera become the defacto way of configuring MC devices? I
guess that's a question open to debate.

It'd be good if it can handle things like TC35874[6|8] and FPD Link
bridge chips/transceivers as there is a sensor at the end of it.
Predominantly the pipeline handers know how their devices work. Unless
the CameraSensor class can integrate any extra "bridge" nodes between
sensor and pipeline handler then it feels like it's going to be a
tricky job within each and every pipeline handler.

HDMI & analogue video bridges are a different beast. There's enough
messing around with EDIDs, video standards, detecting format changes,
etc, that you almost have to use a custom app, at which point you do
the right thing for your chosen bridge chip.

  Dave


More information about the libcamera-devel mailing list