[libcamera-devel] How to detect camera after power on the raspberryPi.

Kieran Bingham kieran.bingham at ideasonboard.com
Tue Oct 20 12:50:18 CEST 2020


Hi Dave,

On 20/10/2020 11:32, Dave Stevenson wrote:
> Hi Kieran
> 
> On Tue, 20 Oct 2020 at 10:36, Kieran Bingham
> <kieran.bingham at ideasonboard.com> wrote:
>>
>> Hi Dave,
>>
>> On 19/10/2020 16:53, Dave Stevenson wrote:
>>> Hi Laurent
>>>
>>> On Mon, 19 Oct 2020 at 13:50, Laurent Pinchart
>>> <laurent.pinchart at ideasonboard.com> wrote:
>>>>
>>>> Hi Dave,
>>>>
>>>> On Mon, Oct 19, 2020 at 11:10:55AM +0100, Dave Stevenson wrote:
>>>>> On Mon, 19 Oct 2020 at 10:38, <tetsuya.nomura at soho-enterprise.com> wrote:
>>>>>>
>>>>>> Dear Sirs.
>>>>>>
>>>>>> May I have your favor about initialization of the RaspberryPi and the camera?
>>>>>>
>>>>>> I'm now evaluating the camera extension board of FPD LINK III technology to extend the camera cable for 2~3m from the RaspberryPi.
>>>>>> https://soho-enterprise.com/2020/10/11/post-866/
>>>>>>
>>>>>> This board can pass the I2C communication to IMX219 after configuring the transceiver chip,
>>>>>> However, after configuration and run the qcam, it seems the qcam program does not detect the IMX219
>>>>>> And I cannot start the camera even I designate the image sensor name.
>>>>>
>>>>> The Pi kernel driver for imx219 probes the sensor driver as the module
>>>>> loads, which is only a few seconds into boot if you've added
>>>>> "dtoverlay=imx219" into config.txt. No or incorrect response from the
>>>>> sensor and it won't create the /dev/video0 device nodes. If your
>>>>> transceivers need some form of configuration then I suspect this has
>>>>> not happened by this point.
>>>>>
>>>>> The correct approach is probably to have a driver for your transceiver
>>>>> in the kernel driver chain, ie imx219 -> transceiver -> Unicam,
>>>>> however that opens a big can of worms over configuration as it forces
>>>>> the use of the Media Controller API to configure each link
>>>>> independently.
>>>>
>>>> A very open question: assuming we can retain control through the video
>>>> node as implemented today for the existing use cases, and implement
>>>> MC-based pipeline control in parallel in the driver (possibly selecting
>>>> one of the options through a module parameter) for use cases that would
>>>> be tricky to support otherwise, would you be OK with that ?
>>>
>>> I'm totally OK with it!
>>> I've had a couple of other threads in the last few weeks that kicked
>>> me to look into MC and what is required.
>>>
>>> The first was trying to support Analog Devices ADV7482 which Kieran
>>> will know and "love". That showed up a couple of oddities around
>>
>> Ahem, cough cough. Indeed :-)
>>
>> Is there a board available that can connect this to the RPi?
>> If so I'd be interested in buying one.
> 
> Not that I'm aware of.
> The forum contributor's company had brought up a board with ADV7282-M
> (analogue video to CSI2) on. Based on that they ploughed ahead in
> building a prototype Compute Module carrier board for ADV7482 (and DPI
> to VGA as done on the VGA666 board). That's the point that he posted
> on the forums as things didn't connect up nice and easily.
> He has been kind enough to ship one to me from Chile(!), but I haven't
> had time to power it up as yet. Looking at the PCB it appears to only
> route 2 CSI-2 data lanes which is a bit of a shame - oh well.

Oh indeed, would have been nice to route all 4 lanes for a
compute-module. Perhaps it's not too late for their next board revision?

Well if you hit any blocker, or need anything let us know.


> 
>>> linking to async_notifier and incorrect behaviour with our use of pad
>>> indexes. Those two bits are easy to rectify, although having made the
>>> async_notifier stuff match other platforms it no longer probed the
>>> simple modules correctly :-( There must be a solution there somewhere,
>>> it's only finding the correct runes.
>>
>> The ADV748x requires endpoint matching, and it complicates things indeed.
>>
>> For a long time, I think the only receiver that supported this driver
>> was the Renesas RCar-CSI2/VIN.
>>
>> There is a series here:
>>
>> https://lore.kernel.org/linux-media/20200701062140.12953-1-laurent.pinchart+renesas@ideasonboard.com/
>>
>> Which should finally ease things and make endpoint matching compatible
>> with all receivers, which I believe has landed in v5.9.
> 
> I thought I'd been running on our 5.9 branch, but looking at the
> branch I'd pushed for their reference[1] it looks to be 5.4.
> I'll try rebasing and see what I get.
> 
> [1] https://github.com/6by9/linux/tree/adv748x
> 
>>> The second use case was Toshiba's TC358746/TC358748 parallel to CSI
>>> bridges, where similar to this case it's chaining sensor ->
>>> TC35874[6|8] -> Unicam.
>>>
>>> With regard switching behaviours, it's relatively simple to look at
>>> the immediately upstream node and see if it has any sink pads. If yes
>>> we go MC, otherwise video-device. Doing so dynamically at runtime is
>>> nicer than a module parameter.
>>
>> +Niklas,
>>
>> I sort of agree here, as it should be easier for end users. Niklas may
>> have different opinions as he has felt pain of supporting both MC and
>> non-MC interfaces in the RCar-VIN.
> 
> Felt as in past tense and you've dropped non-MC?
> I'll have a look at the RCar-VIN driver to see if I can work through
> the differences.
> ...
> It appears MC is enabled dependent on the platform - that gives me
> something to follow.
> I'm a little confused though as it still seems to call
> v4l2_subdev_call(sd, pad, set_fmt, pad_cfg, &format);
> from rvin_s_fmt_vid_cap to set the format on the upstream subdevice
> sink pad. Unless I'm missing something then I thought it wasn't
> allowed in MC to propagate formats across links.
> 
>>> That's the easy bit. The harder bit is working out which bits of
>>> functionality that then requires dropping when in MC mode. At the
>>> moment I'm not totally clear over the correct approach for configuring
>>> the actual Unicam hardware - S_FMT on /dev/videoN, or MC API setting
>>> the sink pad format of Unicam. The mismash of MEDIA_BUS_FMT_xxx vs
>>> V4L2_PIX_FMT_xxx formats surfaces again, how do you avoid format
>>> mismatches (and unpacking to 16bit), and which do you use to configure
>>> CSI data type filtering.
>>> I'm assuming all the subdev API calls for EDIDs, timings (DV and
>>> standards), parm, and enumerations disappear with MC too, as you open
>>> the subdev node to configure that lot.
>>
>> Indeed, those can then be operated on the subdev node.
> 
> OK, so lots of disable_ioctl calls but those are easy enough and
> already half in place for when the subdev doesn't support them.
> 
> So just the step of understanding what configures the hardware
> format/width/height/stride/buffer size.

Userspace ? I think that's where all the format propagation comes in.

I'm very much aware that the TC358746/TC358748 is not a camera sensor,
but libcamera has so much of this 'done' that I do wonder how non-camera
(media-controller) type devices should be used in the future. Perhaps
there is some core parts of libcamera that should be factored out to
facilitate non-camera type devices that use the same pipelines, or
perhaps libcamera would grow to support them. It feels like a bit of a
shame that they are out of scope for us at the moment.

Same with the ADV748x HDMI input ... that's not a camera, but libcamera
would make it much easier to configure/capture from - though
technically, the 8 analogue inputs could be ... ;-)

--
Kieran


>   Dave
> 
>>>
>>> Thanks
>>>   Dave
>>>
>>>>> One workaround would be to remove "dtoverlay=imx219" from config.txt
>>>>> and use dynamic device tree overlay loading to load the relevant
>>>>> modules later in boot. "sudo dtoverlay imx219" should do that for you.
>>>>> The niggle is that when done from config.txt the firmware fixes up the
>>>>> camera shutdown GPIO to match the platform (it moves GPIOs between
>>>>> different variants of the Pi). If loading dynamically then this can't
>>>>> happen, and you'll need to fix up the regulator shutdown line manually
>>>>> [1] (assuming that it is actually controllable at the other end of the
>>>>> FPD Link)
>>>>>
>>>>> [1] (https://github.com/raspberrypi/linux/blob/rpi-5.4.y/arch/arm/boot/dts/overlays/imx219-overlay.dts#L77)
>>>>>
>>>>>> I can start the IMX219, get the RAW image by our own program and see the image.
>>>>>> Then if qcam can send the necessary command to IMX219 when start running the qcam,
>>>>>> I think I can see the image through qcam program.
>>>>>>
>>>>>> It would be great if you could give us an advice to solve it.
>>>>>> I appreciate your kind help in advance.
>>>>>>
>>>>>> Best Regards,
>>>>>>
>>>>>> NOMURA
>>>>
>>>> --
>>>> Regards,
>>>>
>>>> Laurent Pinchart
>>> _______________________________________________
>>> libcamera-devel mailing list
>>> libcamera-devel at lists.libcamera.org
>>> https://lists.libcamera.org/listinfo/libcamera-devel
>>>
>>
>> --
>> Regards
>> --
>> Kieran

-- 
Regards
--
Kieran


More information about the libcamera-devel mailing list