[libcamera-devel] use of v4l2-compat.so

George Kiagiadakis george.kiagiadakis at collabora.com
Mon Jun 22 14:31:59 CEST 2020


On 19/06/2020 19:39, Nicolas Dufresne wrote:
> (Adding George and Raghavendra in CC)
> 
> Le vendredi 19 juin 2020 à 18:44 +0300, Laurent Pinchart a écrit :
>> Hi Nicolas,
>>
>> On Fri, Jun 19, 2020 at 11:31:43AM -0400, Nicolas Dufresne wrote:
>>> Le mardi 16 juin 2020 à 23:38 +0530, naidu nama a écrit :
>>>> Hi team,
>>>> By default gstreamer is not using v4l2-compat.so.So,then what is the
>>>> use of v4l2 folder in libcamera.The goal is to by-pass the v4l2linux
>>>> API and use the libcamera API.So,what is the use of this v4l2 folder.
>>>
>>> It is what is is, a work in progress LD_PRELOAD wrapper for "legacy"
>>> applications. Such a wrap would be achieve in a lauch script (or XDG desktop
>>> file). We suspect some app will take a long time to migrate, and would like
>>> to
>>> we able to use cameras on the IPU3 laptops and ARM devices before they are
>>> ported.
>>>
>>> As for browser and sandboxed apps, it's all clear. In the flatpak community,
>>> they really want to stop giving /dev access to apps unless strictly needed,
>>> specially for V4L2 as it's then impossible to reclaim the access. But there
>>> is a
>>> WIP libcamera backend in pipewire already, so that is one way, but it does
>>> not
>>> really offer the features of libcamera. That all needs to be designed, I do
>>> expect mixed support in the upcoming future (some browser may us libcamera,
>>> some
>>> will do pipewire, as they will have to decide between sandboxing and
>>> features).
>>
>> For pipewire-based stacks, we will need a camera backend for pipewire,
>> as well as a frontend API for applications. Provided we use libcamera as
>> a backend, do you think it would make sense to offer the same API for
>> the frontend ? It would require a libcamera API implementation on top of
>> pipewire, and would allow applications to use the same API, regardless
>> of whether pipewire is part of the camera stack or not.
> 
> I believe so. For PulseAudio and JackD, there is session manages that implement
> the API level, then they share some nodes. That difference seems that the nodes
> would be implement with libcamera underneath. It's just a bit annoying to keep
> in sync, so I'll let George explain what is possible to avoid duplication (e.g.
> making a in-libcamera layer instead of a full copy of the API).
> 

I also think that a libcamera frontend API on top of PipeWire would
probably make sense in order to support applications that want to run
both on top of the device directly and on top of PipeWire.

I am not exactly sure what you mean, Nicolas, about the nodes, but let
me give an overview of how PipeWire operates to help clarify things, if
necessary.

PipeWire is essentially a media stream exchange framework. It allows
transporting fd-based (memfd, dmabuf,) buffers from one place in the
system to another. It works in combination with a session manager, which
creates all the links between devices and applications, making important
decisions about these links and enforcing policy.

In the simple use cases, applications are meant to request playback or
capture streams with a hint as to what kind of media they want to
stream. i.e.:

* capture audio for a "call"
* playback audio with "voice notification" content
* capture video for a "screencast"
* capture video for "camera local preview"
* etc...

Then, the session manager is meant to link this stream appropriately
with the right application or device, also configuring the later
appropriately.

In more advanced use cases, there are more possibilities, from hinting
the session manager to link to a specific "target", to creating links
directly from the application (ex. JACK apps do that); all subject to
the session manager allowing these in the first place.

On the media level, once a stream is open, the application's node is
configured with a specific format and a pool of buffers (all subject to
negotiation). When the session manager links the stream, buffers begin
flowing from one node to the other, with both "push" and "pull"
scheduling modes available.

To achieve compatibility with PulseAudio and JACK applications, without
recompiling them, there are compatibility libraries (full-blown
libraries, not meant to be used with LD_PRELOAD) that implement the full
API of PulseAudio and JACK respectively. When an application links at
runtime against these, it transparently uses PipeWire underneath.

With ALSA compatibility, it is a bit different. PipeWire ships a
libasound plugin that allows presenting a "pipewire" device to ALSA
applications.

For libcamera, they way I see it, it would make sense to go with a
design similar to the ALSA one. libcamera could have backends for its
frontend API, either through plugins or hardcoded in the library. In one
backend it would directly access the device, while in the other one it
would go through PipeWire. This would avoid duplicating the API.

Please don't hesitate to ask me if you have further questions related to
PipeWire.

Best regards,
George


More information about the libcamera-devel mailing list