[libcamera-devel] [PATCH] py: cam: Network renderer

Tomi Valkeinen tomi.valkeinen at ideasonboard.com
Mon Mar 20 10:29:55 CET 2023


On 20/03/2023 01:33, Laurent Pinchart wrote:
> Hi Tomi,
> 
> On Sun, Mar 19, 2023 at 05:49:46PM +0200, Tomi Valkeinen wrote:
>> On 19/03/2023 15:05, Laurent Pinchart wrote:
>>> On Sun, Mar 19, 2023 at 01:30:13PM +0200, Tomi Valkeinen via libcamera-devel wrote:
>>>> Here's something I have found useful a few times.
>>>>
>>>> This adds a "tx" renderer to cam.py, which sends the frames over the
>>>> network to a receiver.
>>>>
>>>> It also adds a "cam-rx" tool (non-libcamera based) which receives the
>>>> frames and uses PyQt to show them on the screen, usually ran on a PC.
>>>>
>>>> This is obviously not super efficient, but on the PC side it doesn't
>>>> matter. On the TX side, at least RPi4 seemed to work without noticeable
>>>> lag, but my old 32-bit TI DRA76, when sending three camera streams, the
>>>> performance dropped to ~5fps. Still, I find that more than enough for
>>>> most development work.
>>>>
>>>> This could be extended to also transmit the metadata.
>>>
>>> What's the advantage of this approach compared to using GStreamer for
>>> network streaming ? It feels to me that we're reinventing the wheel a
>>> bit here.
>>
>> Well, these may not matter to other people, but for me:
>>
>> - This doesn't need gstreamer
> 
> That's an argument I can't disagree with :-)
> 
>> - This works, whereas I have a lot of trouble getting gstreamer working.
>> I did manage the get a few formats rendering locally, but I couldn't get
>> anything over a tcp sink.
> 
> Then we need to improve GStreamer support, fixing bugs if any, and
> providing documentation with sample pipelines for both the TX and RX
> sides.

I can't disagree with that, but I still feel the gstreamer case and what 
I'm doing here are not really comparable.

>> - The code is short and trivial, and I have the same TX code working on
>> my v4l2 python test app.
>> - With this, I have a trivial way to get the raw frames and metadata (to
>> be implemented =) on my PC and process and study them with python & numpy.
>> - This could be used for the "py: cam.py: Provide live graph of request
>> metadata"
>>
>> I should have emphasized that this is a development/testing helper, not
>> "streaming support".
>>
>> Also, doesn't your point apply to any rendering done by cam?
> 
> This leads to the real question: where do we draw the line ? A trivial

I was referring (also) to cam, not just cam.py. cam has at least kms and 
sdl renderers, seems to support mjpeg, and can save frames to disk. 
Can't gstreamer do all those?

> network streaming implementation is, well, trivial, but it will fail in
> various ways in various cases. I don't want to end up with a custom
> implementation of RTSP in cam.py, so where will we stop ?

Me neither, and also, what would be the point? If you're thinking RTSP, 
you're already on quite a high level. I'm sure gstreamer does that just 
fine (at least supposedly).

The features I've implemented to cam.py have been quite low level, or as 
low as is needed to exercise some particular "core" feature. KMS 
rendering is quite obvious. Rendering with Qt and GL are also, I think, 
"core", even if you need a bunch of dependencies to get there.

What I wanted here is a way to get (more or less) real-time data from 
the device to my PC for analysis/processing, without any extra 
processing done on the device. Maybe gstreamer can accomplish the same, 
although I'm guessing it can't handle the metadata. But even if it did, 
and even if I did get gstreamer working, considering the amount of code 
in this patch for the network tx and rx, and the added complexity of 
gstreamer, at least for me and my uses the choice is clear.

> I would also argue that it would be good to keep the feature set of cam
> and cam.py as close as possible to each other.

Well, I thought so too, until I didn't =). Why would it be good? The 
only reason I come up with is that cam.py serves as a py example, as one 
can compare to cam. But I don't think that's a very good argument, as 
proper examples are better examples (!). But even more importantly, if 
we stick to cam features, we will miss all the features that Python 
could offer us easily, but would be more laborious on C++.

Then again, if cam is supposed to exercise the libcamera API as fully as 
possible (but is it?), doing the same on cam.py makes sense, just to see 
that we have the important things implemented and that they are usable 
in practice.

But perhaps this discussion should be more about what do we consider 
part of libcamera, and what should be in separate repositories. I should 
probably set up my own repo for all kinds of "stuff" I use, which help 
my life, and might be helpful for others, but should never be considered 
as a supported part of libcamera. This patch is probably more on the 
"stuff" side. But then we come back to the question of "what should 
cam.py support", as I think some features could as well be removed and 
moved to a "stuff" repo.

I do think that small non-production-quality pieces of Py code are 
valuable. They show how certain things can be done, and serve as 
examples. And I think the best examples are pieces of code which are not 
pure examples, but actually serve some kind of real life use case.

We probably should also consider moving the python bindings to be 
outside the libcamera repo, so that it would be easy to package them in 
a form that can be installed from PyPI. But that probably has to wait 
until we have a stable API, which is kind of unfortunate as people seem 
to expect finding the Py support from PyPI.

Also, I didn't mean this patch to be a complex issue. I just though to 
share it as I found it helpful =).

  Tomi



More information about the libcamera-devel mailing list