[libcamera-devel] gstreamer libcamerasrc to v4l2h264enc hardware encoder

Kieran Bingham kieran.bingham at ideasonboard.com
Fri Jun 12 17:41:12 CEST 2020


Hi Xishan,

On 12/06/2020 16:26, Xishan Sun wrote:
> I am trying to link libcamerasrc with v4l2h264enc to use hardware
> encoder on Raspberry Pi. 
> 
> 1) if I directly feed into v4l2h264enc with something like: 
> 
> gst-launch-1.0 libcamerasrc ! video/x-raw,width=1600,height=1200 !
> v4l2h264enc ! h264parse ! rtph264pay name=pay0 pt=96 ! fakesink
> 
> gstreamer will complain that format cannot be transformed. debug info: 
> 
> ...
> 0:00:00.671398274 22045  0x12d5400 WARN           basetransform
> gstbasetransform.c:1355:gst_base_transform_setcaps:<capsfilter0>
> transform could not transform video/x-raw, format=(string)NV21,
> width=(int)1600, height=(int)1200 in anything we support
> 0:00:00.671467254 22045  0x12d5400 WARN            libcamerasrc
> gstlibcamerasrc.cpp:321:gst_libcamera_src_task_run:<libcamerasrc0>
> error: Internal data stream error.
> 0:00:00.671491235 22045  0x12d5400 WARN            libcamerasrc
> gstlibcamerasrc.cpp:321:gst_libcamera_src_task_run:<libcamerasrc0>
> error: streaming stopped, reason not-negotiated (-4)
> ...

Could you run this with the following environment variable set please?


LIBCAMERA_LOG_LEVELS=*:0


> 
> it is strange that 
> gst-inspect-1.0 v4l2h264enc 
> shows me that  v4l2h264enc accepts NV21 or NV12 format:
> 
> ...
> Pad Templates:
>   SINK template: 'sink'
>     Availability: Always
>     Capabilities:
>       video/x-raw
>                  format: { (string)I420, (string)YV12, (string)NV12,
> (string)NV21, (string)RGB16, (string)RGB, (string)BGR, (string)BGRx,
> (string)BGRA, (string)YUY2, (string)YVYU, (string)UYVY }
>                   width: [ 1, 32768 ]
>                  height: [ 1, 32768 ]
>               framerate: [ 0/1, 2147483647/1 ]
> ...
> 
>  2) if I add videoconvert into pipeline as: 
>   gst-launch-1.0 libcamerasrc ! video/x-raw,width=1600,height=1200 !
> videoconvert ! v4l2h264enc ! h264parse ! rtph264pay name=pay0 pt=96 !
> fakesink

Hrm ... that's odd - I wonder what that's actually setting then.
Again - some full logs with the libcamera debug enabled might tell us more.



> then everything works just fine at the beginning. However after I insert
> the pipeline into a RTSP server with long term run (about 4-5 hours)  I
> see Segmentation fault like: 
> ...
> Thread 13 "pool" received signal SIGSEGV, Segmentation fault.
> [Switching to Thread 0xb4ce23a0 (LWP 6879)]
> std::_Rb_tree<libcamera::Stream*, std::pair<libcamera::Stream* const,
> libcamera::FrameBuffer*>, std::_Select1st<std::pair<libcamera::Stream*
> const, libcamera::FrameBuffer*> >, std::less<libcamera::Stream*>,
> std::allocator<std::pair<libcamera::Stream* const,
> libcamera::FrameBuffer*> > >::_M_lower_bound (this=0xb4316690,
> __x=0x517060b, __y=0xb4316694, __k=@0xb4ce1728: 0xb4319060) at
> /usr/include/c++/8/bits/stl_tree.h:1904
> 1904 if (!_M_impl._M_key_compare(_S_key(__x), __k))

Ouch - that's not very easy to interpret at all ...

I wonder if we can get a core dump on that (as it happens after 4-5
hours we want to minimize issues reproducing).

that should get generated if you set:
 ulimit -c unlimited

but now I realise of course this is going to be a gstreamer core dump,
so it might be tricky for us to parse without matching build objects of
your particular gstreamer and libcamera install.

Still - worth trying to capture if it only occurs after 4-5 hours.


Oh - hang on - are you already in GDB when this happens?
>> Thread 13 "pool" received signal SIGSEGV, Segmentation fault.
>> [Switching to Thread 0xb4ce23a0 (LWP 6879)]

Is there anything you can explore to see what the fault was?

What was the backtrace (bt) for example...





> ...
> 
> Question: 
> 1) Can we have libcamerasrc output direct to v4l2h264enc without
> videoconvert? If we can define "output-io-mode" of libcamerasrc into
> "dmabuf", it would be even better. 

I'm not sure what that means. Are they gstreamer terms? Libcamera does
support dmabuf ... (It's pretty much the only thing supported I think)

> 2) is that with videoconvert the best solution right now? I tried
> v4l2convert and it didn't work. I think libcamerasrc is already using
> the device.

I suspect that the v4l2convert tried to use the same M2M device that the
raspberry pi uses for the IPA ...

--
Kieran


> 
> Thanks,
> 
> -- 
> Xishan Sun
> 
> _______________________________________________
> libcamera-devel mailing list
> libcamera-devel at lists.libcamera.org
> https://lists.libcamera.org/listinfo/libcamera-devel
> 

-- 
Regards
--
Kieran


More information about the libcamera-devel mailing list