User avatar
HermannSW
Posts: 4830
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Minimal gstreamer raspivid pipeline?

Fri May 05, 2017 4:34 pm

Hi,

I can udp stream Pi videotestsrc and display on my laptop linux.
But I am not able to stream and view video generated with raspivid :-(
I searched this forum and googled, and tried tens of different "solutions" that all did not work.

Maybe its because I have installed 0.10 gstreamer on Pi, because that was already installed on my Linux laptop. But streaming should be possible with 0.10 gstreamer as well.

The minimal working gstreamer pipeline that just displays my laptop webcam on my laptop is this:

Code: Select all

$ gst-launch-0.10 -v v4l2src device=/dev/video0 ! autovideosink
What is the "raspivid gstreamer" equivalent of this simple and complete pipeline?
I just want to display the video stream generated by "raspivid ... -o -" with gst-launch-0.10 on Pi (with fdsrc?).
I think if I have this working, then adding updsink/udpsrc stuff will be not that hard.

Hermann.
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4830
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Minimal gstreamer raspivid pipeline?

Sat May 06, 2017 12:42 am

OK, I added a USB webcam to Pi Zero W in addition to Raspberry camera:
Image

I did "ssh -X ..." into Zero W, and when executing this gstreamer pipeline the video stream shows up on my laptop:

Code: Select all

pi@raspberrypi03:~ $ gst-launch-0.10 -v v4l2src device=/dev/video0 ! autovideosink
Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)640, height=(int)480, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)640, height=(int)480, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)640, height=(int)480, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw-yuv, format=(fourcc)YUY2, 
What is equivalent of this minimal gstreamer pipeline for Raspberry camera and streaming via raspivid and gstreamer?

This does not work, how can it be made to work?

Code: Select all

pi@raspberrypi03:~ $ raspivid -w 640 -h 480 -fps 30 -t 10000 -o - | gst-launch-0.10 -v fdsrc ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/GstFdSrc:fdsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2625): gst_base_src_loop (): /GstPipeline:pipeline0/GstFdSrc:fdsrc0:
streaming task paused, reason error (-5)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
pi@raspberrypi03:~ $ 
Hermann.
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 4830
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Minimal gstreamer raspivid pipeline?

Sat May 06, 2017 1:05 am

Not sure whether this is right direction, but now I see video window (for 7.58s as reported by gstreamer).
But the window shows only garbage, and it does not change the whole time:

Code: Select all

pi@raspberrypi03:~ $ time ( raspivid -w 640 -h 480 -fps 30 -t 10000 -o - | gst-launch-0.10 -v fdsrc ! videoparse width=640 height=480 framerate=30/1 ! autovideosink )
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstVideoParse:videoparse0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 7585023885 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = NULL
/GstPipeline:pipeline0/GstVideoParse:videoparse0.GstPad:src: caps = NULL
Setting pipeline to NULL ...
Freeing pipeline ...

real	0m10.374s
user	0m0.410s
sys	0m0.270s
pi@raspberrypi03:~ $ 
Hermann.

Screenshot of video window on my laptop (because of "ssh -X ..." X11 forwarding):
Image

Pi gstreamer videotestsrc shows up nicely (and changing in lower right part) on my laptop:

Code: Select all

pi@raspberrypi03:~ $ gst-launch-0.10 videotestsrc ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 12150
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Minimal gstreamer raspivid pipeline?

Sat May 06, 2017 6:13 am

Raspivid is producing an encoded h264 stream, but you've not told gstreamer this. Your webcam will be producing yuyv or Mjpeg, and gstreamer knows this as it has gone through the v4l2 api.

Several options.
You could use raspividyuv to get i420 out, and tell gstreamer of that format.
Use "sudo modprobe bcm2835-v4l2" to load the v4l2 driver for the Pi camera, and then use v4l2src as before.
Use raspivid but tell gstreamer it is H264. You may also need h264parse and to enable inline headers from raspivid to convince it to work - gst seems to be a bit fussy.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

User avatar
HermannSW
Posts: 4830
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Minimal gstreamer raspivid pipeline?

Sat May 13, 2017 8:23 pm

Thanks for this option listing.

I tried to tell gstreamer about h264 format, and was partially successful -- I was able to play a raspivid video file with gstreamer. But I was not able to make it work with "raspistill ... -o - | gst-launch-1.0 -v fdsrc ...".

Then I tried "sudo modprobe bcm2835-v4l2" -- it was so simple, worked immediately!

Next I did setup a scenario with streaming video. I needed something that moves and took a small solar toy for that. I used my own "sun" to power the solar toy. I had removed some plastic on toy back in order to see how the small solar power did make the toy move (LC circuit plus magnet), click photo for details:
Image

This is youtube video showing the video streamed from Pi Zero W on laptop, and the Pi and laptop command lines:
https://www.youtube.com/watch?v=re1PC5a ... e=youtu.be
Image

This command starts streaming 640x480 @30fps video via UDP on the Pi:

Code: Select all

pi@raspberrypi03:~ $ cat stream.txt 
pi@raspberrypi03:~ $ gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480,framerate=30/1 ! videoconvert ! jpegenc !  rtpjpegpay !  udpsink host=192.168.178.139 port=5200
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
This command displays video stream from port 5200 on the laptop:

Code: Select all

$ gst-launch-1.0 udpsrc port=5200 !  application/x-rtp, encoding-name=JPEG,payload=26 !  rtpjpegdepay !  jpegdec ! videoconvert ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
This is what camera displays after powering off my "selfmade" sun in darkness, thanks to 3W infrared LED mounted with NoIR camera:
Image

Hermann.

P.S:
(Moving) "Caterpillar robot Raspberry Pi Zero W NoIR camera streams video to laptop (640x480 @30fps)"
https://www.youtube.com/watch?v=nWJXXmk ... e=youtu.be

"Caterpillar robot in darkness, navigated by infrared streaming video on laptop (640x480 @30fps)"
https://www.youtube.com/watch?v=uo94frZ ... e=youtu.be
Image
https://github.com/Hermann-SW/memrun
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/en/Raspberry_camera.html

Return to “General discussion”