Took quite some time to get the timestamps right (raspiraw timestamps are [us], gstreamer timestamps are [ns]).
I just took appsrc-launch3.cpp and spread the different parts over raspiraw.c

Now raspiraw sends 640x480 raw10 size frames (384000 bytes, black/white blinking as in appsrc-launch3) into gstreamer pipeline. And that pipelines measures 60fps on average, which is minimal fps rate for mode 7 I started raspiraw with!
<ADD>in order to build raspiraw/fast I had to append "-lgstreamer-1.0 -lgobject-2.0 -lglib-2.0 -lgstapp-1.0" to "userland-rawcam/build/raspberry/release/host_applications/linux/apps/raspicam/CMakeFiles/raspiraw.dir/link.txt" and "-I/usr/include/gstreamer-1.0 -I/usr/include/glib-2.0 -I/usr/lib/arm-linux-gnueabihf/glib-2.0/include" to "userland-rawcam/build/raspberry/release/host_applications/linux/apps/raspicam/CMakeFiles/raspiraw.dir/flags.make"</ADD>
raspiraw gets started with timeout 5000ms (-t), mode 7 (640x480, 60-90fps), saverate 1 (every frame received from camera triggers creation of new black/white frame of same size and pushes it into gstreamer pipeline). raspiraw ends after 5.6s, the stdout output from gstreamer callback and fpsdisplaysink gets "tee"d into file "out", grepping for "dropped" shows the framerates, tail of out shows that callback was triggered 302 times (5s x 60fps):
Code: Select all
pi@raspberrypi02:~/userland-rawcam $ time raspiraw -md 7 -t 5000 -sr 1 -o foobar | tee out
...
mmal: Buffer 0x72f168 returned, filled 384000, timestamp 14154870253, flags 0004
mmal: Buffer 0x72f340 returned, filled 384000, timestamp 14154870253, flags 0084
read_data(183,3028046000,5000000000)
mmal: Buffer 0x72f518 returned, filled 384000, timestamp 14154886890, flags 0004
mmal: Buffer 0x72f6f0 returned, filled 384000, timestamp 14154886890, flags 0084
read_data(184,3044683000,5000000000)
dropped: 0, current: 67.65, average: 60.38
mmal: Buffer 0x72f168 returned, filled 384000, timestamp 14154903528, flags 0004
read_data(185,3061321000,5000000000)
mmal: Buffer 0x72f340 returned, filled 384000, timestamp 14154903528, flags 0084
mmal: Buffer 0x72f518 returned, filled 384000, timestamp 14154920166, flags 0004
mmal: read_data(186,3077959000,5000000000)
Buffer 0x72f6f0 returned, filled 384000, timestamp 14154920166, flags 0084
mmal: Buffer 0x72f168 returned, filled 384000, timestamp 14154936803, flags 0004
mmal: read_data(187,3094596000,5000000000)
Buffer 0x72f340 returned, filled 384000, timestamp 14154936803, flags 0084
...
mmal: mmal_port_disconnect: vc.ril.video_render:in:0(0x72d8b0) is not connected
mmal: mmal_component_destroy_internal: vc.ril.video_render 2
mmal: mmal_port_free: vc.ril.video_render:in:0 at 0x72d8b0
mmal: mmal_port_free: vc.ril.video_render:ctr:0 at 0x72d590
real 0m5.662s
user 0m2.440s
sys 0m0.700s
pi@raspberrypi02:~/userland-rawcam $ grep dropped out
dropped: 0, current: 63.07, average: 63.07
dropped: 0, current: 29.39, average: 46.18
dropped: 0, current: 56.73, average: 49.71
dropped: 0, current: 73.32, average: 55.57
dropped: 0, current: 72.40, average: 58.95
dropped: 0, current: 67.65, average: 60.38
dropped: 0, current: 60.14, average: 60.35
dropped: 0, current: 60.12, average: 60.32
dropped: 0, current: 60.05, average: 60.29
pi@raspberrypi02:~/userland-rawcam $ tail -3 out
read_data(300,4974646000,5000000000)
read_data(301,4991284000,5000000000)
read_data(302,5007922000,5000000000)
pi@raspberrypi02:~/userland-rawcam $
This shows the gstreamer pipeline used, how camera streaming gets started and then gstreamer main loop gets entered:
Code: Select all
...
pipeline = gst_parse_launch("appsrc name=_ ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'", NULL);
...
g_idle_add ((GSourceFunc) read_data, appsrc);
/* play */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
start_camera_streaming(sensor, sensor_mode);
g_main_loop_run (gloop);
/* clean up */
gst_element_set_state (pipeline, GST_STATE_NULL);
...
This shows the gstreamer callback, called every 4ms, and pushing a frame if available into gstreamer pipeline.
It also shuts down gstreamer main loop in case of problem, and more importantly when timeout is reached:
Code: Select all
gboolean
read_data (gpointer user_data)
{
if (gbuffer) { // frame available to push
GstAppSrc *appsrc = user_data;
GstFlowReturn ret = gst_app_src_push_buffer(appsrc, gbuffer);
if (++gcnt % 100 == 0) {}
g_print("read_data(%d,%llu,%llu)\n",gcnt,GST_BUFFER_PTS (gbuffer),ggtimeout);
if ((ret != GST_FLOW_OK) || (ggtimeout && (GST_BUFFER_PTS (gbuffer) > ggtimeout))) {
/* something wrong or timeout, stop pushing */
g_main_loop_quit (gloop);
}
gbuffer = NULL; // ready for new frame
}
vcos_sleep(4); // TODO 250fps max?
return G_SOURCE_CONTINUE;
}
This is the current code that replaced the "SD card write code block" of raspiraw:
Code: Select all
if (!gbuffer) { // no frame left to send, take new one
GstBuffer *buff;
guint size;
size = 800 * 480;
buff = gst_buffer_new_allocate (NULL, size, NULL);
gst_buffer_memset (buff, 0, white ? 0xff : 0x0, size);
white = !white;
if (!ggtimeout) {
gbase = buffer->pts; // us
ggtimeout = gst_util_uint64_scale_int (cfg->timeout, GST_MSECOND, 1);
}
GST_BUFFER_PTS (buff) = (buffer->pts - gbase) * 1000; // us -> ns
GST_BUFFER_DURATION (buff) = 16666666; // TODO value? 1/60 s in [ns]
gbuffer = buff; // new frame ready for push
}
All not very nice yet, and most importantly, copying the the raw Bayer frame captured from raspberry camera in buffer->data into the gst_buffer created for pushing into pipeline.
That will be done tomorrow evening, now its far too late here, have to stop for sleeping.
Although it took some time, it was not that difficult to make raspiraw push its stream into gstreamer pipeline, and it works already partially ...
Code: Select all
$ diff raspiraw.c.orig raspiraw.c | wc --lines
136
$