Monday, February 9, 2009

Snapshots from a live video source in gstreamer

EDIT: A helpful reader pointed out that using videorate in the pipeline should do the trick, something like:
gst-launch -v v4l2src ! tee name=t ! queue ! xvimagesink t. 
! queue ! videorate ! video/x-raw-yuv, width=640, height=480, framerate=1/1 ! jpegenc ! 
filesink location=test.jpeg
I'm fairly sure I tried this before with no success, but it works fine now. There have been some major bugfixes to videorate since I wrote this. In any case, I'll leave the rest of the article up in the hope that it is still useful.

As part of the propulse[ART] project, we wanted to add a "preview" feature to the existing audio/video engine. This involves writing a frame of video every second to a jpeg file, while displaying the video at a full 30 fps in a window. I was surprised to discover that this feature was not already implemented in gstreamer for live sources (to the best of my knowledge).
This should probably be implemented by a real gstreamer element, but for our purposes a relatively straightforward hack prototyped in streamshot.cpp was sufficient. The video pipeline consists of a video4linux source, a capsfilter to enforce some properties on the video, an ffmpegcolorspace element, and an xvimagesink. We attach a callback to the source pad of the video4linux source that is called every time our source has a new buffer of data (i.e. a frame). Since timing is not a big concern, the cb_have_data callback knows to only write a file every second by checking a boolean value that is periodically set to true by a separate callback. The cb_have_data function makes a copy of the buffer, swaps its red bytes with its blue bytes (swapping the red_mask and blue_mask in the caps did not work for some reason) and writes the modified buffer to a jpeg, line-by-line. The jpeg writing code was based on Andrew White's Xlib screenshot example. Thanks also to Koya Charles for help debugging this example, as well as the byte swapping.
The final version of the preview feature will probably happen entirely in a separate thread and not involve file-writing, but rather be streamed as these frames will be part of our web-interface.


Update: To make up for this entry's lack of flashiness, enjoy this code_swarm video of the propulse[ART] software's (codename miville) development so far (code_swarm instructions courtesy of amix.dk):

5 comments:

Passport Please said...

It's nice to see my short code snips are useful as more than a quick reference for myself :)

Enjoy,
Andrew

Codebrainz said...

Unless I'm missing something here, this should be fairly trivial to accomplish fully within the Gstreamer pipeline. Make the v2lsrc tee off into two parts, one with what you have for the live preview and one with a capsfilter that has framerate=1/1 which feeds a jpegenc which goes to a multifilesink.

If you wanted to do your own file-writing logic, you could replace the multifilesink with an appsink and catch buffer events that contain fully encoded JPEG data already, which you would write out to a file in your code directly.

If I'm not mistaken using a tee where each part starts with a queue would also put each part in it's own thread.

Tristan Matthews said...

Codebrainz:
I'd tried something like this before to no avail:

gst-launch -v videotestsrc ! tee name=t t. ! queue ! video/x-raw-yuv, width=640, height=480, framerate=1/1 ! jpegenc ! fakesink t. ! queue ! video/x-raw-yuv, width=640, height=480, framerate=30/1 ! xvimagesink sync=false

You get:
WARNING: erroneous pipeline: could not link queue1 to xvimagesink0

The problem seems to be with having the two different framerates. Having the same framerate in both caps does work. If you have any idea how to avoid this issue please post here.

Best,
Tristan

Anonymous said...

There is a 'videorate' item you can put in the stream to harmonize your framerates. It can drop sufficient frames to make 1 fps for the jpeg output.

tristanswork said...

@Anonymous: Indeed it does. I have not looked at this post in quite a while and we didn't up using this feature for our software. I'm not sure if I tried using videorate before. I was pretty sure I had, to no avail, but clearly this is no longer the case. Either that or I just have a better handle on caps syntax ;)