We're hiring!

SRT in GStreamer

Olivier Crête avatar

Olivier Crête
February 16, 2018

Share this post:

Transmitting low delay, high quality video over the Internet is hard. The trade-off is normally between video quality and transmission delay (or latency). Internet video has up to now been segregated into two segments: video streaming and video calls. On the first side, streaming video has taken over the world of the video distribution using segmented streaming technologies such as HLS and DASH, allowing services like Netflix to flourish. On the second side, you have VoIP systems, which are generally targeted a relatively low bitrate using low latency technologies such as RTP and WebRTC, and they don't result in a broadcast grade result. SRT bridges that gap by allowing the transfer of broadcast grade video at low latencies.

The SRT protocol achieves these goal using two techniques. First, if a packet is lost, it will retransmit it, but it will only do that for a certain amount of time which is determined by the configured latency, this means that the latency is bounded by the application. Second, it tries to guess the available bandwidth based on the algorithms from UDT, this means that it can then avoid sending at a rate that exceeds the link's capacity, but it also makes this information available to the application (to the encoder) so that it can adjust the encoding bitrate to not exceed the available bandwidth ensuring the best possible quality. Using the combination of these techniques, we can achieve broadcast grade video over the Internet if the bandwidth is sufficient.

At Collabora, we're very excited with the possibilities created by SRT, so we decided to integrate it into GStreamer, the most versatile multimedia framework out there. SRT is a connection oriented protocol, so it connects 2 peers. It supports 2 different modes, one in which there is a caller and a listener (so it works like TCP) and one called "rendez-vous mode" where both sides call each other so as to make it friendly to firewalls. A SRT connection can also act in two modes, either as a receiver or a sender, or in GStreamer-speak as a source or as a sink. In GStreamer, we chose to create 4 different elements: srtserversink, srtclientsink, srtserversrc, and srtclientsrc. We decided on the client/server naming instead of caller/listener as we think it's easier to understand and it matches the naming we have to TCP based elements. We also chose to implement the rendez-vous mode inside the client elements as after the initialization, the codepaths are the same.

A typical example would be to have have an encoder which is also a server with a pipeline like:

gst-launch-1.0 v4l2src ! video/x-raw, height=1080, width=1920 ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high ! mpegtsmux ! srtserversink uri=srt://:8888/

And a receiver on the other side would receive it with a pipeline like this one:

gst-launch-1.0 srtclientsrc uri=srt:// ! decodebin ! autovideosink

Using tools like gst-launch, it's very easy to prototype SRT and it's integration into real world pipeline that can be used in real applications. Our team at Collabora would love to help you integrate SRT into your platform, using GStreamer, ffmpeg, VLC or your own multimedia framework. Contact us today to see how we can help!

Update (Jan 2019):

In GStreamer 1.16, we've decided to merge the clientsrc and serversrc srt elements into a single source element, and the same for the server. So the example pipelines in 1.16 are:

  gst-launch-1.0 -v videotestsrc ! video/x-raw, height=1080, width=1920
! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high
! mpegtsmux ! srtsink uri=srt://:8888/


  gst-launch-1.0 srtsrc uri=srt:// ! decodebin !

Comments (10)

  1. nh2:
    Jan 22, 2019 at 03:02 AM

    gst-launch-1.0 srtclientsrc srt:// ! decodebin ! autovideosink doesn't seem to work (any more?).

    Should probably be gst-launch-1.0 srtclientsrc uri=srt:// ! decodebin ! autovideosink

    See also https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/issues/874 for various problems.

    Reply to this comment

    Reply to this comment

    1. Olivier Crête:
      Jan 22, 2019 at 03:00 PM

      Yes, the git master has known issues, they will be fixed before the next release.

      Reply to this comment

      Reply to this comment

      1. nh2:
        Jan 22, 2019 at 04:58 PM

        These the incompatibility of the commands in the post were also present for the 1.14 release, not only in master. But your reply below may already have captured that.

        Reply to this comment

        Reply to this comment

    2. Olivier Crête:
      Jan 22, 2019 at 04:37 PM

      I've just noticed the missing uri=, we've updated the blog post. Also in 1.16, we're merging the client & server src and sinks. I'll be updating the blog post to cover that.

      Reply to this comment

      Reply to this comment

  2. Manuel:
    May 20, 2019 at 07:02 AM

    I'm using...

    gst-launch-1.0 -v srtsrc uri="srt://server_ip:10006?mode=caller&pksize=1316&latency=500&blocksize=1316" ! tsdemux ! opusdec ! audioconvert ! wasapisink sync=false low-latency=true

    When I simulate packet loss I get this error:

    ERROR: from element /GstPipeline:pipeline0/GstSRTSrc:srtsrc0: Internal data stream error.
    Additional debug info:
    ../libs/gst/base/gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline0/GstSRTSrc:srtsrc0:
    streaming stopped, reason error (-5)

    The reception of the stream stops.
    When I perform the same simulation but using srt-live-stransmit sending udp on the same machine and receiving with gstreamer udpsrc, the reception does not stop.
    Is it probably a bug of the plugin or am I putting the parameters wrong in srtsrc?

    Reply to this comment

    Reply to this comment

    1. Olivier Crête:
      May 21, 2019 at 04:13 PM

      This does sound like a bug in the GStreamer plugin. Could you please file a bug at https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/issues/? Thank you!

      Reply to this comment

      Reply to this comment

      1. Niklas Hambuechen:
        May 21, 2019 at 05:35 PM

        When you've filed a bug, please link it here, as I'm also interested in that topic.

        I, too, observed that in some situations the gstreamer pipeline just stops, especially when then stream's bitrate exceeds the capabilities of the connection.

        Reply to this comment

        Reply to this comment

  3. Anonymous:
    Aug 27, 2019 at 01:17 PM

    Comment peut-on faire pour remplacer "videotestsrc" par un vrai fichier en ".mp4" (h264) ? Je n'y suis pas parvenu et cela aurai pourtant un interet certain.


    How can I replace "videotestsrc" with a real ".mp4" file (h264)? I did not succeed and it will have a certain interest.
    Thank you.

    Reply to this comment

    Reply to this comment

    1. Olivier Crête:
      Aug 27, 2019 at 06:10 PM

      You probably want something like: filesrc location=myfile.mp4 ! qtdemux ! h264parse ! mpegtsmux alignment=7 ! srtsink

      Reply to this comment

      Reply to this comment

      1. Anonymous:
        Aug 28, 2019 at 10:19 AM

        Merci pour votre aide !
        J'ai bien mon flux qui arrive sur le client.
        Bon, ce dernier fige au bout de quelques secondes et il n'y a pas de son mais c'est déjà ça ;D


        Thanks for your help !
        I have my feed coming to the client.
        Well, it freezes after a few seconds and there is no sound but it's already that ;D

        Reply to this comment

        Reply to this comment

Add a Comment

Allowed tags: <b><i><br>Add a new comment:

Search the newsroom

Latest Blog Posts

New graphing tool for PipeWire debugging


PipeWire, the new and emerging open source framework that aims to greatly improve the exchange and management of audio and video streams…

Building GStreamer on Windows


With the advent of meson and gst-build, it is now possible to set up a GStreamer Windows development environment that rivals the finest…

Zink: Fall Update


I recently went to XDC 2019, where I gave yet another talk about Zink. I kinda forgot to write a blog-post about it, so here’s me trying…

Adding stateless support to vicodec


Prior to joining Collabora, I took part in Round 17 of the Outreachy internships, to work on the virtual drivers in the media subsystem…

Why HDCP support in Weston is a good thing


What HDCP is, and why supporting HDCP in Weston is justified in both an economical and technical context.

Virglrenderer and the state of virtualized virtual worlds


With the release of virglrenderer 0.8.0, getting accelerated OpenGL within a virtual machine (VM) made a big leap forward. Since virglrenderer-0.7.0,…

Open Since 2005 logo

We use cookies on this website to ensure that you get the best experience. By continuing to use this website you are consenting to the use of these cookies. To find out more please follow this link.

Collabora Ltd © 2005-2019. All rights reserved. Website sitemap.