May 06, 2019
The much-anticipated GStreamer 1.16 release is now live. Many new features were added by the community during the year-long development cycle, and we would like to highlight some of our team's contributions that we're especially proud of.
This release continues Collabora's long-standing focus on giving embedded developers the tools they need to gain deeper insights into their pipelines and extract maximum performance from their hardware.
To help developers achieve the lowest possible latency, Nicolas and Julian added latency tracing for each element in a GStreamer pipeline. The new tracing API can be easily added as follows:
GST_DEBUG="GST_TRACER:7" \ GST_TRACERS=latency(flags=pipeline+element+reported) gst-launch-1.0 \ alsasrc num-buffers=20 ! flacenc ! identity ! \ fakesink
This offers a big improvement over the old per-pipeline latency measurements in pin-pointing latency bottlenecks. You can see a detailed description of the new capabilities here. Also,
gst-stats has been improved to calculate certain latency metrics.
Not content with just measuring latency, Nicolas also worked on eliminating unnecessary latency in the RTP H264 payloader and depayloader by adding support for the
MARKER bit and the
alignment= flag. These taken together mean that is it now possible to do RTP streaming of H.264 without adding latency artificially.
Zeeshan added a new interlacing mode that allows different fields to be put in separate buffers, also reducing latency. Interlaced video is how old fashion analog video was carried, the new alternate mode matches exactly how analog video works, with each field coming separately in time. Finally, Guillaume spent quite a bit of time improving
gst-validate to measure latency, and also to count dropped buffers and buffer frequency.
Storing audio in non-interleaved buffers can dramatically improve performance through more efficient interoperability with external APIs such as FFmpeg and Google's WebRTC DSP code, and also by avoiding memory copies for certain operations such as combining single-channel streams into one multi-channel stream. To achieve this performance boost, George has added full non-interleaved audio support to GStreamer. See here for details on the new non-interleaved APIs.
Guillaume added support for forcing kernel buffer size to what the user requested, along with a warning when the requested size could not be honored, this configuration on udpsrc is used for high bitrate streaming, where the buffering is not sufficient and userspace does not read fast enough. This allows 4K workloads to be handled more smoothly.
For the OpenMAX IL compatibility layer, GStreamer OMX, Guillaume added many improvements, including support for
NV16 pixel format as input to video encoders, better debugging using the new
OMX_API_TRACE debug category, and various Zynq UltraScale+ MPSoC specific improvements.
Nicolas increased GStreamer V4L2 robustness by adding validation to capture buffer imports, a big step on the road to safe capture buffer imports. And Ezequiel added JPEG encoding support, and also support for FWHT, a fake codec implemented in the Linux kernel that's used to test and validate the V4L framework without the need for special hardware.
Nicolas also improved the VAAPI H264 decoder, with more robust handling of broken and partially-supported streams.
During the 1.16 cycle, GStreamer development switched to GitLab, and as part of this project, we're in the process of replacing Jenkins with GitLab-CI. Assisting this effort, Nicolas added Cerbero builds to the CI system. This is the build system used to produce GStreamer builds for platforms other than Linux.
One of our most "visible" improvements came from Zeeshan who added colors to the output of
gst-inspect-1.0. And last but not least, Olivier spent a great deal of time improving the documentation, making over 75 commits to
In addition to these enhancements, our team made numerous bug-fixes and improvements to the code. If you have any questions about these or other new features, please don't hesitate to contact us.
Less than a day away, May 18th looks to be a very busy time. With Live Embedded Event and Embedded Vision Summit taking place almost simultaneously,…
Over the last two years, Bluetooth® audio support has steadily grown in PipeWire and has become a featureful, stable, conformant, open source…
Looking to use hardware-backed and virtual SocketCAN interfaces inside your Kubernetes Pods? A new device plugin now allows processes inside…