February 08, 2021
Communication networks are...capricious. Even more so if we consider autonomous vehicles on the ground or in the air—always on te move, Wi-Fi hotspot or a cellular tower their only link with the home base. Connectivity can change from great to terrible around any corner and if we happen to be streaming a live feed out of our drone's camera, consequences for video quality can be disastrous.
|Woowa food delivery bot—powered by Hwangsaeul—en route to customers. Learn more.
When flying above a cityscape, we can't demand our wireless provider to build more base stations or ask the passersby to kindly refrain from watching cat videos on their phones and let our stream through. We can't fix the network, but maybe we can adapt to it.
Hwangsaeul, or H8L, a remote surveillance streaming solution, utilizes the capability of libsrt to collect statistics from open SRT sockets and by continuously analyzing the available data tries to detect potential connectivity issues. When a problem is diagnosed, a customizable logic may decide to change some parameters of the video encoding process, sacrificing quality to preserve smooth and steady streaming at lower bitrate.
|Stream adaptor implements continous feedback loop from network socket to video encoder.|
Since we realize the strategy as to what tradeoffs to make in order to scale the stream down when connectivity becomes limited may differ from use-case to use-case, Hwangsaeul has an API for writing your own decision-making plugins. The API enables access to the complete variety of SRT statistics for you to analyze and choose how to tweak the encoder settings.
Hwangsaeul comes with an application that demonstrates adaptive streaming using a simple decision plugin that tries to adjust video bitrate to the socket bandwidth estimate calculated by SRT library. You can compile it using Hwangsaeul's fork of gst-build:
git clone https://github.com/hwangsaeul/gst-build.git -b h8l/adaptive-demo cd gst-build meson build ninja -C build
This builds GStreamer's master branch and H8L's Gaeguli module, which contains the demo. Because we are using our own build of GStreamer, we need to enter the development environment that replaces the GStreamer binaries that may be preinstalled on your machine with the versions just compiled.
ninja -C build devenv
Switch to the directory with the demo:
Before we launch it, we have to do one last manual step. In order to control network bandwidth, the demo uses Linux traffic control facilities through
gaeguli-tc-helper binary. Changing the bandwidth is a privileged operation for which the helper executable needs to be granted
cap_net_admin capability by root:
sudo setcap cap_net_admin+ep gaeguli-tc-helper
If you forget to run the previous command, the helper won't be able to access your network interface and will print the following error when you try to enable traffic control:
rtnl_qdisc_add failed: Operation not permitted
The demo executable requires the name of camera device you want to use as the video source:
$ ./gaeguli-adaptor-demo -d /dev/video4 Control panel URI: http://192.168.47.155:8080/
Opening the address in a browser reveals the streaming control panel:
|Adaptive streaming demo control panel and a video stream in Totem.|
After choosing a codec and starting the stream, a SRT URI will appear that can be opened in a compatible player like Totem or
gst-play. The sliders control the baseline bitrate and quantizer settings of your video that apply when network connection is not degraded.
Enabling traffic control will throttle down the bandwidth of local loopback interface, by default to 1024 kbps. As packet loss begins to appear, the stream will start to break and "SRT measured bandwidth" value should also reflect the limited bandwidth. Soon, the stream adaptor, continuously monitoring the bandwidth estimate, should detect the issue and request the encoder to decrease the bitrate accordingly. This will show in "Actual bitrare" value and a video stream that, although being of lower quality, is continous and without errors.
Disengaging traffic control or rising the bandwidth limit high enough should result in steady return of the stream bitrate to the baseline level.
Below you can see the demo in action. Note the demo simulates encoding and streaming the video over a network, which adds some processing latency. The image shown in the player is therefore not in sync with the presentation's audio.
If you have any questions regarding Hwangseul, or would like to learn how to harness the potential of SRT, GStreamer or any other Open Source technologies in your media streaming projects, please don't hesitate to contact us, we'll be happy to help!
The testing ecosystem in the Linux kernel has been steadily growing, but are efforts sufficiently coordinated? How can we help developers…
With the upcoming 0.5 release, WirePlumber's Lua scripts will be transformed with the new Event Dispatcher. More modular and extensible…
This second installment explores the Rust libraries Collabora developed to decode video and how these libraries are used within ARCVM to…
Why is creating object graphs hard in Rust? In part 1, we looked at a basic pattern, where two types of objects refer to one another. In…
Text-to-speech (TTS) models are playing a transformative role, from enriching audiobooks to enhancing podcasts and even improving interactions…
In Linux, the Industrial Input/Output subsystem manages devices like Analog to Digital Converters, Light sensors, accelerometers, etc. On…