We're hiring!

Network adaptive streaming with Hwangsaeul

Jakub Adam avatar

Jakub Adam
February 08, 2021

Share this post:

Communication networks are...capricious. Even more so if we consider autonomous vehicles on the ground or in the air—always on te move, Wi-Fi hotspot or a cellular tower their only link with the home base. Connectivity can change from great to terrible around any corner and if we happen to be streaming a live feed out of our drone's camera, consequences for video quality can be disastrous.

Woowa food delivery bot—powered by Hwangsaeul—en route to customers. Learn more.

When flying above a cityscape, we can't demand our wireless provider to build more base stations or ask the passersby to kindly refrain from watching cat videos on their phones and let our stream through. We can't fix the network, but maybe we can adapt to it.

Hwangsaeul, or H8L, a remote surveillance streaming solution, utilizes the capability of libsrt to collect statistics from open SRT sockets and by continuously analyzing the available data tries to detect potential connectivity issues. When a problem is diagnosed, a customizable logic may decide to change some parameters of the video encoding process, sacrificing quality to preserve smooth and steady streaming at lower bitrate.

Stream adaptor implements continous feedback loop from network socket to video encoder.

Since we realize the strategy as to what tradeoffs to make in order to scale the stream down when connectivity becomes limited may differ from use-case to use-case, Hwangsaeul has an API for writing your own decision-making plugins. The API enables access to the complete variety of SRT statistics for you to analyze and choose how to tweak the encoder settings.

Trying it out

Hwangsaeul comes with an application that demonstrates adaptive streaming using a simple decision plugin that tries to adjust video bitrate to the socket bandwidth estimate calculated by SRT library. You can compile it using Hwangsaeul's fork of gst-build:

git clone https://github.com/hwangsaeul/gst-build.git -b h8l/adaptive-demo
cd gst-build
meson build
ninja -C build

This builds GStreamer's master branch and H8L's Gaeguli module, which contains the demo. Because we are using our own build of GStreamer, we need to enter the development environment that replaces the GStreamer binaries that may be preinstalled on your machine with the versions just compiled.

ninja -C build devenv

Switch to the directory with the demo:

cd build/subprojects/gaeguli/tests/adaptor-demo/

Before we launch it, we have to do one last manual step. In order to control network bandwidth, the demo uses Linux traffic control facilities through gaeguli-tc-helper binary. Changing the bandwidth is a privileged operation for which the helper executable needs to be granted cap_net_admin capability by root:

sudo setcap cap_net_admin+ep gaeguli-tc-helper

If you forget to run the previous command, the helper won't be able to access your network interface and will print the following error when you try to enable traffic control:

rtnl_qdisc_add failed: Operation not permitted

The demo executable requires the name of camera device you want to use as the video source:

$ ./gaeguli-adaptor-demo -d /dev/video4 
Control panel URI:

Opening the address in a browser reveals the streaming control panel:

Adaptive streaming demo control panel and a video stream in Totem.

After choosing a codec and starting the stream, a SRT URI will appear that can be opened in a compatible player like Totem or gst-play. The sliders control the baseline bitrate and quantizer settings of your video that apply when network connection is not degraded.

Enabling traffic control will throttle down the bandwidth of local loopback interface, by default to 1024 kbps. As packet loss begins to appear, the stream will start to break and "SRT measured bandwidth" value should also reflect the limited bandwidth. Soon, the stream adaptor, continuously monitoring the bandwidth estimate, should detect the issue and request the encoder to decrease the bitrate accordingly. This will show in "Actual bitrare" value and a video stream that, although being of lower quality, is continous and without errors.

Disengaging traffic control or rising the bandwidth limit high enough should result in steady return of the stream bitrate to the baseline level.

Below you can see the demo in action. Note the demo simulates encoding and streaming the video over a network, which adds some processing latency. The image shown in the player is therefore not in sync with the presentation's audio.

If you have any questions regarding Hwangseul, or would like to learn how to harness the potential of SRT, GStreamer or any other Open Source technologies in your media streaming projects, please don't hesitate to contact us, we'll be happy to help!

Comments (0)

Add a Comment

Allowed tags: <b><i><br>Add a new comment:

Search the newsroom

Latest Blog Posts

Mainline Linux gains accelerated video decoding for Microchip's SAMA5D4


The Hantro Video4Linux2 (V4L2) kernel module has gained support for another SoC! The Microchip SAMA5D4 features a single decode unit supporting…

Quick hack: Patching kernel modules using DKMS


DKMS is a framework that is mostly used to build and install external kernel modules. It can also be used to install a specific patch to…

Build your own application with GTK 4 as a Meson subproject!


Building GTK 4 as a Meson subproject for your own application is not only useful for Windows builds, but also for many Linux distributions…

Profiling virtualized GPU acceleration with Perfetto


Recently, we have been using Perfetto to successfully profile Apitrace traces in crosvm through VirGL renderer. We have now added perfetto…

Continuous 3D Hand Pose Tracking using Machine Learning & Monado OpenXR


As part of a project backed by INVEST-AI, a program managed by IVADO Labs, we have developed a multi-stage neural network-based solution…

An easy to use MTP implementation for your next embedded Linux project


Did you know you could run a permissively-licensed MTP implementation with minimal dependencies on an embedded device? Here's a step-by-step…

Open Since 2005 logo

We use cookies on this website to ensure that you get the best experience. By continuing to use this website you are consenting to the use of these cookies. To find out more please follow this link.

Collabora Ltd © 2005-2021. All rights reserved. Privacy Notice. Sitemap.

Collabora Limited is registered in England and Wales. Company Registration number: 5513718. Registered office: The Platinum Building, St John's Innovation Park, Cambridge, CB4 0DS, United Kingdom. VAT number: 874 1630 19.