HTML5 Live Streaming with MPEG-DASH

Streaming to browsers without reliance on third-party services

Written 2017 June 13
Updated 2020 February 4

When my organization started streaming live online, we went with YouTube. Since it was free and easy to set up, it was an obvious choice. But during the months we used it a few problems cropped up. Audio and video would frequently be out of sync, despite endless encoder tweaks and changes. Also, creating and configuring new events each week was a hassle.

Once we figured out YouTube wasn't going to fit our needs I started looking at other options. Other streaming services would likely have similar problems, giving little control of the server-side encoding pipeline. Several turnkey self-hosted solutions were available, but as a FOSS geek I wanted to see what open source solutions were available.

As part of the move away from browser plugins like Flash, the Motion Picture Experts Group (MPEG) developed a new media streaming technique: DASH, Dynamic Adaptive Streaming over HTTP. DASH works with almost all major browsers via the dash.js player. (Apple earned yet another black mark in my book by not supporting DASH on iOS, requiring a fallback to Apple's similar-but-uglier HLS protocol.) These two pieces of software, along with FFmpeg (or Gstreamer if desired), come together to form an effecive browser-based live streaming solution.


A bit about DASH

DASH works by taking an incoming media stream and splitting it into chunks, then keeping an index of chunks for viewers to download in sequence. (Apple's HLS works very similarly, but stores the index in a different format). One of DASH's coolest features is adaptive streaming, if configured with multiple copies of a stream at different bitrates (or sizes) it'll automatically switch bitrates to keep the stream from stopping to buffer. While YouTube and other big-name services have had this feature for a long time, it's not as commonly seen with open source streaming solutions.

Streaming software: FFmpeg

The first piece of this streaming system is the streaming software. There's a lot of options out there, both open source and propritary, but for this guide I'll be using FFmpeg.

FFmpeg is a multimedia swiss army knife that captures, converts, and streams just about every format under the sun. It's an incredibly powerful tool, but is also somewhat tricky to use due to its command-line-only interface and plethora of options.

I use FFmpeg because it's one of the few tools that allows for simultaeous encoding at different bitrates, so instead of sending a single stream and then reencoding on the server I send multiple streams from the client, reducing the server's workload and preventing the loss of quality caused by reencoding the stream. The streaming PC needs sufficient processing power and internet bandwidth, however.

Streaming software: Gstreamer

Despite its awesomeness, sometimes FFmpeg doesn't quite cut it. After writing this guide I moved my org's video streaming duties from an Intel i7-based PC to a server box with four AMD Opteron 12-core CPUs. Despite the abundance of processing power FFmpeg couldn't manage even a medium-quality stream, due to inefficient multithreading. Gstreamer's greater degree of control allowed me to set up a pipeline that better used the Opteron's power.

Gstreamer is an open-source multimedia framework that works by connecting elements together in a pipeline to accomplish tasks. Knowledge of how media encoding and muxing works is required in order to create useful pipeline, FFmpeg is generally simpler to use.

The server: nginx-rtmp

nginx-rtmp is a module for the popular nginx web server. The module receives the stream (or streams, in case of adaptive streaming) from the streaming software and splits it into chunks suitable for DASH streaming. I'm using this fork of the module which has additional adaptive streaming support.

The player: dash.js

dash.js runs in the viewer's browser, downloading and playing the chunks generated by nginx-rtmp. It's quite simple to embed and use in an existing website.

Setting up nginx-rtmp


nginx-rtmp has low resource requirements, for example my org's install is hosted on Digital Ocean's lowest-tier plan and runs our three different-quality streams flawlessly. A server must have a good internet connection, however, since streaming video is quite bandwidth-intensive. Since the video chunks are stored on-disk an SSD is recommended.

Any modern Linux distro should work as the server, but for this document I'll assume Ubuntu Server 16.04 LTS. For this guide's purposes I'll also assume the server is dedicated to only running nginx-rtmp, if you're using the same server for other purposes you'll need to adjust the instructions somewhat.

Installing nginx-rtmp

If you're running Ubuntu 16.04 just copy-paste the following (as root):

apt-get build-dep nginx
apt-get source nginx
git clone
cd nginx-1.10.0
./configure --add-module=../nginx-rtmp-module
make install
wget -O /lib/systemd/system/nginx.service
systemctl daemon-reload
systemctl enable nginx.service

This will install nginx with nginx-rtmp to /usr/local/nginx, then configure systemd to start it at boot.


The first step is setting up SSL/TLS. If the website that will display the stream doesn't use HTTPS then this step can be skipped, however since most browsers are restricting support for non-encrypted traffic I highly recommend using HTTPS.

Let's Encrypt is a service that provides SSL/TLS certificates. All you need is a domain name pointing to your server. (Aquiring and configuring a domain name is out of the scope of this document.) Generally, you'll want a subdomain of your website's domain name, for example if your website is at the nginx-rtmp server would be at or

First, install the Let's Encrypt client with apt-get install letsencrypt. Next, stop nginx with systemctl stop nginx.service then get your certificate by running letsencrypt certonly. Follow the prompts, entering your domain name when prompted. Once the client finishes, you're ready to go to the next step.

nginx-rtmp's configuration is somewhat complicated, here's my organization's config as a starting point:

pid /run/;
worker_processes  1;

events {
    worker_connections  1024;

http {
    include       mime.types;
    sendfile        on;    
    keepalive_timeout  65;

    server {
        listen 80;
        server_name <your_server_domain_here>;
        return 301 https://$host$request_uri;

    server {
        listen       443 ssl;
        server_name  <your_server_domain_here>;

        ssl_certificate /etc/letsencrypt/live/<your_server_domain_here>/fullchain.pem;
        ssl_certificate_key /etc/letsencrypt/live/<your_server_domain_here>/privkey.pem;
        ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
        ssl_prefer_server_ciphers on;
        ssl_dhparam /etc/ssl/certs/dhparam.pem;
        ssl_session_timeout 1d;
        ssl_session_cache shared:SSL:50m;
        ssl_stapling on;
        ssl_stapling_verify on;
        add_header Strict-Transport-Security max-age=15768000;

        client_max_body_size 128M;

        add_header Access-Control-Allow-Origin * always;
        add_header Cache-Control no-cache always;

        # Redirect this domain to a different URL
        location / {
            root   html;
            return 301 <your_redirect_here>;

        # Return an empty response, used by dash.js to sync with server time
        location /time {
            return 200;

        # DASH files
        location /dash {
            root /tmp;

        # HLS files
        location /hls {
            root /tmp;

rtmp {
    server {
        listen 1935;
        chunk_size 4096;

        publish_time_fix off;

        application dash {
            live on;
            record off;
            allow publish <your_sender_ip_here>;
            allow publish;
            deny publish all;

            # Copy incoming streams to the HLS application
            exec ffmpeg -re -i rtmp://localhost:1935/$app/$name -c:v copy -c:a copy -f flv rtmp://localhost:1935/hls/${name};

            dash on;
            dash_nested on;
            dash_path /tmp/dash;
            dash_fragment 3;
            dash_playlist_length 120;
            dash_cleanup on;

            dash_clock_compensation http_head;
            dash_clock_helper_uri https://<your_server_domain_here>/time;

            dash_variant _low   bandwidth="500000"  width="640"  height="360";
            dash_variant _med  bandwidth="1500000" width="1280"  height="720";
            dash_variant _high bandwidth="5000000" width="1920" height="1080" max;

        application hls {
            # I despise iOS devices!
            live on;
            hls on;
            hls_path /tmp/hls;
            hls_nested on;

            hls_variant _low   BANDWIDTH=500000;
            hls_variant _med  BANDWIDTH=1500000;
            hls_variant _high BANDWIDTH=5000000;

Copy this config into /usr/local/nginx/conf/nginx.conf, replacing the file's default contents. Replace all instances of <your_server_domain_here> with your server's domain name, and replace the instance of <your_redirect_here> with a URL to your website, in the unlikely event of users browsing to the video server's domain.

Replace <your_sender_ip_here> with the IP address of the stream sender. You can add multiple allow_publish lines to permit sending from multiple IPs.

The dash_variant and hls_variant lines can be modified with the bitrate (in bits per second) and resolution of each of your stream qualities. You can have as few or as many different-bitrate streams as you like.

Sending the stream with FFmpeg

This is the most complex and vital piece of the streaming system. Here's the FFmpeg command line my organization used:

ffmpeg -re -i "Test Video.mp4" \
    -c:a aac -ac 2 -b:a 128k -c:v libx264 -pix_fmt yuv420p -profile:v baseline -preset ultrafast -tune zerolatency -vsync cfr -x264-params "nal-hrd=cbr" -b:v 500k -minrate 500k -maxrate 500k -bufsize 1000k -g 60 -s 640x360 -f flv rtmp:// \
    -c:a aac -ac 2 -b:a 128k -c:v libx264 -pix_fmt yuv420p -profile:v baseline -preset ultrafast -tune zerolatency -vsync cfr -x264-params "nal-hrd=cbr" -b:v 1500k -minrate 1500k -maxrate 1500k -bufsize 3000k -g 60 -s 1280x720 -f flv rtmp:// \
    -c:a aac -ac 2 -b:a 128k -c:v libx264 -pix_fmt yuv420p -profile:v baseline -preset ultrafast -tune zerolatency -vsync cfr -x264-params "nal-hrd=cbr" -b:v 5000k -minrate 5000k -maxrate 5000k -bufsize 10000k -g 60 -s 1920x1080 -f flv rtmp:// 

This command will play the video file Test Video.mp4, encode it at three different qualities, and send it to rtmp:// You can adapt this command to your purposes by replacing with your nginx-rtmp server's domain name and replacing streamname with any stream name desired. You must use the same variant endings (in this example _low, _med, and _high) that you used in the nginx-rtmp configuration from the previous section. You can have multiple stream sets as long as they have different stream names.

Livestreaming a pre-recorded video file isn't very useful, except for testing. The first line of the command can be modified to specifiy any input type supported by FFmpeg, there's a lot of them.

We use a Blackmagic Decklink Mini Recorder receiving the feed from our video mixer:

ffmpeg -f decklink -rtbufsize 702000k -deinterlace -i "DeckLink Mini Recorder@11" \

On Linux-based systems, you could capture video from a webcam with:

ffmpeg -f v4l2 -i /dev/video0

Using Gstreamer

For more control over the encoding and streaming pipeline (or if FFmpeg doesn't meet your needs) Gstreamer can also be used to feed video to nginx-rtmp. A Gstreamer primer is out of the scope of this guide, but here's the Python/Gstreamer script my organization is now using which should be a decent starting point.

You'll need to replace videotestsrc and audiotestsrc with the appropriate elements for your desired media source, Gstreamer has lots of them.

Playing the stream using dash.js

Your stream URLs

Once FFmpeg is sending the stream it will be available over DASH and HLS. For DASH the URL will be https://<your_domain>/dash/<streamname>.mpd. Each streaming quality is available separately at https://<your_domain>/dash/<streamname>_<quality>/index.mpd, which is useful for checking each stream bitrate to ensure they're of acceptable quality. For HLS, the URL will be https://<your_domain>/hls/<streamname>.m3u8.

The DASH-IF Reference player

DASH-IF (the DASH Industry Forum) provides a reference dash.js player, which is very useful for testing your stream. Just enter your stream URL and click "Load".

Code to embed

To embed dash.js on your website, use the following HTML and Javascript:

<video id="live-video" poster="<poster_image_here>" controls />

<script src=""></script>

document.addEventListener("DOMContentLoaded", function (event) {
    var video_element = document.getElementById('#live-video');

    if (window.MediaSource) {
        // For good web browsers, use dash.js

        player = dashjs.MediaPlayer().create();
        player.initialize(video_element, null, true);                    

    } else {
        // For Safari on iOS, use HLS

        video_element.src = "<your_hls_stream_url>";

This snippet checks if the browser supports DASH and initializes the dash.js player, or falls back to HLS if DASH is not supported. Replace <your_dash_stream_url> and <your_hls_stream_url> with your DASH and HLS stream URLs as described in the section above. (Leave the surrounding double quotes intact, i.e "".)


After completing the above steps you should have a working self-hosted live streaming setup. If you run into any issues or have any questions feel free to leave me a message in the comments below.

Blog Comments powered by Disqus.