Adaptive streaming is crucial for delivering high-quality video content to various devices under different network conditions. This DevTip demonstrates how to convert videos to HLS (HTTP Live Streaming) and MPEG-DASH (Dynamic Adaptive Streaming over HTTP) formats using Rust and FFmpeg.

Introduction to HLS and MPEG-DASH

HLS and MPEG-DASH are popular adaptive streaming protocols that split video files into small segments and offer multiple bitrate options. This segmentation enables clients to seamlessly switch to the optimal quality based on current network conditions and device capabilities.

Understanding adaptive streaming

Adaptive streaming typically involves:

  1. Encoding the video at several bitrates and resolutions.
  2. Splitting each encoded version into short segments.
  3. Generating manifest files that detail the available streams and segments.
  4. Allowing clients to automatically select the best-suited quality during playback.

Setting up a Rust project

Begin by installing the necessary system dependencies:

apt-get install -y ffmpeg pkg-config libavcodec-dev libavformat-dev libavutil-dev \
  libavfilter-dev libavdevice-dev libswscale-dev libswresample-dev clang libclang-dev

Create a new Rust project:

cargo new video_converter
cd video_converter

Add these dependencies to your Cargo.toml file:

[dependencies]
ffmpeg-next = "7.1.0"
thiserror = "1.0.50"
anyhow = "1.0.75"

Integrating FFmpeg with Rust

We use the ffmpeg-next crate to interface with FFmpeg. Start by creating a function to initialize FFmpeg:

use anyhow::Result;
use ffmpeg_next as ffmpeg;

fn init_ffmpeg() -> Result<()> {
    ffmpeg::init()?;
    Ok(())
}

fn main() -> Result<()> {
    init_ffmpeg()?;
    println!("FFmpeg initialized successfully");
    Ok(())
}

Implementing video conversion to HLS and MPEG-DASH

To convert videos to HLS and MPEG-DASH, we invoke FFmpeg’s command-line tools through Rust’s Command module. This approach leverages FFmpeg’s powerful CLI while keeping the Rust code straightforward.

Converting to HLS

Use the following function to convert a video file to HLS format. It creates the necessary output directory and invokes FFmpeg with parameters tailored for HLS generation:

use std::process::Command;
use anyhow::Result;

fn convert_to_hls(input: &str, output_dir: &str) -> Result<()> {
    std::fs::create_dir_all(output_dir)?;
    let output_path = format!("{}/master.m3u8", output_dir);
    let status = Command::new("ffmpeg")
        .args(&[
            "-i", input,
            "-c:v", "libx264",
            "-c:a", "aac",
            "-profile:v", "main",
            "-preset", "medium",
            "-crf", "23",
            "-sc_threshold", "0",
            "-g", "48",
            "-keyint_min", "48",
            "-hls_time", "4",
            "-hls_playlist_type", "vod",
            "-hls_segment_filename", &format!("{}/segment_%03d.ts", output_dir),
            "-master_pl_name", "master.m3u8",
            "-var_stream_map", "v:0,a:0",
            "-f", "hls",
            &output_path,
        ])
        .status()?;
    if !status.success() {
        return Err(anyhow::anyhow!("FFmpeg process failed"));
    }
    Ok(())
}

Converting to MPEG-DASH

Similarly, the following function converts a video file to MPEG-DASH format. It sets up segment naming and manifest generation appropriate for DASH streaming:

fn convert_to_mpeg_dash(input: &str, output_dir: &str) -> Result<()> {
    std::fs::create_dir_all(output_dir)?;
    let output_path = format!("{}/manifest.mpd", output_dir);
    let status = Command::new("ffmpeg")
        .args(&[
            "-i", input,
            "-map", "0:v",
            "-map", "0:a",
            "-c:v", "libx264",
            "-c:a", "aac",
            "-b:v:0", "2M",
            "-b:v:1", "1M",
            "-s:v:1", "1280x720",
            "-profile:v", "main",
            "-seg_duration", "4",
            "-use_template", "1",
            "-use_timeline", "1",
            "-init_seg_name", "init-$RepresentationID$.m4s",
            "-media_seg_name", "chunk-$RepresentationID$-$Number%05d$.m4s",
            "-adaptation_sets", "id=0,streams=v id=1,streams=a",
            "-f", "dash",
            &output_path,
        ])
        .status()?;
    if !status.success() {
        return Err(anyhow::anyhow!("FFmpeg process failed"));
    }
    Ok(())
}

Note: For production usage, consider adding more robust input validation and progress reporting during conversion.

Building a simple streaming server in Rust

After converting your videos, you can serve the HLS or MPEG-DASH content using a basic HTTP server built with the warp crate. This example sets up CORS and applies a content-type header for HLS content:

use warp::Filter;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let cors = warp::cors()
        .allow_any_origin()
        .allow_methods(vec!["GET", "POST", "OPTIONS"]);

    let routes = warp::fs::dir("./output")
        .with(cors)
        .with(warp::filters::reply::with::headers([
            ("Content-Type", "application/vnd.apple.mpegurl"),
        ]));

    println!("Starting server at http://127.0.0.1:3030/");
    warp::serve(routes)
        .run(([127, 0, 0, 1], 3030))
        .await;
    Ok(())
}

Testing the streaming setup

To test your streaming solution:

  1. Convert a video to HLS or MPEG-DASH format by invoking the conversion functions. For example:

    fn main() -> Result<()> {
        init_ffmpeg()?;
        convert_to_hls("input.mp4", "./output/hls")?;
        convert_to_mpeg_dash("input.mp4", "./output/dash")?;
        println!("Conversion complete. Starting server...");
        // Optionally, start the streaming server here or in a separate process
        Ok(())
    }
    
  2. Run your streaming server:

    cargo run
    
  3. Use a compatible player—such as VLC, or web libraries like hls.js or dash.js—to play the stream from:

    • HLS: http://localhost:3030/hls/master.m3u8
    • MPEG-DASH: http://localhost:3030/dash/manifest.mpd

Conclusion

This DevTip demonstrated how to convert videos to HLS and MPEG-DASH formats using Rust and FFmpeg. We covered generating playlists and manifest files and setting up a basic streaming server for adaptive streaming. For a more robust solution, consider the following enhancements:

  • Encode videos at multiple bitrates and resolutions for truly adaptive streaming.
  • Experiment with advanced FFmpeg settings to optimize video quality.
  • Add support for additional features such as multiple audio tracks and subtitles.
  • Secure your streaming setup with HTTPS and token-based authentication.

If you are looking for a managed solution for video encoding and adaptive streaming, Transloadit offers powerful video processing capabilities that scale to meet your needs.