Chinese Yellow Pages | Classifieds | Knowledge | Tax | IME

What is DASH and HLS?

DASH (Dynamic Adaptive Streaming over HTTP) and HLS (HTTP Live Streaming) are both popular streaming protocols used to deliver multimedia content over the internet, but they have some differences in how they operate.

DASH:

DASH is an adaptive bitrate streaming protocol that works by breaking the content into small segments and delivering them over HTTP. It allows the client device to request the appropriate bitrate and resolution based on factors like available bandwidth and device capabilities. DASH is codec-agnostic, meaning it can work with various video and audio codecs, including H.264, H.265, VP9, AAC, and others.

Pros:

  1. Codec Agnostic: Supports a wide range of codecs, allowing for flexibility in content creation and delivery.
  2. Dynamic Adaptation: Can adapt to changing network conditions by adjusting bitrate and resolution on-the-fly.
  3. Standardization: DASH is an international standard maintained by organizations like MPEG and the ISO/IEC.

Cons:

  1. Complexity: Implementing DASH can be more complex compared to HLS due to its flexibility and various options.
  2. Browser Support: Historically, DASH has had less consistent support across different web browsers compared to HLS, although this has improved over time.

HLS:

HLS is another adaptive bitrate streaming protocol that breaks content into smaller segments and delivers them over HTTP. It was developed by Apple and is widely used for streaming video on iOS devices and web browsers that support it. HLS typically uses the .m3u8 playlist format and works with codecs like H.264 and AAC.

Pros:

  1. Widespread Support: HLS is widely supported across various devices and platforms, including iOS devices, web browsers, Android devices (with third-party players), smart TVs, and more.
  2. Simplicity: HLS is relatively straightforward to implement, especially for iOS app development.
  3. Security: HLS supports encryption and DRM (Digital Rights Management) for content protection.

Cons:

  1. Apple Ecosystem Dependency: While HLS works well within the Apple ecosystem, it may not be as seamless on non-Apple devices.
  2. Latency: HLS typically has higher latency compared to some other streaming protocols, which may not be suitable for real-time applications like live streaming.

In summary, both DASH and HLS are widely used for streaming video over the internet, each with its own set of pros and cons. The choice between them often depends on factors such as target platforms, codec preferences, and specific requirements of the streaming application.

How they are configured and made in front-end and back-end?

HLS (HTTP Live Streaming):

Front-end Player: HLS playback on the front end typically involves using a JavaScript-based player library that supports HLS, such as:

  • hls.js: A popular JavaScript library that enables HLS playback in browsers without native HLS support.
  • Video.js with HLS.js plugin: Video.js is a customizable HTML5 video player, and the HLS.js plugin adds HLS support.
  • MediaElement.js with HLS provider plugin: MediaElement.js is another HTML5 video/audio player library that supports plugins for additional functionality, including HLS playback.

Back-end Service: To generate HLS content on the back end, you need to encode your video files into multiple bitrate/resolution renditions and create an HLS manifest file (.m3u8). You can use tools like FFmpeg or specialized transcoding services to perform this task.

Example FFmpeg command to create HLS content from a video file:

less
ffmpeg -i input.mp4 -vf "scale=w=1280:h=720:force_original_aspect_ratio=decrease" -c:a aac -ar 48000 -c:v h264 -profile:v main -crf 20 -sc_threshold 0 -g 48 -keyint_min 48 -hls_time 4 -hls_playlist_type vod -b:v 500k -maxrate 856k -bufsize 1500k -b:a 96k -hls_segment_filename '720p_%03d.ts' 720p.m3u8

This command generates HLS content with a 720p resolution and a 500kbps video bitrate.

Example HTML code to embed HLS content:

html
<video id="my-video" class="video-js" controls preload="auto" width="640" height="360" data-setup="{}">
<source src="https://example.com/hls/playlist.m3u8" type="application/x-mpegURL">
</video>

here’s a basic example of how you might implement HLS streaming on the backend using NGINX:

nginx
http {
...
server {
listen 80;
server_name localhost;
location /hls {
alias /path/to/your/hls/files;
types {
application/vnd.apple.mpegurl m3u8;
video/mp2t ts;
}
add_header Cache-Control no-cache;
add_header Access-Control-Allow-Origin *;
add_header Access-Control-Expose-Headers Content-Length;
add_header Access-Control-Allow-Headers Range;
if ($request_method = ‘OPTIONS’) {
add_header Access-Control-Allow-Methods ‘GET, OPTIONS’;
add_header Content-Length 0;
add_header Content-Type ‘text/plain; charset=utf-8’;
return 204;
}
}
}
}

In this example:

  • Replace /path/to/your/hls/files with the actual path to the directory containing your HLS files.
  • The types block maps file extensions to MIME types. This is necessary for proper handling of HLS files by the client’s browser.
  • The Cache-Control, Access-Control-Allow-Origin, Access-Control-Expose-Headers, and Access-Control-Allow-Headers headers are added to enable caching and CORS (Cross-Origin Resource Sharing) support.
  • The if block handles preflight requests for CORS support.

Once you have configured NGINX for HLS streaming, you can access your HLS content via URLs pointing to the HLS manifest files (.m3u8 files) and the individual media segments (.ts files).

Here’s an example of a basic HLS manifest file (playlist.m3u8):

m3u8
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:10
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:9.009,
segment0.ts
#EXTINF:9.009,
segment1.ts
#EXTINF:9.009,
segment2.ts
#EXT-X-ENDLIST

This .m3u8 file specifies the duration and sequence of the media segments (segment0.ts, segment1.ts, etc.). The client’s HLS player will parse this manifest file and request the individual media segments for playback.

To use HLS streaming on the client-side, you would provide the URL to the .m3u8 manifest file to your HLS-compatible player library (e.g., video.js with HLS.js plugin, hls.js). The player will then handle the HLS playback, fetching the media segments and managing adaptive streaming based on network conditions and device capabilities.

DASH (Dynamic Adaptive Streaming over HTTP):

Front-end Player: For DASH playback on the front end, you can use JavaScript-based player libraries that support DASH, such as:

  • Shaka Player: An open-source JavaScript player library for DASH and HLS playback.
  • dash.js: Another popular open-source JavaScript player library specifically designed for DASH streaming.

Back-end Service: To generate DASH content on the back end, similar to HLS, you need to encode your video files into multiple bitrate/resolution renditions and create a DASH manifest file (.mpd). FFmpeg can also be used for DASH content creation, but you may need additional tools or scripts for generating the manifest file.

Example FFmpeg command to create DASH content from a video file:

lua
ffmpeg -i input.mp4 -vf "scale=w=1280:h=720:force_original_aspect_ratio=decrease" -c:a aac -ar 48000 -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 500k -maxrate 856k -bufsize 1500k -vf "format=yuv420p" -hls_segment_filename '720p_%03d.ts' -g 48 -hls_time 4 -hls_playlist_type vod -f dash output.mpd

Example HTML code to embed DASH content:

html
<video id="my-video" controls>
<source src="https://example.com/dash/output.mpd" type="application/dash+xml">
</video>

Summary:

  • Front-end Player: Use JavaScript-based player libraries like hls.js or Shaka Player for HLS and DASH playback in the browser.
  • Back-end Service: Encode your video files into multiple bitrate/resolution renditions and create manifest files (.m3u8 for HLS, .mpd for DASH) using tools like FFmpeg or transcoding services.

Remember to adjust the parameters in the examples (such as bitrate, resolution, and file paths) according to your specific requirements and video content.

Below is a basic example of how you might implement DASH streaming on the backend using a popular media server software called NGINX with the ngx_http_mp4_module module enabled. This example assumes you have already encoded your video content into multiple bitrates and resolutions suitable for adaptive streaming.

nginx
http {
...
server {
listen 80;
server_name localhost;
location /videos {
alias /path/to/your/video/files;# Enable DASH streaming
mp4;
mp4_buffer_size 1m;
mp4_max_buffer_size 5m;
}
}
}

In this example:

  • Replace /path/to/your/video/files with the actual path to the directory containing your video files.
  • The mp4 directive enables DASH streaming for MP4 files in NGINX. Make sure the ngx_http_mp4_module is compiled and enabled in your NGINX configuration.
  • mp4_buffer_size and mp4_max_buffer_size control the buffer sizes for DASH streaming. You can adjust these values based on your specific requirements and server resources.

Once you have configured NGINX for DASH streaming, you can access your video content via DASH manifest files, typically in the form of .mpd (Media Presentation Description) files. These manifest files describe the available video streams and their characteristics for adaptive streaming clients.

Below is an example of a simple DASH manifest file (video.mpd):

xml
<?xml version="1.0" encoding="UTF-8"?>
<MPD xmlns="urn:mpeg:dash:schema:mpd:2011" minBufferTime="PT1.500S" type="static" mediaPresentationDuration="PT0H2M3.25S" maxSegmentDuration="PT0H0M2.000S" profiles="urn:mpeg:dash:profile:isoff-on-demand:2011">
<Period duration="PT0H2M3.25S">
<AdaptationSet mimeType="video/mp4" codecs="avc1.42c01e" frameRate="24" maxWidth="1280" maxHeight="720" par="16:9" lang="und">
<SegmentTemplate timescale="1000" media="video-$Number$.mp4" initialization="video-init.mp4" duration="2000"/>
<Representation id="720p" bandwidth="520000" width="1280" height="720"/>
<Representation id="480p" bandwidth="210000" width="854" height="480"/>
</AdaptationSet>
<AdaptationSet mimeType="audio/mp4" codecs="mp4a.40.2" lang="und">
<SegmentTemplate timescale="1000" media="audio-$Number$.mp4" initialization="audio-init.mp4" duration="2000"/>
<Representation id="audio" bandwidth="64000"/>
</AdaptationSet>
</Period>
</MPD>

This XML file describes the available video and audio streams for DASH streaming. It includes information such as codec details, bandwidth, resolution, and segment URLs.

To use this manifest file, you would provide the URL to it to your DASH-compatible player library (e.g., Shaka Player) on the client-side. The player will then parse the manifest and handle adaptive streaming, selecting the appropriate video and audio streams based on network conditions and device capabilities.

 

Video tag

HLS (HTTP Live Streaming), DASH (Dynamic Adaptive Streaming over HTTP), and HTML <video> tag streaming are all methods used to deliver video content over the internet, but they differ in their underlying technologies and functionalities:

  1. HLS and DASH Streaming:
    • Adaptive Streaming: HLS and DASH are adaptive streaming protocols, meaning they dynamically adjust the quality of the video stream based on factors such as available bandwidth, device capabilities, and network conditions.
    • Segmented Delivery: Both HLS and DASH break the video content into small segments and deliver them over HTTP. This segmentation allows for smoother playback and better adaptability to changing network conditions.
    • Support for Multiple Bitrates: HLS and DASH support multiple bitrate and resolution renditions of the same video content, allowing viewers to experience the best possible quality based on their connection speed.
    • Wide Compatibility: HLS is primarily used on Apple devices and platforms, while DASH has broader support across various devices and browsers.
  2. HTML <video> Tag Streaming:
    • Native Browser Support: The HTML <video> tag is a standard part of HTML5 and is supported by most modern web browsers without the need for additional plugins or libraries.
    • Single File Delivery: With the HTML <video> tag, video content is typically delivered as a single file (e.g., MP4, WebM) without segmentation. This can lead to less efficient streaming, especially over unreliable networks.
    • Limited Adaptability: The HTML <video> tag does not natively support adaptive streaming. While some browsers may support adaptive streaming through extensions or proprietary technologies, it’s not as widely supported or standardized as HLS and DASH.
    • Simplicity: Using the HTML <video> tag for streaming is generally simpler and more straightforward compared to implementing HLS or DASH, especially for basic video playback without advanced features like adaptive streaming.

In summary, HLS and DASH streaming offer adaptive bitrate streaming, segmented delivery, and support for multiple bitrates, providing a more robust and flexible solution for delivering video content over the internet. On the other hand, the HTML <video> tag provides a simpler approach to video playback but lacks some of the advanced features and adaptability offered by HLS and DASH

 

Leave a Reply

Your email address will not be published. Required fields are marked *