Service worker caching, PlaybackRate and Blob URLs for audio and video on Chrome for Android

Sometimes good things have boring names.

Case in point: the Unified Media Pipeline, UMP for short.

This may sound like a sinister Soviet era directive, but in fact it's an important step towards consistent cross-platform audio and video delivery. Chrome on Android will now use the same media stack as desktop Chrome, rather than relying on the underlying platform implementation.

UMP enables you to do a lot:

  • Cache audio and video with service workers, since media delivery is now implemented directly within Chrome rather than being passed off to the Android media stack.
  • Use blob URLs for audio and video elements.
  • Set playbackRate for audio and video.
  • Pass MediaStreams between Web Audio and MediaRecorder.
  • Develop and maintain media apps more easily across devices — media works the same on desktop and Android.

UMP took some hard engineering work to implement:

  • A new caching layer for improved power performance.
  • Updating a new MediaCodec based video decoder hosted in Chrome's GPU process.
  • Lots of testing and iteration on different devices.

Here's a demo of video caching with a service worker:

Screenshot of video playback.

Caching the video file and the video poster image is as simple as adding their paths to the list of URLs to prefetch:

<video controls  poster="static/poster.jpg">
    <source src="static/video.webm" type="video/webm" />
    <p>This browser does not support the video element.</p>
var urlsToPrefetch = [
    'static/video.webm', 'static/poster.jpg',

The inability to change playbackRate on Android has been a long-standing bug. UMP fixes this. For the demo at, playbackRate is set to 2. Try it out!

Screenshot of video playback with playbackRate set to 2.

UMP enables blob URLs for media elements — which means that, for example, you can now play back a video recorded using the MediaRecorder API in a video element on Android:

Screenshot of playback in Chrome on Android of a video recorded using the MediaRecorder API

Here's the relevant code:

var recordedBlobs = [];

mediaRecorder.ondataavailable = function(event) {
    if ( && > 0) {

function play() {
    var superBuffer = new Blob(recordedBlobs, {type: 'video/webm'});
    recordedVideo.src = window.URL.createObjectURL(superBuffer);

For the demo at, video is stored using the File APIs, then played back using a Blob URL:

function writeToFile(fileEntry, blob) {
    fileEntry.createWriter(function(fileWriter) {
    fileWriter.onwriteend = function() {
    fileWriter.onerror = function(e) {
        log('Write failed: ' + e.toString());
    }, handleError);

function readFromFile(fullPath) {
    window.fileSystem.root.getFile(fullPath, {}, function(fileEntry) {
    fileEntry.file(function(file) {
        var reader = new FileReader();
        reader.onloadend = function() {
        video.src = URL.createObjectURL(new Blob([this.result]));
    }, handleError);
    }, handleError);

The Unified Media Pipeline has also been enabled for Media Source Extensions (MSE) and Encrypted Media Extensions (EME).

This is another step towards unifying mobile and desktop Chrome. You don't need to change your code, but building a consistent media experience across desktop and mobile should now be easier, since the media stack is the same across platforms. Debugging with Chrome DevTools? Mobile emulation now uses the 'real' audio and video stack.

If you experience problems as a result of the Unified Media Pipeline, please file issues on the implementation bug or via


Relevant bugs

There are a couple of bugs affecting <video>, service workers and the Cache Storage API:

Browser support

  • Enabled by default in Chrome 52 and above.