top of page

Ex-filtration of data from the Netflix of South India

Updated: Nov 13, 2018

The whole software industry is experiencing a paradigm shift with wide scale adoption of cloud native architecture. Even though the devops community has come up with a wide range of tools to support this transition, a huge factor that is being overlooked is the threat model. Below is a blog post from our consultant where a security misconfiguration in the CDN led to ex-filtration of content from a VOD streaming service provider.

 

After cuddling with Chrome Developer Tools for a while, we could see that they were using a basic setup from Akamai. The goal was to see if there was a possibility to download a high quality content directly from the CDN (Not from the local browser cache).


The web player received a set of index_x_av.m3u8 playlist files from the CDN, index_4_av.m3u8 being Highest Resolution and index_0_av.m3u8 being the lowest. Opening it up in VLC loaded the stream, but the goal was to download the movie. The index_x_av.m3u8 files had URLs to all the segments of the movie. A simple wget magic must do it right?


Since the URLs followed the same pattern throughout all the segments, I didn’t even need to parse the URLs from the playlist file. The below bash script generated URLs for all the segments of the movie, and attempted to download each segment.

#!/bin/bash 
echo "Generating Links..." 
for i in `seq 1 1013`; 
 do 
 echo "http://xvh.akamaihd.net/i/Movie_Name/5.1/YA_1080,.mp4.csmil/segment${i}_4_av.ts" >> links.txt 
 done 
echo "Initiating Download.. Fasten your seat belts.." wget -i ./links.txt 

We watched wget painfully download each segment at 40kbps. Good’ol Madras days eh. Bummer! It happens that Akamai slowed the crap down if you don’t have a regular browser user-agent set on wget. But that’s an easy fix, the one below fixed the issue, and we were able to get back to the first world speed.

wget -d --header="User-Agent: Mozilla/5.0 (Windows NT 6.0) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.97 Safari/537.11"  --header="Accept-Encoding: compress, gzip" -i links.txt 

Now that all 1013 segments were downloaded to our local disk, what needed to be done was stitching them all together. ffmpeg is your one stop shop for all video related things. We spent hours trying to find the right argument/codec to be used to stick all these segments together, when we stumbled upon the -i option, which basically worked like wget. You pass in the playlist file to ffmpeg and it would automatically download and stick the files on the fly.

ffmpeg -i http://xvh.akamaihd.net/i/Movie_Name/5.1/YA_1080,.mp4.csmil/index_4_av.m3u8 -t 10122 -c copy fullmovie.ts

The User-Agent issue struck back again, and it seemed that the User-Agent plugin in ffmpeg was broken.

After a while it occured to us that, since we already have all the segments on our local disk, We could just run ffmpeg -i on the playlist file after modifying the URLs to point to our local copy, hosted on a local webserver. That way we can escape Akamai’s user-agent bashing.

After our modification the playlist file looked like this

And then running

ffmpeg -i http://127.0.0.1:1337/index_4_av.m3u8 -t 10122 -c copy fullmovie.ts

did the job for us.

 

69 views

Recent Posts

See All

Comentarios


bottom of page