Most of the gotchas consists of what chromecast does and doesn’t support. We will be going through each one of them.
Even thought subtitles are supported through m3u8 as a chunks. Unfortunately chromecast doesn’t support them chunked. So you will need your subtitles file complete for the whole length of the video. What that means is that you can’t start downloading video file with emmbeded subtitles and start streaming it to your tv. I mean you can do that, but without subtitles. To get complete subtitles you will usually need the whole video downloaded.
As you download file you can pipe it to ffmpeg. Lucky for us ffmpeg is good enough to accept piped data, which means you don’t need to save downloaded file to disk, you only need to store output of ffmpeg to disk, making things a bit faster.
There are .mkv and .mp4 files which contain channels info on the end of the file. I’ve rarely found them online, however for such cases you will need to download the whole file before you are able to start converting it.
Chromecast media status message from chromecast protocol contains info on what timestamp user currently is. It is fairly reliable and allows us to build features such as resume playing from timestamp you’ve last watched this video.
If we want to build web playing media experience videojs is perfect for it, the same video we convert for casting to chromecast is easily playable in videojs without much config or hussle.
I have written a small go program which casts open source video from your local machine to your android tv. If you want to use it you can find it on github as gochromecast
We’ve learned some of the gotchas of building media server, We’ve learned how chromecast protocol works, And dabbled a little bit into ffmpeg With that we conclude this writings series.