Tiri, Lin and I really wanted to use the 9th floor studio (or a similar setup — we ended up using the micro studio) and decided to explore the possibility of livestreaming with 360 video in this context. The idea we thought would be most interesting was the first that came to mind: presenting a how-to for what we were doing in the form of a Coding Train episode.
Our final production pipeline ended up being pretty simple: Theta S (360 camera) -> OBS (streaming client) -> YouTube. Before getting to this point we had tried many different avenues. Our first approach was to embed the stream into our own webpage and use a javascript library called Aframe to VR-ify the stream content. The first problem with this was cross-origin header issues while embedding. We spoke to Rubin about this and he explained to us that YouTube delivers streams via GET as HLS which doesn’t have native support in Chrome, but more importantly solving the cross-origin issue required each user to install browser extensions which isn’t ideal. We were able to embed Twitch streams but couldn’t block ads on load and couldn’t get a proper .mp4 or .m3u8 (HLS) stream url to work with Aframe because Twitch embeds a webpage via iframe, effectively obscuring the actual video url. At the end of the day, Youtube has a built-in VR feature and essentially plug-and-play 360-video streaming capabilities, so it made no sense to build our own custom implementation.
The livestream imagery is composed of several layers of images chroma-keyed together with several green screens.
Our first test, simply getting a livestream going:
Then managing a 360 livestream:
The final output (viewer’s POV):
Final output (flattened, OBS POV):