System Overview
- The prototype uses the HTC Vive to track a streamer's position within a simple wooden frame.
- There are seven webcams mounted in different positions on the frame that trigger based on the tracked position of the streamer.
- Moving the the standing table in the middle of the volume swtiches the livestream to a digital feed, based on the streamer's movement through digital space.
- The space is annotated on the floor to communicate the different camera zones to the streamer.
- The livestream is projected onto the wall, giving the streamer feedback on the currently active camera angle.
Design Details & Questions
The design details within this piece of furniture reflect specific questions I was looking to explore with this piece.
1. What is the relationship between the streamer and the set?
The set is usually something that is a fixed space, but within a live-streaming context, the set is often a mobile space and/or a personal one. The wooden frame implies an outline of a static room, but the casters on the frame indicate that the frame itself is a piece of furniture.
2. What is the interaction between editing algorithm and the streamer?
By giving control of the edit to an automated system, the streamer has to use the rules of the algorithm governing that system to take back control of the edit. The projection of the live-stream on the wall is one way for the streamer to learn those rules as they use the furniture.
3. Does the underlying editing system affect the performance?
The taped floors indicate the different trigger zones of the furniture. By knowing the bounds of these trigger zones, the streamer starts to perform for the edit.
4. What is the relationship between physical and digital space in the context of live-streaming?
The prototype centers around a desk with a laptop sitting on it. Moving towards the table switches the trigger system from a spatial one to a digital one. Live-streaming is often a blend between virtual and physical space, and this was one way to explore the transitions between those spaces.
Process
This project was driven entirely by technical exploration. It started as a quick prototype that turned my personal studio space into a multi-cam livestream space. I used three cameras, each of which highlighted a different window into the set.
Then I started to explore using spatial triggers, using a HTC Vive controller, rather than just digital ones to trigger the camera angle choice.
I then worked to integrate a full system together, that would work cohesively, across multiple devices, and at the same time refine the visual language of the piece.
Some of the key technical and design challenges were:
- Mapping the spatial zones in Unity to specific camera angles
- Using OBS to incorporate a digital feed from a different computer
- Wiring!
- Refining the visual language of the piece