The following is the documented footage of a working prototype.

The footage is back-projected onto a suspended sheet of frosted acrylic. A light-based distance sensor picks up anyone standing in front of it and changes the visual based on that person’s distance.

The project’s visuals are saved as video files, with layers building up every half a second for 10 seconds at 30 FPS. This gives me a nice round total of 300 frames, making visualisation easier to control.

The video files are encoded with HAP, then loaded into Max MSP. Distance data is collected directly from an arduino through a serial port. Frames are mapped to numerical data output from an arduino. The data itself is collected using a laser-based proximity sensor. The patch works with an ultrasound-based sensor as well, though the output on the Lidar is more consistent and reliable. This way, I can easily determine how the visual behaves as well as the effective distances by changing a few numbers in the patch.

A Two Minute Hate: Prototype 1 A Two Minute Hate: Prototype 2

Because the visuals are all controlled with the same sensor, I can safely branch out into multiple video files, all of which have the same properties (10 seconds, 30FPS, HAP encoding). The visuals are then sent to MadMapper using Syphon. From there they can be mapped to the screens in the installation.