A dead octopus in a washing machine cycle with a reimagined 360-degree camera. The recorded footage was manipulated by an algorithm pushed beyond its limits so that it became defective.
Original Format: Single Channel / Run Time: 12:24 - The project has since been expanded into a variety of further formats (see below).
The Reimagined 360° Camera
In a rudimentary design, using a Christmas tree decoration, make-up pads and tape, a 360º camera was encased in a transparent sphere. This design afforded the camera not only the ability to spin in any direction, but provided for it to do so without being anchored to a fixed point, such as a monopod or tripod, and without the need for the movements to be programmed. Whilst the new camera design may stay still if placed, for instance, on a flat surface in a relatively quiet environment, its capacity to relate to more active environments was expanded. Taking this to an extreme, for example, when entered into a washing machine, the camera was now not only able to spin, but also roll, float, tumble and much more simultaneously. To ensure that the full potential of this movement was recorded, the image stablisation options of the camera were not used.
The Defective Algorithm
The footage recorded using this new camera housing was then processed using Optical Flow. Optical Flow is a video time remapping algorithm available as a feature of the Adobe video editing program, Premiere Pro. It allows users to slow their footage down while maintaining, via interpolation, the appearance of continuous motion. Slow motion is typically achieved when footage is recorded at a fast frame rate and played back at a regular frame rate. When wanting to slow footage down, and missing the additional frames provided by a faster recording frame rate, the footage may look stuttery. Optical Flow claims to smooth this out by building and inserting these missing frames into the footage. But this interpolation has some limitations. As stated on the Adobe website, ‘Optical Flow needs to calculate the motion of every pixel for each frame. It does not actually know the difference between the pixels that make up your subject and the pixels that are part of your background or other objects. So you may see some visual warping if visual elements that are the same color conflict with each other, or if parts of an object get occluded from one frame to the next’. The more complicated your footage is according to these standards and the further footage is slowed down from its original recorded speed, the more difficulty the algorithm has in performing its calculations and the chance of “warping” increases.
In “•”, Optical Flow was used to reduce the speed of the raw footage shot with the camera by over 20000%. 3.5 seconds of footage was extended to over 12 minutes. In doing this the Optical Flow algorithm created over 20000 new frames to make up the missing footage. A lot of warping resulted.
You can read more about the techniques developed in the making of “•”, as well as how these techniques may be used by ethnographers, in Collapsing Scales: Entering Relations - published in Entanglements (2020) and presented at the Royal Anthropological Institute Film Conference (2021, see film below).
Select stills from “•”