Research project for my dissertation on the Msci Creative Computing programme at Goldsmiths. I trained a
type of artificial neural network called an autoencoder to reconstruct the individual frames from the
film Blade Runner. Once it had been trained, I got the network to reconstruct every frame from Blade
Runner and then resequenced it into a video. I also ran other films through the network trained on Blade
The model is a variational autoencoder trained with a learned similiarity metric as first proposed by Larsen et al. 2015. You can read more about the project in my medium post or in more technical detail in my dissertation.
This project was featured in Vox, boingboing, Wired Italy, prosthetic knowledge. I was also interviewed about the project for the CBC radio show Spark.
This work as been shown at The Whitney Museum of American Art , The Photographers Gallery and Art Center Nabi .