a – FinalProposal

I want to make a system that attempts to maximise some bodily response from a viewer.

This requires a parametric image and a way to measure bodily response in real-time.  Given the hardware available, the simplest options seem to be either using heartbeat data, or the Muse EEG headband.

The project works as follows: Modify the parametric image. Evaluate response.  Estimate gradient of emotional response w.r.t parametric parameters of image.  Take a step of gradient ascent in the direction in parameter space that maximises the estimated emotional response function using reinforcement learning or genetic algorithms. Repeat. An alternative route would be to train a neural network to predict emotional response, and optimize using its surrogate world-model gradient, which would enable using stochastic gradient descent to optimize the image much faster.

Given the slow response of heartbeat data, I should use the Muse headband. In addition, we know the approximate timeframe that a given visual signal takes to process in a brain, although it remains to be seen if the noisy data from the EEG headband can be optimized against.

This project parallels work done using biofeedback in therapy and meditation, although with the opposite goal. An example of a project attempting to do this is SOLAR (below), in which the VR environment is designed to guide (using biofeedback, presumable using a Muse-like sensor) the participant into meditation.

For the parametric image, there are a variety of options. Currently, I am leaning towards using either a large colour field, or a generative neural network to provide me with a differentiatable parametric output. It would be awesome to use bigGAN to generate complex imagery, but the simplicity of the colour field is also appealing.  A midway option would be to use something like a CPPN, a neural network architecture that produces interesting abstract patterns that can be optimized into recognizeable shapes.

http://picbreeder.com
from http://blog.otoro.net/2016/03/25/generating-abstract-patterns-with-tensorflow/