colour to sound
This project simulates hardware devices that allow colour blind people to perceive colours by using their sense of hearing instead.
It uses the device's camera to find the colour at the center of the image, and turns it into a music note, which is then played through the device speakers.
This is all achieved by using Web technologies such as Web Audio and the Media Streams APIs; the data processing is also done entirely locally, so no camera data is ever sent to a remote server for processing.
Inspiration
The idea to build this came after I saw there is a new documentary about Neil Harbisson coming up.
He has famously declared himself to be a cyborg, as he has a light sensor mounted on an antenna attached to his skull. The light sensor is connected to a brain implant on the back of his head, and the implant turns the colours captured by the sensor into sound.
In other words, the implant allows him to hear sound, as he has achromatopsia, a type of color blindness which means 'no colour' i.e. he sees everything in a greyscale.
The set of antenna, sensor and implant also seem to be known as an "Eyeborgs" or "Cyborg antennas". I saw mentions to other versions of this device in existence in some websites, but since I couldn't find much detail about their specifics I did not take them into account when building this experiment.
Simulation, not reproduction
Given the scarcity of technical information available, the output from this experiment is at best an approximation, but it grossly aims to produce the same sonochromatic scale that is discussed in the Wikipedia page for Sonochromatism, where the red in the colour spectrum starts at F, orange is F#, yellow is G and etc all the way to magenta which is E (for the next octave).
A small settings panel can be opened to adjust several parameters, such as the amount of transposed semitones (so the note where the scale starts can be selected), how many notes in an octave and how many micronotes to use, the gain (to alter the output volume), etc.
The generated scales are slightly different from the chromatic scales which are the type of scales we tend to think of when we talk about Western music. While "normal" music uses octaves with 12 notes (semitones) each, the Eyeborg admittedly uses 360 notes, which I interpreted as "each octave contains 36 notes", i.e. each note can be either the same value as the note in a chromatic scale, or it can be slightly off in either direction - like a third off the value of the interval between "normal" notes. I chose this option as it still was close enough to the standard note name convention and so I could display the approximate note names so that people can have an idea of which note each colour is (i.e. is it an A, or is it a C? etc)
Then, to achieve the final 360 notes they say the Eyeborg can play, the tool can detect up to 10 levels of lightness for each pure base color. This results in actually more than the stated 360 notes, as we have 36 initial values + 36 x 10 variations = 396 notes.
Potential follow up ideas
I reached a lot of dead ends and made a lot of false starts, and I also wrote a lot of exploratory code to test ideas, either because I wasn't sure they would work for what I wanted to accomplish, and at points, I wasn't even sure of what it was that I was trying to accomplish!
However, because I wanted to keep as close to the original as possible, I kept leaving many of these ideas aside, which makes the end result a touch "bare", but hopefully represents quite accurately what it is to use one of these Eyeborgs, without getting access to the original hardware or having seen it in person.
A happy outcome is that now I have a well of fresh ideas and concepts to keep digging into for further projects.
Some examples:
- it is not too noticeable because this experiment only outputs monophonic sound, but one follow up idea I had was to play more than one note at once with this type if scale and see if I encounter any interesting wave overlap effect.
- play the x common colours in an image (this is actually what I started with, when I realised that actually, the eyeborg is monophonic). This was quite spooky...
- allow to listen to colours from an image selected / uploaded by the user, or from a video. Could be useful if there is no camera in the device or if permission can't be granted!
- use different types of oscillator waves
- use colour properties to alter parameters in the output such as filter cut off frequencies (I am not using a filter yet but it could be something to experiment with)
I also found a lot of things to write about including, but not limited, small bugs and Web Audio rendering differences between engines, processing colours, and much more, which I will be writing about in my blog and linking to from here.
In the meantime, you can look at the other references below or any of my other experiments!
References
- Neil Harbisson in Wikipedia.
- the Cyborg Foundation dedicated to promoting cyborg rights.