The Kromophone was originally developped by Zachary Capalbo and Dr. Brian Glenney in the Philosophical Psychology lab at Gordon College during the summer of 2008. We thought that using color as the basis for a sensory substitution device would allow visually impaired users to better interact with their environment than any other of the sensory substitution devices we had experimented with.
The original design had the hue, saturation, and luminosity of colors mapped to pitch, pan, and volume of the sounds, respectively. While this worked a little, we found that different colors were far too difficult to distinguish.
We then moved into a red, green, blue model, where each color was given it's own unique pitch and timbre. This was a huge improvement, but still made it difficult to identify colors like yellow or white.
We therefore decided to add seperate sounds for yellow and white. After this, users were able to easily distinguish between different colors.
The original version of the Kromophone has been discontinued, and is archived at http://www.zachcapalbo.com/projects/kromophone. The source code is still available for download, but it's not pretty, and doesn't compile easily on modern distributions of Linux (and not at all on Windows).
The next version of the Kromophone was developped by Bill Rosser. It is an android port of the original Kromophone, but is as-yet incomplete. The source code is available at https://sourceforge.net/projects/kromophone/
The current version of the Kromophone is a complete rewrite using the Qt. This allows it to run on Linux (including embedded machines like the Raspberry Pi), Windows, and Mac and Android.
Head mounted camera. Allows for localization of color perception due to proprioception of the head and neck.
Blinder Goggles. Needed for sighted subjects to block out vision.
Headphones. Convey the sound. Should NOT be noise canceling. The user needs to be able to hear what's going on in the environment in addition to the sounds from the Kromophone.
Remote Control. Allows the user to fine tune the Kromophone for the environment he's in.
Embedded Computer. Runs the Kromophone software.
For training, sighted users where given time to use the kromophone with their vision unobscured. As an introduction, they were shown a picture of a color spectrum on a computer display; when they moved the mouse over any point on the image, the kromophone would produce the sound for that color. Blind subjects were given pieces of colored construction paper and told the color that corresponds to each piece of paper.
To test the subject's ability with the Kromophone, we placed various fruits (apple, orange, lime, lemon, banana) on a table infront of the subject. The subjects then had to identify which fruits where which, and where on the table they were placed, without touching the fruit.
Some Kromophone users also took part in Zach Reynolds' sensory substitution device training study; however, their data was not factored into the results of the study.
The Kromophone provides a unique and novel opportunity to experience visual artwork for both the sighted and blind alike. By hearing the colors of a painting or other artwork, the user of the Kromophone can experience even a very familiar piece of artwork in a new way.
This new experience enabled by the Kromophone has also inspired artwork, such as artist Sara Hendren's We Never Asked To Be Made Human art pieces and video. Musician Chad Wimberly even used it to create a techno song.
The results of the summer were published and presented in the proceedings of the Fifth Asia-Pacific Computing and Philosophy Conference. You can view a presentation that Zach gave shortly after the conference on Youtube.
The Kromophone has been cited in a number of academic papers since its creation, including:
Bologna, Guido, Deville, Benoit, Pun, Thierry. Sonification of Color and Depth in a Mobility Aid for Blind People. The 16th International Conference on Auditory Display (ICAD-2010) June 9-15, 2010, Washington DC. http://icad.org/Proceedings/2010/BolognaDevillePun2010.pdf
Mengucci, Michele, Henriques, J. Tomas, Cavaco, Sovia, Correia, Nuno, Medeiros, Francisco. From Color to Sound: Assessing the Surrounding Environment. Proceedings of the International Conference on Digital Arts, 2012. http://ctp.di.fct.unl.pt/~sc/publicacoes/SeeThroughSound-ARTECH2012.pdf
Master's Theisis by Farina, Mirko with Kiverstein, Julian, Vierkant, Tillman, Clark, Andrew. Slaying the chimera: a complementary approach to the extended mind thesis. University of Edinburgh Press, 2012. http://www.era.lib.ed.ac.uk/bitstream/1842/6294/1/Farina2012.pdf
Maidenbaum, Shachar, Amedi, Amir. Applying Plasticity to Visual Rehabilitation in Adulthood. Plasticity in Sensory Systems. Ed. JKE Steeves, L.R. Harris, Cambridge UP 2012: 229-253. Google Books Reference
Reich, Lior, Maidenbaum, Shachar, Amedi, Amir. The brain as a flexible task machine: implications for visual rehabilitation using noninvasive vs. invasive approaches. Current Opinion in Neurology 25 1 2012: 86-95. http://mishkenot.org.il/Hebrew/docs/ethics/The%20brain%20as%20a%20flexible%20task%20machine%20implications.pdf
You can read an account of a test subject (David Botticello) after using the Kromophone at the Notes Along the Way blog