The Mill created a massive real time data art installation built from the computational analysis of pop music, social media and news media by IBM’s Watson. Natural language and musical compositions were assigned emotional values by Watson which we then translated into immersive visualizations that could be navigated by time, emotion and genre. Intricate color coding of the visuals was based on a five color palette, one each for joy, anger, disgust, sadness, and fear.
We sought to create art, but also find insight. We believe that the overall emotional tenor of the world, as determined by analyzing media and social posts, has a causal relationship to what is popular in mainstream music in a similar time period.
Post election we expect to see the first ever death metal song track on Billboard.
At it’s heart are two systems. Our abstract visualization gives the user a gestalt visual experience, summing vast and complex data sets gleaned from the lyrics and composition of Billboard Top 100 songs, into a color coded environment built of undulating waves.
The second system is a text based visualization built on analysis of twitter feeds and news media headlines. The language was color coded by statement to coincide with the abstract system colors.
Both systems are linked by time. The user can navigate temporal, emotional and genre based classification via a custom built hardware interface we designed to feel like a midi controller on a starship.
Grammy nominated artist Alex Da Kid was the first to use the system to help provide insight and inspiration to the process of writing his forthcoming album.