close
close

Artist used AI and synesthesia to compose music from paintings

Shane Guffogg said AI helped him “unlock the musicality” in his paintings.
Soni Mei Images

  • Shane Guffogg is a multimedia abstract artist with synesthesia, which means he “hears color.”
  • Guffogg worked with AI experts and musicians to compose music to match his paintings.
  • He believes AI is still a tool that “needs supervision,” but it has improved his creative process.

This is an as-told conversation with Shane Guffogg, an American artist who earlier this month launched ‘At the Still Point of the Turning World – Strangers of Time’, an exhibition of 21 paintings at the Venice Biennale. This conversation has been condensed and edited for clarity.

I have synesthesia, which means I hear color.

So what I listen to when I paint is important. I listen to Indian classical music, Gregorian chants and some obscure composers such as György Ligeti, Leo Ornstein and Terry Riley. The music stimulates my creativity and ensures that I can be fully present in that moment.

For years I have been concerned with what my paintings could sound like. The AI ​​revolution pushed me to look for experts who could help me. My first point of contact was Radhika Dirks, expert in the field of AI and quantum computing. We had a few Zoom sessions and she told me – as far as she knew – that no AI program could help me. Instead, she suggested I create a visual alphabet that matched the musical chords I heard in my head to colors.

I thought it would be a way to stimulate my creativity. It also built on the idea of ​​an unconscious alphabet that has influenced my art throughout my career.

I met musicians and AI experts to create a visual alphabet

I started looking for musicians to collaborate with and met Anthony Cardella, a young, incredibly gifted pianist in Los Angeles. He has been promoted. student at USC and happened to know a lot of obscure composers that I listen to when I paint, and even play.

We started working together. We sat down and examined my paintings together. I zoomed in on a color in Photoshop, looked at it and sensed the musical note. Then I would tell Anthony. For example, I would say, I think this is the color of the note B. He pressed B and I said, “No, it’s not that; why don’t you try a B character?” After a few tries, he suddenly hit the right notes. I would know because the colors would vibrate before me. Together we mapped out chords that correspond to 40 colors.

Shortly thereafter, through mutual contacts, I met an AI researcher named Jonah Lynch. He works at the intersection of digital humanities and machine learning. I invited him to my ranch in central California and explained the work I had done and how I made my paintings. We had long discussions about art, poetry and creating an AI algorithm that could feed the chords.

He developed a program to ‘read’ my paintings and convert them into music. I gave him the main colors I used in each painting and the chords I hear when I see those colors. Jonah watched videos of me painting, studied the movement of my hands, and wrote software that sampled images of the paintings, tracked my hand movements, and assigned each color from the paintings to its corresponding chord. He then fed this sequence of chords into a neural network that has memorized most of the last 500 years of keyboard music, prompting the network to “dream” about new sequences, based on the color chord sequences and the history of Western music, to create pages create with sheet music.

When I heard that music being played for me, it brought tears to my eyes. It was just a rough version of what I heard while painting, but I thought, “There it is.”

I took the music back to Anthony, the pianist. Amazingly, I was able to point to the sheet music and tell him what compositions I was listening to while painting, and he said, “Yes, I can tell by the chords.” The Indian ragas, the Gregorian chants, the Ligeti and Ornstein – they were all there.

Yet the music at that time consisted largely of a series of chords. Anthony said we could get melodies if we rearranged it a bit.

AI is still a tool that needs human supervision.

Guffogg’s piece Only Through Time Time is Conquered formed the basis for the sonata that Cardella played for guests at the Venice Biennale.
Shane Guffogg

We have composed music for several paintings and performed them for audiences all over the world. Last month we held a concert at the Forest Lawn Museum in Los Angeles, where I also had a few paintings in an exhibition. The audience was able to look at the paintings while Anthony played, which was a profound experience. A few people were crying.

At the launch of my latest exhibition during the opening week of the Venice Biennale, Anthony performed the world premiere of a sonata he composed, inspired by my painting Only Through Time Time is Conquered, to a live audience. After the performance I talked to several people and they said they could see where the colors and the notes met in the painting. It was something they had never experienced before.

I know that many people are very afraid of AI, and I too see it as a tool that needs human supervision. It is not a means to an end. Still, it opened up a lot of possibilities and improved my creative process. I don’t know if I could have really unlocked the musicality in my paintings without it.

Listen to the sonata below: