You are here
September 5, 2023
Scientists translate brain activity into music
At a Glance
- Researchers reconstructed a song based on the brain activity of people when they were listening to it.
- The findings shed light on how the brain processes music and could help brain-computer interfaces to produce more natural-sounding speech.
Music is a universal human experience. Past research has identified parts of the brain that respond to specific elements of music, such as melody, harmony, and rhythm. Music activates many of the same brain regions that speech does. But how these regions interact to process the complexity of music has been unclear.
An NIH-funded research team, led by Drs. Ludovic Bellier and Robert Knight at the University of California, Berkeley, used computer models to try to reconstruct a piece of music from the brain activity it elicited in listeners. The study appeared in PLoS Biology on August 15, 2023.
The team had 29 neurosurgical patients listen to the song “Another Brick in the Wall, Part 1” by Pink Floyd. Electrodes placed directly on the surfaces of their brains for epilepsy evaluations recorded activity. The researchers looked for correlations between electrode signals and the song’s auditory qualities. They then used this information to try to reconstruct the song from the brain signals. Similar methods have been used to reconstruct speech from brain activity. But this is the first time that music has been reconstructed using such an approach.
The researchers found a total of 347 electrodes (out of nearly 2,700) across the patients that were significant for detecting music encoding. A higher proportion of electrodes on the brains’ right hemispheres (16.4%) detected activity in response to the music than on the left (13.5%). This contrasts with speech, which evokes greater responses in the left hemisphere. In both hemispheres, most of the responsive electrodes were on a region called the superior temporal gyrus (STG), located just above and behind the ear.
The reconstructed song, based on data from all 347 significant electrodes, resembled the original, although with less detail. For example, the words in the reconstructed song are much less clear.
Certain patterns of brain activity matched specific musical elements. One pattern consisted of short bursts of activity at a range of frequencies. These corresponded to the onset of lead guitar or synthesizer motifs. Another pattern involved sustained activity at very high frequencies. This occurred when vocals were heard. A third pattern corresponded to the notes of the rhythm guitar. Electrodes detecting each pattern were grouped together within the STG.
To narrow down which brain regions were most important for accurate song reconstruction, the researchers repeated the reconstruction with signals from various electrodes removed. Removing electrodes from the right STG had the greatest impact on reconstruction accuracy. The team also found that music could be accurately reconstructed without the full set of significant electrodes; almost 170 had no effect on accuracy.
These findings could provide the basis for incorporating musical elements into brain-computer interfaces. Such interfaces have been developed to enable people with disabilities that compromise speech to communicate. But the speech generated by these interfaces has an unnatural, robotic quality to it. Incorporating musical elements could lead to more natural-sounding speech synthesis.
“One of the things for me about music is it has prosody (rhythms and intonation) and emotional content,” Knight says. As the field of brain-machine interfaces progresses, he explains, this research could help add musicality to future brain implants for people with disabling neurological or developmental disorders that compromise speech. “It gives you an ability to decode not only the linguistic content, but some of the prosodic content of speech, some of the affect. I think that’s what we've really begun to crack the code on,” he adds.
—by Brian Doctrow, Ph.D.
Related Links
- Brain Decoder Turns a Person’s Brain Activity into Words
- Understanding How Sound Suppresses Pain
- Study Reveals Brain Networks Critical for Conversation
- Device Allows Paralyzed Man to Communicate with Words
- System Turns Imagined Handwriting into Text
- Scientists Create Speech Using Brain Signals
- Sound Health
References: Bellier L, Llorens A, Marciano D, Gunduz A, Schalk G, Brunner P, Knight RT. PLoS Biol. 2023 Aug 15;21(8):e3002176. doi: 10.1371/journal.pbio.3002176. eCollection 2023 Aug. PMID:Â 37582062.
Funding: NIH’s National Institute of Biomedical Imaging and Bioengineering (NIBIB), National Institute of Neurological Disorders and Stroke (NINDS), and BRAIN Initiative; Fondation Pour l’Audition.