AI reconstructing songs from brain scans

AI reconstructing songs from brain scans

The 15-second audio clip sounds like a muffled version of Pink Floyd’s Another Brick in the Wall played underwater. Except Pink Floyd didn’t perform any of the music in the clip. Instead, the track was captured by a team of researchers at the University of California at Berkeley, who looked at the brain activity of more than two dozen people who listened to the song.

That data was then decoded by a machine learning model and reconstructed into audio — marking the first time researchers have been able to re-create a song from neural signals.


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.