AI model indicates innate musical instinct

Music, often hailed as a universal language, may share a common thread across diverse cultures. A recent breakthrough by a KAIST research team, led by Professor Hawoong Jung, unveils the emergence of musical instincts in the human brain without explicit learning.

Utilizing an artificial neural network model, the team identified principles akin to the auditory cortex of a real brain. Previous studies highlighted the universal presence of music in distinct cultures, emphasizing shared elements in beats and tunes. Professor Jung’s team, using Google’s AudioSet, demonstrated that specific neurons spontaneously respond to music, irrespective of genre.

These music-selective neurons, mirroring the auditory cortex, encode the temporal structure of music. Intriguingly, their suppression hampers cognitive accuracy for other natural sounds, suggesting an evolutionary adaptation for processing diverse auditory stimuli. Professor Jung envisions applications in AI music generation, musical therapy, and cognitive research.

While the study focuses on the foundational aspects of processing musical information, it doesn’t delve into the developmental learning processes. Published in Nature Communications, the research by Dr. Gwangsu Kim and Dr. Dong-Kyum Kim marks a significant step in understanding the innate musical instincts within the human brain.

Source NeuroScienceNews

Author: Neurologica