© CATHERINE DELPHIAFunctional magnetic resonance imaging (fMRI) studies have taken diverse approaches to pinpointing areas involved in musical perception, providing “musical” stimuli ranging from human singing to synthesized piano melodies and other computer-generated sounds, and yielding equally varied results. Despite these hurdles, research is beginning to offer some clues about the regions of the brain involved in musical perception.

Music specificity

Based on Cortex, 59:126-37, 2014

Music activates diverse areas of the brain, from the primary auditory cortex to the amygdala. But the degree to which certain areas are specifically geared to processing music, as opposed to other sounds, is unclear. By comparing activation patterns in the brain while people listened to nonmusical human vocalizations, such as speech or laughter, or to instrumental music, researchers found that certain regions responded more strongly to one type of auditory stimulus than the other. For example, parts of the superior temporal...

Beat and pitch

Based on Cereb Cortex, 24:836-43, 2014 and Philos Trans R Soc Lond B Biol Sci, 370:20140093, 2015 (left); Front Psychol, 3:76, 2012 (right)

Some fMRI studies have focused on identifying the brain circuitry underlying specific components of auditory perception. For example, the primary auditory cortex (located in the STG) and the thalamus are thought to play prominent roles in beat perception for both music and speech, and trained musicians may recruit extra language-processing areas such as the supramarginal gyrus (SMG) when listening to complex rhythms. In addition, several regions considered to be part of the motor system have been associated with beat perception, including the supplementary motor area (SMA) and the premotor cortex (PMC), suggesting an important link between perceiving a rhythm and synchronizing movement to it.

Studies of pitch processing, meanwhile, have repeatedly highlighted a role for the auditory cortex, although evidence for the overlap between speech and music in this and other areas is mixed. Some regions, however, including the intraparietal sulcus (IPS, located on the parietal lobe), appear to be activated more by pitch in sung words than by pitch in spoken words. Additional observations revealed differential lateralized activity for song and speech: the left inferior frontal gyrus (IFG), for example, dominates in pitch processing for speech, while the right IFG takes over for song.

Read the full story.

Interested in reading more?

The Scientist ARCHIVES

Become a Member of

Receive full access to more than 35 years of archives, as well as TS Digest, digital editions of The Scientist, feature stories, and much more!
Already a member?