Projects

Music affects our bodies and our minds

Whether we listen to sounds that were designed in a lab or to our own cherished playlists of favorite songs, could simply listening to music, deeply, support our overall health and wellbeing? Can music be used to treat diseases such as epilepsy? The BEAtS lab studies the effects of music on our bodies and minds. We work between neuroscience (fMRI, EEG, and iEEG imaging), machine learning, and computational music analysis to help answer fundamental questions about the power of music to induce beneficial changes.

Recent Publications:

Music effects in epilepsy

Prior research involving persons with drug-resistant epilepsy has demonstrated that listening to some music decreases the probability of clinical seizures and their related comorbidities. This article reviews recent research designed to elicit the neural mechanisms behind positive outcomes on biomarkers of the disease. Using novel music analytical and neurophysiological experimental methods, our results showed positive effects on epilepsy using 15-second gamma-band (40-Hz) complex tones as well as Mozart's Sonata for Two Pianos in D Major (K448). We also observed greater effects with increased stimulus duration. Further analysis elicited effects localized to bilateral frontal brain regions due to transitions between musical phrases. Finally, music matched for patient preference from a range of musical styles was not as effective as 40-Hz tones or Mozart's K448. Understanding these results required expertise in both music and neuroscience, and could yield reliable music-based interventions for epilepsy that may also be transferable to other brain disorders.

Neural decoding heard and imagined music

Lloyd May, Andrea R. Halpern, Sean D. Paulsen, Michael A. Casey; Imagined Musical Scale Relationships Decoded from Auditory Cortex. J Cogn Neurosci 2022; 34 (8): 1326–1339. doi: https://doi.org/10.1162/jocn_a_01858

Notes in a musical scale convey different levels of stability or incompleteness, forming what is known as a tonal hierarchy. Levels of stability conveyed by these scale degrees are partly responsible for generating expectations as a melody proceeds, for emotions deriving from fulfillment (or not) of those expectations, and for judgments of overall melodic well-formedness. These functions can be extracted even during imagined music. We investigated whether patterns of neural activity in fMRI could be used to identify heard and imagined notes, and if patterns associated with heard notes could identify notes that were merely imagined. We presented trained musicians with the beginning of a scale (key and timbre were varied). The next note in the scale was either heard or imagined. A probe tone task assessed sensitivity to the tonal hierarchy, and state and trait measures of imagery were included as predictors. Multivoxel classification yielded above-chance results in primary auditory cortex (Heschl's gyrus) for heard scale-degree decoding. Imagined scale-degree decoding was successful in multiple cortical regions spanning bilateral superior temporal, inferior parietal, precentral, and inferior frontal areas. The right superior temporal gyrus yielded successful cross-decoding of heard-to-imagined scale-degree, indicating a shared pathway between tonal-hierarchy perception and imagery. Decoding in right and left superior temporal gyrus and right inferior frontal gyrus was more successful in people with more differentiated tonal hierarchies and in left inferior frontal gyrus among people with higher self-reported auditory imagery vividness, providing a link between behavioral traits and success of neural decoding. These results point to the neural specificity of imagined auditory experiences—even of such functional knowledge—but also document informative individual differences in the precision of that neural response.

Subscribe to BEAtS  Lab RSS