There is a large volume of research focusing on the cognition of musical rhythm from an analytical, instrument-only perspective. Accordingly, present research cannot account for the differences in rhythm cognition created by the presence of spoken or sung lyrics in modern popular music. However, in some modern songs, there appears to be a dichotomy between the perceived rhythm in instrumental versions and the rhythm in the standard radio version. This study will tackle this question by understanding (1) the differences in sensorimotor-reported tapping and (2) the neurological basis for any differences in these tapping pattern between radio and instrumental versions of the same musical piece. Party in the USA is a Billboard Top 100 #1 hit song by popular teenage singer, Miley Cyrus. Through a recording studio-provided instrumental track of the song, it becomes clear to the listener that there are important differences in one’s experience of the piece without the lyrical content. This piece is an ideal musical masterpiece to examine because of the differences that exist between the instrumental and radio versions of the song; the lyrical content is not closely matched to the musical notation of the supporting instruments and it plays a large but not overwhelming role in the listener’s conception of the piece. The experiment will start with a standard sensorimotor rhythm tapping paradigm combined with electroencephalography, which will measure localized brain activity during the task. Audio will be presented stereophonically with the lyrical content in one ear and the instrumental version of the piece in the other ear, with both pieces in complete synchronization. Participants will be asked to tap to the beat (tactus) with both hands independently for each presentation ear. Due to lateralization of brain functioning, language processing in Broca and Wernicke’s areas takes place in the left hemisphere of the brain. The literature has presented evidence supporting the theory that the brain activity consolidates, to an extent, to individual hemispheres. Thus, participants presented with the lyrical content in the right ear and tapping with the right hand should be closest to the lyrically-derived rhythm, with varying levels of dissimilarity following the pattern of language-processing (fixed, left brain), hand, and presentation ear modalities. The tapping will be characterized as a metrical or figural representation of the song.
Bonnel, A., Faita, F., Peretz, I., & Besson, M. (2001). Divided attention between lyrics and tunes of operatic songs: Evidence for independent processing. Perception & Psychophysics, 63(7), 1201-1213. Retrieved from PsycINFO.
This article discusses an experiment that was used to determine the interaction between the melodic and semantic processing that takes place during the listening of music with lyrics. French operas were sung with the last word either congruent or incongruent with semantics and melody (separately). The study also included a cost-benefit analysis and information relating to familiarity in music, demonstrating that in the particular experiment, most participants (94-96%) were not familiar with the music, an important experimental consideration. The study demonstrated that, regardless of one’s level of musical experience, people process melody and semantics separately. (Note, another similar study used EEG to test the same conditions using the P300 and N400 brain waves, which is extremely relevant to the proposed experiment: Besson, M., Fa, F., Peretz, I., Bonnel, A., & Requin, J. (1998). Singing in the brain: Independence of lyrics and tunes. Psychological Science, 9(6), 494-498. doi: 10.1111/1467-9280.00091)
Brown, S., Martinez, M. J., & Parsons, L. M. (2006). The Neural Basis of Human Dance. Cerebral Cortex, 16(8), 1157-1167. doi: 10.1093/cercor/bhj057
This article investigates the neurology behind dance and discusses an experiment in which PET scans were used to localize the specific portions of the brain involved in various types of dance that were either synchronous or asynchronous to the rhythm. First discussing the synchronization aspect of dance, including how dance patterns mimic the hierarchical structure of musical rhythm, the article provides detailed information about the combination of physical expressions of music (dance) with the cognition of the actual rhythm. The findings, while extremely complicated, indicate that the sensorimotor parts of the brain synchronize with the rhythm cognition parts of the brain. The specific findings in this study can be combined with the EEG research in the proposed experiment to determine both the novelty of the rhythm and the parts of the brain associated with the various dance tasks.
Stevens, C. J., Schubert, E., Wang, S., Kroos, C., & Halovic, S. (2009). Moving with and without music: Scaling and lapsing in time and in the performance of contemporary dance. Music Perception, 26(5), 451-464. Retrieved from PsycINFO.
This piece investigates the mechanisms by which dancers keep time. The experiment involved in this journal article involves a strong discussion of experimental methods that give insight into the major body points to be used in quantifying the dancers’ movements. The authors purport that scaling (compressing or expanding parts of the music) and lapsing (insertions and deletions) are the two mechanisms by which dancers mentally reconstruct and actualize the music when dancing. The research found that scaling accounted for about 20% of the variance between dancing with music and dancing to memory, and that lapses accounted for 20% of the variance as well (more deletions than insertions).
Trainor, L. J., Gao, X., Lei, J., Lehtovaara, K., & Harris, L. R. (2009). The primal role of the vestibular system in determining musical rhythm. Cortex: A Journal Devoted to the Study of the Nervous System and Behavior, 45(1), 35-43. doi: 10.1016/j.cortex.2007.10.014
This piece details the role of the vestibular system in musical rhythm. The experiment applied a small galvanic stimulation in specific patterns. When the pattern mimicked the movement of one’s head every second, the person perceived the meter in a duple representation. Alternatively, when the pattern mimicked a movement of the head on every third the beat, people perceived the ambiguous meter as being in triple time.