How does the interaction of music and story affect perceptions of emotion, meaning, and structure? Additionally, how does this affect memory and comprehension?

Balkwill, Laura-Lee, and William Forde Thompson. “A Cross-Cultural Investigation of the Perception of Emotion in Music: Psychophysical and Cultural Cues.” Music Perception: An Interdisciplinary Journal 17.1 (1999): 43-64. JSTOR. Web.

This article proposes a theory of emotion in music that states that emotion is communicated in music through a combination of both universal and cultural cues, and it is these cues that we use to perceive and understand the emotions the music is conveying.  Western music listeners were exposed to 12 Hindustani “ragas”, each of which was intended to convey one of 4 emotions: joy, sadness, anger, and peace. The subjects then rated each piece for these 4 emotions, as well as 4 attributes of the music, including tempo, rhythmic complexity, melodic complexity, and pitch range.  Subjects were sensitive to joy, sadness, and anger, and these judgments were related to the judgments of the musical attributes, suggesting that listeners are able to extract emotion from unfamiliar music, and that musical cues help them do this.

 

Boltz, Marilyn. “Temporal Accent Structure and the Remembering of Filmed Narratives.” Journal of Experimental Psychology: Human Perception and Performance 18.1 (1992): 90-105. PsycARTICLES. Web.

A study was conducted where filmed narratives were broken up by commercials either between major episode boundaries (so-called “breakpoints”) or within these episodes (“non-breakpoints”).  Those who experienced the narratives with the more logical commercial placements were able to recall details from the story better, better recognition, and better memory for temporal information. This suggests that people use episode boundaries for attention and remembering, and also suggests that narratives have a natural rise and fall, that within the larger arc of the story are smaller arcs that form a regular structure that is also found in other forms of media.

 

Brower, Candace. “A Cognitive Theory of Musical Meaning.” Journal of Music Theory 44.2 (2000): 323. Répertoire International de Littérature Musicale. Web.

This article puts forward a theory of how musical meaning, or metaphor, is created, and relies on two theories from Cognitive Science: that pattern recognition and matching plays a part in thought, and that we map our bodily experiences onto the patterns of other domains.  Thus, through a mix of intra-domain mapping (matching patterns to patterns heard previously in the piece, as well as matching patterns from the piece to patterns conventionally found in music) and cross-domain mapping (matching patterns of music onto bodily experiences, i.e., the idea of strong and weak beats, higher and lower pitches, expansion and contraction, etc.) we create musical meaning.  The author explains the concepts, and then applies them in an analysis of Schubert’s “Du bist die Ruh”.

 

Cohen, A. J. (2001). Music as a source of emotion in film. In Juslin P. & Sloboda, J. (Eds.). Music and emotion. (pp.249-272). Oxford: Oxford University Press. Google Scholar. Web.

This chapter discusses the role of music in films, and what they add to the narrative, as well as how they evoke emotion.  The music in films is a bit of an oddity, as it is not directed at the characters of the film, but rather the audience. Cohen outlines 6 different ways music in films evokes our emotions.

 

Gratier, Maya, and Colwyn Trevarthen. “Musical Narratives and Motives for Culture in Mother-Infant Vocal Interaction.” Journal of Consciousness Studies15.10-11 (2008): 122-58. PsycARTICLES. Web.

Researchers looked at the non-verbal communications between mothers and infants, theorizing that communications and narratives conveyed through gestures and other non-verbal communications help the child become a being that participates in culture.  They also look at the organization of these factors in time, and at the end of the article, they provide empirical evidence for their claims.

 

Imberty, Michael, and Maya Gratier. “Narrative in Music and Interaction Editorial.” Musicae Scientiae 12.1 Suppl (2008): 3-13. PsycARTICLES. Web.

This article speculates on the definition of narrative, and about wordless narrative, including music.  The author also focuses on the musicality in communication, including gestures and speech, and the temporality of both narrative and music.

 

Klein, Michael Leslie, and Nicholas W. Reyland. Music and Narrative Since 1900. Bloomington: Indiana UP, 2013. Print.

This book focuses directly on the connection between music and narrative, especially in recent years, and seeks to challenge the claim that some modern music has lost its narrative.  The book looks at the phenomenon of narrative and music over time, tracking how it has changed, and the effect of other types of narrative on contemporary music and musical narrative.  There are also many different analyses of various pieces presented in this book, which display musical narrative at work.

 

Malloch, Stephen, and Colwyn Trevarthen. “Brain, Music, and Musicality: Inferences from Neuroimaging.” Communicative Musicality: Exploring the Basis of Human Companionship. Oxford: Oxford UP, 2009. N. pag.Répertoire International de Littérature Musicale. Web.

This article serves as a literature review of studies done with all different kinds of brain scans, seeking to look for commonalities between the processing of music and the processing of language. These studies seem to suggest strong connections and similarities between the two (similar areas of processing, similar patterns), but also slight differences, such as the fact that music is likely to be processed more bilaterally, which suggests that capacity to be affected by music is likely to be innate.  This also provides evidence for the idea that the faculties we use in the cognition of music may help facilitate language acquisition. The studies also provide evidence for the localization of certain parts of music cognition; for example, music activates the areas that generally deal with emotion, as well as many other specific areas.

 

Miell, Dorothy, Raymond A. R. MacDonald, and David J. Hargreaves. Musical Communication. Oxford: Oxford UP, 2005. Print.

This book seeks to bring together ideas and concepts from many different fields to look at the topic of musical communication. Researchers cover themes such as “Music and meaning, ambiguity, and evolution”, “Singing as communication”, and “The role of music communication in cinema” in an attempt to understand how humans share emotions, intentions, meanings, and stories with each other through music.

 

Patel, Aniruddh D. Music, Language, and the Brain. Oxford: Oxford UP, 2008. Print.

Patel’s book explores the connection between language and music, including such topics as rhythm, melody, syntax, and meaning. Patel reviews the relevant studies, and summarizes the joint scientific field of music and language as of now.

 

Porter-Reamer, Sheila Veronica. Song Picture Books and Narrative Comprehension. N.p.: n.p., 2006. Web.

This study sought to compare the effects of reading a story vs. reading a story with a song, to measure whether narrative comprehension was better with song. While the results showed no such effect, they did show that memory improved with the song picture books.

 

Ziv, Naomi, and Maya Goshen. “The Effect of ‘Sad’ and ‘Happy’ Background Music on the Interpretation of a Story in 5 to 6-year-old Children.” British Journal of Music Education 23.03 (2006): 303. PsycARTICLES. Web.

This article details an experiment run where children heard a neutral story read aloud while either sad music (minor, slower tempo), happy music (major, faster tempo), or no music was played.  The kids were then asked questions about the story, and told to pick either a sad face, a happy face, or a neutral face to describe certain moments in the story.  Kids who heard the happy music were more likely to interpret the story/character as happier, while kids who heard the sad music interpreted the story/character as more sad.  This shows that music affects the perception of other stimuli and stories.

 

3 thoughts on “How does the interaction of music and story affect perceptions of emotion, meaning, and structure? Additionally, how does this affect memory and comprehension?

  1. An article I just read about, that I thought might be relevant, at least partially:

    James CE1,2,3, Cereghetti DM2,4, Roullet E2, Oechslin MS2,5
    1 University of Applied Sciences Western Switzerland, School of Health Sciences, Geneva, Switzerland; 2 Faculty of Psychology and Educational Sciences, University of Geneva, Geneva, Switzerland; 3 Geneva Neuroscience Center, University of Geneva, Geneva, Switzerland; 4 Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland; 5 International Normal Aging and Plasticity Imaging Center (INAPIC), University of Zurich, Zurich, Switzerland. clara.james@hesge.ch

    The majority of studies on music processing in children used simple musical stimuli. Here, primary schoolchildren judged the appropriateness of musical closure in expressive polyphone music, while high-density electroencephalography was recorded. Refined in-key harmonic transgressions at closure were presented interspersed with regular endings. The children discriminated the transgressions well above chance. Regular and transgressed endings evoked opposite scalp voltage configurations peaking around 400ms after stimulus onset with bilateral frontal negativity for regular and centro-posterior negativity (CPN) for transgressed endings. A positive correlation could be established between strength of the CPN response and rater sensitivity (d-prime). We also investigated whether the capacity to discriminate the transgressions was supported by auditory domain specific or general cognitive mechanisms, and found that working memory capacity predicted transgression discrimination. Latency and distribution of the CPN are reminiscent of the N400, typically observed in response to semantic incongruities in language. Therefore our observation is intriguing, as the CPN occurred here within an intra-musical context, without any symbols referring to the external world. Moreover, the harmonic in-key transgressions that we implemented may be considered syntactical as they transgress structural rules. Such structural incongruities in music are typically followed by an early right anterior negativity (ERAN) and an N5, but not so here. Putative contributive sources of the CPN were localized in left pre-motor, mid-posterior cingulate and superior parietal regions of the brain that can be linked to integration processing. These results suggest that, at least in children, processing of syntax and meaning may coincide in complex intra-musical contexts.

  2. I’m not sure how helpful the studies linking music and film will be, but it’s not a bad idea to look into it. I like the fact that you have a good mix of sources from different disciplines. I’m not sure how helpful the Gratier & Trevarthen will be, unless you connect it to more research on gesture and music, e.g., the work of Zohar Eitan (https://telaviv.academia.edu/ZoharEitan), which I suggested before. Were you able to find sources? This one might be interesting:

    “HOW MUSIC MOVES: Musical Parameters and Listeners’ Images of Motion”

    Abstract: THIS ARTICLE PRESENTS AN empirical investigation of the ways listeners associate changes in musical parameters with physical space and bodily motion.1 In the experiments reported, participants were asked to associate melodic stimuli with imagined motions of a human character and to specify the type, direction, and pacechange of these motions, as well as the forces affecting them. The stimuli consisted of pairs of brief figures, one member of a pair presenting an “intensification” in a specific musical parameter, the other an “abatement” (e.g., crescendo vs. diminuendo, accelerando vs. ritardando). Musical parameters manipulated included dynamics, pitch contour, pitch intervals, attack rate, and articulation. Results indicate that most musical parameters significantly affect several dimensions of motion imagery. For instance, pitch contour affected imagined motion along all three spatial axes (not only verticality), as well as velocity and “energy.” A surprising finding of this study is that musical-spatial analogies are often asymmetrical, as a musical change in one direction evokes a significantly stronger spatial analogy than its opposite. Such asymmetries include even the entrenched association of pitch change and spatial verticality, which applies mostly to pitch falls, but only weakly to rises. In general, musical abatements are strongly associated with spatial descents, while musical intensifications are generally associated with increasing speed rather than ascent. The implications of these results for notions of perceived musical space and foraccounts of expressive musical gesture are discussed.
    Published in ‘Music Perception’, 2006

    I also thought of another strand of research I had not mentioned, i.e., (spontaneous) imagery and music. The main researcher is Andrea Halpern (http://www.bucknell.edu/x16803.xml). There is a few titles in her posted bibligraphy that sound potentially related, e.g.:

    – Lucas, B.L., Schubert, E., & Halpern, A. R. (2010). Perception of emotion in sounded and imagined music. Music Perception, 27, 399-412

    – Zatorre, R. J. & Halpern, A. R. (2005) Mental concerts: Musical imagery and 
auditory cortex. Neuron, 47,9-12.

Leave a Reply

Your email address will not be published. Required fields are marked *