When watching a film, we engage with much more than combinations of moving images. We combine what we see with what we hear, and what we hear often aids in the construction of a story. Although some researchers endorse the ways sound guides viewer expectations, there is still a need to explain the ways images, sounds, and other available cinematic modes interact to construct meaning. This article engages with research on embodiment, cognition, and multimodal artifacts to reveal how sound aids in the construction of film narratives by focusing on examples where sounds take the primary role in constructions of narrative meaning. Additionally, by discussing recent theories on cognition and multimodality, this article shows how sounds can evoke conceptual and narrative information in ways that stabilize our understanding of cinematic representations through the joint contribution of all of the available modes.
Brad Jackson is a doctoral candidate at The University of British Columbia. His doctoral research focuses on the intersections of cognitive science, cognitive linguistics, and narrative film, specifically, the role of multimodality in film analysis. Email: bradley.jackson@alumni.ubc.ca