The Magical Musical Mat is a communicative platform that maps interpersonal touch to dynamically-changing sounds. When two people stand/sit/lie on the mat and establish skin-to-skin contact with one another, music plays and changes dynamically according to their touch-based gestures. The MMM system has been in development iteratively with non-speaking autistic children and their families, and an autism clinic.
MMM has been a labor of love, spurred by the lived experiences of non-speaking individuals, and our team’s passion for fun, music, and tinkering. We hope that MMM will be increasingly driven by the non-speaking community.
Team Members: Rachel S.Y. Chen (Graduate School of Education, UC Berkeley; Special Education, SFSU); Arianna Ninh (Cognitive Science and Computer Science, UC Berkeley); Rebecca Abraham (Electrical Engineering and Computer Science, UC Berkeley). Research with the MMM has also been supported by Megha Krishnan (Cognitive Science, UC Berkeley), Megan Kym (Nanyang Technological University, Humanities), and Lim Weiyi (Nanyang Technological University, Humanities).
Mentors: Betty Yu (Speech, Language and Hearing Sciences, SFSU); Kimiko Ryokai (School of Information, UC Berkeley) Dor Abrahamson (Graduate School of Education, UC Berkeley), Jürgen Streeck (Communication Studies, UT Austin).
Duration: Spring 2019 – Present
Funding: Barbara White Bequest; Jacobs Innovation Catalyst Grant (winner of Jacobs Design Showcase 2019); Berkeley Center for New Media fellowship; Institute of Cognitive and Brain Sciences
Background: The MMM was ideated during Fall 2018 in Professor Kimiko Ryokai’s course on Tangible User Interface (TUI) design, where Arianna, Justine, Fang, and Rachel worked towards an interface that would facilitate novel interaction between people. Inspired by projects that brought people together through touch, we created a musical mat that amplified a variety of touch-based interactions through sound. Rachel, having studied the interactions of non-speaking autistic individuals for years, saw the potential in bringing the project into their homes and schools. Rachel, Arianna, and later Rebecca thus began working closely together, iterating on the mats, building a range of musical palettes, and having fun along the way.
Motivation: Non-speaking autistic children often have to accommodate to the participatory expectations of speaking others. Towards inclusive practices, how can interaction embrace the expressive repertoires of the autistic child? What sociomaterial environments might surface such interactions? The MMM goes beyond speech, embracing touch and music as a means for people to connect. The MMM removes interactional asymmetry between diverse people and surfaces the basic human capacity to connect with one another.
Academic inspirations: Within academia, there have been many inspirations but just a few will be named here. The MMM is inspired by novel projects in Human Computer Interaction, Marjorie Goodwin’s work on Haptic Sociality, Charles Goodwin’s work with Chil and Co-operative action theory, Merleau-Ponty’s concept of intercorporeality, conversations with colleagues from the Joint-Doctoral Program in Special Education, Jeanne Bamberger’s work on music, colleagues in the sound design / experimental music space, and of course, collaborations with colleagues behind Special Education Embodied Design, as well as Dor Abrahamson and colleagues at Embodied Design Research Laboratory.
Future plans: As of May 2022, Rachel is reaching the end of her PhD, and Arianna and Rebecca have graduated from UC Berkeley. Rachel has plans to develop and disseminate MMM within community and clinical partnerships in her future faculty position. She hopes to concretely involve non-speaking individuals as part of the future team behind MMM. Music, improvisation, and sound design feeds Rachel’s soul, and she is excited to keep growing MMM as a rich environment for musical improvisation. Currently, Rebecca is generously working with Rachel to wrap up this current phase of the project, and otherwise geeking out with her about music, slam poetry and life. Arianna is no longer involved in MMM, but is still Rachel’s number one crafting buddy.
Chen, R. S. Y. (2021, June). Embodied design for non-speaking Autistic children: the emergence of rhythmical joint action. In M. Roussou, S. Shahid, J. A. Fails, & M. Landoni (Eds.), Interaction Design and Children (pp. 648–651). Association for Computing Machinery. https://doi.org/10.1145/3459990.3463396
Chen, R. S. Y., Ninh, A., Yu, B., & Abrahamson, D. (2020). Being in touch with the core of social interaction: Embodied design for the nonverbal. In The Interdisciplinarity of the Learning Sciences, Proceedings of the 14th meeting of the International Society of the Learning Sciences (ICLS 2020) (Vol. 3, pp. 1681–1684). International Society of the Learning Sciences.