TeleTangibles

September 6, 2024

PI’s Dor Abrahamson (EDRL, UC Berkeley), Jenna Gorlewicz (CHROME, Saint Louis University), and Emily Moore (PhET, U Colorado Boulder) have just received notice that their project proposal, “TeleTangibles: Flexible Inclusive Tangibles to Bring Sensorimotor Interaction Back into STEM Education,” has been accepted for funding and awarded $900,000 by the National Science Foundation, under the Research on Innovative Technologies for Enhanced Learning (RITEL) program. To this collaboration, the PI’s bring expertise in the Learning Sciences with a focus on design-based research of digital manipulatives for mathematics learning (Abrahamson), accessible digital interaction for students of sensorimotor diversity (Moore), and research and development of robotics technologies (Gorlewicz).

The project’s vision is to inclusively bring hands-on, embodied learning into the current educational landscape through the launch of a new genre of STEM resources that the team calls TeleTangibles. TeleTangibles is a set of flexible, tangible learning materials that can be reconfigured for different contexts and settings, can be used with or without digital interactive simulations, and includes variants that stretch across a cost and functionality spectrum to meet the diverse needs of learners and resources available. Just as telephones transmit and receive sound and televisions transmit and receive images, teletangibles transmit and receive physical manipulation. With the new proposed technological enhancements, blind students will be able to to learn collaboratively, remotely, even across the globe, where one student’s actions on the device in their own hands re-shapes the device in the other student’s hands. The first teletangible created by the team, the “Quad” <insert link to BSE webpage item> — a device for blind students to learn about quadrilaterals — has won two international awards.

Abrahamson, with Postdoctoral Fellow Sophia Tancredi, will invent new theory-based teletangibles for STEM content, which will be engineered by Gorlewicz (mechatronics) and Moore (interactive spoken feedback). Abrahamson and Tancredi will lead the analysis of multimodal data from user-testing as well as theoretical modeling of the learning processes.