The use of mobile technology has individualized education and even when there is a sense of a connected learning community, the classroom itself remains a passive space whose only function is to contain students, faculty and furnishing. This paper introduces a proof-of-concept system that combines Motion Sensing Input Devices, Projected Visualizations and Immersive Interfaces with physical objects, transforming the classroom into an active space. It also discusses the impact this can have in the learning process of the students. The dynamic idea uses real time computing combined with projected visualizations over a large surface, like a desk or a wall, that acts as an augmented spatial immersive display. Along with gesture tracking from a motion sensor device, it enhances the interactions between an interface and engaged users. This research presents a detailed exploration of the interface design and considers different ways to trigger interactions between the user and the interface, using 3 levels of physical body gestures: finger gestures, hand gestures and gestures that involve the arms. These gestures drive the interactions and augment the information from learning content. Interactions can range from acquiring information from diverse web sources like Google or Wikipedia, to sharing a document, to using ubiquitous classroom supplies like post-its and markers in a shared projected environment. Here, an initial execution is presented about the system in a classroom setting with preliminary demonstrations as well as outcomes and insights from the interface design.
Daniel Echeverri, Zayed University, UAE
Stream: Future Classrooms
This paper is part of the ACTC2015 Conference Proceedings (View)
View / Download the full paper in a new tab/window
Comments & FeedbackPlace a comment using your LinkedIn profile
Share this Research