Date: March 2016
JK Westlund, JJ Lee, L Plummer, F Faridi, J Gray, M Berlin, H Quintus-Bosz, R Harmann, M Hess, S Dyer, K Santos, S Adalgeirsson, et al. Tega: A Social Robot (2016).
In Proceedings of the International Conference on Human-Robot Interaction (HRI) Extended Abstract.
Tega is a research platform designed to support highly-expressive interactions with young children. The robot leverages smart-phone technology to not only graphically display facial expressions but also for computation, which includes behavioral control, sensor processing, and motor control to drive its five degrees of freedom: head up/down, waist-tilt left/right, waist-lean forward/back, body-extend up/down, and body-rotate left/right. For increased perceptual awareness, we augmented the phone’s ability with an external camera that can capture high-resolution images with a wider field-of-view of the environment and of users. To withstand long-term continual use, the efficient battery-powered system can last up to six hours before requiring charge. We designed for robust and reliable actuator movements, including the ability to withstand rapid expanding and contracting movements (i.e., “squash and stretch” behaviors) between the base and the head through a lead-screw design. Through combinatorial movements, Tega can physically express a wide range of affect such as nodding to agree, leaning in to show engagement, tilting its head in thought, or straightening up in excitement.
As both tech lead and systems engineer, I led a team of designers, animators, and engineers to design and develop this social robot platform from original concept to mechanical design to working prototypes.