Faculty Advisor or Committee Member
Sonia Chernova, Advisor
The focus of this project was to construct a humanoid animatronic head that had sufficient degrees of freedom to mimic human facial expression as well as human head movement and could be animated using face-tracking software to eliminate the amount of time spent on trial-and-error programming intrinsic in animatronics. As such, eight degrees of freedom were assigned to the robot: five in the face and three in the neck. From these degrees of freedom, the mechanics of the animatronic head were designed such that the neck and facial features could move with the same range and speed of a human being. Once the head was realized, various face-tracking software were utilized to analyze a pre-recorded video of a human actor and map the actors eye motion, eyebrow motion, mouth motion, and neck motion to the corresponding degrees of freedom on the robot. The corresponding values from the face-tracking software were then converted into required servomotor angles using MATLAB, which were then fed into Visual Show Automation to create a performance script that controls the motion and audio of the animatronic head during its performance.
Worcester Polytechnic Institute
All authors have granted to WPI a nonexclusive royalty-free license to distribute copies of the work. Copyright is held by the author or authors, with all rights reserved, unless otherwise noted. If you have any questions, please contact firstname.lastname@example.org.
Fitzpatrick, Robert J., "Designing and Constructing an Animatronic Head Capable of Human Motion Programmed using Face-Tracking Software" (2012). Masters Theses (All Theses, All Years). 615.
robotics, animatronics, face-tracking, RAPU, Visual Show Automation