Faculty Advisor or Committee Member

Sonia Chernova, Advisor

Faculty Advisor or Committee Member

Gregory Fischer

Co-advisor

Gregory Fischer

Identifier

etd-050112-072212

Abstract

The focus of this project was to construct a humanoid animatronic head that had sufficient degrees of freedom to mimic human facial expression as well as human head movement and could be animated using face-tracking software to eliminate the amount of time spent on trial-and-error programming intrinsic in animatronics. As such, eight degrees of freedom were assigned to the robot: five in the face and three in the neck. From these degrees of freedom, the mechanics of the animatronic head were designed such that the neck and facial features could move with the same range and speed of a human being. Once the head was realized, various face-tracking software were utilized to analyze a pre-recorded video of a human actor and map the actors eye motion, eyebrow motion, mouth motion, and neck motion to the corresponding degrees of freedom on the robot. The corresponding values from the face-tracking software were then converted into required servomotor angles using MATLAB, which were then fed into Visual Show Automation to create a performance script that controls the motion and audio of the animatronic head during its performance.

Publisher

Worcester Polytechnic Institute

Degree Name

MS

Department

Robotics Engineering

Project Type

Thesis

Date Accepted

2012-05-01

Accessibility

Unrestricted

Subjects

robotics, animatronics, face-tracking, RAPU, Visual Show Automation

Share

COinS