Peter Balch visited a robot exhibit at his local museum and noticed that one of the most popular pieces was a robot head that would track and mimic visitors’ faces. That was so interesting that Balch decided to replicate the project in order to learn how it was done. To do that, he first needed a robot head to work with. This Instructables tutorial explains how he built a skull-like android head that will eventually mimic human expressions.
Balch hasn’t yet tackled the facial detection and expression recognition portions of the project, which will require significant processing power. But he has built the android head that will receive the expression commands. It resembles a human skull with a copper tube framework that acts as both a support structure and a design accent. The head also has copper wire eyebrows (with heat-set insert ends) and plastic eyeballs from a cheap toy.
The robot can tilt its head up and down, rotate left and right, point its eyes in any direction, open and close its jaw, and pivot its eyebrows. That doesn’t cover the full range of human facial expression, but it does provide enough actuation for the robot to emote in a recognizable way. An Arduino Nano board drives the servos that handle the actuation. At this time, the Arduino controls the servos according to explicit commands. But once Balch finishes the facial recognition software, the device it runs on will send control commands to the Arduino to replicate the functionality that Balch saw in the museum.
Read more about this on: Arduino Blog