GogginsBOT-an Animatronic Face

by mpede in Circuits > Arduino

27 Views, 0 Favorites, 0 Comments

GogginsBOT-an Animatronic Face

IMG-20251205-WA0003.jpg

GogginsBot is an interactive animatronic inspired by fitness motivator David Goggins.

Its goal is simple: motivate the user during push-ups. Every time the user completes a push-up, the animatronic raises its eyebrow. Every five repetitions, the mouth moves while an audio motivational line is played.

The interaction begins with a light sensor (LDR) placed on top of the animatronic’s head. When the user waves a hand over it, the light drop triggers a 19-second introductory message. After that, the user can start doing push-ups over a floor-mounted ultrasonic sensor, which automatically detects each repetition and triggers the animatronic’s reactions.

The Structure

IMG-20251205-WA0006.jpg

The structural base of MotivatorBot is a simple plastic vase, chosen because it provides a lightweight, hollow container where all electronic components can be hidden and supported. This vase acts both as the housing and as the internal frame onto which the rest of the animatronic is mounted.

To create the character’s face, an AI-generated image of David Goggins was produced from a real photo, ensuring a perfectly frontal and symmetrical appearance suitable for animatronic movement. The printed face was glued onto a piece of cardboard to give it rigidity, and this panel was then attached to the front of the vase, forming the visible “head” of the device.

Sensors

IMG-20251205-WA0002.jpg
IMG-20251205-WA0004.jpg
IMG-20251205-WA0005.jpg

To give the animatronic its interactive behavior, two sensors were integrated into the physical structure. The first one, the LDR, was installed directly on the top of the vase. A small circular hole was cut into the upper surface so the photoresistor could sit flush with the opening and remain fully exposed to ambient light. This placement allows the system to detect the moment a user waves their hand above the head, triggering the start sequence when the light reading drops below the calibrated threshold.

The second sensor—the ultrasonic module—was placed inside a separate plastic vase base that acts as a dedicated floor station. Two holes were made in the front of this base so the HC-SR04 could be inserted cleanly, with both the trigger and echo transducers unobstructed. When positioned on the floor, this standalone base sits directly under the user during push-ups. As the user lowers their chest toward the sensor and rises again, the distance readings change accordingly, enabling the system to count each repetition in real time.

Actuators

IMG-20251205-WA0001.jpg

With the sensors installed, the next step was giving the animatronic its expressive motion. Inside the vase, just behind the cardboard face, two micro servos were mounted: one dedicated to raising the eyebrow and the other to controlling the movement of the mouth. Their placement was chosen so that all mechanics remained hidden, allowing the animatronic to maintain a clean and natural appearance from the front.

To achieve smooth and reliable mouth movement, a custom 3D-printed joint was created. This part connects the servo horn to the lower portion of the printed face, allowing the servo to open and close the mouth in a stable and controlled way. A standard, off-the-shelf linkage usually doesn't fit well in this type of build because the distance between the servo and the cardboard face depends on the diameter of the vase, the interior mounting points, and how far the face sits from the container. Printing a custom joint ensured the mouth wouldn’t protrude too much or sit too far inside, and that the motion stayed aligned with the facial features. For this reason, using a 3D-printed connector is highly recommended unless you already have a linkage that matches your vase’s exact dimensions.

The eyebrow servo, positioned above the mouth mechanism, was mounted more simply and lifts the eyebrow briefly each time a push-up is detected. This quick gesture adds personality to the animatronic and gives immediate feedback to the user.

Control Logic

Circuito.png

With the hardware assembled, the animatronic needed a control system capable of coordinating sensors, servos, and audio in real time. The Arduino program was designed around two main phases: the activation phase triggered by the LDR, and the workout phase driven by the ultrasonic sensor.

When the device powers on, it continuously monitors the LDR value coming through the hole at the top of the vase. As soon as a hand passes over it and the light drops below the calibrated threshold, the system transitions into the introduction sequence. During this time, the mouth servo animates in a talking pattern while an external Python script plays a 19-second motivational message.

Once the introduction is complete, the device enters push-up mode. Here, the Arduino repeatedly measures the distance from the ultrasonic sensor embedded in the floor base. Whenever the user lowers their chest toward the sensor and crosses the “down” threshold, the system registers the beginning of a repetition. When the user rises back above the “up” threshold, the repetition is completed. Each successful rep triggers the eyebrow animation, giving immediate visual feedback.

The software also keeps track of how many repetitions have been completed. Every time the user reaches a multiple of five, the Arduino activates the mouth animation again and sends a signal to Python to play a new motivational phrase. Importantly, all animations are implemented using non-blocking timing, allowing the Arduino to keep reading sensor data even while the servos are in motion. This ensures smooth, responsive interaction throughout the workout.

Lessons Learnt

During the development of this project, several important lessons emerged that would significantly help anyone attempting a similar build. One of the most crucial discoveries involved the ultrasonic sensor. At first, it seemed sufficient to align the Trigger and Echo transducers behind a panel with small holes. Even though the holes were centered, they weren’t large enough to fully expose the sensor, and this caused highly unreliable distance readings. The ultrasonic module is extremely sensitive to any obstruction—even a millimeter-thin overlap can distort the sound waves and produce completely incorrect measurements. The lesson here is simple: the ultrasonic sensor must be fully exposed, with absolutely nothing in front of either transducer. It should never sit behind a mesh, narrow hole, or partially obstructed surface.

A second important takeaway was the value of fully testing the system before placing everything inside the vase. Since the vase acts as a compact enclosure, accessing components once everything is installed becomes difficult. If a cable disconnects or a sensor needs to be repositioned, working inside the narrow opening can be frustrating and time-consuming. Thoroughly testing all sensors, servos, and logic on the table before final assembly prevents avoidable headaches during integration.

Finally, servo motion required careful tuning. Facial movements in this project need to be small and precise—large rotations can easily cause the servo horns or linkages to collide with the vase walls or with other components inside. Such collisions force the servo to stall, which can quickly overheat or damage the motor. Starting with minimal movement ranges and slowly increasing them only as needed proved to be the safest and most reliable approach.

Final Product

Downloads