Mechanical & Machine Design
Snapshot of this week's mechanical design, machine building, and midterm review milestones.
Swiper mechanism and coordinated tapping/swiping automation for phone interaction.
Real-time person tracking with following and stop behaviors for interactive machine control.
Complete actuation and automation system with all subsystems integrated and coordinated.
Spring-loaded phone holder mechanism and 3D-printed components.
Servo gear system and linear actuator stylus mechanism.
Wi-Fi livestreaming and on-device face detection with Edge AI.
Dual servo opposite-direction sweep pattern for synchronized tapping and swiping mechanisms.
Synchronized 4-step motion pattern (0° → 90° → 180° → 90° → 0°) for coordinated actions.
3D-printed tapper and swiper enclosures with integrated servo mounts and motion guides.
Machine building training session with xylophone demonstration.
System diagram and development timeline for midterm review.
Injection molding process overview with Dan covering mold design and machine operation.
Machine building principles, injection molding processes, mechanical design fundamentals, and midterm review preparation for final project documentation.
Design and build a machine with mechanism, actuation, automation, function, and user interface. Prepare comprehensive midterm review documentation.
Mechanical design principles, stepper motor control, real-time motion systems, injection molding workflows, and project planning.
Group machine design and manual operation, recitation notes on machine building kits, injection molding training summary, and individual midterm review documentation.
Primary references for mechanical design, machine building, and midterm review requirements.
The MIT Mechanical Design overview covers stress-strain relationships, materials selection (plastic, metal, rubber, foam, garolite, wood, cement, ceramic), fasteners, framing systems, drive mechanisms (gears, lead screws, belts), guide systems (shafts, rails, slides), bearings, and mechanical principles (academy.cba.mit.edu).
The Machine Design page covers mechanisms, structural loops, sensors, actuators, end effectors, power electronics, motion control (open-loop, closed-loop), control theory (bang-bang, PID, acceleration, model predictive), timing protocols, and machine control systems (academy.cba.mit.edu).
The Midterm page outlines required deliverables for the final project review (academy.cba.mit.edu).
Refined notes from Quentin Bolsee's machine building recitation, anchored to the Slack recap (Slack).
The control system uses a byte-passing protocol for device communication instead of address hopping.
Stepper motor control involves understanding signals for position, velocity, acceleration, jerk, crackle, and pop. Reference: Stepper Motor Video.
StepDance is a modular real-time motion control system with components for inputs, interfaces, generators, kinematics, recording, outputs, and filters.
See recitation slides for additional references and detailed examples.
Wednesday presentation: Bring your machine and prepare a 15-minute presentation per machine. Win the presentation!
Design and build a machine that includes mechanism, actuation, automation, function, and user interface. Document the group project and your individual contribution.
Design a machine that includes mechanism + actuation + automation + function + user interface. Build the mechanical parts and operate it manually. Document the group project and your individual contribution.
[Placeholder: Group assignment documentation will be added here]
Actuate and automate your machine. Document the group project and your individual contribution. Prepare a demonstration of your machines for the next class.
[Placeholder: Group assignment documentation will be added here]
Document your individual contribution to group assignment 1 and group assignment 2.
I pitched and developed the initial concept for the group project, which helped initiate collaborative design discussions and whiteboard sessions. The concept evolved from a coin flipper machine to the final BrainrotBot design—a mobile robot that navigates and interacts with smartphones.
The initial design concept focused on a coin flipper machine with the following components:
Lever attached to a loaded spring under a platform flips a coin inserted into a curved box.
Lever pushes the loaded spring platform beyond a stopper to actuate the coin flip.
Button activates a motor to push the lever, automating the coin flip actuation.
Schrödinger's cat coin (minimal), heads or tails, 6-sided dice, 10-sided dice random number generator, magic 8-ball.
After the group settled on the BrainrotBot concept, I contributed to splitting the system into modular subsystems with defined interfaces. This modular approach enabled parallel development and clear integration points.
View subsystem breakdown document → | View subsystem references →
I contributed to key architectural decisions that separated the base chassis from the body, enabling an upgradeable design that could transition from two-wheel drive to omnidirectional drive.
Designed a phone holder with integrated passive amplifier for audio output. The design incorporates a spring-loaded mechanism for secure phone mounting and a horn-shaped amplifier for enhanced sound projection.
Developed multiple iterations of the stylus mechanism for touch screen interaction, progressing from simple manual designs to a linear actuator-driven system for precise control.
Designed a motor-driven system for tapping and swiping gestures using a linear actuator mechanism with servo control for precise horizontal movement.
Developed the camera subsystem with Wi-Fi livestreaming and edge AI inference capabilities for real-time object detection and face recognition.
The camera livestream implementation uses ESP32-S3's built-in camera and HTTP server capabilities to stream JPEG frames over Wi-Fi using MJPEG (Motion JPEG) protocol. The system initializes the camera with optimized settings for frame rate and quality, connects to Wi-Fi, and serves a continuous stream of JPEG images via HTTP multipart response.
For detailed pseudocode and implementation, see the Camera Code section in Design Files.
The Edge AI system uses a FOMO (Faster Objects, More Objects) model from Edge Impulse for real-time face detection. The model was trained on person/face classification data from the Model Zoo, converted to TensorFlow Lite format, and compiled as an Arduino library for deployment on the ESP32-S3.
The system processes camera frames through the on-device inference pipeline, outputs bounding box coordinates for detected faces, converts these coordinates to distance measurements, and sends byte packets to motor microcontroller boards for control. This enables real-time person tracking and machine interaction based on face detection.
Edge Impulse Model: View model in Edge Impulse Studio →
Development References: ChatGPT Transcript 1, ChatGPT Transcript 2, ChatGPT Transcript 3, ChatGPT Transcript 4
Designed the v1 GUI for manual control and monitoring of the machine's subsystems.
All design files organized by subsystem component:
Design files for the phone holder with integrated passive amplifier.
phone-holder-print.3mf — Main phone holder 3MF filephone-stand-amplifier-print.3mf — Amplifier horn 3MF fileReferences: Spring Loaded Phone Holder (Thingiverse), Phone Amplifier Passive Speaker (Thingiverse)
Design files for the stylus mechanism.
printable_stylus_with_built_in_stand.stl — Stylus with integrated standReferences: Printable Stylus (Thingiverse)
Design files for the linear actuator and servo-driven tapping/swiping mechanism.
linear_motor.3mf — Linear motor assemblylinear_motor_stylus.3mf — Linear motor with stylus mountCase_R.3mf, Linear_Case_L.3mf — Motor case componentsGear.3mf, Linear_Rack_RL.3mf — Gear and rack componentsReferences: Linear MG90S Micro Servo (Thingiverse), Linear Actuator Design (Thingiverse)
Arduino code for controlling two MG90S servo motors for tapping and swiping mechanisms.
Download Files:
two_servo_spins.zip — Complete project for dual servo sweep testtwo_servo_spins.ino — Dual servo opposite-direction sweep controlback_forth_test.zip — Complete project for 4-step motion testback_forth_test.ino — 4-step synchronized motion pattern (0° → 90° → 180° → 90° → 0°)Vinyl sticker designs generated using VDraw.ai black-and-white image converter for preparing artwork suitable for vinyl cutting.
VDraw_1763512341238.png — "Swiper No Swiping" sticker design converted from original artworkVDraw_1763514225691.png — "Brainrot9000" logo sticker design generated from Gemini-created artworkThe VDraw.ai converter optimizes images for vinyl cutting by creating clean black-and-white designs with clear edges and minimal detail loss, ensuring successful cutting and weeding operations.
Complete design for the phone holder with integrated swiper and tapper mechanisms, including servo mounts, linear actuators, and motion guides.
phone holder and movement v8.f3z — Fusion 360 design file (v8) for phone holder with integrated swiper and tapper mechanismsThe design includes all mechanical components for the phone holder, servo-driven linear actuators for tapping and swiping, mounting brackets, and protective enclosures for reliable operation.
PCB design files for the speaker/amplifier subsystem circuit board, including Gerber files for fabrication and design documentation.
DFPlayer-F_Cu.gbr — Front copper layer Gerber file for PCB fabricationDFPlayer-Edge_Cuts.gbr — Edge cuts Gerber file defining board outlinepcb_design.png — PCB layout visualization showing component placement and trace routingpcb_schematic.png — Circuit schematic diagram showing electrical connections and component relationshipsThe PCB was milled using the Othermill machine following the standard operating procedures documented in Week 5 training documentation.
Arduino code for ESP32-S3 camera livestreaming and Edge AI face detection.
SETUP:
1. Initialize Serial communication (115200 baud)
2. Configure camera pins (from camera_pins.h):
- Data pins (Y2-Y9) for parallel data bus
- Control pins (XCLK, PCLK, VSYNC, HREF)
- I2C pins (SIOD, SIOC) for camera configuration
3. Create camera_config_t structure:
- Set LEDC channel and timer for clock generation
- Map all GPIO pins to camera interface
- Set XCLK frequency to 20MHz
- Set pixel format to JPEG
- Configure frame size (QVGA if PSRAM available, QQVGA otherwise)
- Set JPEG quality to 12 (if PSRAM available)
- Set frame buffer count (2 if PSRAM, 1 otherwise)
4. Initialize camera with esp_camera_init()
5. Connect to Wi-Fi network:
- Begin connection with SSID and password
- Wait until connection established
- Print local IP address
6. Start HTTP server:
- Create HTTP server configuration
- Register URI handler for root path "/"
- Set handler function to stream_handler
- Start server and print access URL
STREAM_HANDLER (HTTP request handler):
1. Set HTTP response type to "multipart/x-mixed-replace; boundary=frame"
2. Enter infinite loop:
a. Capture frame from camera (esp_camera_fb_get())
b. If capture fails, return error
c. Format HTTP multipart header:
- Boundary marker: "--frame"
- Content-Type: "image/jpeg"
- Content-Length: frame buffer length
d. Send header chunk via HTTP response
e. Send frame buffer data chunk
f. Return frame buffer to camera (esp_camera_fb_return())
g. Send boundary terminator "\r\n"
h. If any send operation fails, break loop
3. Return result status
LOOP:
- Minimal delay (10ms) to allow other tasks
Download Files:
camera_stream.zip — Complete camera stream project (includes .ino and .h files)camera_stream.ino — Main Arduino sketch for camera livestreamingcamera_pins.h — GPIO pin definitions for XIAO ESP32-S3 camera moduleEdge Impulse Arduino library for FOMO-based face detection on ESP32-S3.
ei-face-detection--fomo-arduino-1.0.90.zip — Edge Impulse Arduino library (v1.0.90)Edge Impulse Model: View model in Edge Impulse Studio →
Group Collaboration: All design work was documented in the Slack thread after each working session, ensuring real-time communication and progress tracking throughout the project.
Co-developed servo motor control firmware and electrical connections for the tapper and swiper mechanisms with Hayley Bloch. The system uses two MG90S micro servos connected to GPIO pins on the ESP32-S3 for synchronized tapping and swiping motions. Development transcript →
| Component | Connection | ESP32-S3 Pin |
|---|---|---|
| Servo 1 (Tapper) Signal | PWM Control | GPIO1 |
| Servo 2 (Swiper) Signal | PWM Control | GPIO2 |
| Servo 1 & 2 Power | VCC (5V) | 5V Output |
| Servo 1 & 2 Ground | GND | GND |
SETUP:
1. Initialize Serial communication (115200 baud)
2. Allocate PWM timers for ESP32-S3 (timer 0 and timer 1)
3. Attach servo1 to GPIO1 with pulse range 500-2400μs (MG90S range)
4. Attach servo2 to GPIO2 with pulse range 500-2400μs
LOOP:
1. Sweep forward (0° to 180°):
- servo1: 0° → 180° (incrementing)
- servo2: 180° → 0° (decrementing, opposite direction)
- 10ms delay between steps
2. Sweep backward (180° to 0°):
- servo1: 180° → 0° (decrementing)
- servo2: 0° → 180° (incrementing, opposite direction)
- 10ms delay between steps
3. Repeat continuously
SETUP:
1. Initialize Serial communication (115200 baud)
2. Allocate PWM timers (timer 0 and timer 1)
3. Attach both servos to GPIO1 and GPIO2 with 500-2400μs range
MOVE_BOTH function:
- Set both servos to same angle simultaneously
- Wait 120ms for MG90S to reach position (tunable delay)
LOOP (4-step pattern):
1. Move both servos to 90° (center position)
2. Move both servos to 180° (full extension)
3. Move both servos to 90° (return to center)
4. Move both servos to 0° (full retraction)
5. Repeat pattern
For complete code files, see Servo Motor Controls in Design Files.
Collaborated with Hayley Bloch on the mechanical design and 3D printing of tapper and swiper enclosures and actuators. The designs integrate servo mounting points, linear motion guides, and protective casings for reliable operation.
Designed, cut, transferred, and applied custom vinyl stickers to the assembled Brainrot9000 machine. The vinyl graphics enhance the machine's visual identity and provide clear labeling for different subsystems.
The vinyl designs were created using VDraw.ai black-and-white image converter to prepare artwork for vinyl cutting. Two main designs were developed:
Co-designed the tapping and swiping automation system with Hayley Bloch, then assembled and troubleshooted the mechanisms to ensure reliable operation. The system integrates servo-driven actuators with precise motion control for synchronized tapping and swiping actions.
Following the tapping and swiping automation, worked on early iterations of the person follower system. Shared references, helped with code logic, provided implementation code from references, discussed technical issues, and collaborated with programmers on the team to develop the face-tracking and person-following functionality.
Assembled and integrated the complete actuation and automation system with other subsystem teams. This involved coordinating the tapper, swiper, person follower, and camera systems into a unified control architecture.
Assembled the head inner subsystem, which houses the camera, display, and control electronics. Integrated this subsystem with other teams' components to create a cohesive machine head assembly.
Assembled and integrated the complete Brainrot9000 machine, bringing together all subsystem components into a fully functional automated system. Coordinated with multiple teams to ensure proper integration of mechanical, electrical, and software components.
Milled a custom PCB for the speaker/amplifier subsystem using the Othermill machine, creating the circuit board that interfaces the audio output with the phone holder amplifier system. The PCB was designed to integrate with the overall machine electronics and provide reliable audio signal routing. The milling process followed the standard operating procedures documented in Week 5 training documentation.
For complete design files including Gerber files for fabrication, see Speaker PCB in Design Files.
The midterm review was completed. On the final project site: posted a system diagram, listed tasks to be completed, made a schedule, and scheduled a meeting with instructors for a graded review.
The system diagram for the MirrorAge Intrinsic Capacity Mirror project was posted on the final project page, showing the multimodal sensing stack, on-device inference layers, and real-time feedback channels.
Updated block diagram highlighting the multimodal sensing stack (grip, voice, face, motion, wearables), on-device inference layers, and real-time feedback channels that feed the intrinsic capacity score. View full system diagram →
The remaining tasks for the MirrorAge project were listed and organized into five key areas:
A development timeline was created that aligned subsystem sprints with HTMAA milestones from Week 8 through Week 13:
A calendar hold was sent for Thursday, Nov 12 at 10:00 AM ET (38-501 conference room) per the shared HTMAA scheduling sheet. The meeting was held and the agenda covered subsystem demos, weekly documentation spot checks (Weeks 0–9), and next-sprint alignment.
The meeting slot was referenced in the midterm review schedule.
Midterm Review Completed: All required elements (system diagram, task list, schedule, and instructor meeting) were documented on the final project page midterm review section, which included featured subsystems, completed tasks, execution schedule, and review logistics.
Key concepts and processes from the injection molding training session, anchored to the Slack recap (Slack).
Injection molding is a manufacturing process for producing parts by injecting molten material into a mold. Reference: Schematic diagram of an injection molding machine.
Students can create injection molds using generic mold blanks with core and cavity components.
Reference: Injection molding animation — think of yourself as the plastic pellet traveling through the process.
Designed a phone holder with integrated passive amplifier for audio output. The design incorporates a spring-loaded mechanism for secure phone mounting and a horn-shaped amplifier for enhanced sound projection.
Developed multiple iterations of the stylus mechanism for touch screen interaction, progressing from simple manual designs to a linear actuator-driven system for precise control.
Designed a motor-driven system for tapping and swiping gestures using a linear actuator mechanism with servo control for precise horizontal movement.
Developed the camera subsystem with Wi-Fi livestreaming and edge AI inference capabilities for real-time object detection and face recognition.
The camera livestream implementation uses ESP32-S3's built-in camera and HTTP server capabilities to stream JPEG frames over Wi-Fi using MJPEG (Motion JPEG) protocol. The system initializes the camera with optimized settings for frame rate and quality, connects to Wi-Fi, and serves a continuous stream of JPEG images via HTTP multipart response.
For detailed pseudocode and implementation, see the Camera Code section in Design Files.
The Edge AI system uses a FOMO (Faster Objects, More Objects) model from Edge Impulse for real-time face detection. The model was trained on person/face classification data from the Model Zoo, converted to TensorFlow Lite format, and compiled as an Arduino library for deployment on the ESP32-S3.
The system processes camera frames through the on-device inference pipeline, outputs bounding box coordinates for detected faces, converts these coordinates to distance measurements, and sends byte packets to motor microcontroller boards for control. This enables real-time person tracking and machine interaction based on face detection.
Edge Impulse Model: View model in Edge Impulse Studio →
Development References: ChatGPT Transcript 1, ChatGPT Transcript 2, ChatGPT Transcript 3, ChatGPT Transcript 4
Designed the v1 GUI for manual control and monitoring of the machine's subsystems.
All design files organized by subsystem component:
Design files for the phone holder with integrated passive amplifier.
phone-holder-print.3mf — Main phone holder 3MF filephone-stand-amplifier-print.3mf — Amplifier horn 3MF fileReferences: Spring Loaded Phone Holder (Thingiverse), Phone Amplifier Passive Speaker (Thingiverse)
Design files for the stylus mechanism.
printable_stylus_with_built_in_stand.stl — Stylus with integrated standReferences: Printable Stylus (Thingiverse)
Design files for the linear actuator and servo-driven tapping/swiping mechanism.
linear_motor.3mf — Linear motor assemblylinear_motor_stylus.3mf — Linear motor with stylus mountCase_R.3mf, Linear_Case_L.3mf — Motor case componentsGear.3mf, Linear_Rack_RL.3mf — Gear and rack componentsReferences: Linear MG90S Micro Servo (Thingiverse), Linear Actuator Design (Thingiverse)
Arduino code for controlling two MG90S servo motors for tapping and swiping mechanisms.
Download Files:
two_servo_spins.zip — Complete project for dual servo sweep testtwo_servo_spins.ino — Dual servo opposite-direction sweep controlback_forth_test.zip — Complete project for 4-step motion testback_forth_test.ino — 4-step synchronized motion pattern (0° → 90° → 180° → 90° → 0°)Arduino code for ESP32-S3 camera livestreaming and Edge AI face detection.
SETUP:
1. Initialize Serial communication (115200 baud)
2. Configure camera pins (from camera_pins.h):
- Data pins (Y2-Y9) for parallel data bus
- Control pins (XCLK, PCLK, VSYNC, HREF)
- I2C pins (SIOD, SIOC) for camera configuration
3. Create camera_config_t structure:
- Set LEDC channel and timer for clock generation
- Map all GPIO pins to camera interface
- Set XCLK frequency to 20MHz
- Set pixel format to JPEG
- Configure frame size (QVGA if PSRAM available, QQVGA otherwise)
- Set JPEG quality to 12 (if PSRAM available)
- Set frame buffer count (2 if PSRAM, 1 otherwise)
4. Initialize camera with esp_camera_init()
5. Connect to Wi-Fi network:
- Begin connection with SSID and password
- Wait until connection established
- Print local IP address
6. Start HTTP server:
- Create HTTP server configuration
- Register URI handler for root path "/"
- Set handler function to stream_handler
- Start server and print access URL
STREAM_HANDLER (HTTP request handler):
1. Set HTTP response type to "multipart/x-mixed-replace; boundary=frame"
2. Enter infinite loop:
a. Capture frame from camera (esp_camera_fb_get())
b. If capture fails, return error
c. Format HTTP multipart header:
- Boundary marker: "--frame"
- Content-Type: "image/jpeg"
- Content-Length: frame buffer length
d. Send header chunk via HTTP response
e. Send frame buffer data chunk
f. Return frame buffer to camera (esp_camera_fb_return())
g. Send boundary terminator "\r\n"
h. If any send operation fails, break loop
3. Return result status
LOOP:
- Minimal delay (10ms) to allow other tasks
Download Files:
camera_stream.zip — Complete camera stream project (includes .ino and .h files)camera_stream.ino — Main Arduino sketch for camera livestreamingcamera_pins.h — GPIO pin definitions for XIAO ESP32-S3 camera moduleEdge Impulse Arduino library for FOMO-based face detection on ESP32-S3.
ei-face-detection--fomo-arduino-1.0.90.zip — Edge Impulse Arduino library (v1.0.90)Edge Impulse Model: View model in Edge Impulse Studio →
Group Collaboration: All design work was documented in the Slack thread after each working session, ensuring real-time communication and progress tracking throughout the project.
Co-developed servo motor control firmware and electrical connections for the tapper and swiper mechanisms with Hayley Bloch. The system uses two MG90S micro servos connected to GPIO pins on the ESP32-S3 for synchronized tapping and swiping motions. Development transcript →
| Component | Connection | ESP32-S3 Pin |
|---|---|---|
| Servo 1 (Tapper) Signal | PWM Control | GPIO1 |
| Servo 2 (Swiper) Signal | PWM Control | GPIO2 |
| Servo 1 & 2 Power | VCC (5V) | 5V Output |
| Servo 1 & 2 Ground | GND | GND |
SETUP:
1. Initialize Serial communication (115200 baud)
2. Allocate PWM timers for ESP32-S3 (timer 0 and timer 1)
3. Attach servo1 to GPIO1 with pulse range 500-2400μs (MG90S range)
4. Attach servo2 to GPIO2 with pulse range 500-2400μs
LOOP:
1. Sweep forward (0° to 180°):
- servo1: 0° → 180° (incrementing)
- servo2: 180° → 0° (decrementing, opposite direction)
- 10ms delay between steps
2. Sweep backward (180° to 0°):
- servo1: 180° → 0° (decrementing)
- servo2: 0° → 180° (incrementing, opposite direction)
- 10ms delay between steps
3. Repeat continuously
SETUP:
1. Initialize Serial communication (115200 baud)
2. Allocate PWM timers (timer 0 and timer 1)
3. Attach both servos to GPIO1 and GPIO2 with 500-2400μs range
MOVE_BOTH function:
- Set both servos to same angle simultaneously
- Wait 120ms for MG90S to reach position (tunable delay)
LOOP (4-step pattern):
1. Move both servos to 90° (center position)
2. Move both servos to 180° (full extension)
3. Move both servos to 90° (return to center)
4. Move both servos to 0° (full retraction)
5. Repeat pattern
For complete code files, see Servo Motor Controls in Design Files.
Collaborated with Hayley Bloch on the mechanical design and 3D printing of tapper and swiper enclosures and actuators. The designs integrate servo mounting points, linear motion guides, and protective casings for reliable operation.
[Placeholder: Contributions section will be added here]
This Week 10 documentation was created with the assistance of Cursor AI (Auto). The AI assistant helped with documentation, organization, HTML formatting, and content refinement. All design decisions, technical implementations, and project contributions documented here are my original work.
I maintained full control over all design decisions, technical implementations, content selection, and final review. All machine designs, component contributions, system architecture, and project work documented here represent my original contributions to the group project.
For full transparency, the complete transcript of AI assistance for Week 10 is available in both formats:
The transcript includes detailed documentation of all AI-assisted tasks, file management, content organization, and quality assurance processes.
This work is licensed under a
Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License