Problem
MR visual overlays can interrupt precision work—occlusion, split attention, and constant glancing between the task and the HUD can break flow.
My name is Rodrigo A. Gallardo. I’m a graduate student at MIT in Architecture and Electrical Engineering & Computer Science, researching multimodal systems that integrate vision, haptics, and AI. My work focuses on developing interactive platforms that enhance usability, ergonomics, and task performance, with my thesis centered on textile-based actuation for soft, wearable haptic feedback.
A wireless, wrist/forearm-worn haptic band that delivers directional vibration cues—8 motors distributed around the forearm—to support mixed-reality (MR) tasks without relying only on visual overlays. The project emphasizes modularity, comfort, and fast donning/doffing, so both the electronics and the wearable structure can be reconfigured quickly for different layouts, patterns, and experiments.
MR visual overlays can interrupt precision work—occlusion, split attention, and constant glancing between the task and the HUD can break flow.
Offload some guidance to the body using localized haptic patterns. For this phase, the focus was on building a functional, modular band with rich pattern control—mixed-reality integration is planned for future work.
8 vibration motors placed at different points around the forearm with driver-per-motor control, enabling different vibration effects/patterns with control over amplitude and frequency.
This phase focused on building and validating the modular haptic band itself. Mixed-reality integration will be tackled in future work.
The band runs untethered on battery power, addresses multiple motors independently, and plays repeatable directional patterns mapped to physical locations around the arm (left/right/rotate/on-array).
This wasn’t built as separate parts. The mechanical fit, wiring routing, and haptic clarity were tuned together through iteration— changes in one layer forced improvements in the others.
The focus of this final was building a functional end-to-end artifact. Mixed-reality triggers are the next integration step, but the platform and pattern library are already working and ready to be connected.
A set of vibration patterns were designed to represent different "messages":
Directional cue indicating movement to the left.
→ View Arduino CodeDirectional cue indicating movement to the right.
→ View Arduino CodeSequential pattern indicating clockwise rotation.
→ View Arduino CodeSequential pattern indicating counter-clockwise rotation.
→ View Arduino CodeAll motors activate simultaneously for confirmation or attention.
→ View Arduino CodePatterns can be swapped or tuned by updating effect parameters (amplitude, duration, tempo) and by remapping which motor(s) activate.
The frame, TPU ends, and inner motor liner are separable components that can be reconfigured independently.
Motors + wiring are designed to be swappable so layouts can change "on the fly" for different experiments.
The same platform can support different motor placements, different pattern sets, and different study tasks.
A wireless wrist/forearm haptic band that delivers directional vibration cues using 8 motors around the forearm, so guidance can be felt without relying only on visual overlays.
Vibrotactile wrist/forearm cueing is commonly used for navigation and guidance. This project focuses on modularity (wearable + wiring) and per-motor pattern control for assembly-style tasks.
Hardware + documentation used while selecting parts and building the control system:
This section is meant to make the project “presentation-complete”: what existed before, and what references were used.
The table below is a retail estimate. Replace any line items with your actual receipts and note what came from lab stock.
| Item | Qty | Unit | Subtotal | Where from |
|---|---|---|---|---|
| XIAO ESP32-S3 | 1 | $7.50 | $7.50 | Seeed / lab stock |
| Haptic driver (DRV2605L breakout) | 8 | $8.00 | $64.00 | Adafruit / SparkFun / lab stock |
| I2C multiplexer (TCA9548A) | 1 | $7.00 | $7.00 | Adafruit / SparkFun / lab stock |
| Coin vibration motor (ERM) | 8 | $2.00 | $16.00 | Jameco / Amazon / lab stock |
| 9V battery | 1 | $3.00 | $3.00 | Local / lab stock |
| 9V → 5V regulator | 1 | $10.00 | $10.00 | Amazon / Adafruit / lab stock |
| PLA + TPU (prints) | — | — | $10–$20 | Shop filament |
| Stretch fabric + thread + glue | — | — | $5–$15 | Shop / local |
| Wire / connectors / copper tape | — | — | $5–$15 | Shop / lab stock |
| Estimated total | $120–$150 | |||
If you used a custom PCB instead of multiple breakouts, cost shifts mainly from breakouts → ICs + PCB fabrication.
Verified each channel could be addressed independently and that patterns played reliably across repeated runs.
On-arm checks for comfort, contact pressure, slippage, and repeatability of motor placement across don/doff cycles.
Confirmed untethered operation using a 9V source regulated to 5V while running multiple pattern sequences.
This phase validated the platform and pattern set; a future study will compare task throughput vs visual-only guidance.
A wearable haptic wristband that provides directional tactile feedback to guide assembly tasks with minimal visual dependence. The system aims to enable users to complete assembly sequences through haptic cues alone, reducing cognitive load and allowing eyes-free interaction during precision work.
Developing a "vocabulary" of haptic patterns to represent different actions. The wristband uses sequential motor activation to create perceivable directional cues—rotation, translation, and confirmation signals that users can understand without visual reference.
To validate the haptic guidance concept, I chose a tangram puzzle as a proxy assembly task. Seven geometric pieces must be arranged into specific configurations—simple enough to understand, complex enough to require spatial reasoning, and perfect for testing eyes-free assembly with haptic-only guidance.
The wristband requires independent control of multiple motors to generate distinguishable patterns. Early design work explored both vibration motors and alternative actuation methods like pneumatics and shape-memory alloys.
These initial explorations established the foundation for a haptic guidance system that could enable eyes-free assembly. The next phase would involve building functional prototypes and conducting user studies where participants attempt assembly tasks without visual feedback—relying entirely on haptic cues to position and orient objects correctly.
The tangram puzzle provides a controlled, repeatable task structure perfect for measuring the effectiveness of different haptic patterns, motor spacing configurations, and feedback timing strategies.
Machine and Worflow
Drew parts in Rhino → exported .dxf → opened in CorelDRAW. Set all vectors to Hairline and grouped by color
(black/red/blue/green) for different cut/engrave settings. Turned on the shop vacuum; focused with the stick (top surface for thin stock,
mid-thickness for thicker). In the ULS UI: set power/speed/frequency per color, positioned the job, and used the red pointer to check placement.
Characterization(notes) Material: corrugated cardboard (~1/8″). Dialed settings until edges were clean with minimal burn. Fill in when measured: focus: top / mid; power: 65%; speed: 50%;
Process Notes Within a color, cut order creates interesting toolpaths. We balanced speed and power to avoid charring while maintaining full separation, then produced the final piece shown above.
Parametric construction kit: design, lasercut, and document a press-fit kit that accounts for kerf and can assemble multiple ways.
Extra credit: include non-flat elements and combine engraving with cutting.
This week, I practiced and developed skills in soldering and embedded systems. The system I worked on was designed by Quentin Bolsee.
Components soldered:
Tools / materials: soldering iron, flux, tweezers, hot air, solder gel
Verification:
Printer: Bambu Lab P1S 3D PrinterSlicer: Bambu Studio
Goal: This week, I set out to 3D print an early version of the haptic bracelet I am designing. The design required two different materials: a rigid PLA housing for the motors, and flexible TPU bands that could stretch for donning and doffing.

Using the parametric model I created in Week 2, I completed the 3D models in Rhino. Because of its size and overhangs, this design could not have been fabricated subtractively, making additive manufacturing essential.



.jpg)
.jpg)
.jpg)
I went through multiple iterations to refine the design:
Overall, this process gave me a clearer understanding of material constraints and tolerance requirements for future versions of the bracelet.
.jpg)
Scanner/App: Reality Scan Object: Clay object casted in dirt
This week I jumped into KiCAD. I picked it because, out of all the PCB design tools, it felt like the one with the most tutorials and documentation floating around online. It seemed like the best choice for me as a beginner.
For my final project I’ll eventually need to control five separate 3V vibration motors, each individually. At first, I tried hunting down ready-made footprint and schematic files online but couldn’t find what I needed. So instead of wasting more time, I decided to just keep it simple and use this week as practice.
The idea was to make a breakout board where I could plug things in more easily and test them out. I followed the hardware overview from lecture, added in pin sockets, and labeled SDA and SCL on pins 4 and 5. The naming conventions were a little confusing at first since they didn’t always match up, but I worked through it.







I did run into a few design rule check (DRC) warnings while routing. I used them as a checklist—fix a spacing issue, re-run DRC, repeat—until I cleared the critical ones.


Seeing the board in 3D really helped sanity-check footprints, connector clearances, and general orientation before sending anything out.


Overall, this week was more about getting my hands dirty in KiCAD than making something polished. I wanted to learn the basics: how to place components, wire things up, and troubleshoot when the layout got messy. It took a few retries, but I can already see how much smoother the workflow feels compared to when I started. Now I’ve got a simple breakout board under my belt and feel more ready to take on the complex designs I’ll need for the final project.
This week focused on taking a PCB design from schematic to physical board: milling traces, drilling holes, soldering components, and debugging connectivity issues. For my final project, I need independent control of multiple vibration motors, so I designed a board with a multiplexer to address each motor's haptic driver individually.
I designed this board in KiCad to control an array of haptic motors for my wristband project. The core challenge was routing I2C signals to multiple DRV2605L haptic drivers through an 8-channel multiplexer, allowing individual motor control from a single microcontroller.
Each DRV2605L haptic driver has configurable parameters for vibration effects. I tested different waveform libraries, amplitude settings, and timing configurations to understand the range of tactile feedback patterns the system could produce.
I used the Roland SRM-20 to mill the PCB traces and drill holes. The workflow was similar to what others have documented: export gerbers, generate toolpaths in mods, set the origin carefully, and run the 1/64" bit for traces followed by the 1/32" bit for the outline and holes.
Soldering the surface-mount components required careful attention to alignment and solder quantity. The DRV2605L drivers and multiplexer chip were the most challenging due to their small pitch and multiple pins.
After soldering, I ran into several connectivity issues. Using a multimeter, I tested continuity between components, checked for shorts, and verified that power and ground were properly distributed across the board.
Multiplexer failure: During testing, the multiplexer overheated and failed. This was likely due to a power supply issue or incorrect current limiting on one of the channels. The chip showed visible damage after the failure.
For the revised board design, I plan to:
Before using the CNC, I completed the shop's safety training. Key reminders included:
We also reviewed the standard bit types and the four cut types used in the lab: drilling, mortises/interior cuts, pockets, and profile cuts. To keep parts stable during milling, we used a thin onion skin layer that was later removed by hand.
.jpg)
.jpg)
Make (Design + Mill + Assemble) Something Big (~meter-scale)
Extra credit: no fasteners or glue

For my project, I designed a zero-waste lounge chair meant for sitting cross-legged. My goal was to use the entire 4×8 OSB sheet with minimal leftover material while creating a comfortable, low-angled form.
I started with sketches and modeled the design in Rhino, alternating between 2D nesting (for material efficiency) and 3D assembly (for ergonomics and joinery). Early versions felt too upright, so I adjusted the seat and back angles until the proportions felt balanced and supportive.
The final layout used nearly the entire sheet, with just thin perimeter offcuts.
.jpg)
.jpg)
.jpg)
I prepared all cut layers directly in Rhino, organizing toolpaths for drilling, pockets, and profiles. Once complete, I sent the file to the Arch FabShop for verification.
After feedback, I made small edits to the cut order and tolerances before submitting the final file. The machining was run by Geoffrey, the lab manager, while I was present during the cutting process to observe and take notes.
The parts were cut from a single OSB sheet. Once removed, I cleaned the onion-skin edges with a utility knife and sanded the press-fit joints. The pieces locked together without glue or screws, relying entirely on friction and geometry for stability.
.jpg)
.jpg)
.jpg)
.jpg)
.jpg)
.jpg)
The assembled chair supports a cross-legged sitting posture and highlights the geometry of each interlocking component. The design reflects an effort to balance ergonomics, efficiency, and zero-waste fabrication principles.
.jpg)
.jpg)
.jpg)
.jpg)
.jpg)
.jpg)
.jpg)
.jpg)
.jpg)
I'd probably make the chair a bit smaller next time, or design it so it can be taken apart more easily for transport. I'd also reinforce the seat area with extra supports since the middle began to flex slightly under weight. Overall, the build worked well, but those small adjustments would make it sturdier and more practical.
I wasn't able to take the chair to the show and tell because of its size, here is a picture of me strugguling to move it out of the shop.
.jpg)
The Rhino 8 files, with toolpaths, for this project are available here: Fabrication Files
Group Assignment:
Individual Assignment:
During this week's training, Gert demonstrated how to use the oscilloscope to verify whether our motion detection sensor was working properly. By connecting the sensor to the oscilloscope, we could visualize the sensor's signal patterns in real time.
Key Learning Points:
The oscilloscope proved to be an essential diagnostic tool for understanding how analog signals respond to real-world inputs. Through this exercise, we learned to interpret signal patterns and distinguish between active detection states and idle states.
For this week's individual assignment, I designed and built a distance sensing system using a VL53L1X Time-of-Flight sensor connected to an Arduino Uno (SparkFun RedBoard) with three LEDs (red, yellow, and green). The system provides visual feedback based on the detected distance, with different colored LEDs illuminating depending on how close an object is to the sensor.
My objective was to create an intuitive distance detection system that provides immediate visual feedback. Rather than displaying numerical values on a screen, I wanted a traffic light-style indicator that anyone could understand at a glance:
This type of interface could be useful for various applications including parking assistance, collision avoidance systems, or interactive installations.
I decided to reuse the PCB board I designed back in Week 5. The reference from Adrián Torres' Fab Academy documentation turned out to be extremely helpful when planning the pin layout and connections. The board I designed proved to be well-suited not only for controlling the servo motor I used in weeks 5 and 6, but also for experimenting with various input devices.
Since I am still determining which types of sensors would be best to include in my final project, I decided to experiment with a distance sensor to understand its capabilities and limitations.
VL53L1X Distance Sensor
I selected the VL53L1X Time-of-Flight sensor by Pololu for several reasons:
Arduino Uno with SparkFun RedBoard
For this iteration, I used an Arduino Uno (SparkFun RedBoard) to streamline the prototyping process. While my Week 5 PCB board could have been used, the Arduino Uno provided:
Three-LED Visual Indicator System
I connected three LEDs (red, yellow, green) to digital pins on the Arduino to create a simple but effective distance visualization system.
The first step was to prepare the VL53L1X sensor for connection:
The VL53L1X sensor requires only four connections:
I used four female-to-male jumper cables to connect the sensor to the Arduino. The I2C pins on the Arduino Uno are:
For the LED connections, I wired:
Each LED requires a current-limiting resistor to prevent damage. I used 330Ω resistors, which provide appropriate current limitation for standard 5mm LEDs powered from Arduino's 5V pins.
Before coding, I needed to install the VL53L1X library in the Arduino IDE:
I started with the basic example code from the Pololu library to verify the sensor was communicating properly:
//Fab Academy 2023 - Fab Lab León
//Time of Flight VL53L1X
//Basic Test
#include <Wire.h>
#include <VL53L1X.h>
VL53L1X sensor;
void setup()
{
Serial.begin(115200);
Wire.begin();
Wire.setClock(400000); // use 400 kHz I2C
sensor.setTimeout(500);
if (!sensor.init())
{
Serial.println("Failed to detect and initialize sensor!");
while (1);
}
sensor.setDistanceMode(VL53L1X::Long);
sensor.setMeasurementTimingBudget(50000);
sensor.startContinuous(50);
}
void loop()
{
Serial.print(sensor.read());
if (sensor.timeoutOccurred()) { Serial.print(" TIMEOUT"); }
Serial.println();
}
The initial code didn't work as expected. Since I was confident that my board was correctly designed, I realized the issue might be in the code itself or possibly in the cable connections.
To verify the board's functionality, I modified the code to include LED interaction. Seeing a simple blinking light confirmed that the board was operating properly, but the sensor still showed no response.
I systematically checked:
After replacing the cables and double-checking all connections, the sensor began responding correctly.
Hardware Reliability Lesson: Jumper cables, despite appearing sound, can have internal breaks that cause intermittent failures. When troubleshooting sensor communication issues, replacing all cables is a quick and easy diagnostic step.
Once I confirmed the sensor was working, I developed the final code that maps distance readings to LED states:
// Distance Sensor with Three-LED Visual Indicator
// Red = Close range
// Yellow = Medium range
// Green = Far range
#include <Wire.h>
#include <VL53L1X.h>
VL53L1X sensor;
// LED pin definitions
const int redLED = 3;
const int yellowLED = 5;
const int greenLED = 6;
// Distance thresholds (in mm)
const int closeThreshold = 300; // < 30cm = RED
const int mediumThreshold = 800; // 30-80cm = YELLOW
// > 80cm = GREEN
void setup()
{
Serial.begin(115200);
// Initialize LED pins
pinMode(redLED, OUTPUT);
pinMode(yellowLED, OUTPUT);
pinMode(greenLED, OUTPUT);
// Turn all LEDs off initially
digitalWrite(redLED, LOW);
digitalWrite(yellowLED, LOW);
digitalWrite(greenLED, LOW);
// Initialize I2C and sensor
Wire.begin();
Wire.setClock(400000);
sensor.setTimeout(500);
if (!sensor.init())
{
Serial.println("Failed to detect and initialize sensor!");
// Blink red LED rapidly to indicate error
while (1) {
digitalWrite(redLED, !digitalRead(redLED));
delay(200);
}
}
sensor.setDistanceMode(VL53L1X::Long);
sensor.setMeasurementTimingBudget(50000);
sensor.startContinuous(50);
Serial.println("VL53L1X Distance Sensor Ready!");
Serial.println("Red = Close | Yellow = Medium | Green = Far");
}
void loop()
{
uint16_t distance = sensor.read();
if (sensor.timeoutOccurred()) {
Serial.println("Sensor timeout!");
return;
}
Serial.print("Distance: ");
Serial.print(distance);
Serial.print(" mm - ");
// Turn all LEDs off first
digitalWrite(redLED, LOW);
digitalWrite(yellowLED, LOW);
digitalWrite(greenLED, LOW);
// Activate appropriate LED based on distance
if (distance < closeThreshold) {
// Close range - RED
digitalWrite(redLED, HIGH);
Serial.println("RED (Close)");
}
else if (distance < mediumThreshold) {
// Medium range - YELLOW
digitalWrite(yellowLED, HIGH);
Serial.println("YELLOW (Medium)");
}
else {
// Far range - GREEN
digitalWrite(greenLED, HIGH);
Serial.println("GREEN (Far)");
}
delay(100); // Small delay for stability
}
The program operates on a simple but effective logic:
closeThreshold (300mm): Anything closer triggers the red LEDmediumThreshold (800mm): Distances between 300-800mm trigger the yellow LEDThe distance sensor with LED feedback system works reliably within the expected range. The VL53L1X sensor provides consistent readings up to approximately 2 meters in my testing environment, with excellent precision in the 0-1 meter range where the LED thresholds are set.
Key Achievements:
↑ Back to top
Measure the power consumption of an output device

Add an output device to a microcontroller board you've designed, and program it to do something.
For this week, I designed a custom PCB capable of controlling three individual LEDs.
Since my final project involves an array of 3 × 5 actuators, I created a smaller proxy PCB to test the control system on a simpler setup. The goal was to understand how to efficiently manage multiple outputs before scaling up to the full array.
Initially, I planned to design a board that could handle all fifteen actuators, but this required more complex routing and power management than I could complete within the week. To simplify the task, I focused on a three-channel LED control board, which still allowed me to explore PWM control, output regulation, and signal consistency.
I used the XIAO RP2040 as the microcontroller because of its compact form factor and sufficient I/O pins for this test configuration.

I designed the schematic in KiCad, connecting the RP2040 output pins to three NPN transistor channels to drive the LEDs safely. Each channel includes a current-limiting resistor and a clean connector footprint for modular testing.


.png)

The board was milled on the PCB mill using traces/outlines exported from KiCad. Toolpaths were generated in mods and the board was hand-soldered.
.png)
I programmed the board using the Arduino RP2040 core. Each LED was mapped to a PWM pin to test dimming and sequencing for timing/power response.

The three-channel prototype controlled each LED's brightness as intended and validated the control logic for the future 3×5 array. Next steps: scale output channels and improve power distribution for the full actuator grid.


The KiCad design files and Arduino code for this project are available here: Project Files
This week I used molding and casting to create a tangram puzzle, which I will later use as a proxy task for my final project, a haptic-guided assembly system. The tangram gives me a repeatable, table-top puzzle that still involves spatial reasoning, shape matching, and multi-step assembly.
The workflow followed the standard process from lecture:
I started by modeling a full tangram set: five triangles of different sizes, one square, and one parallelogram. I kept the geometry simple and planar so the task focuses on spatial arrangement rather than tricky undercuts.
Since I eventually want to cast multiple full sets, I modeled the tangram pieces as hollow shells instead of
solid blocks. This reduces print time and PLA usage, while still giving me clean exterior faces to mold from.
I exported the meshes as .stl files and printed them in PLA. I will link the exact model file in
this week's repo.
After printing, I cleaned up the parts, removing stringing and smoothing any obvious artifacts that might be exaggerated in the mold. I then glued the tangram pieces down to a flat base so they would not float or shift when I poured the silicone. I left a bit of space between each piece so the silicone could flow fully around them and create clear part lines.
This glued-down arrangement became the master plate: one board holding all seven shapes in place, ready to receive the silicone and produce a negative mold for the set.
For the mold, I used a two-part silicone (Part A and Part B). I measured equal parts by volume, mixed thoroughly, and tried to stir in a way that minimized bubbles. After mixing, I poured the silicone slowly into one corner of the mold box and let it flow across the tangram pieces.
The smaller triangles and the tips of the parallelogram were the most bubble-prone areas, so I gently tapped the sides of the mold and used a stick to nudge silicone into those corners. I also made sure the silicone layer was thick enough above the highest point of the pieces so the mold would not tear when demolding.
Once cured, I peeled the silicone block off the base and popped out each PLA tangram piece. The negative captured the edges and overall geometry well, including some of the original layer lines from the PLA prints, which is fine for this first iteration.
My first casting material was a gypsum-based plaster (Perfect Cast), mixed at roughly 100 parts powder to about 10–20 parts water by weight. I aimed for a smooth, pourable consistency that could still reach into the sharp corners of each tangram cavity.
I lightly dampened the silicone cavities so they would not trap dry spots, then mixed the plaster, tapped the container to bring bubbles to the surface, and poured slowly into each tangram cavity, slightly overfilling the top. After the plaster set, I demolded a full tangram set.
The plaster pieces captured the corners and edges well and have a matte, stone-like surface. They are also sandable, which means I can tune small imperfections or soften edges as needed. These plaster tangrams will work well as everyday puzzle pieces and are good candidates for painting or color-coding later.
I also experimented with metal casting into the same silicone mold using a low-melting-point alloy. The goal here was to see how the tangram behaves in a heavier material and how the weight and temperature of the pieces change the feel of the puzzle.
Before pouring, I warmed the silicone mold slightly so the thermal shock would not be as extreme. The metal cools and solidifies much faster than plaster, so the pour had to be steady but quick to avoid partially filled shapes or cold seams. Once solid, the metal tangram pieces came out with a very different tactile quality: heavier, more inert, and closer to coins or game tokens.
Some thin regions that feel a bit fragile in plaster, like the small triangle tips, feel more robust in metal, but the metal pieces are more sensitive to surface imperfections if bubbles reach the surface while cooling.
Next, I plan to cast additional sets in other materials such as tinted plaster, resin, or rubber-like compounds and eventually build a multi-material tangram:
All of this connects back to my final project. I want a standardized tangram puzzle that I can recast in multiple materials and use as a controlled task for testing haptic and mixed-reality guidance. The idea is to have users assemble specific silhouettes or configurations while receiving haptic cues from the wristband, using the tangram as a structured but flexible proxy for more complex assembly tasks.
This week was machine week. As the Architecture section, we built a numerically controlled Ouija board: a wooden board with engraved letters and symbols on top, and a hidden XY gantry with electromagnets underneath that drive a planchette across the surface. Full group documentation, electronics, and code are on the Architecture section page: Machine Week — Architecture .
The machine combines a classic Ouija board aesthetic with a fully hidden motion system. Two stepper motors move a carriage under the table; an array of permanent magnets and electromagnets couples that motion to the planchette on top so it can spell messages without any visible mechanism. My work focused on the cursor design, physical build and finishing of the planchette, and video documentation of the overall project.
I joined the cursor design subteam. We started by studying a commercial Ouija planchette to understand its proportions, how the viewing window frames letters, and how the feet sit on the board. From there we sketched a new design that matched the graphic language of the board that the rest of the section was developing.
I translated our sketches into vector linework and a 3D model in Rhino, laying out the eye, stars, and crescent motifs as curves that could be both engraved and read clearly at arm's length. In the solid model I built in pockets for magnets on the underside, a recessed seat for the acrylic lens, and enough wall thickness that the part would print cleanly without supports.
Once the model was finalized, I exported it and set up the print. We printed the cursor flat in PLA to keep the engraved details on the top surface as crisp as possible and to avoid unnecessary support material.
After printing, I cleaned up the surfaces, trimmed stray strings, and test fit the magnets and lens. Getting the magnet spacing right was important so the cursor would couple reliably to the moving magnets under the board without feeling too sticky or heavy.
To match the slightly haunted aesthetic of the board, we wanted the cursor to look worn, not like a fresh 3D print. I first painted the entire surface with black acrylic, pushing paint into all the engraved lines and letting it dry to a uniform matte finish.
Once the black layer dried, I lightly sanded the raised surfaces to bring the white PLA back through, leaving the paint inside the engraved areas. This created a high-contrast, slightly distressed look where the symbols read clearly but the cursor still felt aged and handled.
With the cursor assembled, I worked with the gantry and electronics teams to test it on the actual board. We verified that the magnets stayed aligned over long moves, that the cursor did not snag on engraved letters, and that the viewing window correctly framed single characters as it stopped over them.
I also helped document machine week. Together with my teammates, I shot video of each subsystem, recorded interviews, and captured the first full runs of the Ouija machine in action. I focused on camera work and framing, aiming to show both the hidden mechanics under the table and the more theatrical experience on top.
That footage fed into two videos on the Architecture section page: one that walks through the technical breakdown of the machine, and another more narrative piece that leans into the spooky Ouija theme. Between building the cursor and helping to tell the story of the project, this week was a nice bridge between fabrication, interaction design, and media.
How can we process remote video footage?
This week I built a wireless node using an ESP32-CAM (AI Thinker) that hosts a live video feed over WiFi. The camera acts as the local input device (video frames), and the browser-based web server acts as the local interface/output (a live stream + controls). The larger goal is to integrate this into my project so the feed can be processed (e.g., object/color detection) and used to trigger a cue.
The ESP32-CAM becomes a WiFi-connected node by joining a local network and hosting a small web server. On the network, it’s reachable by its IP address (network address), which lets any device on the same WiFi open the stream in a browser. In other words: the camera is the sensor input, and the webpage is the output/interface—both served over WiFi.
The main friction points were hardware stability (secure connections) and serial configuration. I went through a few iterations of wiring and layout to make uploads consistent. A big “aha” moment was realizing the baud rate needed to match between the sketch and the Serial Monitor—once that was aligned, the output became readable and debugging moved much faster. Getting help (thanks Anthony) helped me isolate the issue instead of changing everything at once.
Once the stream was stable, I could open the live camera feed from any device on the same network. This is the key step for “remote video footage”: the camera is no longer local-only—its frames are accessible over the network and can be consumed by a browser or a processing script (e.g., OpenCV reading frames).
The same networking setup also supports the “message between two projects” requirement: once devices share a network,
they can communicate using simple web requests (HTTP) or sockets (UDP/TCP). In practice, this camera node can broadcast a lightweight
“event” (for example, a detection result like color=red or object=person) to a second project, which can then
respond (e.g., trigger a haptic cue, change a UI state, or log the event).
The key outcome is that the camera became a wireless networked node: the video feed is transmitted and accessed over WiFi through a browser interface, and the stream can be treated as remote footage for processing (frame-by-frame) and future “detection → cue” integration.
For interface week, I explored how an RC522 RFID reader could act as a physical "mode selector" for a browser-based GUI. The goal was to scan a tag, identify its UID, and use that signal to drive a simple web interface for selecting different haptic patterns (instead of relying on buttons or menus on the device itself).
The intended interaction was:
• The RC522 detects an RFID tag and reads its UID over SPI
• The microcontroller updates an internal "current UID" state
• A tiny local web interface uses that state to enable pattern selection (dropdown/buttons)
• The chosen pattern becomes the active output mode (for later integration with the haptics)
On the networking side, I had partial success: I was able to connect to a 2.4GHz hotspot and validate the "tiny server" idea. The main blocker this week was not the web UI—it was the RFID read reliability across different boards and wiring configurations.
A key issue was inconsistent tag detection. Even when the serial monitor showed "Ready. Scan a tag…", the reader sometimes wouldn't register the UID. I used a low-level SPI sanity check to confirm whether the RC522 was actually responding.
Below is the minimal sketch I used to confirm SPI communication by reading the RC522 VersionReg (0x37).
If the reader isn't wired correctly, you often see 0x00 or 0xFF instead of a real value.
#include <SPI.h>
#define SS_PIN 10
// RC522 register for version is 0x37
byte readReg(byte reg) {
digitalWrite(SS_PIN, LOW);
SPI.transfer((reg << 1) & 0x7E); // read
byte val = SPI.transfer(0x00);
digitalWrite(SS_PIN, HIGH);
return val;
}
void setup() {
Serial.begin(115200);
pinMode(SS_PIN, OUTPUT);
digitalWrite(SS_PIN, HIGH);
SPI.begin();
SPI.beginTransaction(SPISettings(4000000, MSBFIRST, SPI_MODE0));
delay(50);
byte v = readReg(0x37);
Serial.print("RC522 VersionReg = 0x");
Serial.println(v, HEX);
Serial.println("Expected: 0x91 or 0x92 if wiring is correct.");
}
void loop() {}
On the browser side, I drafted a simple "Haptic Pattern Selection" interface with a dropdown for patterns. The intent was to have the page respond to RFID state (e.g., tag present → unlock controls; tag removed → reset). Even though the RFID read reliability limited the full demo this week, the GUI structure is ready to connect to the live device state once the reader loop is stable.
→ View Haptic GUI CodeThis week clarified the separation between "interface logic" and "hardware reliability." The tiny-server + GUI approach is straightforward, but the RFID reader has to be completely dependable for the interaction to feel seamless. Next, I want to lock down a single board + stable wiring, then connect UID events to the GUI so that scanning a tag becomes a clean, physical gesture for selecting and triggering haptic patterns.
For wildcard week, I explored 2D inflatables using ultrasonic welding. I took the sleeve cut out from my final project and fabricated an inflatable air chamber to test how pneumatics would work at this scale and on the arm. This was a practical experiment to validate the feasibility of pneumatic actuation for my wearable project before committing to the final design.
The goal was to create a working inflatable chamber that mimics the form factor of my final project sleeve. By testing inflation and deflation cycles, I could evaluate the material behavior, seam strength, and overall effectiveness of pneumatics as an actuation method for a wearable garment. This prototype would help answer key questions about comfort, range of motion, and the mechanical properties needed for the final system.
I participated in the 2D inflatables workshop led by Alfonso and Erik, learning the ultrasonic welding technique for creating airtight seams in thermoplastic materials. Ultrasonic welding uses high-frequency vibrations to generate localized heat, melting and fusing the material layers together without adhesives or external heat sources.
Materials tested: The workshop provided several options including cheap nylon food-sealing bags for sketching, pure PA6 (Nylon 66) which is extremely strong and tears resistant, polyimide with a shiny deposited layer requiring directional welding, and colorful camping materials with TPU layers. TPU (thermoplastic polyurethane) proved to be the most stretchy and weldable material for my application.
Process setup: The welding stack consists of a silicone base layer, the material to be welded, and a Teflon top layer that conducts heat while allowing the vacuum system to function. The seam width is approximately 2mm with a 4mm kerf, requiring careful planning in the design phase.
Design considerations: Seam widths between 2-6mm work well for crumpling and flexibility. Rounded corners are critical—sharp corners cause the welding head to lift up, adding excess heat and potentially creating holes. The recommended welding speed is around 20mm per second for TPU materials.
I designed a 2D pattern based on the sleeve geometry from my final project. The pattern needed to account for seam allowances, inflation expansion, and the curvature of the arm when worn. I created the design in vector format (DXF) to ensure compatibility with the ultrasonic welding machine's path planning system.
The fabrication session required careful material handling—aligning the layers precisely, setting the correct welding parameters, and testing small sections before committing to the full seam. The high-pitch sound of the ultrasonic welder was notable, but the process itself is relatively safe with localized heat generation and no toxic fumes compared to heat sealing methods.
After welding the chamber, I tested inflation using compressed air to evaluate several performance factors:
Seam integrity: The ultrasonic welds held pressure well with no leaks at the seams. The rounded corners I incorporated prevented the stress concentrations that could have caused failures.
Material behavior: The TPU material exhibited good elasticity and recovery, inflating smoothly and deflating without creasing or permanent deformation. The stretchiness of TPU was particularly important for wearable applications where the inflatable needs to conform to body movement.
Scale and wearability: Testing the chamber on my arm confirmed that pneumatic actuation at this scale is viable. The inflation provided noticeable but not restrictive pressure, and the sleeve maintained its position during movement. This validated my approach for the final project.
During the workshop, I learned about an interesting hybrid technique: 3D printing directly onto vacuum sealing bags to create rigid attachment points or reinforcement structures. By clipping the bag to the print bed, you can deposit PLA onto the plastic surface, creating integrated hard points for pneumatic fittings, mounting brackets, or structural reinforcement without adhesives.
This wildcard week experiment provided crucial data for my final project design. I confirmed that:
• TPU is the ideal material for wearable pneumatic actuators due to its flexibility and weld strength
• Ultrasonic welding creates reliable, airtight seams suitable for repeated inflation cycles
• The pneumatic system can generate sufficient force without being uncomfortable or restrictive
• Pattern design must prioritize rounded corners and appropriate seam widths for durability
• Inflation and deflation timing will be fast enough for responsive interaction
With these findings, I can move forward confidently with pneumatic actuation in my final project, knowing the fabrication method is sound and the materials will perform as needed. The ultrasonic welding technique opens up possibilities for creating complex, multi-chamber systems that could provide more sophisticated haptic feedback or motion assistance in future iterations.
For additional examples and inspiration on inflatable structures, I referenced Yue Yang's Kiri Inflate work, which demonstrates sophisticated patterns and applications for 2D inflatables. The workshop materials and technical guidance from Alfonso and Erik were invaluable for understanding the nuances of ultrasonic welding parameters and material selection.