You found me! I am Annie Li. My research focuses on
Reinforcement Learning and Bayesian Optimization in
Large Language Models (LLMs) and machine control, and I am especially
interested in how humans collaborate with AI in areas such as prosthetics
and fabrication, where machines extend human ability.
This site documents
my progress in MAS.836How to Make (Almost) Anything taught by Neil Gershenfeld through
14 weekly projects and a final project, with notes written to be
clear enough to fully reproduce my work step by step.
I am reachable through email:
annieliy@mit.edu.
“Run toward the hardest problems. This approach has helped me to learn a tremendous amount from both success and failure.”
“We're so excited about technology. We can help turn the impossible into the possible.”
—— Lisa Su
“The people who are crazy enough to think they can change the world are the ones who do.”
—— Steve Jobs
Final Project — Marauder’s Map of MIT
The MIT Hackers Map! Tells you all the official and unofficial places you're at by live tracking. Maybe your professor is watching you...
In the future, I aim try to integrate features that tracks people, e.g. a proximity alert of your professor!
System Overview
The system continuously scans nearby Wi-Fi access points, compares their
BSSIDs and signal strengths against known location fingerprints,
and maps the result to a specific MIT building or landmark. Once a location
is detected, the TFT screen updates with a customized message and visual effect.
Schematic again for Seeed XIAO ESP32S3 driving 2.8" SPI TFT display (ILI9341).
TFT display soldered onto board, check finals week for my location tracking code!!
Location display.
Location Detection via Wi-Fi Fingerprinting
Instead of using GPS or cloud-based geolocation APIs, I implemented Google API (outdoors) + a
local Wi-Fi fingerprinting approach(indoors), which works reliably indoors
and does not require external services.
Each location (e.g. Building 38, Killian Court, Media Lab) is represented by
a small set of known Wi-Fi access point BSSIDs. When the ESP32 detects one or
more of these BSSIDs above a signal strength threshold, it infers the
corresponding location.
Arduino logic for scanning nearby Wi-Fi networks and matching BSSIDs.
In this section for example, Building 38 is identified by the strong and consistent networks of EECS Shop and EECS Lab.
Arduino logic for displaying matched position using boolean.
This approach allowed me to create a flexible, expandable location system:
adding a new building only requires collecting its Wi-Fi fingerprints and
defining a new detection function.
(This is a dropbox link, you (and anyone with this link) can download from it, ignore "can't load this file", that's just b/c dropbox don't support stl preview.) If you can not download it, please email me at annieliy@mit.edu.
TFT Output Design
The TFT display acts as the primary user-facing output device. When a location
is detected, the screen displays a welcoming message, the user’s presence,
and a themed visual effect.
Different locations trigger different color schemes and animations, reinforcing
the idea of a “living map” that reacts to the environment.
TFT display showing a detected location and custom message.
For example:
✅Building 38: Green text with matrix-style animation
🍀Killian Court: Formal welcome message and dome-inspired effects
🧠Media Lab: Glitch effects and high-contrast visuals
🎃Easter eggs: Fun proximity alerts (e.g. “Police Car on the Dome”)
Live Location Tracking!!
In the demo video below, I walk through different parts of campus while the device
dynamically updates its output based on the detected location.
I added some effects such as Matrix Rain, Glitch Effect, Pulsing dome so it looks cool! Check out the final display effect below:
Interaction Flow
The ESP32 continuously scans nearby Wi-Fi networks
Strong signals are filtered and matched against known fingerprints
A location is inferred based on detected BSSIDs
The TFT screen updates with location-specific text and animations
If no match is found, the system remains in an “unknown location” state
I 3D printed MIT letters and painted them with Acrylic colors.
My outer device uses 3D print parts as set up.
Final shape of device coming together.
HTMAA Convention - Annie's stand & all semester highlights!
Midterm Review Content + Plans & Task Status before Midterm
Project Direction: A compact handheld device that displays your approximate MIT location on a small OLED screen. The microcontroller (ESP32S3) receives simple coordinate or building-number data from a phone and prints phrases like
"You are near Building 26" or "East Campus" or "Lobby 10".
What it Does
ESP32 connects to a phone (BLE or WiFi) and receives lightweight location data.
Data is mapped to MIT areas (e.g., “Main Campus”, “E62 area”, “Building 32”).
Device displays location text on a 0.96" OLED (SSD1306).
Optional: capacitive touch to toggle views or refresh location.
All electronics sit on a custom-milled PCB fitted inside a stylish 3D-printed handle frame.
Progress So Far
Selected microcontroller: ESP32 (built-in BLE + easy OLED support).
Chosen display: SSD1306 miniature OLED (I²C).
Identified phone → ESP32 communication method (BLE or WiFi hotspot).
Tested OLED text rendering (“Labubulu!” demo).
Initial PCB concept: ESP32 + pads for OLED + power + optional touch zones.
Planned enclosure: handheld cage-style mount inspired by professional camera rigs.
Next Steps (Before Final)
Design and mill custom PCB: ESP32, I²C pins, USB power, optional capacitive pads.
3D-model the handle/body and print a first fit-test.
Write ESP32 firmware:
BLE service for receiving small location packets
Parsing coords → location labels
Display transitions and screen formatting
Write simple phone script/app to send location to ESP32.
Assemble PCB, attach OLED, test BLE connection.
Record 1-minute demo video + final documentation.
Hardware Overview
ESP32 Dev Module (BLE + WiFi)
0.96" OLED Display (SSD1306 I²C)
Custom milled PCB
USB-C power or LiPo battery + charger
Optional capacitive touch pads
3D-printed handheld “rig” enclosure
Status: Core architecture decided. OLED tested. BLE communication pathway defined. Ready for PCB design, enclosure modeling, and firmware.
Quick stickers to celebrate MAS.863! I traced the course marks and icons,
tuned stroke weights for vinyl, and cut them as high-contrast decals.
I experimented with ChatGPT to generate initial vinyl sticker mockups PNGs. Then I vectorized them and adjusted paths in inkscape in Inkscape, and prepare them for the vinyl cutter. Through this process, I learned the importance of clean outlines, kerf considerations, and simplifying shapes for weeding.
Reference artwork used for the vinyl traces.Final cut & not yet weeded stickers.
Process of tracing and preparing the design in Inkscape.
Process
Software & Setup.
Import the design into the vinyl-cutter software; connect the cutter to power and your computer.
Load Vinyl.
Feed the roll; choose Roll on the machine; align so the pinch rollers sit on the grit rollers.
Adjust Blade.
Set blade depth so it cuts the vinyl but leaves the paper backing intact.
Send Design.
Send the job from software. Set pixels to 200 (≥ 72; adjust as needed),
click Calculate, then Connect to the cutter and start the cut.
Unload & Apply.
Remove the sheet, weed the excess, apply application tape, then transfer to the surface.
Learning from Mistakes
In the bottom left corner, I made a vinyl cutter beginner mistake during designing the sticker "Parametric Design". To get the final product, I have to remove the little squares inside the net like structure bit by bit...
Therefore, use simple shapes & straight lines for vinyl cutter design.
We tested power/speed and measured kerf; our press-fit offset was +0.18 mm.
Parametric Press-fit Kit
Material: 3 mm cardboard
Parameters: thickness, tab_width, angle
Multiple assemblies: cube, lantern, lantern without handle
Extra: living-hinge arc panel (not flat) (didn't get to it this time; will try next time!)
Project Demo
My laser-cut design layout design notes.
Laser-cut design layout with parameters labeled from Fusion 360.
Design Process
I parameterized all pieces around thickness, tab_width, and the
overall panel width. The side panels are mirrored; the front and back share
geometry; and all slots subtract 2 × thickness from the internal clearances.
Tabs are evenly spaced and sized to press-fit with the measured kerf offset (+0.18 mm).
I iterated through test cuts to tune slot_clearance for cardboard springiness, then finalized
the handle, base, and top.
Project Demo — design → cut → assembly (Week 1)
First Time Laser Cutting - Bringing 2D Designs into 3D Reality
I have some previous experience designing 3D shapes in CAD, but I had never
actually built them in real life. This was my first time using a laser cutter,
and I was instantly hooked—I couldn’t stop wanting to learn more.
(Shout out to Jesse for his amazing teaching style;
I didn’t zone out for a single moment!) I’m officially addicted now.
I initially set my slot width to slightly more than 2 mm (2.94 mm), based on recommendations
from my research[2]. But when I went to the makerspace and measured the actual
cardboard thickness, I become concerned that the slots might not fit
properly with the tabs.
I joined
Tushar
and
Mariam
for my first time laser cutting. We helped each other in the laser cut process
(such as Tushar’s magnificent use of the Blue laser engraving) and in CAD
(such as slot width, cardboard thickness, and kerf)!
First Time Reaction
I was fascinated by how the laser cutter decided its path—it seemed to cut
here and there instead of following one continuous line. Small fires and little
bursts of smoke appeared from time to time, yet the cutter moved with incredible
precision. It was mesmerizing to watch!
Laser Cutting
Big shout out to
Jesse
, for walking me through the laser cutter & safety training detail by detail, and watching me do the whole process while giving instructions!
Quick Steps
Wear safety glasses.
Plug in the USB drive.
Open the USB and locate your .svg or .dxf file.
Open Inkscape → New document.
File → Import… and choose the file from the USB.
Select all (⌘/Ctrl + A) and scale to fit the page.
Use ⌘/Ctrl + mouse wheel to zoom.
Set stroke color by intent:
Red — Cut through
Blue — Engrave / score
Print (⌘/Ctrl + P) → click More settings.
Material: Cardboard
Set Power and Speed (Please see Test Result Notes below). Hit Apply and OK, then Print.
Put material in laser cutter. Set XY home by bringing the icon to the edge of pattern, check inside the laser cutter to see if the laser is out of material margin.
Optional Focusing: Place focal tool on the material. Manually lower the laser head or raise the machine's bed until the nozzle gently touches the top of the focus tool by pressing Ʌ/V on laser cutter to adjust for Z-Axis Height(If there are people using the laser cutter before you, no need to adjust Z Height). Remove the focus tool from under the nozzle.
Last Check: Gas Assist and Laser Cutter is turned on.
Initiate Cutting: Start the cutting job from the control panel.
Finishing: Remove Parts, clean the area. Apply any necessary finishing touches, such as cutting, painting, or assembly.
First Test Cut
Test Results
Setting
Value
Observation
Power
Red 100%
—
Blue 100%
First layer of cardboard cut through
Blue 75%
Engraving color too light
Blue 85%
Perfect
Speed
Red 30%
First cut: did not cut through. Second cut: smokes & small fires, still incomplete; needed knife.
Red 20%
Mostly cut through, but second layer needed knife. Result resembled perforated Amazon box.
Red 15% or lower
Recommended to test
Blue 100%
—
Set Power and Speed
Set XY Home
Learning from Mistakes
My second attempt at laser cutting did not work for my final product, because my dimensions and ratios are off once I imported and adjusted them in the laser cutter computer.
I reused the card board which has some of it cut out. In my second cut, the pattern over lapped with the cut out part from trial 1, causing the laser to shoot through the honeycomb structure underneath the material. I will make sure to aviod next time, and I have to do a second cut to get the shape!
Kerf
In my test trial, my tabs & slots are a lose fit due to kerf from cutting a second time.
In my actual laser cutting session, my tabs are too big (all the bad thoughts of having to go back to fusion, my life flashed before my eyes), but luckily it is cardboard, I start pinching the tabs until they fit into my slots!
Engraving in Laser Cutting
Engraving vs. cutting in Inkscape and the lab software:
I follow the lab convention Blue = Engrave, Red = Cut, and set
power/speed based on the table above.
Steps
Separate geometry by operation. Select paths you want to engrave (fills, hatch, logos) and those you want to cut (outer profiles).
Assign stroke colors:
Blue (#0000FF) = Engrave
Red (#FF0000) = Cut
In Inkscape: Object → Fill and Stroke → Stroke paint. Set Fill = None. In Stroke style, use a thin width (e.g., 0.1 mm) for clarity.
Check real size.File → Document Properties to confirm page units (mm/in). With your artwork selected, verify W/H on the toolbar. Lock proportions and type exact dimensions if needed.
Export for the laser. Save as Plain SVG (or PDF if your lab prefers). Keep colors intact.
Map colors to processes in the laser software.
Blue (Engrave): set Speed high / Power lower for surface mark.
Red (Cut): set Speed lower / Power higher for through-cut.
Use the lab’s recommended values from the table above, then run a small test square.
Focus, frame, test. Check focus height, run a frame pass, and do a 10–20 mm test to confirm mark depth and cut-through without charring.
Engraving setup & color-layer mapping workflow.
Engraving mistake: tab dimensions mismatch with the front panel.
What Went Wrong (and Fix)
The engraved tab should match the front tab dimensions, but they differ here because I didn’t
re-check sizes in Inkscape before cutting. Fix: verify units and W/H for both parts,
use Align and Distribute to compare, and run a paper print check before sending to the laser.
Quick Pre-Flight Checklist
Units correct (mm or inches) and page size set.
All cut paths = Red, all engrave paths = Blue, fill = none.
No duplicate/stacked paths (Edit → Select Same → Stroke color, then Path → Combine as needed).
Dimensions match your sketch/CAD; do a paper print at 100% to confirm fit.
Test swatch for power/speed per the table; adjust if edges char or don’t go through.
Write & test programs that interact with local I/O (LEDs, touch, OLED).
Communicate data to the computer (serial + live plotting).
Extra: I have made plans to try PlatformIO and ESP32, but I don't know where to start yet.
Local Input & Output
1) LEDs
According to Quentin Bolsée's example, it defined pins for red, green, and blue LEDs and blinked them in sequence to verify digital GPIO control.
2) Touch Sensing on Copper Board
The copper pads on my custom board act as capacitive sensors. When a finger touches a pad, the effective capacitance increases. The RP2040 measures this by timing charge/discharge cycles; the measurement rises on touch, and plotting those values yields a real-time graph of touch intensity.
3) OLED Display (I²C)
Using the Adafruit SSD1306 library, I initialized a 128×64 display and printed text (e.g., “Labubu!”). This validates I²C communication between the microcontroller and display.
Communication with the Computer
The sketch streams touch values over Serial. In the IDE’s plotter, touches show up as spikes, satisfying the “interact + communicate” requirement.
Physics Rationale (Why Numbers Rise When I Touch)
A capacitor stores charge: Q = C · V. The copper pad and ground form a capacitor with air (and board materials) as the dielectric. Your body behaves like a large conductive object near ground; touching the pad increases the pad’s capacitance C. The microcontroller senses this via timing, so the reported value increases—what you see as rising numbers and peaks in the plot.
Reflection
This week tied together sensing (capacitive touch), processing (RP2040), and output (LEDs/OLED) with simple physics. Arduino made it straightforward to connect hardware effects to visual feedback.
My Week 3 3D printing project is inspired by the traditional
玲珑球 (Línglóng Ball/Nested Ball), an extraordinary hand-crafted artifact
from the Qing Dynasty. These intricate puzzle balls, carved from a
single piece of ivory, often contain multiple concentric spheres
nested inside each other, each freely rotating. They reflect the
pinnacle of Chinese artisanship: precision, patience, and the
pursuit of impossibility.
What fascinates me is how these artifacts parallel the challenge of
non-subtractive design in digital fabrication. Just like artisans
achieved the impossible by hand, 3D printing allows us to create
nested and interlocked geometries that machines cannot carve
subtractively. My design takes direct inspiration from this tradition,
reimagined through modern additive manufacturing.
I also drew inspiration from Lingdong, an alumnus of this class,
who created a modern reinterpretation of the 玲珑球/Nested Ball in his own Week 6
project.
Fusion 360 Design
Here is my own attempt to reinterpret the Nested Ball using Fusion 360.
I modeled nested shells with Voronoi patterned cutouts uisng Meshmixer, capturing the spirit
of the original hand-carved puzzle balls, while showcasing what 3D
printing makes uniquely possible.
Torus Knot
Finished Torus Knot checking for continuity.
Inspiration
This piece reinterprets the flowing, braided rhythm of traditional Chinese jade bracelets known as
绞丝 / 麻花镯. The form is a continuous tube that follows a circular path and twists
multiple times, creating a braided silhouette that reads as soft silk but is printed as a single body.
This design is inspired by the braided jade bracelets (绞丝纹 / 麻花手镯) of the Qing Dynasty,
important artifacts in the Shanghai Museum that exemplify the elegance of rhythmic linework
and the technical mastery of jade carving. These bracelets are considered standard forms of
Qing court art, with the twisted silk-like pattern representing both aesthetic refinement and
extraordinary craftsmanship.
My STL explores the rarer, high-craft variant: the bracelet is effectively
hollow and interlocking—the apparent strands weave around each other without being fused inside,
echoing museum pieces where carvers created the illusion of independent helices. Additive manufacturing lets
me realize this as a single print, while preserving the visual logic of interlaced jade cords.
Specs
Ring outer Ø: ~80–100 mm (parametric)
Tube Ø: ~2.5–3.5 mm
Twist: 720°–1440° (2–4 full turns around the ring)
Optional braided look via Circular Pattern (8–16 copies)
Modeling Steps (Fusion 360)
Full Fusion 360 Tutorial
On the XY plane, sketch a 30 mm diameter circle with its center
positioned 100 mm from the origin (this is the base path reference).
On the perimeter of that larger circle, sketch a 10 mm diameter circle (strand profile).
Use Trim to remove the portion of the large circle that lies inside the small circle, leaving two distinct circles.
Open Modify → Move/Copy. Select the entire sketch and set the pivot to the origin.
Click Confirm (to lock the pivot at the origin).
Make a copy of the sketch and rotate it –5° about the Y-axis (pivot at the origin).
Repeat the copy/rotate operation to create six additional copies, each rotated by an extra 5°.
Stop when you have 8 sketches total evenly distributed around the origin.
For the second sketch (the one at –5° about Y), select only its small circle.
In Modify → Move/Copy, set the pivot to the center of its corresponding large circle. Align the pivot by draging rotation manipulator (the small circle by the arrows) to the center of the small circle of the sketch.
Rotate that small circle by 45° about the Z-axis.
Repeat the previous step for the remaining small circles so they are sequentially offset by
90°, 135°, 180°, … relative to the first sketch’s small circle.
Use Trim to remove all remaining large circles. You should now have a ring of
small profile circles with progressive angular offsets.
With all profiles complete, select the entire sketch group and open Modify → Move/Copy.
Set the pivot at the origin and confirm. Create copies of the group,
rotating each by 40° increments until the sequence closes the full 360°.
Use Create → Loft and sequentially select the small circles in 3D to connect them into
a smooth, continuous strand around the ring.
Tip: If a loft fails, check profile order and spacing; adjust angles or add guide profiles as needed.
With the strand body created, open Modify → Move/Copy, set pivot to origin, confirm,
and make a copy rotated by 10°.
Create three copies total (original + 3), each offset by 10°, to form
four interwoven strands. Inspect continuity, then export as .stl.
Print note: Lay the ring flat; 0.16–0.20 mm layers, 3 perimeters, 10–20% gyroid infill.
Print Plan
Material: White PETG (The lesson is, the more rigid the material, the easier to remove support. Also, PETG is shiny!)
Nozzle: 0.4 mm • Layer: 0.16–0.20 mm
Perimeters: 3 • Infill: 10% gyroid
Supports: Organic, Everywhere.
Orientation: Lay the ring flat for strength and surface finish
Project Demo | 3D Printed Torus Knot Bracelet
Organic support within the independent spirals, surprisingly easy to remove.
Using Creality Ferret 3D, I scanned a banana I retrieved from the Banana Lounge.
Initial Scan of Banana (Hollow) and Caesar Statue by Creality Ferret Pro.
Ready to Print Solid Banana STL
Following Anthony's advice, I tried scanning the banana on its own for a few times, and the Creality Ferret Pro just couldn't pickup the entire banana while it is spinning. Therefore, I placed the little Caesar statue next to it and get a full complete scan of the banana.
I then imported the scan into Meshmixer, using Plane Cut to slice off the Caesar and the base, leaving a hollow Banana; then I use Meshmixer to make it solid, below are my steps:
Import and select Plane Cut.Adjust the angle of Plane Cut.
Filling a Hollow 3D Scan in Meshmixer
Open Meshmixer and go to
File → Import to load scanned .stl or .obj model.
Click the model to select it in the Object Browser (bottom-right corner).
Go to Edit → Make Solid to generate a solid, printable version of your scan.
In the Make Solid panel, adjust the parameters:
– Solid Type: Accurate (recommended for detailed scans)
– Solid Accuracy: Increase for smoother details
– Mesh Density: Increase for finer mesh
– Offset Distance: Set to 0 mm for same surface, or positive to thicken walls
For mine, I used default.
Once all the parameters are checked, hit cut.Make solid, this takes a few seconds.
Wait for the preview to update. Check that holes are filled and the interior is solid, then click
Accept.
To fix any remaining holes, go to Analysis → Inspector and click
Auto Repair All until the mesh is fully watertight.
Finally, export the new solid model via File → Export → STL and it’s ready for 3D printing.
Ready to Print Solid Banana STL
Printing Parameters
Big thanks to
Anthony
for walking me through every useful 3D printing parameter, it was very straightforward! I am excited to print some more next! OvO
(This is a dropbox link, you (and anyone with this link) can download from it, ignore "can't load this file", that's just b/c dropbox don't support stl preview.) If you can not download it, please email me at annieliy@mit.edu.
Week 4 — Electronics Design
This week, I designed a custom PCB around the Seeed XIAO ESP32S3
to drive a 2.8" SPI TFT display (with optional touch) for my
location-tracking project.
Connector: 1×14 header footprint to match the TFT module pins
Power: 5V and GND routed to the display connector
Net classes: thicker traces for Power, standard for Signal
Schematic highlights
I started by creating clear net labels (SPI_SCK, SPI_MOSI,
SPI_MISO, TFT_CS, TFT_DC, TFT_RST,
5V, GND) and connecting them between the XIAO pins and the 14-pin
display connector.
Schematic: XIAO ESP32S3 mapped to 2.8" TFT SPI connector pins.
PCB layout + routing
After importing the netlist into PCB Editor, I placed the XIAO footprint and the 1×14 connector,
then routed the SPI + control lines. I also set up net classes to route power thicker than signals
and drew the board outline on Edge.Cuts.
PCB: component placement and routed traces (power + signal nets).
Checks (ERC / DRC)
I ran ERC to confirm the schematic connections, then ran DRC after routing. The remaining flags were
mostly silkscreen-related (text size / silkscreen clipping), which do not affect electrical function.
DRC: final check after routing (warnings mainly from silkscreen constraints).
Process recap
Labeled SPI + control nets on the XIAO pins
Added a 1×14 connector symbol and matched labels pin-by-pin to the TFT
Ran ERC and fixed “label not connected” issues by completing the connector mapping
Imported into PCB Editor and placed footprints
Set net classes (Power thicker than Signal) and routed traces
Drew the board outline on Edge.Cuts and ran DRC
Demo / build log
Short video walkthrough of the board design and routing:
(This is a dropbox link, you (and anyone with this link) can download from it, ignore "can't load this file", that's just b/c dropbox don't support stl preview.) If you can not download it, please email me at annieliy@mit.edu.
Extra credit: case design for the screen + PCB
As extra credit, I designed a 3D-printable enclosure that
fits both the TFT display and the PCB, with cutouts for the screen and
sufficient clearance for assembly. I measured key dimensions in CAD and
verified fit before slicing.
After: nicely enclosed. I used four 40 screws and nuts to secure the screen onto the drilled holes; case material is PLA.
Before: PCB and TFT display with no case.
I also cut a box which I glued to the screen case for full enclosure.
Case design in CAD.
Prusa slicer preview.
Week 5 — Electronics Production
Electronics Production
This week focused on the full electronics production workflow: preparing a Fab-ready PCB design,
generating milling files from KiCad, setting up the milling machine, changing tools, and fabricating
the final copper board. The goal was to translate a finished PCB design into a physical circuit board
using the lab’s CNC milling process.
Big thanks to the GOAT
Anthony
for teaching us how to use the milling machine — patiently, repeatedly, and with heroic restraint,
despite having explained the same steps approximately a thousand times.
I converted this ...
to this.
and this!
and this!
Finished PCB after milling and removal from the machine.
Preview
Exporting Fab-Ready Files from KiCad
I began by exporting fabrication-ready files from KiCad. This included carefully checking that the
correct layers were enabled:
Edge.Cuts — for the board outline
F.Cu — for copper traces
PTH — for plated through holes
These layers were exported and verified to ensure compatibility with the milling workflow used in
the lab. I double-checked trace widths, clearances, and overall board dimensions before proceeding.
Fab-ready PCB files opened and previewed before milling.
Milling
Tool Selection and Milling Parameters
For milling, I selected two different end mills depending on the task:
1/64" end mill — for fine copper traces
1/32" end mill — for outline cuts and larger features
My board is a 3cm by 7cm board, total estimated milling time by machine is 20 minutes.
The board was secured to the milling bed using strong double-sided tape on the back of the copper
board to ensure it remained flat and stable throughout the process.
Double-sided tape applied to the back of the copper board before mounting.
Tool Changing Process
I first milled the traces using the 1/64" end mill. After the trace milling was complete, I
paused the machine and changed tools to the 1/32" end mill in order to mill the board outline.
During the tool change, I used two wrenches to safely remove and install the end mill, and used
wire clips to allow the machine to re-probe the bed accurately after the tool swap. This ensured
correct Z-height calibration for the second milling pass.
⚠️Audio maybe loud in the youtube videos below.⚠️
The first video below exists primarily so Neil can verify that I did, in fact, operate the machine myself.
After that, I’ve included a close-up tutorial video for everyone else, featuring the GOAT
Anthony
in original speed, no edits, no shortcuts, and no mercy‼️ for improper tool changes.
Milling Process
⚠️Audio maybe loud in the youtube video below.⚠️
Cleaning and Board Removal
After milling completed, I vacuumed the board and surrounding area to remove copper dust and debris.
Once cleaned, I carefully pried the finished board off the milling bed, taking care not to bend or
damage the copper traces.
⚠️Audio maybe loud in the youtube video below.⚠️
Finished PCB after milling and removal from the machine.
Soldered pcb with ESP32S3 and eight 0 ohm resistor as jump wires.
I tested the ESP32S3 with a simple LED test. Then I went onto soldering the 2.8 inch TFT display to my pcb.
TFT display soldered onto board, check finals week for my location tracking code!!
TFT display soldered onto board, check finals week for my location tracking code!!
Week 6 — Computer-Controlled Machining
CNC-Milled Stool
For Machine Week, I designed, milled, and assembled a meter-scale stool
using CNC routing. The stool is constructed from sheet stock and assembled
using interlocking geometry.
Big thanks to
Anthony
and
Dave
at the EECS shop for turning a big, loud, slightly terrifying CNC machine into something approachable—and for patiently walking me through how routers, bits, saws, and assorted sharp spinning things actually turn wood into furniture.
Finished stool on work bench.
And it passes the weight test! It can hold heavier stuff than this water bottle.
Extra Credit: curved surface
This project explore structural press-fit design and
form-driven CNC geometry. In particular, I incorporated
curved surfaces along the stool legs and edges to improve ergonomics
and visual softness, satisfying the curved-surface extra credit.
(This is only partial view of CAD, complete one below) Curved edge design on stool.
Wood board 3D preview.
Assignment Checklist
✔ Designed, milled, and assembled a large (~meter-scale) object
✔ CNC-milled from sheet material
✔ Curved surfaces included (leg and edge profiles)
✔ Three-axis toolpaths used
Final Result
Finished CNC-milled stool. The structure is stabilized by horizontal stretchers
and interlocking joints.
Design Intent
The stool is designed around a simple, repeatable frame:
four vertical legs connected by horizontal stretchers and topped
with a flat seat panel. The geometry emphasizes structural clarity
while allowing the CNC process to define the final form.
Rather than keeping all edges rectilinear, I intentionally introduced
curved outer profiles along the legs. These curves soften the
visual weight of the stool and improve hand-feel, demonstrating how
subtractive digital fabrication can still produce expressive forms.
CAD & CAM Workflow
First, I draw th outlines and design on Inkscape; then I exported it as DXF to Fusion 360.
Line Design in Fusion.
In Fusion 360, I used extrude to make DXF 3D.
CNC toolpath preview before machining.
Modeled the stool in CAD with material thickness 11mm defined as a parameter,
allowing easy tolerance adjustment.
Designed press-fit joints sized based on measured stock thickness.
Added curved leg profiles using sketch splines and fillets.
Generated CNC toolpaths including 2D contours and
three-axis finishing passes for curved surfaces.
CAM Parameters
I created two 2D contour toolpaths for each manufacturing setup.
Compared to the default CAM settings, I modified the following parameters:
Setting
Value
Tool
3/8" Flat Endmill
Spindle Speed
12,000 rpm
Feedrate
120 ft/min
Tab Width & Height
0.375"
Bottom Height
−0.03" (to ensure cutting fully through the stock)
The slight negative bottom height ensured a clean cut-through across the full
thickness of the OSB board.
CNC Fabrication
The stool was milled from a single sheet of material using a CNC router.
All parts were cut in a single setup.
All parts after milling, before assembly.
A slightly deeper final pass ensured clean cut-through.
Tabs were removed manually and edges were lightly sanded.
Assembly
The stool assembles through interlocking joints and friction fit.
The horizontal members lock the legs in place and prevent racking.
Finished stool with the pieces cut out.
Assembled stool, it supports the heavy bag.
Failures & Fixes
One issue I encountered during fabrication was related to the tab geometry.
The original tab length did not fit into the slots as I had expected based on the CAD model.
To resolve this, instead of relying on a single long tab, I switched to using
three shorter tabs stacked vertically. Each tab corresponded to the
standard OSB board thickness of approximately 11 mm, resulting in a
combined engagement height of 33 mm (11 mm × 3). I wasn't able to avoid using hot glue; I used hot glue inside the slots and in between tabs to secure them more.
Reflection
This project reinforced the importance of material measurement,
tolerance testing, and toolpath strategy in CNC fabrication.
Small changes in joint sizing had a large impact on assembly quality.
If I were to iterate further, I would refine the curved profiles
using additional finishing passes and explore alternative seat
geometries.
This week, I designed and soldered a new pcb board which should drive a 1.8 inch TFT screen,
specifically for the Input Devices assignment. However, the screen was connected correctly but TFT display did not work.
(Sorry Neil!)
I then used one touch sensor GPIO (D1) on
Seeed Studio XIAO ESP32-S3, to give a touch sensing reading on my serial monitor.
My KiCAD skills have significantly improved; No cross paths and jumper wire resistors in sight!!
Milled.
Soldered and connected to 1.8 inch TFT (Left), compare to final project (Right) board which is designed to connect to 2.8 inch TFT.
ESP32-S3 capacitive touch on GPIO 2 (D1)
The Seeed XIAO ESP32-S3 exposes several GPIOs that support capacitive touch sensing. I used
GPIO 2 (D1) in this test (any touch-capable GPIO would also work).
The ESP32-S3 measures a capacitance-related value on that pin; touching the pin/pad with a finger increases the
effective capacitance, producing a clear change in the reading.
How the test works
Input: finger touch on the touch-capable GPIO (GPIO 2 / D1)
Measurement:touchRead() returns a live numeric value
Output for verification: Serial Monitor prints values continuously
Code (Arduino)
// Capacitive Touch Test on Seeed XIAO ESP32-S3
// Pin used: GPIO 2 (D1) in this build (touch-capable pin)
void setup() {
Serial.begin(115200);
delay(1000);
Serial.println("ESP32-S3 Capacitive Touch Test (GPIO 2 / D1)");
}
void loop() {
// On many cores you can pass the GPIO number directly.
// If your core requires T0/T1 style enums, map GPIO 2 accordingly.
int val = touchRead(2); // GPIO 2 (D1)
Serial.println(val);
delay(200);
}
Note: Depending on your ESP32 core/library, touchRead() may accept either a GPIO number
(as shown) or a touch-channel macro. The key behavior is the same: you should see a stable baseline value and a
noticeable shift when touched.
The TFT backlight reliably turned on, but the display content did not render correctly.
I suspect a driver/compatibility mismatch (i.e., the TFT module is not the exact controller I
assumed), because I tested multiple libraries and initialization variants and still could not get stable pixels.
Week 8 — Output Devices
This week, I built an interactive, location-aware output system
using a custom PCB (Seeed XIAO ESP32S3) driving a
2.8" SPI TFT display. The device detects where I am on
MIT campus using Wi-Fi fingerprinting and displays
context-aware messages, animations, and “Easter eggs” directly on the screen.
Big thanks to the mighty
Quentin
for teaching me how to display on TFT screen using Arduino!!
TFT display soldered onto board, check finals week for my location tracking code!!
System Overview
The system continuously scans nearby Wi-Fi access points, compares their
BSSIDs and signal strengths against known location fingerprints,
and maps the result to a specific MIT building or landmark. Once a location
is detected, the TFT screen updates with a customized message and visual effect.
Schematic again for Seeed XIAO ESP32S3 driving 2.8" SPI TFT display (ILI9341).
TFT display soldered onto board, check finals week for my location tracking code!!
Location display.
Location Detection via Wi-Fi Fingerprinting
Instead of using GPS or cloud-based geolocation APIs, I implemented Google API (outdoors) + a
local Wi-Fi fingerprinting approach(indoors), which works reliably indoors
and does not require external services.
Each location (e.g. Building 38, Killian Court, Media Lab) is represented by
a small set of known Wi-Fi access point BSSIDs. When the ESP32 detects one or
more of these BSSIDs above a signal strength threshold, it infers the
corresponding location.
Arduino logic for scanning nearby Wi-Fi networks and matching BSSIDs.
In this section for example, Building 38 is identified by the strong and consistent networks of EECS Shop and EECS Lab.
Arduino logic for displaying matched position using boolean.
This approach allowed me to create a flexible, expandable location system:
adding a new building only requires collecting its Wi-Fi fingerprints and
defining a new detection function.
(This is a dropbox link, you (and anyone with this link) can download from it, ignore "can't load this file", that's just b/c dropbox don't support stl preview.) If you can not download it, please email me at annieliy@mit.edu.
TFT Output Design
The TFT display acts as the primary user-facing output device. When a location
is detected, the screen displays a welcoming message, the user’s presence,
and a themed visual effect.
Different locations trigger different color schemes and animations, reinforcing
the idea of a “living map” that reacts to the environment.
TFT display showing a detected location and custom message.
For example:
✅Building 38: Green text with matrix-style animation
🍀Killian Court: Formal welcome message and dome-inspired effects
🧠Media Lab: Glitch effects and high-contrast visuals
🎃Easter eggs: Fun proximity alerts (e.g. “Police Car on the Dome”)
Demo
My wifi scanning script for ESP32S3.
This displays nearby WiFi SSIDs live!
Concept
GPS indoors is unreliable, so for my final project I used a practical alternative:
WiFi fingerprinting. As I walked around campus, I recorded a small set of
“stable” SSIDs/BSSIDs seen consistently at specific locations. Later, my final project device
checks the current scan and matches it against those fingerprints to infer where I am.
What I record per access point
SSID: human-friendly network name
BSSID: AP MAC address (more stable than SSID alone)
RSSI: signal strength (helps filter weak / far networks)
How it plugs into my final project
ESP32 scans WiFi periodically
Matches against location fingerprint lists
Displays “Building 38 / Media Lab / …” on TFT
Live Location Tracking!!
In the demo video below, I walk through different parts of campus while the device
dynamically updates its output based on the detected location.
I added some effects such as Matrix Rain, Glitch Effect, Pulsing dome so it looks cool! Check out the final display effect below:
Interaction Flow
The ESP32 continuously scans nearby Wi-Fi networks
Strong signals are filtered and matched against known fingerprints
A location is inferred based on detected BSSIDs
The TFT screen updates with location-specific text and animations
If no match is found, the system remains in an “unknown location” state
Reflection
This week pushed me to think about output devices as interfaces,
not just displays. By tying the TFT output directly to sensed environmental
data, the screen becomes an active participant in the system rather than
a passive indicator.
The Wi-Fi fingerprinting approach also highlighted how much contextual
information already exists in everyday signals, and how embedded systems
can leverage them creatively without relying on heavy infrastructure.
Week 9 — Molding and Casting
This week, I made a silicone Labubu from a 3D print mold!
Big thanks to
Jesse
,
Aijia
, and
Anthony
for great advice along the way and helped making this possible!
Molding
Labubu 3D print mold negative model in fusion.
Labubu 3D print mold in PLA.
I printed a negative mold of a labubu. Material is PLA (Layer = 2, Layer Height: 0.2-0.4mm, no support) They have registers and can snap together nicely.
(This is a dropbox link, you (and anyone with this link) can download from it, ignore "can't load this file", that's just b/c dropbox don't support stl preview.) If you can not download it, please email me at annieliy@mit.edu.
Pouring Holes
I use a flush cutter to create one bigger (which I later learned that it is NOT big enough!) hole for pouring, and one small hole for air to come out as I pour. The flush cutter is great, it cuts through PLA like butter!
Hot Glue Sealing
Per Jesse's advice, I used the hot glue gun to seal the mold, which prevents potential leakage. Hot glue can be later removed with alcohol.
Learning from Mistakes
If you have a closed mold like mine, please read this!! The pouring hole I created was too small, which made the pouring process very difficult. Any attempt to pour a larger flow of silicone caused the material to overflow from the hole. I had to stop frequently to wipe away the excess and then pour again very carefully, controlling the stream to be extremely thin. As shown, the opening is smaller than half the size of a thumbnail. I recommend making a pouring hole at least the size of a thumbnail. Because the small opening only allowed a very slow flow, it took me approximately 35 minutes to pour about 150 mL of silicone. The only silver lining was that the slow, steady pour prevented any air bubbles from forming.
Smoothing the layer lines: Bee's Wax Coating
I used the air heater to melt bee's wax, first apply a somewhat thin layer to the mold surface using the small brush, then use the air heater to melt the wax on mold surface, let it flow around to get a even, thin coat, then drip the excess wax back down to the plate.
Although as my first try the coating is not perfectly even, it does the job to blur a majority of the layer lines!
Casting
Mixing and Casting Silicone
Safety:Wear gloves and an apron to protect your hands and clothing from silicone residue.
Preparation:Find two clean stir sticks and two large empty mixing cups.
Components: Locate bottles labeled Component A and Component B.
Stir each thoroughly before pouring to ensure even consistency.
Measure Component A: Estimate half the total silicone volume needed for your mold
and pour that amount of Component A into the first cup.
Measure Component B: Estimate the same half-volume and pour Component B
into the second cup.
Verify Volumes: Ensure Components A and B are roughly equal in volume.
Stir each cup thoroughly.
Combine: Slowly pour Component B into Component A.
Mix: Mix thoroughly and consistently until uniform.
Avoid harsh movements that trap air bubbles — incomplete mixing may prevent proper curing.
Pour Preparation: Pinch the edge of the cup to form a pointed spout
for a controlled, precise pour.
Pouring Technique:
For closed molds, pour slowly and patiently to avoid overflow.
For open molds, start at one corner, let the silicone flow and settle to avoid air bubbles,
then continue pouring the rest evenly.
Timing: Complete pouring within ≤ 40 minutes after mixing.
Beyond this, silicone may become too viscous to pour properly.
Curing: Place the filled mold under a heat lamp
and allow it to cure for at least 4 hours. Longer curing yields better results.
Demolding
First, I used alcohol to remove the hot glue sealing. I was briefly worried because the silicone outside my mold was sticky and not cured. Then I used a screw driver to pry open the mold (moment of truth!).
The finished product looked great after removing excess silicone! It cured nicely. The mold is also reusable, just re-apply bee's wax and hot glue.
To resemble the real Labubu (which is not green), I touched up the silicon labubu with acrylic pain (I used 3 main colors: Red + Green = Brown, Skin, White). It wasn't easy because acrylic paint don't stick onto silicon. However, after it dried, I applied a thick layer, and it looked fine! My silicon labubu now joins the lovely labubu family.
Acrylic painted silicon Labubu.
Seems familiar? Have you noticed my favicon? 😝
Week 11 — Networks & Communications
For Networks & Communications week, I built a wireless “scanner node” on an ESP32 that
live-scans nearby WiFi networks and prints SSID/BSSID/RSSI. I then used the stable SSIDs
as indoor “location fingerprints” for my final project: when the device sees familiar networks
(e.g., EECS Labs/Shop for Building 38; Media Lab SSIDs for Media Lab), it displays the detected
location on my final project PCB + TFT screen.
Assignment Requirements Checklist
Design, build, and connect wired or wireless node(s)
I used an ESP32 (Seeed XIAO ESP32S3) as a wireless node that performs active WiFi scanning.
With network or bus addresses
The scan outputs network identifiers: SSID (network name) and BSSID (AP MAC address),
plus RSSI. BSSID serves as a stable address-like identifier for each access point.
And local input and/or output device(s)
Output is shown live via the Serial Monitor during data collection, and the same “fingerprints”
are later used as an output on my final project (location name rendered on a TFT screen).
Demo
My wifi scanning script for ESP32S3.
This displays nearby WiFi SSIDs live!
Concept
GPS indoors is unreliable, so for my final project I used a practical alternative:
WiFi fingerprinting. As I walked around campus, I recorded a small set of
“stable” SSIDs/BSSIDs seen consistently at specific locations. Later, my final project device
checks the current scan and matches it against those fingerprints to infer where I am.
What I record per access point
SSID: human-friendly network name
BSSID: AP MAC address (more stable than SSID alone)
RSSI: signal strength (helps filter weak / far networks)
How it plugs into my final project
ESP32 scans WiFi periodically
Matches against location fingerprint lists
Displays “Building 38 / Media Lab / …” on TFT
Hardware
Parts
Seeed XIAO ESP32S3 (wireless node)
USB-C cable (power + programming)
Computer with Arduino IDE (Serial Monitor output)
Wiring
No external wiring required for WiFi scanning itself. The ESP32 only needs power over USB-C.
(In my final project, the same ESP32 is wired to a TFT for the location display.)
ESP32 board used for scanning.
Later integration: final PCB + TFT that displays detected location
Software
Workflow
Program ESP32 with a WiFi scan sketch
Open Serial Monitor to view live scan results
Walk around campus and record stable SSIDs/BSSIDs per location
Paste those fingerprints into my final project code for matching
As I moved around campus, I noted a small set of networks that consistently show up in each location.
This creates a lightweight indoor “signature” for each building.
Tip: SSIDs can be shared across buildings, so I prefer to record BSSIDs (AP MAC addresses)
when possible. RSSI helps ignore very weak signals from far away.
How This Becomes Indoor Location Tracking
Matching logic (high level)
Scan nearby WiFi networks
Check if any BSSID/SSID matches a known location list
Optionally require RSSI above a threshold
Pick the best match (or “Unknown”)
Final project output
Location name displayed on TFT
Updates live as I move around indoors
Same ESP32 node; now paired with a display
Results
The ESP32 reliably scans and lists nearby WiFi SSIDs live in real time.
Recording a few stable SSIDs/BSSIDs per location created usable indoor fingerprints.
This directly supports my final project’s indoor location detection and display pipeline.
What I’d improve next
Prefer BSSID-based matching everywhere (less ambiguity than SSID-only).
Implement scoring (count matches + RSSI weighting) rather than first-match wins.
Add hysteresis / smoothing to avoid location “flicker” when walking past boundaries.
Week 12 — Interface & Application Programming
Touch-Driven Application on ESP32-S3
For Interface and Application Programming week, I wrote an
embedded application that interfaces a user with a physical
input device I made using the ESP32-S3’s built-in capacitive touch sensor.
The application translates human touch into application-level state changes
and provides immediate, human-readable feedback through a serial interface.
Rather than printing raw sensor values, the firmware implements
event detection, state management, and user feedback,
forming a complete interaction loop:
user action → sensing → application logic → output.
Application behavior
The application treats the capacitive touch pin as a
touch button. Each touch event advances the system through
a set of predefined modes. The current mode is printed to the Serial Monitor,
making the interaction explicit and observable.
Input: Human finger touching GPIO 2 (D1), a touch-capable pin
Processing: Thresholding + edge detection + mode state machine
Output: Human-readable messages in the Serial Monitor
This turns a raw hardware capability (capacitive sensing) into an
interactive application that a user can control in real time.
Live user interaction
The following Serial Monitor output (captured in a video) shows the application
responding to repeated touch events. Each touch advances the mode and immediately
reports the new state to the user:
04:36:55.720 -> Current mode: DEBUG
04:36:58.004 ->
04:36:58.004 -> Touch detected!
04:36:58.004 -> Current mode: IDLE
04:36:59.712 ->
04:36:59.712 -> Touch detected!
04:36:59.712 -> Current mode: ACTIVE
04:37:01.323 ->
04:37:01.323 -> Touch detected!
04:37:01.323 -> Current mode: DEBUG
04:37:05.098 ->
04:37:05.098 -> Touch detected!
04:37:05.098 -> Current mode: IDLE
04:37:06.902 ->
04:37:06.902 -> Touch detected!
04:37:06.902 -> Current mode: ACTIVE
04:37:09.595 ->
04:37:09.595 -> Touch detected!
04:37:09.595 -> Current mode: DEBUG
This demonstrates a complete, repeatable interaction cycle where the user
intentionally controls application behavior through physical touch.
Application code
The code below implements the application logic. Touch events are detected using
a calibrated threshold based on observed sensor values, and a rising-edge detector
ensures that each physical touch triggers exactly one state change.
// =======================================================
// Interface & Application Programming
// Capacitive Touch User Interface
// Board: Seeed XIAO ESP32-S3
// Input: Capacitive touch on GPIO 2 (D1)
// Output: Serial Monitor
// =======================================================
const int TOUCH_PIN = 2; // GPIO 2 (D1)
const int TOUCH_THRESHOLD = 100000; // Based on measured values
bool lastTouchState = false;
int mode = 0;
const int NUM_MODES = 3;
void setup() {
Serial.begin(115200);
delay(1000);
Serial.println("========================================");
Serial.println("ESP32-S3 Touch Interface Application");
Serial.println("Touch the pad to change modes");
Serial.println("========================================");
printMode();
}
void loop() {
int touchValue = touchRead(TOUCH_PIN);
bool touched = (touchValue > TOUCH_THRESHOLD);
// Detect new touch (rising edge)
if (touched && !lastTouchState) {
mode = (mode + 1) % NUM_MODES;
Serial.println();
Serial.println("Touch detected!");
printMode();
}
lastTouchState = touched;
delay(100);
}
void printMode() {
Serial.print("Current mode: ");
switch (mode) {
case 0: Serial.println("IDLE"); break;
case 1: Serial.println("ACTIVE"); break;
case 2: Serial.println("DEBUG"); break;
}
}
Week 13 — Wildcard Week, Glass Engraving
Glass Engraving
For Wildcard Week, I used a glass engraving process to fabricate two small etched designs inside/through a
clear glass prism block:
Bowtie Charm: a CAD model I designed and exported as an STL, then engraved into the glass.
Banana: a 3D scan I captured, cleaned into a watertight mesh, exported as an STL, then engraved into the glass.