MAS.863/4.140/6.9020

How To Make (almost) Anything

CBA Section - Week 13 Group Assignment

2025

Tyler Hill - I used the WebSerial interface to set up a serial connection to connect with a XIAO board/Arduino. There was not anything that needed to be done from the Arduino side which was nice, all that code had to handle was potentially sending things out to the serial monitor or reading in bytes from it as input. I did learn however, that you cannot have the serial monitor open in the Arduino IDE since that will conflict with the port that gets opened from the web interface. As to the website code, this was also pretty easy as it just took a few lines of JavaScript to open up the serial port and start reading/sending bytes on it. I didn't go through the process of getting characters to send since there is an encoder or something like it that is needed, but I was able to get all my functionality by sending numbers that corresponded to certain options that I then implemented hooks for in my code.

I also used the Flat UI package that was linked from the class page. It was really easy to use and took care of most of the css formatting in downloadable packages. I did have trouble getting the slider to work with JavaScript though and I was not able to get to the root cause of it, but I figured I could just use a normal slider and that wouldn't change the aesthetic too much.

Matti Gruener - I used Snap's "Lens Studio" to create an application (they call it "lens") for the Snap Spectacles device. The graphical elements used were done in Figma. The lens on the Spectacles is able to make HTTP requests to the server setup that I made last week. It allows clients of the API to switch what the server considers the "current" image. An ESP32 C3 polls that server for the current image every 4s. If it retrieves a new image, it displays it on a 200x200 pixel Waveshare e-ink display.

Lenses are developed using JavaScript or Typescript. I used Typescript for this assignment because it looks much more like the C# components that I'm used to from Unity. In Lens Studio, there's a library called UIKit that can be used for simple UIs. It is pretty barebones, but got the job done.

The UI I made shows QR codes in a list and lets the user select one by touching a button with their hands. The interaction setup on the device also let's you, essentially, point at a button and pinch your index finger and thumb.

My experience with Lens Studio was pretty good overall. I think the UIKit library needs some work, but it was sufficient for what I tried to get done this week. There's more information on my page.

Sara Fernandez - I used Python and the websockets library to make a simple interface for my charlieplexing LED board. It consisted of a simple webpage with clickable buttons that sends commands over a WebSocket to a local server, which then forwards those commands to the Seeed Xiao RP2040 over serial. This ended up being a nice bridge between embedded control and a real user interface once the WebSocket connection was up. The response time after pressing buttons was imperceptible, and I could map each button to a specific pattern on the board. The main difficulties I encountered were making sure to keep the server running while hosting the page locally and making sure the serial port wasn’t being used by another program at the same time (by closing all other serial monitors). Overall, it was a really clean workflow, and having this approach be so well documented made it much easier to solicit advice from ChatGPT when needed. More can be found on my class page.

Ben Weiss - Back in my days a marine robotics engineer, I used Tkinter to create a GUI interface for an eDNA sampler. Each virtual button was tied to a seprate sampling pump which was triggered by specific serial commands. I decided to do something similiar for this week's group project: I used Tkinter to create a grid of buttons, each of which, when pressed, sent out a frequency between 0 and 1000 Hz to my RP2040, which was connected to a speaker via an H-bridge amplifer. See my site for more details.

Miranda Li - I used Python Remote Procedure Call to read Serial input into a Python script, Websocket to stream that to my static website, and Javascript to pass the data into a p5.js sketch. I made my UI in HTML and CSS, just as god intended. The pipeline was fairly straightforward to set up and I had no issues with latency. As for the interface, I've used other frameworks before, but for small one-off projects I prefer to keep it simple -- I made a bit of use of [Shoelace](https://shoelace.style/) for some components, but that was about it.

Edward Chen – I used Python’s Tkinter and Pygame to visualize and interact with the IMU data streamed from the ESP32 over WiFi. A Python server listens for incoming UDP packets and parses the accelerometer, gyroscope, gravity, and quaternion data in real time.

Tkinter was used to create a simple desktop interface and canvas for displaying the sensor data. It provides basic UI elements such as windows, text, and drawing primitives, which made it easy to visualize orientation and motion data without a complex setup.

Pygame was used to build an interactive visualization and game driven by the IMU data. It handles real-time rendering, input, and timing, allowing the motion of the physical device to directly control objects on the screen. This made the IMU data more intuitive and engaging to explore.

Together, Tkinter and Pygame allowed me to quickly prototype both a data visualization tool and an interactive application using the same sensor stream, demonstrating how physical motion can be mapped to digital behavior.