First, as most weeks, I started by brainstorming. I’ve never done embedded programming, so the hill to climb felt steep. I’d love to make something useful for my final project, but that’s a big leap, so I started small: understand the board, build confidence with soldering, and get a minimal program interacting with the outside world.
Below is a snapshot of the components I was working with,
We started with a microcontroller, a screen, a board with capacitance buttons and two 1000 Ohm resistances. Alone, hoewever, these weren't much use, they had to be soldered.
After our first soldering attempt, we faced an issue as the screen broke off and took some of the copper lining from the board with it. To debug, our trusty TA QUentin came to the rescue and 3D rinted a holder for the screen. This was just the mechanical support we needed.
Here's a snapshot of the holder on the board! Thank you Quentin!
Because a picture is worth a thousand words, a video is worth even more (as long as its compressed). Here's a little video of me soldering my board!
ALl put together, we used the usb-c connection to speak with the controller. Thankfully, it seemed the connections were good because it flashed as factory settings would have us expect! This let us quickly check that the board was actually alive and responding to uploads over USB.
After confirming the basics, I moved on to testing the peripherals. Using Quentin’s QPAD-XIAO Arduino code, we verified that both the OLED display and the keypad matrix were functional. The Arduino sketch initializes the display driver and maps the key scanning routine, so we could immediately see whether pixels lit up correctly and if button presses registered in the serial monitor.
Running this known-good firmware was very helpful: instead of debugging both hardware and my own new code at the same time, I could first prove that the hardware itself was soldered correctly and that all traces were making contact. Once the screen displayed the expected test UI "hello world" and the keys produced the right serial output, I was confident the board was ready for my own experiments.
I extended the test into a simple interaction: button press cycles through states and prints structured logs (timestamped) over serial. Next step: implement a “Magic Eight Ball” variant that selects from 10 responses on any button press and displays via OLED + serial.
With the display and keypad verified using Quentin’s reference firmware, I wrote a small sketch that turns the QPAD into a Magic 8-Ball: pressing any key selects a random answer from a list of ten phrases, shows it on the OLED, and logs it to Serial for easy capture.
setup()
from a noisy source (e.g., analogRead(A0)
) XOR millis()
, then pick idx = random(0, 10)
.Serial.println()
it.Ten short phrases that fit cleanly on the OLED:
randomSeed(analogRead(A0) ^ millis());
int idx = random(0, 10);
answers[idx]
, update display.Serial.println(answers[idx]);
(timestamp optional).For my final project, I'd like to make kinetic earrings that charge with the movement of the person wearing them. The capacitance buttons on this board reminded me of the capacitive touch sensors I could use to change the volume, press play, or perform other controls. on my earphones. So let's see if we can make the controller communicate with a phone to control music playback!
I started by trying to figure out if we could speak with the media output of my device from the arduino interface. We ran an .ino sketch that did indeed pause a song playing on spotify on my laptop.
// TinyUSB smoke test: send Play/Pause 3s after boot
#include "Adafruit_TinyUSB.h"
Adafruit_USBD_HID usb_hid;
uint8_t const desc_hid_report[] = { TUD_HID_REPORT_DESC_CONSUMER() };
void setup() {
usb_hid.setReportDescriptor(desc_hid_report, sizeof(desc_hid_report));
usb_hid.setPollInterval(2);
usb_hid.begin();
delay(3000); // wait for host to enumerate
if (usb_hid.ready()) {
usb_hid.sendReport16(0, 0x00CD); // PLAY/PAUSE
delay(10);
usb_hid.sendReport16(0, 0); // release
}
}
void loop() { }
Next steps: integrate this code with the keypad scan so that pressing a button sends a media key event. Then, explore more complex interactions: long-press vs short-press, multiple buttons, etc. We did a first iteration of this playing around with code given by chat gpt, and it worked! Have a look at the result below.