Home About Final

Anemoia Device

{This page is in the process of being updated}

My final project for HTMAA 2024 is the Anemoia Device: A memory generator for memories you have never experienced.

The word anemoia means feeling nostalgia for a time or moment you've never experienced. I discovered it while researching memory and found the concept to be highly poetic and fitting towards my research and techno-aesthetic goals.

Background:

For me, the goal of taking How To Make Almost Anything wasn't just to learn how to make, but to combine my thinking around why and what we make (alongside if indeed we should make) while enhancing my actual abilities to materialize sometimes highly abstract concepts like anemoia.

In my research I’m exploring the connection between memory and digital data, combining technologies like synthetic biology, machine learning, and biological computing, to materialize data into tangible, poetic interactions that reimagine our relationship with digital information and living systems.

For my final project the notion of a memory generator had been percolating since the first week. However, a few things then happened which helped me to fully define the concept.

My initial design idea was to use dials to take an input image and control LLM prompt generation which would lead to the creation of a ‘new memory’. I was keen to make using AI more tangible and thought seeing prompts update on the fly via dials would be interesting.

Concept

I set out to create something that could respond to these thoughts. I imagine memories that stretch from the recent to distant past, from ourselves to more than human alternate realities; to reconnect us with ourselves and the myriad of living and non-living beings.

When memories fade, can we generate hybrid memories to offer people a new kind of sensory intimacy with the unseen and the unremembered?

To create a machine that could do this I needed a reference and a metaphor. I came across the Antikythera Mechanism, an ancient (perhaps primordial) computing device that worked with novel technology at the time, mechanical gears, to connect us to the future of space and time. The Anemoia device then could be framed as a device working with a novel technology, generative AI, to connect us to the past of the unseen and unremembered.

For the design, the metaphor of distillation was crucial. It was coherent with the usage of scent and lent itself to the idea of a memory being input at the top and being distilled downwards towards an eventual output.

Creation Process

The development of the project can be broken into 4 high-level parts:

The project should be capable of taking an input in the form of an image, transforming that image into a usable/decipherable text caption, and generating instructions for ‘printing’ a scent. To achieve this, the following tasks were identified:

Hardware

Software

Package and Integration

Project management

At some point I did project management professionally and it's not sometihng I therefore enjoy. Especially for projects like this. However, it is a necessary evil and I like t use a simple, lightweight approach that keeps me organised. To help plan, I used Notion to break everything into work streams and some individual tasks. I did not plot out everything but had some notion (pun intended) of what needed to be done by when.

Initially, I planned to use each week leading up to the final project to complete sections of the work. However, Machine week ended up being much more consuming than anticipated.

Another pecularity was that I actually ended up finshing the Anemoia Device during Wildcard Week and then did my full Wildcard Week project in the last 48 hours of the Final Project Week. This worked better for me as my wildcard has no assosciation with my final project and jumping into another workflow and task would have made no sense.

Another important note, particularity with the scheduling of HTMAA, is that when I was finally ready to make orders, it was Thanksgiving/black friday period, which on the one hand made things cheaper but on the other hand made delivery times extremely elongated. If you're reading this, do try to order before this period(!)

Hardware

Pumps and power

I decided to start with the part I was least confident about working (and which I assumed would cause the most problems) - the pumps. I was able to get some very powerful diaphragm pumps from Ozgun in my research group (Tangible Media Group at MIT Media Lab) and started out by simply trying to operate them. They were rated 12V, but when testing with a regulated power supply I found they were operable from 8V. In any case it seemed prudent to use a 12V power supply.

I then began researching how to safely control the pumps, with a 12v supply given that the microcontrollers I work with supply and operate at much lower voltages. Since the pumps I was planning to use are essentially DC motors, similar to what we used during machine week, I realized I need to use a relay to control the pump with my microcontroller because the pump operates at 12V and requires more current than the microcontroller can handle directly.

The relay acts as a switch that lets the microcontroller control the higher voltage and current without being directly connected, ensuring the circuit is safe. It also provides electrical isolation, protecting the microcontroller from potential damage caused by electrical noise or surges from the pump.

I ended up using commercial relays for reliability purposes. I was initially considering (and designed a board) using a smaller h-bridge component and may revisit that in the future. Even with the relays in place, it’s surprisingly easy to fry your board; simply unplugging the USB-C of the XiaoESP32S3 before the 12v power left one of my microcontrollers unusable.

To get to the final intended outcome, I went step-by-step starting with 1 pump and 1 relay with a very simple circuit on a breadboard. I gradually added complexity - increasing the number of pumps, switching to a PCB, controlling the pumps with potentiometers and web interfaces (during network week).

Electronics Production

I designed a total of 5 boards making this project. The first 2 boards were designed to work with an H-bridge from the CBA inventory. However, as mentioned, I changed my approach to commerical relay modules in the interest of reliability. The relays have 6 pins:

Wiring was quite straightforward. VCC and GND connect to the XIAO’s pins (3.3V for the VCC worked best). IN was connected to any of the XIAO’s GPIOs. NO connected to the positive terminal of the pump and COM connected to 12V.

I used everything except NC as I wanted to ensure the pump was only activated when the relay was triggered, allowing the circuit to remain open (disconnected) by default for safety and to avoid unintended operation.

I started with a simple board, designed to house a XiaoESP32 S3, to control a single pump with surface mount connectors for a relay, connectors for a pump (through hole), and a surface-mounted jack for a 12v power supply.

This is the relay I ended up using, which was very affordable. They are rated at 5V, which is sufficient for the XiaoESP32 microcontrollers I was using. I decided to run jumper cables to surface mounted male header pins for the relays.

I got a bunch of 1.4mm power jacks from Quentin, I initially struggled to find their footprint but eventually realised that they were in the CBA fusion library

They have a little registration feature to make them sit well on the PCB board which I appreciated once I found the actual footprint (and hated prior).

After testing the initial Anemoia PCB v0.3 and getting it to work well, I designed the next board v0.4 to accommodate all 4 pumps and relays. The board came out pretty well and initially seemed to work as intended when connected to 1 pump. So I continued to use the board during networking week assuming all was good, connecting it to rotary encoders over a wireless connection.

During soldering, I initially did not seat the male headers well for the relay, and actually ripped a set off while disconnecting jumper cables! It was my first time seeing that and quite surprising as I was sure my soldering was good - it looked shiny and strong. However, when I had a look under the scope, I could see that actually the amount of surface connecting the pins to the pads was minimal. Was a good learning experience. From then on, I became a strong advocate of flux and my soldering has improved by around 100% since.

I thought that would be the only issue, the next week however, I realised that in fact my wiring was wrong and I was connecting all the pumps to each other. After thinking through the issue I realised I had made a very basic mistake but I had forgotten to specify that the specific # NO terminal of the relay should correspond to a specific # positive terminal of a pump e.g. Pump 1 connects to the NO pin of Relay 1, Pump 2 <> Relay 2 etc.

Debugging then was relatively easy, and I was able to debug, create a new schematic and make a new board within a few hours. The final board Anemoia v0.5 maintained the organic design and this time it actually worked with all 4 pumps.

Dials and Tribulations

I was advised against using rotary encoders by Quentin and Ling Dong. But, I didn't listen and actually in the end I was quite glad with how things worked out. I ordered some mechanical rotary encoders with a chunky feel, 18 stops, so the click for each turn feels quite satisfying and distinct, which is what I wanted.

Yes, they are super awkward to work with, they use a lot of pins especially if you’re using them as buttons, and it probably took 48h longer than it would have to get potentiometers and additional buttons working with the logic I setup, but I really wanted to have a minimalist layout and the flexibility of having rotary encoders that double as push buttons was idea.

After quite a lot of frustration, I was able to get the encoders to precisely jump from one step to another. Initially they would do all kinds of wild things like jump ahead n+2 positions. Or push buttons would double commands. All of this was quite easy to see in the Arduino IDE serial monitor though so pretty easy to debug. During networking week I was able to communicate with the rotary encoders over WiFi to control pumps. In the interface week I focused more on the actual task I wanted them to do, which was to control prompt/caption generation.

My initial plan was to create another PCB for the rotary encoders, however I simply had too many other things to do, so I decided to use an Arduino UNO and its many pins for the dials (and screen). After all, they were both going to be seated well within the eventual package/object so I felt this was also potentially unecessary. I also found that WiFi connection between XIAOs was very unreliable, so a serial connection between XIAO and UNO was a much better bet for a successful long-term demoable end product.

My code for testing the encoders is here

Screen Time

The unassuming LED screen ended up being the most challenging part of the electronics part of the project. Its job was simple, display captions being generated by AI APIs based on image inputs.

I assumed it would be pretty much plug-and-play, however even running the demo code initially did not seem to work even though the LED backlight turned on. After an hour or so of debugging code, I googled it and found that there was actually a potentiometer on the back of the screen that you can adjust with a screwdriver. Adjust until the screen works.

After this, while it was easy enough to send simple chunks of text to the screen, there was a challenge reading back variable, often extremely long captions generated from images. Using the Anthropic Claude API is great - Claude is an excellent writer by all accounts, but it does like to go on a bit. This led to lots of gaps and buffering issues, where Ling Dong did provide some great assistance in helping to understand why this was happening, and helping to write some code that would give the Arduino UNO the screen was connected to a bit longer to parse the data / characters.

Code for testing testing the screens is available here:

Software

At this point everything was working well independently, and now I needed to write a program to bring everything together. Until now I had also been using an image stored on my computer to test, so I started by getting together a program to read an image via webcam, taking an image repeatedly, then storing (and overwriting) so that there would always be an image for the system to process

Image capture code available here

For the main program I needed to do the following:

Taking a Photo

Taking a photo of an image is explained above. Captioning an image via Claude API is achieved as follows. It is limited to 145 characters to ensure that the screen output is reliable and fast. It also offers a better user experience.

            {
                "type": "text",
                "text": "Provide a detailed, comprehensive caption for this image in a few precise sentences, prioritizing the subjects. Limit the output to 145 characters."
            }
            

To generate the caption:

            try:
                response = client.messages.create(
                    model="claude-3-5-sonnet-20241022",
                    max_tokens=300,
                    messages=[message]
                )
                return response.content[0].text
            

Connecting to the Microcontrollers

It’s simply a matter of knowing what ports are connected, and if using the same setup this should be the same each time, e.g.:

        # Serial configuration for Arduino Uno and Xiao ESP32S3, may need to change...
        ARDUINO_SERIAL_PORT = "/dev/tty.usbmodem1201"  
        XIAO_SERIAL_PORT = "/dev/tty.usbmodem1101"     
        BAUD_RATE = 115200
            

Handling Encoder States

For the encoders, as they will likely always be in different positions, it is important that they start at 0 and that the button state is defined as false, which is done as follows:

            # Encoder states
            encoder_positions = [0, 0, 0]
            last_encoder_positions = [-1, -1, -1]
            button_states = [False, False, False]
            
Next, based on the generated caption, the code then queries the API to ask it to determine the main subjects and classify them as living or non-living. This is done with the OpenAI API. If living, certain options will then be available with the 2nd dial. If non-living another set of options will be presented on the screen.
            def extract_subjects(caption):
                prompt = f"""
                Extract the main subjects from this caption: "{caption}".
                Provide them as a numbered list in the following format:
                1. [Subject 1] (living or non-living)
                2. [Subject 2] (living or non-living)
                """
                response_text = query_gpt(prompt)
                subjects = []
                lines = response_text.split("\n")
                for line in lines:
                    if "." in line:
                        parts = line.split(".", 1)
                        if len(parts) > 1:
                            subjects.append(parts[1].strip())
                if not subjects:
                    subjects = ["Default Subject 1", "Default Subject 2"]
                return subjects
            

This then gives us specific options on the screen for the user to select in relation to the time-period of interest of the living or non-living thing that is selected as a main subject

                    # Determine time options based on the first subject
                    first_subject = subject_options[0]
                    if "non-living" in first_subject.lower():
                        time_options = ["raw-materials", "manufacture", "in-usage", "decay"]
                    else:
                        time_options = ["childhood", "youth", "adulthood", "elderly"]
                
                    mood_options = ["happiness", "sadness", "anger", "calmness"]
                

For the scent generation, I worked with AI to help convert the memory sentence into a specific scent formula. The AI selects from predefined smells (e.g., Campfire, Night Air, Rain) and assigns proportions to create a unique combination that best reflects the generated memory. The output is a formatted list of four scents with their proportions, such as:

There is also a function, generate_pump_times that assigns each scent to one of four pumps, translating the proportions into specific durations (in seconds) for which each pump should run. For instance:

Finally, we have the logic for the encoders, and buttons. The program continuously runs in a loop, checking for changes in the encoder positions and button presses. Users interact with the system by turning the encoders to select values for Subject, Time, and Mood and confirming these selections using buttons. Once all inputs are confirmed, the program generates a “final prompt” and the scent formula.

        while True:
        # encoder 1: Subject selection
        if encoder_positions[0] != last_encoder_positions[0]:
            index = encoder_positions[0] % len(subject_options)
            selected_subject = subject_options[index]
            display_message_to_arduino(f"Current Subject: {selected_subject}")
            last_encoder_positions[0] = encoder_positions[0]

        # encoder 2: Time selection
        if encoder_positions[1] != last_encoder_positions[1]:
            index = encoder_positions[1] % len(time_options)
            selected_time = time_options[index]
            display_message_to_arduino(f"Current Time: {selected_time}")
            last_encoder_positions[1] = encoder_positions[1]

        # encoder 3: Mood selection
        if encoder_positions[2] != last_encoder_positions[2]:
            index = encoder_positions[2] % len(mood_options)
            selected_mood = mood_options[index]
            display_message_to_arduino(f"Current Mood: {selected_mood}")
            last_encoder_positions[2] = encoder_positions[2]

        #handle button presses
        if button_states[0] and selected_subject is not None:
            display_message_to_arduino(f"Subject Confirmed: {selected_subject}")
            button_states[0] = False

            # Now subject = confirmed subject, recalculate time options inside the same block
            if "non-living" in selected_subject.lower() or "nonliving" in selected_subject.lower():
                time_options = ["raw-materials", "manufacture", "in-usage", "decay"]
            else:
                time_options = ["childhood", "youth", "adulthood", "elderly"]

        if button_states[1] and selected_time is not None:
            display_message_to_arduino(f"Time Confirmed: {selected_time}")
            button_states[1] = False
        if button_states[2] and selected_mood is not None:
            display_message_to_arduino(f"Mood Confirmed: {selected_mood}")
            display_message_to_arduino("Generating Final Prompt...")
            final_sentence = generate_sentence_with_mood(selected_subject, caption, selected_mood)
            final_sentence = final_sentence.replace('\n', ' ')  # optional: remove newline chars
            final_sentence = " ".join(final_sentence.split())  # normalize spaces
            display_message_to_arduino(f"Final Prompt: {final_sentence}")
            button_states[2] = False
            time.sleep(8)
            display_message_to_arduino("Generating Anemoia...")
            scent_formula = generate_scent_formula(final_sentence)
            print(f"Scent Formula: {scent_formula}")
                

Fabrication

Creating the Anemoia Device involved a lot of 3D printing, some laser cutting, and quite a lot of trial and error. The goal was a very clean design, based on the metaphor of distillation that could support the electronics but retain an aesthetic and mysterious form. It should of course be capable of holding liquids and tubes and allow fragrance oils to flow.

The initial prototype for the device was made from scrap aluminium rods found in the TMG lab. It was held together with tape and glue, to get a sense of shape and size. The final structure is composed of 4 x ½ inch aluminium rods and 8 ¼ inch aluminum rods cut to size on the circular saw in the CBA shop. A bottom and top panel are 3D printed for convenience with slots for the aluminium rods.

It was quite challenging to figure out how to cleanly and aesthetically attach everything together. In the end this was achieved with a series of brackets that held acrylic sheets in place. This was 3D modelled in CAD and then through a lot of test 3D printing on the Prusa MK4 and MK4s, eventually the right fit was found. Final prints were mostly done with a Bambu Carbon X1.

To package things, I procured a nice set of 8x10 inch frosted acrylic sheets from Amazon at quite a good price which was pretty much perfect for the job. The acrylic sheets themselves were laser cut. 4 panels are used to form a box that holds the electronics in place. This separates out the pumps (on a lower level) from the other electronics (on an upper level). The front panel has slots for the dials and the screen. The side panels slot together to hold things in place. The top and bottom panels are attached via the bracket system.

Below this, there is a further panel that houses the fragrance oils in tube holders. Right at the top is a shield for the webcam, which allows the webcam to sit flush and face down. All electronics are mounted carefully and secured in place to ensure strain relief as far as possible. The PCB is seated within a 3d printed cartridge slot. Everything uses jumper cables so there’s always a risk that they could become loose, something to improve for a v0.2.

The Anemoia device was successfully demonstrated at the How To Make (Almost) Anything demo day, with a memory generated about the Great Wall of China. Neil reminisced (jokingly) that it did make him feel nostalgic for the Great Wall. Next steps for the project are user testing to understand whether the device actually does instill a sense of nostalgia, or something else entirely. It does seem to provoke a reaction which is great and feedback thus far has been positive.