Hello!
I am a first year student at MIT Media Lab, in the Design Fiction group. My work often operates at the intersection between technology, the mediated body, culture & the sensorium.
As a researcher, I have a huge interest in extending sensory perception through technology, to expand the frontiers of human experience through novel applications of engineering. By designing with these experimental extensions of the body, I explore the relationships between technology, culture, and the creation of self.
As an artist and designer, I continually seek to discover the unexpected, through playful experimentation, intuition, and speculative storytelling. Some of my interests include prosthetics, perception, architecture, augmented reality, sensorial hacking, glitches, cyborgs, science fiction, and corgis.
I am a marvelously lucky participant of Neil Gershenfeld's "How to Make (Almost) Anything" course at MIT.
This is where I will chronicle my adventures and assignments from my first semester at MIT Media Lab.
        
        
        
        
This is scenario: In the near future, humans, cultivated by the supremacy of a logic based computational world, will become so obsessively hygienic and reason driven that they will lose a connection to their biological and animal impulses. To aid in this tragedy and release the inner ID in every computer scientist, I propose a robotic limb that will allow the refined urban dweller to release his or her animal urge for short recreational durations.
I made a mini slide show to create a character sketch for the narrative I am trying to build. All photos are from this amazing artist named Michael Wolf.
The antidote I would like to engineer for this numbed urban dweller is a prosthetic to lose control, a sort of phantom limb for emotions. A limb that allows me to express emotions that i am too repressed or obedient to express. Right now, I am experimenting with anger. As an asian, as a woman, as someone living in this contemporary society, expressing anger feels embarrassing and unacceptable, like losing one self or not maintaining control.
I propose a limb that frees the wearer from these norms and allows the repressed to come out to play, if only for a short recreational while.
This Week's Mission: Press Fit Construction
In this guide I will show you step by step how to make your own Corgi shelf, or any waffle construction object!
This week we learned how to use a laser cutter and vinyl cutter to take advantage of computer controlled cutting. The assignment was to test different tolerances of cuts using the laser cutter, and create something formed by press fit construction and cardboard. As I have been a life long fan of the world's cutest dog, I decided to take this oppotunity to create a corgi companion for my work desk.
This is the world's cutest dog: The Corgi
I was inspired by an animal shelf I saw online. Can you believe it retails for over $600!
Step 1: Make a 3D model
Build or find a 3d object that you would like to build. I started by finding a 3D model of a corgi online. I found one acceptable to my high corgi standards over at Autodesk I took the model over to Rhino, oggled its cuteness for a good while and began the design process.
Step 2: Subdivide Model
I subdivded the mesh along three axises and extracted sectional planes. Helpful hint for those who are looking to make a waffle construction model! Rhino has a command called "contour."
Diagram of all the original contoured pieces.
On the left you see the contours straight out of the program, on the right you see my final design.
The contour command is wonderful and almost magical to the first time user, but it still requires some elbow grease and rebuilding. This is an Example of how contours need to be patched and fixed. From left to right: the original mesh corgi, the contoured corgi, (in pink) the contour that needs rebuilding, the rebuilt contour.
Step 3: Extrude Thickness
After all the contours are rebuilt to your liking, extrude the curves to the thickness of your material. This was a tricky part for me because the thickness here has to be exactly right for the pressfit to, well, fit. I went down to the stockpile and did some measuring and test cuts at different settings before going forward.
Step 4: Create Notches
After you have given the pieces in your model a thickness, it is time to make the notches. To do so, I moved each of the pieces along the axis of the contour half way up the delpth of the extrusion like so:
Then I used a nifty command called Boolean Split on both X and Y axes contours. This results in automatic perfect slots that will allow you to make the perfect joints!
TThis is an image of all of my notched peices spread out.
Step 5: Make a Laser File
Now that you have all your pieces, it is time to lay them out into a laser file. I typically draw the dimension of the laser bed and the material dimensions for reference. I made a big error here this week as I assumed the cardboard would be the exact size of the laser bed, and made my corgi too big! I had go back several steps to resize the whole model without changing the thickness of the notiches. Another note of advice: it helps to label your pieces! I haven't here because the shapes of my corgi are very intuitive, but it can save you a lot of trouble with pieces that look very alike.
Step 6: Zapping Time!
Load your material into the laserbed and make a few test cuts before zapping your file to make sure your tolerances are good. If your material is warped, make sure you try to flatten it, as the laser will not focus correct in the areas where the Z height is inconsistent.
Step 7: Lay out the pieces and Build Build Build!
I like to lay all of my pieces out to organize and streamline the building process. Building was very smooth sailing for me- construction file was designed correctly, phew!
Step 8: Show it off!
Mission accomplished! My new corgi shelf companion is born!
BONUS CNC: Vinyl Cutting!
I am simultaneously taking How to Grow Almost Anything right now, and George Church is giving a lecture this week. Earlier this year, using the CRISPR DNA editing technique, George Church and his genetics research team at Harvard successfully copied woolly mammoth genes into the genome of an Asian elephant... which clearly made my design fiction mind go into overdrive. Hello Jurrasic Pleistocene Park!?! Clearly this kind of talk must make for very good chatter at the bar. I made George a custom sticker with wooly mammoths flanking his face and put it on a pint glass as a gift for him.
I had to learn the hard way that cutting the very thin Jurassic Park font panels are an artform and very hard execute and peel.
But persistence pays off:
The art of making the perfect drinking glass to tell the story of how you brought wooly mammoths back through CRISPR gene editing.
Photoshoot with George Church himself.
I hope you enjoyed this "How To!""
This Week's Mission: PCB Production
In this guide I will walk you through my journey of creating a FabISP in-circuit programmer from scratch.
This week I learned about all about electronics and printed circuit board (or PCB) fabrication. This skills were learned through the exercise of making an in-system programming in-circuit programmer. An in-system programmer (ISP), a piece of hardware used to load programs onto the ATtiny. This board will allow me to program the microcontrollers on other boards I will make throughout the semester! The process, rougly outlined is as such:
1. Milling: Use the Modela CNC to mill a circuit design onto your PCB board.
2. Stuffing: Solder on the teeniest surface mount components you have ever seen.
3. Programming: Give your circuit brains!
Below and on the left is is the schematic of the circuit design I chose to work with. It is Neil's hello.ISP.44.res.cad board , which is designed for production within a FabLab. Next is an image of the traces, and on the right I tried to customize this board with my name. Something to keep in mind is that the design should have at least 17mil spacing in between all traces for our Modela. I looked at Amanda Ghassaei's blog for reference, and she suggests a setting of 12mil traces with 18 mil clearance.
Step 1: Milling
With the traces in hand USB, I proceeded to mill for the very first time. Our FabLab uses the Modela mill. David Mellis has made an amazing comprehensive tutorial for the first time user.
As you can see from the image below, my first mill job was a bit of a disaster.
But, practice makes perfect. I had to mill the board serveral times to learn the tricks of the trade. For my first board, I made the dual rookie error of not securing my plate on securely with enough double sided tape (causing the board and thus the traces to wiggle.) The solution to this is to make sure the sacrificial plate is clean by vaccuuming, scraping off residual tape, and wiping down with a tiny bit of isopropyl alcohol. Another difficulty I encountered was in recognizing when I was using a dull bit, which can cause burred edges. Solutions to this include scraping the burrs off with a straight edged tool, slow the speed of the mill job down, or simply change the bit!
After you have a satisfactorily milled PCB, take it to the sink and give it a soapy wash to remove residual oils.
Step 2: Stuffing the Boad
Next comes the soldering! I collected all of my components and labeled them carefully in my notebook. The main microcontroller is the Atmel ATTINY44A. I was pretty blown away by how tiny these components were.
Next, I organized all of the necessary tools into a workstation, which consisted of a soldering iron, solder, solder wick, safety goggles, circuit schematic, vice clamp and a multimeter.
Then came the actual soldering! This was challenging as the components were so tiny.
A technique that worked well for me was to drop a bit of solder first to serve as a tack. To do so, first heat up the designated component location with the tip of your soldering iron, and drop a tiny bit of solder on the board. Place the component over the dot of solder with tweezers, and heat both the solder and the component with your iron until the component sticks and is immobolized. Then continue with an actual solder on all leads of the component. If you make any mistakes, you can always desolder and try again!
Finally, after much soldering, desoldering, and squinting, my board was all done!
Step 3: Programming
Now was the moment of truth, to test if my boad has been fabricated properly! To program the ATiny44 on the FabISP board, we needed another in-system programmer.
The process is fairly straightforward:
1. Plug the board to a USB for power.
2. Plug the board to the AVRISPmk2.
3. Load the firmware:
i. make clean
ii. make hex
iii. sudo make fuse
iv. sudo make program
After programming, we had to desolder two jumper cables (of which I used 0K resistors) to make sure our programmer can forever write but never read again.
And that was my adventure into creating my first circuit board!
Addendum: FabISP & Nyan Cat....united at last!!
I hope you enjoyed this "How To!""
This Week's Mission: Design and 3D print an object that can only be made additively.
This week's post will take you through my trials and tribulations as I discover the tolerances of the different 3d Printers through the CBA labs.
3D Printing
This week I learned about all about 3D scanning and Printing. To goal is to make something that highlights the capabilities of additive manufacturing. Our class is printing on the Ultimaker2. The primary material is Polylactic Acid (PLA) a plant based thermoplastic polyester. It is awesome because it is made from reneweable resources, such as corn starch! The machine heats up and extrudes a thin filament, layer by layer to build up a 3d Model. It is amazingly cheap and efficient, but has some drawbacks, as outlined below:
Design Constraints of Fused deposition Modeling (FDM):
(Some limitations based on filament diameter:)
Cannot print too sharp and pointy
Cannot print too thin
Cannot print gaps too close to each other
Cannot print extremely high resolution
Sometimes print might slip off plate mid print, and cause a mess.
Despite these limitations, the Ultimaker is a very useful and affordable tool to prototype ideas quickly! However, I had to learn how to design to optimize its strengths, and to avoid mishaps. Dan Chen, the fabulous HTM TA and my lab mate, has written an excellent guide to the Ultimaker:
Along with "How to Make (Almost) Anything", I am con-currently taking "How To Grow (Almost) Anything", directed by George Church, professor of Genetics at Harvard medical school. I have been immersed in learning about synthetic biology and the principles behind engineering genomes, and I thought it could be fun to design a bracelet in the form of DNA. Below are some shots from the design process.
It took a while for me to figure out how to model DNA in Rhino. Even after I discovered the "helix" command, my first model was inaccurate:
Hitting the trusty textbooks for reference, my second attempt yeilded better results:
>
This was the DNA Bracelet I tried to print:
But, alas, calamity! Because the form hit the bed on so few points, and with such tenuous amounts of surface area, my print tended to peel up and dragged off by the extruder, forming a mess. Motivated to get this print done, I tried to alter the following settings:
- To tweak the bed temperature,
- To change build quality setting to "best"
- To change the build density from 20% to 100%
- To slow the speed of the build.
- To lay gluestick down for a enhanced surface adhesion
None of these changes helped to any avail. Images of the massacre to follow:
Because of these repeated misprints, I learned the limits of what the Ultimaker is capable of building. I went back to the drawing board and tried to design another bracelet that would highlight additive manufacturing but hit the bed plate more solidly. I came up with the design below:
I simplified the design quit a bit, and made each component much thicker. I piped a polygon into a cage, and built beads that would be trapped the bracelet. This was the final design I sent to the Ultimaker:
Next, I tweaked some of the settings in Cura, the printer interface for Ultimaker, and sent it to the Ultimaker with my fingers crossed!
An hour into the print, things are still looking good:
This is what it looks like fresh off the 3d Printer bed, with all the support material still attached:
I took some time to clean it off and buff off the rough edges with a soldering iron, which my classmate Daniel Windham discovered as a neat little trick.
Finally, after all of the testing and print errors, my bracelet was complete!
3D Scanning
The second half of this week's assignment was to 3d scan something, and optionally print it out. I chose to 3d scan my head as a sculptural letter to my grandmother, who lives in China and has not seen me for several years. Dan graciously offered to scan me:
I have to admit, the scan was rather terrifying. Seeing a 3D scan of yourself is akin to hearing a recording of your voice. (Is this how I really look???)
Tom printed it on the Polyjet printer for me, and the results were pretty good! I have to say, it is pretty uncanny to hold miniature self as a giant.
Dan is my lab mate in Design Fiction, and he also 3D printed and scanned himself last year during his time in HTM. So obviously now it was time for a family photo shoot.
Edit! After all of my failed forrays with the Ultimaker and my DNA bracelet, Tom Lutz offered to print it on the Eden for me! I finally acheived my 3D printed circle of life.
My bio blueprint station is complete :)
And that concludes this week's adventure in 3D Scanning and Printing! I hope you enjoyed this "How To!
This Week's Mission: Make something big!
This week is all about using the ShopBot to CNC mill. The bulk of what I learned this week is in how to translate a 3d design into toolpaths, how to use a machine to cut the parts precisely, and lastly, the process of post-processing and assembling the parts.
Above is an image of the CNC machine that we used for this week, the ShopBot .
Designing Something Big and Transformable
I live in a relatively small apartment by myself, and don't have much furniture. I like living minimally, but I will admit that the barren state of my apartment has deterred me from having guests over. For this week, I decided to make a coffee table that could transform into a larger dining table for when I want to entertain guests.
With a combination of Rhino and Solidworks, I designed a table that extends in both length and height with the use of two sets of hinges.
A big part of the design process was making sure that the tolerances and clearances for both sets of legs were correct.
I wanted to challenge myself by designing most of the joints to be inherent in the design itself, with notching and pressfit. The only hardware used in this design are the hinges themselves.
Drawing of the pressfit / notched joint in the table leg:
After all the joints were properly designed, it was time to lay out all the parts.
These are the components laid out on the material. I had some extra space, so I added a chair design from OpenDesk, an awesome resource for opensource CNC furniture designs!
Milling
Now that the design phase was complete, it is time to start milling. The FabLab has an excellent guide on how to use the ShopBot. The basic workflow is as follows:
1. Convert CAD drawing to .dxf and load into PartWorks 2D, which is a software program that generates toolpaths for the ShopBot CNC mill.
2. Using PartWorks 2D, generate a tool path.
3. Refine toolpath by adjusting cutting parameters, which are Pass Depth, Stepover, Spindle Speed, Feed Rate, and Plunge Rate.
4. Generate toolpath, save toolpath.
The next steps involve setting up the ShopBot itself:
5. Select the right collet for your tool, thread collet holder into spindle, and use a collet wrench to snug collet into place.
6. Add the dust collection skirt (also lovingly known as ShopBot's Mustache).
7. Insert key attached to collet into control box to turn it on.
8. Important! Press blue button to reset drivers so that the machine will talk to your computer!
9. Fix the material to be cut onto the bed, preferably securely with screws.
10. Now it is time to zero the cutting tool on X, Y, and Z axis.
11. Set spindle speed.
12. Turn on dust collector.
13. Load toolpath commands.
14. Put on Goggles!
15. Press green button to start spindle.
16. Press ok to start cutting.
17. Retreive pieces and vaccuum the bed of dust and scraps.
18. Beam with victory!
Importing my .dxf file into PartWorks 2D to generate toolpaths:
Zeroing the bit.
Ready, Set, Milling! With my HTM partner in crime Raphael Schaad.
Mission milling: Accomplished!
Time to post-process. Read: Sanding like there is no tomorrow.
Testing the tolerances.
That feeling when notches fit just right.
A "dogbone" clearance is typically used in CNC paths that have sharp corners beause the bit is round and cannot cut such a tolerance for press fit friction joints. An extra rounded corner is added to remove the extra material. I chose to forgo the dogbone and hand filed the pressfit opennings on the side of the table to get the joint flush for aesthetic purposes.
Testing the transformable coffee table!
Photoshoot time:
Bonus slender chair:
Photoshoot time:
And that concludes this week's adventure in Computer Controlled Machining!
I hope you enjoyed this "How To!
This week was my first forray into electronics design. This week I found a lot of useful references and tutorials, including SparkFun Guide, Dan Chen's guide, and Pip's guide.
After reading all about components and learning how to use Eagle, instead of packing a circuit as compactly as possible, I thought it would be a fun challenge to lay a circuit out in the shape of Neil's face.
I browsed around and read some guides. Here are my Background Notes from a Newbie
The Inspiration
Pip's guide blinks an LED, which she called "winking," and it inspired me to make my own version, with Neil's face as the circuit. The idea was to include a capacitive touch pad in the shape of "Fab" in his brain, which would cause him to LED "wink" at you.
I should mention that I would have never embarked on this project had Eric VanWyck not mentioned that he was fixing up and testing the Trotec Speedy 100 Flexx, which is a two-in-one CO2 and fiber laser, which is capable of laser etching a high resolution image on a circuit board.
The Famous Neil Gershenfeld Winking LED Circuit Board:
The Components and What They Do
The first step was the draw the schematic of the circuit on Eagle.
This was followed by laying out traces of the schematic on a board. This can be challenging, trying to find enough room for paths that don't overlap.
I kept a vectorized image of Neil's face on an adjacent screen to give me a relative idea of where the components should go.
I spent a good while going back and forth trying to be clever about my circuit routing. Oddly, I felt that my training as an architect helped. It felt a lot like planning complex circulation for a building plan.
After all the work, Neil's circuit portait was born!
It was time to bring it to the Trotec Speedy 100 Flex Laser. This is where my heartache began.
Unfortunately, although the image looked stellar coming out of the laser, we did not etch deeply enough, causing a heartbreaking moment with the multimeter, when I discovered my whole board was connected.
Not to be deterred, I went to mill it on the Modela. This is what Neil's face looks like as a toolpath, in case you were wondering. You're welcome.
But alas, there were problems there too. After much pain, advice seeking and googling, I have discovered that it is important that the send command read "serial.py/dev/ttyUSB0 9600 rtscts ". I have also received advice that it is helpful to clear the cache on the history of your browser. If you don't, you can end up with a mill that inserts its own artistic expression (see below.)
After learning my lessons the hard way, I finally got around to soldering the components onto my circuit portrait. I got some help debugging this circuit from the amazing Brian Tice
Here is a short video of Neil as the Terminator.
Some bonus images of my circuit under a microscope!
And that concludes this week's adventure in creative circuit design!
I hope you enjoyed this "How To!
This week we learned all about molding and casting. I used this week to experiment with casting a variety of materials.
The Inspiration
I originally wanted to cast a necklace of my surname "Liu" in Chinese.
I modeled different calligraphy styles of the character and tried different forms. I modelled the dimensions of the block of wax we were given, and 3d modeled both the postive and negative in CAD.
In the end, the main constraint was the diameter of the CNC bit, which was 1/8th inches. In the diagram below, the original design is shown with the tool bit in red, showing where the tolerances would not be big enough. On the right is a design revised for the toolpath.
The result was a form that was too big to be a necklace, so I decided to make a customized mooncake mold for my family for the next Autumn festival.
After the the design was finalized, it was time to import the .STL file to ShopBot Partworks3D. I will walk you through the process below.
After the toolpaths have been generated, it is time to set up the mill. Glue your block of wax to a peice of OSB with a bead of hot glue along the edges. Drill the OSB to the mill bed. Zero the bit to the X and Y origin of your block.
Use the plate to zero the Z axis as well. Don't forget to ground the system by clipping the alligator clamp to the collet!
My block after the rough pass!
My block in the middle of the fine pass. It is very satisfying to watch the bit hug the curves and edges of the model.
After the block finished milling, I used cardboard and tape to finish the ready to pour mold!
I went to Reynolds Advanced Materials to buy food safe mold material.
There, I learned about the Shore Hardness Scale. This will eventually be relevant because I am thinking of casting a silicone soft robot for my final project.
My first cast was actually with the provided OOMOO 25. I wanted to make this mould so that I could experiment with the low temperature alloy.
I have never worked with silicone before and my first mixed batch had way too many bubbles.
Lucky for me, we had a vaccum in the shop, which was very effective.
The first mould came out satisfactorily. (Proof that the vaccuum works wonders!
My first cast was with the drystone. Again, I had trouble with bubbles, but in this case I thought the sponge effect looks interesting.
Dan did a demo during recitation of casting a low temperature alloy called Roto 281F.
The alloy melts at a low enough temperature to be handled with the kiln that we have in the CBA shop. I poured it during office hours under Tom's expert supervision.
My first time casting metal!
Next up was castsing the foodsafe mould. I used the SORTA-Clear Transluscent Silicone Rubber 18 from Smooth-On. This time I already anticipated the bubble problem, and tried to mix and pour as slowly as possible. But the bubbles persisted.
Thankfully, I was saved by the vaccuum, and a lot of manual vibration to release trapped air.
All of my mouldes and casts together.
The individual pieces:
Bonus! Aluminum Casting! Sam Callish gave a demo on Aluminum casting today! Here are some pictures from the session. I will do a full blog of this demo another time!
And that concludes this week's adventure in Molding and Casting!
I hope you enjoyed this "How To!
References
This week I read a lot of online guides and learned a lot from the alums from the FAB accademy. I am linking them here:
As I was learning as much as I could about microcontrollers this week, I tried to understand the relationship of the chip architecture to the ways in which I could program it. To start, the AVR is a modified Harvard architecture 8-bit RISC single-chip microcontroller, which was developed by Atmel in 1996. Wikipedia claims that the AVR was one of the first microcontroller families to use on-chip flash memory for program storage, as opposed to one-time programmable ROM, EPROM, or EEPROM used by other microcontrollers at the time but Neil handily debunked this in class.
Below are some acronyms that I encountered and looked up:
Types of Microchip Packages
Here are some messy sketches/ notes I took while trying to learn about microcontrollers. Attaching them here incase it helps someone in the future.
On the left I was learning about the path of microcontroller programming work flow. On the right, I am learning about AVR architecture to see the path that the logic of the code takes.
Learning about microcontroller memory structure cauesd me to trace back and learn about computer architecture. (I am really starting from scratch here.)
To understand how C-code, I had to learn more about the structure of registers. This made me marvel at the wonders of bit-banging.
Which caused me to look into the difference between parallel and serial communication:
Lastly, I studied the pinouts and the concept of bootloaders:
One of the initial frustrations I had was with the disparity between the pin names in different charts. for instance, PB0 is not pin 0 nor pin 2 but in fact reads out to Pin 10. I got this diagram from Dan's website and it helped immensely. Note that it is for the ATtiny44 model we used, and might not apply to your microcontroller.
I learned how to use the Ardunio programming environment with my ATtiny44 from this guide from High-Low Tech.
This is the simple code I wrote to make the wink happen. It's not much, but it's pretty much the first code I've ever written from scratch.
Here is a video of the completed circuit with a button:
Here is a video of the completed circuit with capacitive touch:
And that concludes this week's adventure in Embedded Programming!
I hope you enjoyed this "How To!
(Example of a system to drive pnuematic output. Image from Soft Robotics Toolkit)
Concepts
Control boards were full of electronic components and systems that I had no experience with. I am listing some concepts and links to guides that I read about this week:
Transistors
Transistors are devices that control the movement of electrons, and consequently, electricity. They can start and stop the current, and they also control the amount of the current. With electricity, transistors can both switch or amplify electronic signals, letting you control current moving through a circuit board with precision. Transistors require pure semiconductor materials, such as germanium and silicon. An integrated circuit is one piece of semiconductor material loaded with transistors and other electronic components.
Computers use those currents in tandem with Boolean algebra to make simple decisions. With many transistors, a computer can make many simple decisions very quickly, and thus perform complex calculations very quickly.
MOSFET
The MOSFET (Metal Oxide Semiconductor Field Effect Transistor) transistor is a semiconductor device which is widely used for switching and amplifying electronic signals in the electronic devices. It is a field-effect transistor that has a thin layer of silicon oxide between the gate and the channel.here are two ways in which a MOSFET can function. The first is known as depletion mode. When there is no voltage on the gate, the channel exhibits its maximum conductance. As the voltage on the gate increases (either positively or negatively, the channel conductivity decreases. The second way in which a MOSFET can operate is called enhancement mode. When there is no voltage on the gate, there is in effect no channel, and the device does not conduct. A channel is produced by the application of a voltage to the gate. The greater the gate voltage, the better the device conducts.
H-Bridges
A H bridge is an electronic circuit that enables a voltage to be applied across a load in either direction. These circuits are often used in robotics and other applications to allow DC motors to run forwards and backwards.
PID Controller
A proportional-integral-derivative controller (PID controller) is a control loop feedback mechanism (controller) commonly used in industrial control systems. A PID controller continuously calculates an "error value" as the difference between a measured process variable and a desired setpoint. The controller attempts to minimize the error over time by adjustment of a control variable, such as the position of a control valve.
Pulse Width Modulation
Pulse width modulation (PWM) is a fancy term for describing a type of digital signal. Pulse width modulation is used in a variety of applications including sophisticated control circuitry. A common way we use them here at SparkFun is to control dimming of RGB LEDs or to control the direction of a servo motor.
Bipolar Junction Transistors
This is a bonus bit about transistors because they are fascinating and I cannot stop reading about them. A bipolar junction transistor (BJT) is named as such because it has conduction by two carriers: electrons and holes in the same crystal. From the very informative site, All About Circuits : "The bipolar junction transistor is an NPN three layer semiconductor sandwich with an emitter and collector at the ends, and a base in between. It is as if a third layer were added to a two layer diode. The key to the fabrication of a bipolar junction transistor is to make the middle layer, the base, as thin as possible without shorting the outside layers, the emitter and collector. We cannot overemphasize the importance of the thin base region." The part that blows my mind with amazement is this notion of "holes." Holes are spaces where an electron could be but presently is not. Like any hole in the macroscopic world, you can't move one; it's an absence. All you can do is fill the hole, which creates a new hole somewhere else. We can in some ways model this as an imaginary particle that's flowing the opposite direction from the electrons (and thus in the same direction as the current), but there's no actual particle moving in that direction. Like most models, it's a convenient fiction that makes the math easier.
In plain english: Transistors are tiny switches that can be triggered by electric signals. Transistors rely on a quirk of quantum mechanics known as an “electron hole.” A hole is the lack of an electron at a spot where one could exist in semiconducting material. By introducing an electric signal to a transistor, electric fields are created that force holes and electrons to swap places. This allows regions of the transistor that normally insulate to conduct (or vice versa). All transistors rely on this property, but different types of transistor harness it through different means.
I think this aspect of my electronics research really struck me because I find its simultaneous ambiguity and preciscion quite wonderous, and learning about this gave me the sensation of a lovely framework shift. Ok, yes, I have discovered a newfound crush on quantum mechanics...
For my final project, I am thinking of making a pneumatically actuated soft prosthesis that augments my bodily motions. In doing so, I am seeking to play with the relationship between physiology and psychology, and to experiment with how physical extensions shape and form our interior states. I am planning to start a continuous blog about my final project --you can read more about my final project development here.
To actuate the airflow, I am going to be using solenoid valves. Solenoids are basically electromagnets: they are made of a big coil of copper wire with an armature (a slug of metal) in the middle. When the coil is energized, the slug is pulled into the center of the coil. This makes the solenoid able to pull (from one end) or push (from the other.)
I thought this week would be a good time to learn about transistors, PMW, and how to control a solenoid. I found this mini tutorial on solenoids from Adafruit that was helpful. To drive a solenoid you will a power transistor and a diode. You will need a fairly good power supply to drive a solenoid, as a lot of current will rush into the solenoid to charge up the electro-magnet, about 500mA, so don't try to power it with a 9V battery!
Additionally, from Adafruit I learned that the solenoid valve I want to use should be paired with this N channel MOFSET, because these FETs can switch over 60A and 30V, and my valve may require a lot of power.
To practice and test out general output control concepts, I tried out three FabLab boards: the DC Motor Board to learn about H-bridges and the Servo Board to learn about Pulse Width Modulation, and the Speaker Board to learn about MOSFETs.
The schematics/ boards I tested:
I milled, deburred, and stuffed all three boards. I am getting faster and better at this process. Getting very comfortable with aspects of the Modela:
After deburring for quite some time, my boards looking shiny and ready for soldering:
I soldered two MOSFET boards simultaneously.
Follow by the h-bridge board.
And the servo board.
While I was milling and chatting about solenoid valves, Sam Callish showed me the teeniest pump ever.
Boards all stuffed and ready to be programmed.
This marks the end of the happy part of my output device journey.
Even those this schematic is pretty simple, since my experience with electronics was extremely limited, I had some time going backwards and learning about some basic things, such as the difference between the types of transistors (PNP, NPN, BJT, MOSFET...) and the difference between the different types of diodes (Zener, Schottky, Rectifier..)
A note here about transistors and inductive loads. Solenoids are inductive loads, and when you turn them off, some current goes the other way. You need to put a diode across the transistor, and ideally the motor too, to prevent this strikeback of current from hurting your circuit. I learned this the hard way and blew out two transistors before I figured it out.
This was my first test with a solenoid valve. Oh the thrill of hearing that electro magnetically induced thump:
I got this plastic solenoid valve working. I added some LED lights so I had a visual indicator of when the solenoid should be going off.
Lastly, a video of the solenoid valve working to control airflow into a test silicone cast.
Video of my control board working:
And that concludes this week's adventure in Output Devices!
I hope you enjoyed this "How To!
I was also inspired by this image that I came across this week. It is an array of sensors designed to fit on a beating heart developed at the University of Washington in St. Louis. It can simultaneously sense information and apply therapy when distress is detected.
Tutorials and References
I looked at this guide from Parallax on their Polar Heart Rate receiver.
I also looked at this Adafruit tutorial on how to blink LEDs to your heart beat.
The references from PulseAmped on how to interface the signal with an Arduino.
My objectives this week were:
1. To learn about, build and test a circuit that uses an LED to read a pulse
2. To use the serial port to read the raw data from the sensor I designed.
3. Use an LED as an output device that blinks every time a pulse is detected
4. Use the input information to drive an output device that can physically change its position in sync with the frequency of your pulse.
5. Bonus points if I can use a smoothing algorithm to process the signal noise, but no promises, as I am such a newbie to all this.
To start, I used the PulseSensorAmped, an Arduino, and a servo to do some testing. This is a video of that test: the reading is extremely noisy, reading a heartrate of about 90 BPM (inaccurate.) However you see that it responds very well in big drops of light (aka if I lift my finger completely off the light sensor and cover it again. I do this 5 seconds into the video.)
After lots of testing and debugging on the Arduino, I designed my own circuit of a pulse detector that uses an IR LED and receiver. I decided to design it as a breakout board for ease of modular use and placement. I designed the graphic to repesent what it is doing: "seeing" your pulse with an LED.
This is an image of the schematic:
This is an image of the way I translated the schematic into a board. I exported the png from Eagle, and then imported it into illustrator to draw this.
This is an image of the sensor in action:
I milled this custom heart rate sensing PCB and milled a new valve control board as well:
And that concludes this week's adventure in Input Devices!
I hope you enjoyed this "How To!
Tutorials and Resources:
Beginner Processing Tutorials
Sparkfun Tutorial on I2C
Sparkfun Tutorial on Connecting Arduino to Processing
Adam Marblestone's super awesome HTM on EEG Sensing
Arduino Guide on Interfacing with Software via Blynk
I was lucky enough to discover that PulseAmped had a Processing Visualizer for their sensor. I was able to read and learn from their code. For this week, my work was in modifying this sketch to create an interface that I found more visually pleasing, and to animate and display my pulse in a customized way of my design. The original code that I based my modifications off of can be found here: GitHub of PulseSensor
General Concept of how to Make Arduino Talk with Processing:
1. In Arduino, you will need to set up your program to begin serial communication to your computer. Use:
2. Now, in Arduino, you will need set up a loop so that this communication keeps happening as long as the program is running:
3. In Processing, you will first need to import the serial library:
This is a video of a live stream of my pulse from a sensor into my microcontroller into Arudino, and from the serial serial port of Arduino into processing:
And that concludes this week's adventure in Interface & Application Programming!
I hope you enjoyed this "How To!
I had another final the day after my HTM final, and had to do a hasty documentation. After taking a breather and finally sleeping, I am doing a proper update. This outline format is inspired by my very lovely classmate Caroline Jaffe.
Self-portraiture has a deep heritage in art history, dating all the way to Ancient egypt. In today's cultural landscape, the selfie looms large and egocentrically, with great prevalance. Given my new skills in fabrication and electronics design, I wanted to push the envelope and create a new kind of dynamic self portrait- a biometric self portrait. I designed a machine that would take something that was emotionally charged, and very personal- my own heart rate signal-- and project it into a mechanical, kinetic, and expressive sculpture.
If you read further down this blog, you will see that my original intent was to construct a different final project altogether. When many of the planned components failed (note to the future HowToMake student: have many backup plans!) I had to take as many components of the previous project as possible, and recreate a new one that would work. I think it was a valuable lesson in spiral development, being able to both scale up and down in ambition. The design of this biometric self portrait was salvaged from different parts of different prototypes I had been experimenting with, and the design was broken down into several components, which are described in mini categories below:
I had spent the semester learning about pneumatics and valve control. Many of the challenges I faced had to do with noisy pumps, cumbersome size of compressed air canisters, the tradeoff between strength of airflow and size of pump. These were all problems because I wanted to create a wearable inflatible garment, but could be mediated in a sculpture that had a base, and could be plugged into an outlet.
This is sketch of how the system would work conceptually:
I knew from the beginning that I wanted to incorporate and integrate as many of the fabrication skills as possible that we learned in class. I also knew that I wanted to mill a wooden base for the juxtoposition between an organic material and a algorithmically derived process. The design of the base features a signal from my pulse captured from my application and programming week, displayed in two ways. The first way is in a 2d linear cut, to be used to house the wire of the pulse sensor. The second way is by sweeping that pulse around a circle, to create a base for the valve that would be displaying my heart rate.
(Above) Diagram of integration.
3d modeling for component tolerances (above)
3d modeling for material study (below)
STL file for partworks.
For the more didactic part of my biometric self portrait, I chose to include a 3d scan of my head as a sculptural aside. Dan Chen graciously offered to scan me:
>This is the [eerie] result of the scan:
>Tom printed it on the Polyjet printer for me, and the results carried my resemblence.
>I had worked towards the schematic of this circuit when I studied output devices. The goal is to drive a 12V motor in correspondence with the input from a pulse sensor. I chose the ATmega328 because initially I had grander plans and wanted to be able to scale up.
This is the schematic of the PCB. As I have been developing this technique all semester, I have integrated the circuit into a graphic and symbolic representation of a heart and valve. You can see the clear breakout between the circuit used to power the ATmega328 and the circuit to power the valve, which is slightly below it.
This is actually the intended way I wanted the circuit to look, but at the last minute discovered an error and did not have time to redraw the whole circuit in this veiny way:
Now came all the fun and nail biting process of putting it all together.
First up: milling the base. First off, Big thanks to Tom Lutz and Sam Callisch for their patience and expertise. This was my first time milling wood and I learned all about feed rates and spindle speeds for machining. A cutting speed is the speed difference between the cutting tool and the surface of the thing you are cutting. It is expressed in units of distance along your workpeice/time (usually surface feet per minute, (sfm)). A feed rate is the relative velocity at which your cutter is moving along your workpiece. A feed rate will depend on the motion of your tool, such as whether you are cutting straight paths or curved paths. The unit is usually distance per spindle revolution (in/rev or ipr). Main variables to consider include: hardness of your workpiece material, your cutter material, nature of tool path, desired finish. There are calculators online to help you find the sweet spot:
An image of the milling in process. The form here is my pulse wave, swept in a circle:
A video of the endmill carressing my heartbeat:
CATASTROPHE!I encountered this error in an utterly terrible moment.
Sam and Tom came to the rescue.
To save some unlucky future students some pain, I wrote this quick tutorial:
WHAT TO DO IF SHOPBOT FAILS
> Take a picture of everything.
> Things to note:
_line of gcode that it failed at
_ xyz position that it failed at
_ don’t move the position of the spindle at all
> Go into partworks text editor and open your original gcode file
> Make a copy of that file so you have a safe file
> Find the line you were last at.
_ note the position of xyz of your failure, and compare it to where it should have been
> Delete all M3 lines before the fail point (maybe save one line before the fail for reference)
> Thing to note: take note of your Z position. The shopbot might blast through all of your material trying to return to where it was
HOW TO FIND THE NEW ZERO.
> Take the note of the x y z position of the fail was
> Tell shopbot that this is the new zero
subtract those coordinates from the new zero. This should be the old zero!
> Note: my zeroing was not 100% accurate but it got most of the way. We ran the file with the new zero, cut a tiny bit, and then used calipers to measure the difference in the x and y dimension
Using this new info, we were able to get a super accurate, almost seamless continuation of the shopbot file where it left off
Milling successfully finished despite glitch.
Onto PCB fabrication. I have spent more hours with the Modela this semester than any other machine.
My traces:
My cutout:
Milling on the Modela. Sometimes the machine is feeling kind, sometimes not. While I milled the final file, six traces popped off. Which lead to acrobatic bootloading, as you will see two images down.
Board stuffed. First of two.
Because my traces broke off while milling, I spent many painful hours trying to bootload and program by soldering directly on pins and partial traces. Was crazy, miraculously successful. I have to thank Sands, Thras and Dan for a their steady hands and constant encouragement.
I have learned that you need just a teeny tiny momentary connection to load a program onto the microcontroller. It is kind of amazing.
Onto casting an actuator in gypsum to dampen its noisy operation.
Demolding the cast. I liked the way it looked like some sort of ancient ruin.
Miscellaneous other tasks: laser cutting backup cover in case circuit doesn't work out, and hole needs to be patched.
Physical set up in progress.
Final setup: it Works!
This is a bill of materials from this project. Some of the parts, such as the PCB stock and microcontroller are doubled because I made errors. There are components such as FTDI headers that I used but do not show in the final circuit beacuse I desoldered them for aesthetic reasons. A challenge I took up in the design process was integrating high tech components with traditional materials to create integrated and interesting juxtapositions.
I demoed successfully at the final review, but haven't taken a video yet, because my phone, which serves as my camera broke that very morning. I am still looking forward to doing a proper documentation of the final product.
I have learned so many things in this class that have little to do with actual fabrication, but a very integral part of the process. Skills such as patience, having (many) solid backup plans, and learning to creatively apply one skill towards varied outputs was very useful. In a way, I had to learn how to learn, but more importantly, I had to learn how to fail, and learn how to learn from failure. I have immensely enjoyed suffering through, and learning all these new skills. I am excited to continue the journey of fabbing/ making/ programing, especially designing more PCBs with embedded aesthetics and making pnemuatics that work well, consistently, and expressively. Thank you Neil for the great learning experience!
Probably against all better judgement, I had the idea for a different kind of pneumatic actuation, which is a small array of tubes that would act like "arm hairs" that could stand straight up when filled with air.
This is an image of the design. While I have CNC milled and 3d printed my previous molds, I decided to laser cut this one, because I thought it would be faster. (It was not, but more on that later.) On the left is the intial sketch for how to make such a device. On the right is the model with fabrication details such as holes for aligning and clamping with screws, extra layers for over flow material and airholes for debubbling.
This is an image of the design with a section cut. The grey represents the mold for the hairs. The pink represents the voids where pneumatic actuation will happen. The blue are air channels for air to move to each hair.
I laser cut small samples first to test for tolerances with the screws and plexiglass rods.
But alas, even though the tests came out fine, the laser did not cut consistently through the entire bed. I spent a lot of time removing peices that cut well and recutting in some areas over 4 times. It is the small details that end up making your project take four times as long as you anticipated. :(
I finally got all the pieces cut, removed all the backing, and got to gluing. I used a syringe and SCRGRIP 4, which is a water thin plexi bonding adhesive that works very well. You can get it locally at Altec Plastics.
Images of the finished mold, and a detail shot of the chambers.
An image of the two part mold.
The air chamber positives
Silicone casting time!
It helps a lot with soft robots and silicone casting to use a vaccum to pull out the bubbles. Here is a video of this process if you've never seen it:
Image of each cast chamber with a positive for the voids:
Heartbreak Haven: I recast this twice with two different techniques but I could not get the whole array to actuate. Instead of trying to continue to innovate a new method of actuation and debugging the fabrication technique, I have moved on to focus on a deployable project for the final review. Here are some images of the (naievly optimistic) detour into pneumatic array actuation:
A video of the single hair actuating:
The last few days I have been working designing circuits that I have successfully tested out on the Arduino. Because I am a designer, I could not help but wonder if I could embed another layer of design to the PCB, in addition to its elctronic function.
I designed six different boards: two have three are complete control boards with microcontrollers on them, three are breakout boards. Of those, two are for valve control (which have design allusions to a heart,) and one is for pulse detection (which illustrate the sesnsing via the hand.) These are some of the circuits, followed by some photos of the boards. I am still in the process of stuffing and testing. I am happy to report that I am getting a lot better at debugging circuits. I feel that I am actually chasing electricity around the board and seeing where they go and accomplish (or as often the case is, not accomplish) on the journey.
This is the schematic for the control board for solenoid valve control using a MOSFET:
This is the schematic for valve control breakout:
This is the schematic for pulse detection:
Some images of the boards:
Video of my control board working:
This is still a working title, but this past few days I made another breakthrough in my project conceptually.
I have been doing a lot of searching about what it is that I a trying to build with this bodily extension that I have been so obsessed with, and the last week or so I started to think about it differently...
...that maybe this desire to build a prosthetic to expressed the repressed is actually an attempt to build something to be understood.
I started to think a lot about how to make less distinct, the differences between the self and the other, which made me realize that I was thinking about empathy.
The last few days I have been working on taking a biometric reading (currently heart rate) and mapping it to an actuator. As I am trying to work the technical aspects out, I did this little sound collage and rhythm test to see how my device might feel, to consider and inform my design process going forward. Right now I am considering whether the assymetric distribution adds a desirable uncanniness or not. I am also learning that the rhythm of a heart requires two quick pulses which might be difficult for my valve/ fish tank pump to deliver. I am looking into compressed air canisters, and other methods of actuation.
I also made this composition. A gut feeling made me take a photo. Not sure what else to say for now.
I am continuing to cast and test different materials and shapes to see how different morphologies will respond to pneumatic actuation. While the previous tests achieved a satisfying pulsating effect, this week I wanted to aim towards actual motion. My first test was cast in a small tylenol bottle with a peice of jumper wire that I lubricated and pulled out to create a small thin void. The material was EcoFlex20. This was the result:
The test above produced a decent amound of motion. Motivated, I cast into a bigger tube, this time a container for urine samples, and placed two pieces of threaded rod to create the voids. This was the result:
Placed on a table, actuation simulates a feeling of taking my pneumatic tentacle out for a walk:
I am also continuing to test the chambered forms developed at the Wyss institute, this time varying the thickness of different walls to achieve a different actuation effect:
After some testing with automating valve action, I was ready to try to connect the system to an automated circuit. This was totally a moment of "IT'S ALIVE" thrill for me.
I enjoy the sound of one of my solenoids because it sounds a bit like a beating heart:
Just to see what would happen, I also connected a small pump, with no valve, to the soft robot:
A lot of my challenges had to do with coupling the air pump openning (1/8" diameter) to the large openning of my solenoid valve (3/4" diamter) through to my soft robot (incision roughly 1/6"). Sharing an embarssingly cluttered photo of my desk to show the coupling problems. Many, many thanks to Eric VanWyck who helped me obtain a few valve couplers.
This weekend I had a conceptual breakthrough, and I am not working towards creating an empathy machine.
This was my first test with a higher quality solenoid valve. Oh the thrill of hearing that electro magnetically induced thump:
After switching solenoid valves, I was able to get the cheaper plastic valve working as well. I added some LED lights so I had a visual indicator of when the solenoid should be going off.
I plugged in a bunch of different output devices in here, to see how they would react in this setup with a transistor.
After playing around with the first cast, I wanted to do more iterations with different stiffness and pliability of materials, as well as different forms that could potentially be easier to cast. I got busy testing:
I made smaller molds this time, so that I could prototype and test more economically. I started with these open source designs from Soft Robotics Toolkit.
Testing out two durometers of mold material: Ecoflex 20 and DragonSkin 10. Added more combinations by inserting paper into some of the molds for a stiffer base, and cross bonding Ecoflex chambers on DragonSkin and vice versa.
Despite their petit size, the molds were incredibly difficult to take apart. Despite printing at 80% density many of my molds ripped apart. I suggest printing on the Eden if you have access; FDM printers are just not that great for this kind of mold making.
Moment of inflation always gives me thrills.
But it is definitely not always smooth sailing.
Testing pneumatic actuation with the new formwork:
Now that I had a prototype, I could test out some basic manual actuation, but more importantly for me, the affective components of its design and concept.
A test to see the inflation effect just by hand pumping.
Actuation test to see its emotional qualities next to my body:
Actuation test to see if I could sensorially connect it with breathing, or an autonomic response, such as heartbeat.
Test to see if it could feel like an extension of my flesh.
Testing for body placement and affective qualities.
I wanted to take the casting & molding week to create a prototype for my prosthetic device, but this cast turned out to be trickier than I thought, and I resorted to blogging about a simpler mold. But now I can finally blog about my journeys with the tentacle!
As I was searching around for prosthesis inspiration, I stumbled across this Robo-Tentacle Guide and immediate became obsessed with the idea of a soft robot for my final project.
For practice and experience, I followed the guide exactly to try to recreate my own to experiment with, and of course, the journey was not straight forward at all, and I failed many times.
The general idea of this pneumatic robot is that it is made from soft silicone with three voids inside. Depending on which void is inflated, the tentacle will move away from the inflation. This gives the soft robot three degrees of motion.
What this means is that the molding and casting needs to be done in several stages: 1. To create the mold/formwork for the entire exterior. 2. To create the positive formwork for the interior hollows 3. To create the negative mold for the interior hollows 3. To cast the entire exterior embedded with the negative molds 4. To excavate the interior hollow material from the tentacle.
One of the first problems was that the recommended mold size could not fit into the Ultimaker. I had to scale down 85%, but I forgot to scale the interior hollow mold down as well. This required my placement of the hollows to be very precise, or else some walls of my mold would be very thin. In the end a few did rupture, and I learned that I could patch them with SilPoxy. However, this oversight continued to be a problem for this prototyping iteration.
I started by 3d printing a positive form of the wax pieces that would eventually become the hollow negatives of the final product. This mold had some gaps between the filament that I tried to fuse with a hot soldering iron, and then eventually by rubbing wax into the gaps. I taped off the entire exterior to make sure it wouldn't leak and make a mess in the shop.
I cast OMOO directly into the 3d print after a light spray of mold release. I destroyed the original formwork in the process of demolding.
Lesson learned: For 3d printed molds, set density to 80% or higher on the Ultimaker for robustness.
I used parafin wax because I wanted a clear wax that would not stain or be so noticeable incase I couldn't excavate all of it. However, the parafin wax was prone to breaking beacuse it was so oily and soft.
Lesson learned: For lost wax casting that requires structural integrity, parafin wax is probably too soft.
The next headache came with printing the exterior molds on the Ultimaker. Even after constraining the size of my mold to the maximum the bed could take, setting an infill of 80% made each print (1/3 of the mold) take about 27 hours each to print. Additionally, when they were finished, because of the orientation necessary to make the print fit, a lot of post processing was necessary.
Lesson learned: Orient the valuable part (aka the part you are casting into) of your 3d prints cleverly (facing up) to avoid post proccessing.
I clipped off the excess filament with angle cutters, filed and sanded the casting surface smooth, and then applied a thin layer of wax to futher smooth out the ridges.
Lesson learned: A dremmel and sandpaper go a long way in cleaning up FDM prints.
I put the 3 part mold together and prepared to cast. This photo is actually from my second try. You can see in this iteration I reinforced the wax with chopsticks.
Lesson learned: Use struts to reinforce paraffin wax when using the method oflost wax casting.
You can see the broken wax inside my first cast. This creates an airchamber that will not actuate properly because the void is not continuous.
Image of the reinforcement method for the wax.
It yeilded much better results.
I placed the whole mold in the oven at 200 degrees F to melt the wax out of the interior.
Now that I had a prototype, I could test out some basic manual actuation! I hooked up some appropriated hand pumps from blood pressure cuffs. The next post will be about testing my two prototypes.
This marks the beginning of the concept ideation. I am including the sketches from the first week as a reference.
This is scenario: In the near future, humans, cultivated by the supremacy of a logic based computational world, will become so obsessively hygienic and reason driven that they will lose a connection to their biological and animal impulses. To aid in this tragedy and release the inner ID in every computer scientist, I propose a robotic limb that will allow the refined urban dweller to release his or her animal urge for short recreational durations. I made a mini slide show to create a character sketch for the narrative I am trying to build. All photos are from this amazing artist named Michael Wolf.
Conformity, self control, rationality, emotional regulation are traits we commonly aspire to in a civilized, regulated world (and for good reason.) However, this overly hygenic regime can be numbing. The antidote I would like to engineer for this desensitized urban dweller is a prosthetic to lose control, a sort of phantom limb for emotions. A limb that allows me to express emotions that i am too repressed or obedient to express.
Right now, I am experimenting with anger. As an asian, as a woman, as someone living in this contemporary society, expressing anger feels embarrassing and unacceptable, like losing one self or not maintaining control. These sketches experiment with the idea of an angry arm that bangs on the table and flips people off.
I’m still in the process of working through it; there are many repressed emotions and i feel like this is still a very simple one. It is definitely in progress.
The antidote I would like to engineer for this numbed urban dweller is a prosthetic to lose control, a sort of phantom limb for emotions. A limb that allows me to express emotions that i am too repressed or obedient to express. I propose a limb that frees the wearer from these norms and allows the repressed to come out to play, if only for a short recreational while.
Thanks for reading my ongoing progress on my final project.
I hope you are enjoying the insights into the creative process!
Team CBA kept a blog about our Modular Machine Progress here.
BACKGROUND:
Inspired by, and based upon, the Machines That Make Project. In MIT's How to Make (Almost) Anything class, we learn different digital fabrication skills each week. Towards the end of the course, these skills are integrated in a group project where we collaboratively build a machine based on the Modular Machines that Make Project developed by Nadya Peek. This is collaborative document from the CBA Section showing the machine we made and how we made it.
TECHNICAL DESCRIPTION:Before we had even decided what our machine would actually do, we needed to produce and assemble some of the machine components. Following these instructions, we broke into teams to tackle different components. Vera and Kim used the cold saw to cut machine shafts of the correct length. Caroline and Ali used the Modela to mill the driver board for the Gestalt nodes, and then went to the basement lab to stuff the board and construct ribbon cables. Dhruv, Yasmine, Raphael and Camille used the laser cutter to cut cardboard for the stages. Yasushi, Sands and Joshuah installed the PyGestalt software that will allow us to control our machine. Dan, Sooyeon, Camille and Vera used glue and plastic clips to assemble the laser-cut cardboard into the stages. Thanks to Nadya and Dan for instruction and oversight!
Camille proposed to create a gestural interface between your tongue and a CNC controlled tongue. The machine will be used to gain psycho-sensory satisfaction by virtually licking a cake while seeing a robotic tongue effect the cake. There will be 4 degrees of freedom for the tongue's expressivity: x, y, z and theta. The z axis motion can be accomplished either by moving the toolhead up and down or by moving the cake up and down. The theta motion can be accomplished by installing a servo at the base of the tongue for rotation or by installing a pneumatic actuation. The toolpath is generated by tracking the motion of the user’s own tongue movements. The x, y movements of the user's tongue will be mapped to x, y movements of the CNC tongue or of the cake pan. Ideally, this would be accomplished through computer vision, potentially by adding a white dot at the end of the user's tongue for easier tracking purpose. Alternatively, an iPad or iPhone application can be created to obtain haptic sensor data through the touch screen. The team tested this idea by licking the phone screens. It worked just as well as swiping with fingers. The tongue can be connected to the actuators by screws or forks. The effector team will decide.
The tongue can be made by casting gelatin or silicone; by 3D printing; or by creating a pneumatics actuator. The tongue will be tested with materials with various levels hardness.
Ani will make conceptual drawings. Ani and Daniel L. will explore pneumatic robotic actuation. Ani will test casting techniques for pneumatic actuators. Daniel L. will explore 3D printing elastic materials that have different levels of hardness and elasticity. Joshua Jest will design 3D models of different tongue forms and tongue textures. Sooyeon will update the group document based on the meeting note. Camille and Sooyeon will cast the prototype static tongues. Caroline and Ani will liason between the effector group and the toolhead group.
The task of modeling a tongue fell on Joshuah. He designed the tongue using Rhino. Sectional cuts of a tongue were eyeballed in 2D from a sketch. These sectional contours were then used to define a 3D form using Rhino's loft command (which takes some finessing). The more sections the better the outcome. Once the form had been generated, the dimensions could be fine tuned; first to the true scale of a tongue, and then stretched to approximate different tongue shapes (pointed, normal, flat). Eventually the normal tongue was selected to be milled and cast. For the prototype and potential variety in different end effectors (tongue shapes) tongues in oomoo, drystone and clay were casted/molded. Camille cast a drystone tongue so that there was a model that could be used for Ani's pneumatic tongue casting. Furthermore, many clay tongues were molded to potentially have different shaped tongue end effectors. We wanted to play with the variety of tongue shape in order to give the user more options for potential cake decorating/licking. We thought about options such as a clover shaped tongue, or a rolled tongue to see if this could give a different quality of cake licking. We also casted the tongue with Dragon Skin silicone rubber. We fixated a screw inside the cast to be attached to the actuator.
Ani explored pneumatic actuation for the tongue mold, and tongue-like molds. This is a test casting with two chambers as "proof of concept." Kim, Vera, Harpreet, Raphael, Ali met to discuss design of the frame. After a test assembly the frame was modeled in Rhino. The main components are the three stages in a vertical “Delta Bot configuration”. They’re held in place on a bottom platform in slots and by a top cover. The platform was casted out of hydrostone and the top laser cut from cardboard. The detailed plans show the measurements that were important for the math of the whole mechanism. The dimensions are chosen so that we can fit a cake in the center that serves the whole group. The mold was milled out of foam using the shopbot. Three layers of foam were glued together using gorilla glue and clamped over night.
The mold was sealed with gesso and paste wax was applied as a release agent. Hydrostone was used to cast the base of the machine.
The Arm Team went through several nomenclature iterations (first Toolhead Team, eventually Arm Team). We met several times during the course of the project. In our first meeting (with Caroline, Eric, Andy K and Ani as a liason for the Tongue Team), we came up with idea to make our machine a Delta Bot, which uses the combined motions of three vertical axes to effect motion in the x, y and z planes, and looks roughly like the following image. We posited this idea to the rest of the group, who received it graciously. Our task was to build the linkage system that interfaces with the cardboard stages (from the Frame Team) and the tongue (from the Tongue/ End Effector Team). We coordinated among ourselves to carry out the following subtasks: Here are some pictures of our process in progress and the assembled frame: The cake licking machine requires real time communication between a touch screen and the machine itself. After discussing and examining different solutions for the implementation the software team decided on the following architecture: Phone client > Websockets Server (HTML) > Python server > Gestalt. We divided responsibilities: The code we wrote can be found in this git repository. The app that sends "licks" to the server and then to the machine is available at lickr.herokuapp.com (Assuming only one user on this page when operating the machine.) When integrating the software with the machine we ran into a few obstacles - malfunctioning power cable, uneven behavior of motors and more. Some pictures from our meetings and the work on integrating the software with the machine:
Many thanks to the CBA section for taking on this adventure!!! This will be a lasting HTM memory. #grateful Bonus: Look how happy we made Neil!
And that concludes this week's adventure in Modular Machines!
Creating the mold was fairly straight forward. The tongue form was rotated so that the two-part mold would cut the tongue axially, minimizing hangover overhang in each half, and creating shallow molds which would later help with silicon compositing. The tongue was then boolean differenced out of the two blocks to form the negative mold. As the tongue geometry was fairly complex, this didn't work on the first try. Rhino is notorious for boolean fails. If geometry is ill-defined, or misaligned by any non-zero value, you're screwed. There are plenty of work-arounds, and sometimes even rotating the geometry by 90 degrees in any direction can solve the issue. In this case Rhino's explode command (typing this command can also be cathartic when dealing with a failed boolean for too long) was able to isolate the surface of each of the tongue halves, around which the wax cube could be reconstructed with simple surfaces. All surfaces were then combined using the join command to create a solid - a closed geometry essential for both boolean commands and for milling.
Milling went smoothly, and the results can be seen below.
Casting Prototype Static Tongues
Casting and Prototyping Pneumatic tongues
And here is a video with a larger prototye complete with a "simulated" cake:
This is an in process image of the tongue being cast with pneumatic chambers:
Frame Team (Kim, Vera, Harpreet, Raphael, Yasushi, Ali, Dhurv)
Initial Meeting: Saturday 11/16
Milling and Casting
Arm Team (Caroline, Andy K, Eric, Dhruv)
Software Team (Andy S, Miguel, Mike, Sands, Jasmin, Yasushi, Viirj)
I hope you enjoyed this "How To!
Tutorials and Resources:
Beginner Processing Tutorials
Sparkfun Tutorial on I2C
Sparkfun Tutorial on Connecting Arduino to Processing
Adam Marblestone's super awesome HTM on EEG Sensing
Arduino Guide on Interfacing with Software via Blynk
I was lucky enough to discover that PulseAmped had a Processing Visualizer for their sensor. I was able to read and learn from their code. For this week, my work was in modifying this sketch to create an interface that I found more visually pleasing, and to animate and display my pulse in a customized way of my design. The original code that I based my modifications off of can be found here: GitHub of PulseSensor
General Concept of how to Make Arduino Talk with Processing:
1. In Arduino, you will need to set up your program to begin serial communication to your computer. Use:
2. Now, in Arduino, you will need set up a loop so that this communication keeps happening as long as the program is running:
3. In Processing, you will first need to import the serial library:
Haven't gotten to this week yet!
If you are hungry for more, check out my previous work!
And that concludes this week's adventure in Interface & Application Programming!
I hope you enjoyed this "How To!
Probably against all better judgement, I had the idea for a different kind of pneumatic actuation, which is a small array of tubes that would act like "arm hairs" that could stand straight up when filled with air.
This is an image of the design. While I have CNC milled and 3d printed my previous molds, I decided to laser cut this one, because I thought it would be faster. (It was not, but more on that later.) On the left is the intial sketch for how to make such a device. On the right is the model with fabrication details such as holes for aligning and clamping with screws, extra layers for over flow material and airholes for debubbling.
This is an image of the design with a section cut. The grey represents the mold for the hairs. The pink represents the voids where pneumatic actuation will happen. The blue are air channels for air to move to each hair.
I laser cut small samples first to test for tolerances with the screws and plexiglass rods.
But alas, even though the tests came out fine, the laser did not cut consistently through the entire bed. I spent a lot of time removing peices that cut well and recutting in some areas over 4 times. It is the small details that end up making your project take four times as long as you anticipated. :(
I finally got all the pieces cut, removed all the backing, and got to gluing. I used a syringe and SCRGRIP 4, which is a water thin plexi bonding adhesive that works very well. You can get it locally at Altec Plastics.
Images of the finished mold, and a detail shot of the chambers.
An image of the two part mold.
The air chamber positives
Silicone casting time!
It helps a lot with soft robots and silicone casting to use a vaccum to pull out the bubbles. Here is a video of this process if you've never seen it:
I was able to stuff, bootload, and program my ATTiny Control valve boards. Here are some videos of the board in action, along with some soft actuators that I 3d printed the mold of and casted in EcoFlex-20:
The last few days I have been working designing circuits that I have successfully tested out on the Arduino. Because I am a designer, I could not help but wonder if I could embed another layer of design to the PCB, in addition to its elctronic function.
I designed six different boards: two have three are complete control boards with microcontrollers on them, three are breakout boards. Of those, two are for valve control (which have design allusions to a heart,) and one is for pulse detection (which illustrate the sesnsing via the hand.) These are some of the circuits, followed by some photos of the boards. I am still in the process of stuffing and testing. I am happy to report that I am getting a lot better at debugging circuits. I feel that I am actually chasing electricity around the board and seeing where they go and accomplish (or as often the case is, not accomplish) on the journey.
This is the schematic for the control board for solenoid valve control using a MOSFET:
This is the schematic for valve control breakout:
This is the schematic for pulse detection:
Some images of the boards:
This is still a working title, but this past few days I made another breakthrough in my project conceptually.
I have been doing a lot of searching about what it is that I a trying to build with this bodily extension that I have been so obsessed with, and the last week or so I started to think about it differently...
...that maybe this desire to build a prosthetic to expressed the repressed is actually an attempt to build something to be understood.
I started to think a lot about how to make less distinct, the differences between the self and the other, which made me realize that I was thinking about empathy.
The last few days I have been working on taking a biometric reading (currently heart rate) and mapping it to an actuator. As I am trying to work the technical aspects out, I did this little sound collage and rhythm test to see how my device might feel, to consider and inform my design process going forward. Right now I am considering whether the assymetric distribution adds a desirable uncanniness or not. I am also learning that the rhythm of a heart requires two quick pulses which might be difficult for my valve/ fish tank pump to deliver. I am looking into compressed air canisters, and other methods of actuation.
I also made this composition. A gut feeling made me take a photo. Not sure what else to say for now.
I am continuing to cast and test different materials and shapes to see how different morphologies will respond to pneumatic actuation. While the previous tests achieved a satisfying pulsating effect, this week I wanted to aim towards actual motion. My first test was cast in a small tylenol bottle with a peice of jumper wire that I lubricated and pulled out to create a small thin void. The material was EcoFlex20. This was the result:
The test above produced a decent amound of motion. Motivated, I cast into a bigger tube, this time a container for urine samples, and placed two pieces of threaded rod to create the voids. This was the result:
Placed on a table, actuation simulates a feeling of taking my pneumatic tentacle out for a walk:
I am also continuing to test the chambered forms developed at the Wyss institute, this time varying the thickness of different walls to achieve a different actuation effect:
After some testing with automating valve action, I was ready to try to connect the system to an automated circuit. This was totally a moment of "IT'S ALIVE" thrill for me.
I enjoy the sound of one of my solenoids because it sounds a bit like a beating heart:
Just to see what would happen, I also connected a small pump, with no valve, to the soft robot:
A lot of my challenges had to do with coupling the air pump openning (1/8" diameter) to the large openning of my solenoid valve (3/4" diamter) through to my soft robot (incision roughly 1/6"). Sharing an embarssingly cluttered photo of my desk to show the coupling problems. Many, many thanks to Eric VanWyck who helped me obtain a few valve couplers.
This weekend I had a conceptual breakthrough, and I am not working towards creating an empathy machine.
This was my first test with a higher quality solenoid valve. Oh the thrill of hearing that electro magnetically induced thump:
After switching solenoid valves, I was able to get the cheaper plastic valve working as well. I added some LED lights so I had a visual indicator of when the solenoid should be going off.
I plugged in a bunch of different output devices in here, to see how they would react in this setup with a transistor.
After playing around with the first cast, I wanted to do more iterations with different stiffness and pliability of materials, as well as different forms that could potentially be easier to cast. I got busy testing:
I made smaller molds this time, so that I could prototype and test more economically. I started with these open source designs from Soft Robotics Toolkit.
Testing out two durometers of mold material: Ecoflex 20 and DragonSkin 10. Added more combinations by inserting paper into some of the molds for a stiffer base, and cross bonding Ecoflex chambers on DragonSkin and vice versa.
Despite their petit size, the molds were incredibly difficult to take apart. Despite printing at 80% density many of my molds ripped apart. I suggest printing on the Eden if you have access; FDM printers are just not that great for this kind of mold making.
Moment of inflation always gives me thrills.
But it is definitely not always smooth sailing.
Testing pneumatic actuation with the new formwork:
Now that I had a prototype, I could test out some basic manual actuation, but more importantly for me, the affective components of its design and concept.
A test to see the inflation effect just by hand pumping.
Actuation test to see its emotional qualities next to my body:
Actuation test to see if I could sensorially connect it with breathing, or an autonomic response, such as heartbeat.
Test to see if it could feel like an extension of my flesh.
Testing for body placement and affective qualities.
I wanted to take the casting & molding week to create a prototype for my prosthetic device, but this cast turned out to be trickier than I thought, and I resorted to blogging about a simpler mold. But now I can finally blog about my journeys with the tentacle!
As I was searching around for prosthesis inspiration, I stumbled across this Robo-Tentacle Guide and immediate became obsessed with the idea of a soft robot for my final project.
For practice and experience, I followed the guide exactly to try to recreate my own to experiment with, and of course, the journey was not straight forward at all, and I failed many times.
The general idea of this pneumatic robot is that it is made from soft silicone with three voids inside. Depending on which void is inflated, the tentacle will move away from the inflation. This gives the soft robot three degrees of motion.
What this means is that the molding and casting needs to be done in several stages: 1. To create the mold/formwork for the entire exterior. 2. To create the positive formwork for the interior hollows 3. To create the negative mold for the interior hollows 3. To cast the entire exterior embedded with the negative molds 4. To excavate the interior hollow material from the tentacle.
One of the first problems was that the recommended mold size could not fit into the Ultimaker. I had to scale down 85%, but I forgot to scale the interior hollow mold down as well. This required my placement of the hollows to be very precise, or else some walls of my mold would be very thin. In the end a few did rupture, and I learned that I could patch them with SilPoxy. However, this oversight continued to be a problem for this prototyping iteration.
I started by 3d printing a positive form of the wax pieces that would eventually become the hollow negatives of the final product. This mold had some gaps between the filament that I tried to fuse with a hot soldering iron, and then eventually by rubbing wax into the gaps. I taped off the entire exterior to make sure it wouldn't leak and make a mess in the shop.
I cast OMOO directly into the 3d print after a light spray of mold release. I destroyed the original formwork in the process of demolding.
Lesson learned: For 3d printed molds, set density to 80% or higher on the Ultimaker for robustness.
I used parafin wax because I wanted a clear wax that would not stain or be so noticeable incase I couldn't excavate all of it. However, the parafin wax was prone to breaking beacuse it was so oily and soft.
Lesson learned: For lost wax casting that requires structural integrity, parafin wax is probably too soft.
The next headache came with printing the exterior molds on the Ultimaker. Even after constraining the size of my mold to the maximum the bed could take, setting an infill of 80% made each print (1/3 of the mold) take about 27 hours each to print. Additionally, when they were finished, because of the orientation necessary to make the print fit, a lot of post processing was necessary.
Lesson learned: Orient the valuable part (aka the part you are casting into) of your 3d prints cleverly (facing up) to avoid post proccessing.
I clipped off the excess filament with angle cutters, filed and sanded the casting surface smooth, and then applied a thin layer of wax to futher smooth out the ridges.
Lesson learned: A dremmel and sandpaper go a long way in cleaning up FDM prints.
I put the 3 part mold together and prepared to cast. This photo is actually from my second try. You can see in this iteration I reinforced the wax with chopsticks.
Lesson learned: Use struts to reinforce paraffin wax when using the method oflost wax casting.
You can see the broken wax inside my first cast. This creates an airchamber that will not actuate properly because the void is not continuous.
Image of the reinforcement method for the wax.
It yeilded much better results.
I placed the whole mold in the oven at 200 degrees F to melt the wax out of the interior.
Now that I had a prototype, I could test out some basic manual actuation! I hooked up some appropriated hand pumps from blood pressure cuffs. The next post will be about testing my two prototypes.
This marks the beginning of the concept ideation. I am including the sketches from the first week as a reference.
This is scenario: In the near future, humans, cultivated by the supremacy of a logic based computational world, will become so obsessively hygienic and reason driven that they will lose a connection to their biological and animal impulses. To aid in this tragedy and release the inner ID in every computer scientist, I propose a robotic limb that will allow the refined urban dweller to release his or her animal urge for short recreational durations. I made a mini slide show to create a character sketch for the narrative I am trying to build. All photos are from this amazing artist named Michael Wolf.
Conformity, self control, rationality, emotional regulation are traits we commonly aspire to in a civilized, regulated world (and for good reason.) However, this overly hygenic regime can be numbing. The antidote I would like to engineer for this desensitized urban dweller is a prosthetic to lose control, a sort of phantom limb for emotions. A limb that allows me to express emotions that i am too repressed or obedient to express.
Right now, I am experimenting with anger. As an asian, as a woman, as someone living in this contemporary society, expressing anger feels embarrassing and unacceptable, like losing one self or not maintaining control. These sketches experiment with the idea of an angry arm that bangs on the table and flips people off.
I’m still in the process of working through it; there are many repressed emotions and i feel like this is still a very simple one. It is definitely in progress.
The antidote I would like to engineer for this numbed urban dweller is a prosthetic to lose control, a sort of phantom limb for emotions. A limb that allows me to express emotions that i am too repressed or obedient to express. I propose a limb that frees the wearer from these norms and allows the repressed to come out to play, if only for a short recreational while.
Thanks for reading my ongoing progress on my final project.
I hope you are enjoying the insights into the creative process!