This week I wanted to continue working on my two ideas:
1. What if Foodcam was a hat?and probably more interestingly
2. What if a plant and a satellite could dance?The second idea is based on an idea I had after attending a lecture by Benjamin Bratton where he spoke about how the satellites that orbit earth are now creating a kind of figurative skin around the planet. I had a number of what I think are interesting (weird) ideas during the lecture, but I found the thought about satellites and extensions towards planetary computing very evocative. It got me thinking immediately about this week’s assignment.
I haven’t spent much time thinking about satellites previously, but it suddenly felt incredible to me that there are an increasingly large number of advanced technologies hurtling around space orbiting the planet, powering many aspects of our lives, yet we don’t have much of a relationship with them.
It immediately triggered a thought about the contrast of these satellites; advanced human created technological devices, blasted into space and moving at 17,000 mph, and for example a tree or a plant, spawned of nature, utterly grounded, and for all intents and purposes stationary.
One of my research interests is the interactions between species, particularly outside of the human context. As a lot of my work in the past has been about plants, I naturally started thinking about the ways in which a plant and satellite might exchange. What might a plant be able to sense from a satellite?
What kinds of connection, relationship, or exchange might occur between a plant and satellite? Of course, satellites are not a traditional species, but as more and more objects orbit earth, they start to form a new ecosystem there and as technology advances particularly via AI, these objects may in the future be seen in a new light. They may even form relationships with each other or other species.
I liked the metaphor of a dance, especially as immediately I thought about working with a robot arm to mediate and assist an unmoving plant in its potential waltz with a satellite.
After some research I found some hints that there were pathways towards this idea. The general high-level idea was something like this:
As a quick recap from last week, I managed to get the GPS to send me coordinates of satellites and I stored this in the below array as a proxy for live GPS data. This week was more about translating that into movement.
The data gives me a timestamp, satellite number, coordinates (x, y, z) and the SNR. Of course all I really needed for now was a simple array of positional data so I formatted it as follows:
coordinates = [
[0.895, 0.078, 0.438],
[0.731, 0.571, 0.375],
[-0.321, -0.476, 0.819],
[0.087, -0.989, 0.122],
[0.752, 0.146, 0.643],
[0.087, -0.989, 0.122],
[0.895, 0.078, 0.438],
[0.109, 0.884, 0.454],
[0.086, 0.017, 0.996],
[0.109, 0.884, 0.454],
[0.085, 0.021, 0.996],
[0.544, 0.838, 0.052],
[-0.313, -0.464, 0.829],
[0.109, 0.884, 0.454],
[-0.689, 0.714, 0.122],
[0.085, 0.021, 0.996],
[0.752, 0.146, 0.643],
[0.087, -0.989, 0.122],
[0.099, 0.034, 0.995],
[0.099, 0.034, 0.995],
[0.120, 0.980, 0.156],
[-0.689, 0.714, 0.122],
[0.120, 0.980, 0.156],
[-0.688, 0.712, 0.139],
[0.600, 0.797, 0.070],
[0.137, 0.978, 0.156],
[0.894, 0.094, 0.438],
[-0.603, 0.524, 0.602],
[-0.595, 0.517, 0.616],
[0.765, 0.135, 0.629],
[0.086, -0.986, 0.139],
[0.097, 0.039, 0.995],
[0.097, 0.039, 0.995],
[0.768, 0.122, 0.629],
[0.768, 0.122, 0.629],
[0.726, 0.588, 0.358],
[0.731, 0.571, 0.375],
[-0.689, 0.714, 0.122],
[-0.313, -0.464, 0.829],
[0.765, 0.135, 0.629],
[0.765, 0.135, 0.629],
[0.731, 0.571, 0.375],
[0.099, 0.034, 0.995],
[0.109, 0.884, 0.454],
[0.085, 0.021, 0.996],
[0.544, 0.838, 0.052],
[-0.313, -0.464, 0.829],
[0.109, 0.884, 0.454],
[-0.689, 0.714, 0.122]
Based on the GPS data from the satellites, I did a first test with a large robot arm with the help of Kye who has been working with it for some time.
It was programmed in Python and RoboDK.
I was then able to track down a smaller Robot arm - myCobot 280 M5 to test in more detail. Miana gave me some high level training, but it also seemed like the robot arm was having a lot of issues during the training:
In any case, I decided to give it a go…
I started out by tracking down the Github and reading through their docs. I set up a (conda) environment as described here:
https://docs.elephantrobotics.com/docs/gitbook-en/7-ApplicationBasePython/7.1_download.html
Reading through the docs, I also found that you could 3D print a base for the Robot arm amongst other things here:
https://www.3dfindit.com/en/cad-bim-library/manufacturer/elephant-robotics?path=elephant_robotics
To communicate with the robot arm, we plug it in via USB-C and have to supply power to it as well. Within the ‘transponder’ menu, we then select USB UART, and it should display something like Atom: OK which I believe means it’s connected to one of its 2 microcontrollers.
In terminal, we can then use ls /dev/tty*
to see which serial ports are available for communicating with the Robot arm, in my case "/dev/tty.usbserial-0252EBFC"
. The Baud rate should be set to 115200.
I tried to run some basic python as we had done with Kye to move the robot arm around, but it did not seem to like moving very much.
At other times, it seemed to have suicidal tendencies. So I looked into how I could properly calibrate this wild creature.
I then downloaded the Elephant Robotics software called MyStudio: https://github.com/elephantrobotics/myStudio
I saw that they had just released a new version, which is either good or bad. I downloaded it, but unfortunately, the calibration software failed to run. I tried to uninstall and install a lower version, but it seems that version also had an issue with exactly the same thing, and this new version was supposed to fix it.
So I filed a Git Issue and tried to continue.
As a temporary fix, I calibrated the robot manually by setting its positions. There are small indicators at each axis, which are locked by servos when the robot is connected. So I went into the calibration menu and manually calibrated each position, hoping that might help. (It did not)
I then reset the arm and tried again.
From the Github, I found some examples to run, firstly to just test out the LEDs. I edited it, of course, to just include my connection.
from pymycobot.mycobot import MyCobot
import time
mc = MyPalletizer("/dev/tty.usbserial-0252EBFC", 115200)
mc.set_color(255, 0, 0)
print("Hello, World!")
time.sleep(5)
mc.set_color(255, 255, 0)
print("Hello, 2!")
time.sleep(1)
mc.set_color(255, 255, 255)
print("Hello, 3!")
time.sleep(1)
mc.set_color(100, 50, 255)
print("Hello, 3!")
time.sleep(1)
mc.set_color(255, 50, 255)
print("Hello, 3!")
time.sleep(1)
Next, I wanted to try out some positions. You can quite easily send angles to the robot arm via something like the below:
mc.send_angles([x, y, z, rx, ry, rz])
This function sends a list of angles to all six joints of the robot arm, telling each joint to move to a specific angle. The penultimate value (80) sets the speed and the mode defines linear (1) or nonlinear movement (0), which is the default.
1.49°
115°
145.45°
30°
33.42°
137.9°
So with that in mind, we can send:
mc.send_angles([-1.49, 115, -145.45, 30, -33.42, 137.9], 80)
In the reference, there is also a function to send coordinates:
mc.send_coords([x, y, z, rx, ry, rz], speed=80, mode=0)
More information about how this works is detailed here.
I tried this out with my array of coordinates; however, the robot arm seemed to immediately nosedive whenever coordinates were used, no matter what they were. Again, potentially a calibration issue (or a skill issue).
I found another repo for the MyCobot280 specifically here and some examples including one which uses G-code. However, when I ran the code, it looked pretty similar to the coordinates example and seemed like it might break the MyCobot 280.
More information about that repo can be found here.
I then did some research on how I could convert coordinates into angles. This would involve reverse kinematics, which I would love to learn but I think is a bit out of scope for this week, especially given my current schedule.
As a compromise, I figured I could use the angles as these were reliably working. So doing a basic mapping would give me a potentially usable array to demo the idea.
# List of coordinates
coordinates = [
[0.895, 0.078, 0.438],
[0.731, 0.571, 0.375],
[-0.321, -0.476, 0.819],
[0.087, -0.989, 0.122],
[0.752, 0.146, 0.643],
[0.087, -0.989, 0.122],
[0.895, 0.078, 0.438],
[0.109, 0.884, 0.454],
[0.086, 0.017, 0.996],
[0.109, 0.884, 0.454],
[0.085, 0.021, 0.996],
[0.544, 0.838, 0.052],
[-0.313, -0.464, 0.829],
[0.109, 0.884, 0.454],
[-0.689, 0.714, 0.122],
[0.085, 0.021, 0.996],
[0.752, 0.146, 0.643],
[0.087, -0.989, 0.122],
[0.099, 0.034, 0.995],
[0.099, 0.034, 0.995],
[0.120, 0.980, 0.156],
[-0.689, 0.714, 0.122],
[0.120, 0.980, 0.156],
[-0.688, 0.712, 0.139],
[0.600, 0.797, 0.070],
[0.137, 0.978, 0.156],
[0.894, 0.094, 0.438],
[-0.603, 0.524, 0.602],
[0.529, 0.847, 0.052],
[0.097, 0.039, 0.995],
[0.768, 0.122, 0.629]
]
# converting coordinates to approximate angles
def convert_to_angles(x, y, z):
base_angle = x * 180 # Simplified base angle calculation
shoulder_angle = y * 90 # Simplified shoulder angle calculation
elbow_angle = z * 90 # Simplified elbow angle calculation
wrist_angle_1 = 0
wrist_angle_2 = 0
wrist_angle_3 = 0
return [base_angle, shoulder_angle, elbow_angle, wrist_angle_1, wrist_angle_2, wrist_angle_3]
# Convert coordinate set to angles
angles = [convert_to_angles(x, y, z) for x, y, z in coordinates]
# print each set of angles
for i, angle_set in enumerate(angles):
print(f"Coordinates {coordinates[i]} -> Angles {angle_set}")
This gave me a new array of angles rather than coordinates:
I then used this as an array of angles to send to the robot to carry a plant and dance with the satellites (in theory).
from pymycobot.mycobot import MyCobot
from pymycobot.genre import Angle
import time
COM3 = "/dev/tty.usbserial-0252EBFC"
baud = 115200
if __name__ == "__main__":
# MyCobot class initialization requires two parameters:
# The first is the serial port string, such as:
# linux: "/dev/ttyUSB0"
# or "/dev/ttyAMA0"
# windows: "COM3"
# The second is the baud rate::
# M5 version is: 115200
#
# such as:
# mycobot-M5:
# linux:
# mc = MyCobot("/dev/ttyUSB0", 115200)
# or mc = MyCobot("/dev/ttyAMA0", 115200)
# windows:
# mc = MyCobot("COM3", 115200)
# mycobot-raspi:
# mc = MyCobot(PI_PORT, PI_BAUD)
#
# Initialize a MyCobot object
mc = MyCobot(COM3, baud)
# List of approximate angle mappings for each XYZ coordinate point
angle_mapping = [
[10, 20, 15, 0, 30, 0],
[15.66, 89.01, 10.98, 0, 0, 0],
[-20, -30, 35, 0, 45, 0],
[5, -60, 10, 10, 20, 0],
[20, 35, 20, 0, 25, 0],
[161.1, 7.02, 39.42, 20, 0, 0],
[131.58, 51.39, 33.75, 0, 45, 0],
[-57.78, -42.84, 73.71, 0, 0, 0],
[135.36, 13.14, 57.87, 0, 0, 0],
[161.1, 7.02, 39.42, 0, 0, 0],
[19.62, 79.56, 40.86, 0, 0, 0],
[15.48, 1.53, 89.64, 0, 0, 0],
[19.62, 79.56, 40.86, 0, 0, 0],
[15.3, 1.89, 89.64, 0, 0, 0],
[97.92, 75.42, 4.68, 0, 0, 0],
[-56.34, -41.76, 74.61, 0, 0, 0],
[19.62, 49.56, 40.86, 0, 0, 0],
[-104.02, 64.26, 10.98, 0, 0, 0],
[15.3, 1.89, 89.64, 0, 0, 0],
[135.36, 13.14, 57.87, 0, 0, 0],
[17.82, 3.06, 89.55, 0, 0, 0],
[21.6, 18.2, 14.04, 0, 0, 0],
[17.82, 3.06, 89.55, 0, 0, 0],
[21.6, 18.2, 14.04, 0, 0, 0],
[-123.84, 64.08, 12.51, 0, 0, 0],
[108.0, 71.73, 6.3, 0, 0, 0],
[24.66, 88.02, 14.04, 0, 0, 0],
[160.92, 8.46, 39.42, 0, 0, 0],
[17.46, 3.51, 89.55, 0, 0, 0],
[138.24, 10.98, 56.61, 0, 0, 0]
]
# Speed setting (0 to 100, with 100 being the fastest)
speed = 30
# Move through each set of angles in the mapping
for angles in angle_mapping:
print(f"Moving to angles: {angles}")
mc.send_angles(angles, speed)
time.sleep(2) # Wait for 2 seconds to observe the movement
# After completing all positions, return to a neutral position
neutral_position = [0, 0, 0, 0, 0, 0]
print("Returning to neutral position.")
mc.send_angles(neutral_position, speed)