Feedback control project

Project Overview

PnP machines usually have a camera attached to its head to move itself to the exact place where it pick/place something.
DIY Pick and Place Motion Test with Camera - YouTube : https://www.youtube.com/watch?v=sTWdujEdT1k


I would implement a feedback control using an image recognition.
I would implement a face tracking function on the webcam using the servo motor as a camera platform.
Face recognition function on OpenCV is based on Haar Feature-based Cascade Classifier.
I would do some research on "template matching" methodology on image recognition which can be applied on visual feedback functions of PnP machines.





Set up Raspberry Pi w/o keyboard

There are plenty of guides to set up Raspberry Pi. I used the one which I set up last semester.
Complete Guide to Set Up Raspberry Pi Without a Keyboard and Mouse : https://sendgrid.com/blog/complete-guide-set-raspberry-pi-without-keyboard-mouse/

SSH connection to Raspberry Pi

Install Needed libs & packages
$ sudo apt-get update && sudo apt-get install git python-opencv python-all-dev libopencv-dev

Check whether the USB webcam is recognized properly
$ lsusb

Install and setup Motion

$ sudo apt-get install motion

could not run parallel with apt-get on another ssh session

$ cd /etc/motion
$ ls
$ nano motion.conf

Changed below configurations in order to allow remote access
     webcam_localhost on ->off
     control_localhost on ->off

$ sudo motion

Somehow I could not open 10.0.0.21:8081 on Chrome. It was redirected to 10.0.0.21:8080. However, I could open the stream on Safari.


Control a servo motor from Raspberry Pi

I tried ServoBlaster(https://github.com/richardghirst/PiBits/tree/master/ServoBlaster) this time which had good reputations in a Rapberry Pi community. It is a GPIO wrapper to control servo motors with Raspberry Pi.

$ git clone git://github.com/richardghirst/PiBits.git
$ cd PiBits/ServoBlaster/user
$ make servod
$ sudo make install


Wire the servo motor and Pi

Logical pin assigns of ServoBlaster.

Physical pin assigns of Pi. I used Pin# 02(5V VCC), 07(GPIO 4), 39(GND) directly connected to three wires of the servo motor.
Raspberry Pi • View topic - Pi B+ 40 pin GPIO how to use buttons and how many can I use? : http://www.raspberrypi.org/forums/viewtopic.php?f=78&t=82397

Color wires represent a VCC, GND, and Signal wire.


Could control the servo on the command lines

It worked so so. 

$ echo 0=60 > /dev/servoblaster
$ echo 0=140 > /dev/servoblaster
$ echo 0=240 > /dev/servoblaster

It sometimes moved in a weird way. It might be due to a shortage of the power supply. I would separately supply a power not using Raspberry Pi VCC pins.



OpenCV programming

To check the accuracy and ability of the face detection function of OpenCV, I cloned this project of OpenCV + node.js + WebSockets and run locally on my laptop (https://github.com/drejkim/face-detection-node-opencv).

$ git clone https://github.com/drejkim/face-detection-node-opencv.git
$ cd face-detection-node-opencv/server
$ npm install
$ node server.js

Open the browser and access to http://localhost:8080. It sometimes recognizes random objects as a face or fails to track a face but could track my face most of the time.


Feedback control to the servo

The logic I would implement is:
  1. Detect the center point (intersection of red lines) of the rectangle which is generated by the face recognition by OpenCV
  2. Calculate the distance between the center line (blue) of the screen
  3. Order the servo motor to move to the direction which reduce the distance between the red center point and the blue center line




$ sudo apt-get install python-setuptools
$ sudo easy_install -U RPIO

I have got this error: VIDIOC_QUERYMENU: Invalid argument
I have not fixed it yet.

Reference: OpenCV in python. Getting webcam stream works, but error message is printed - Stack Overflow : 
http://stackoverflow.com/questions/14958167/opencv-in-python-getting-webcam-stream-works-but-error-message-is-printed



Capability - HTML5 VM

I have read fabmodues, gestalt framework, and the gestalt node avr code to understand the current processes.
I also learned about the current RepRap firmwares and communication architecture as a reference. It was mainly to understand more about the difference between G-code, RML or other current languages and what gestalt framework want to achieve.

My current challenges is (1) to understand thread/queue programming in the gestalt framework which is closely related to the packet handling, (2) to what extent I could simplify the communication for the initial implementation.
Nadya gave me a module for client ver. of fabmodules which communicates with gestalt nodes.
Next step would be implementing a javascript module which can communicate with gestalt physical nodes as an output module of fabmodules.


RS-485 (or FABNET)
noise resistant


RepRap Options - RepRapWiki : http://reprap.org/wiki/RepRap_Options


Motherboard 1.2 - RepRapWiki : http://reprap.org/wiki/Motherboard_1.2#RS485_Comms_.2B_Power
"RS485 is how the RepRap motherboard communicates with all the tool controllers"
Both RepRap and MakerBot use the RS485 communications channels to control their extruders


Read serial communication module of Gestalt Framework

Communication interfaces are implemented in interface.py.





xy_plotter.py is an example implementation of virtual machines.
086-005a.py is an example implementation of virtual nodes.
      




      


Virtual Machine
xy_plotter.py

__main__

stages = virtualMachine(persistenceFile = "test.vmp”)
     //インスタンス生成と代入。継承済virtualMachineクラス(指し示す理由は?)
     抜粋
     initInterfaces
           else: self.fabnet = interfaces.gestaltInterface('FABNET', interfaces.serialInterface(baudRate = 115200, interfaceType = 'ftdi', portName = '/dev/ttyUSB0'))
              初期化時にinterface = interfaceShell(interfaces.serialInterface())
               それによりinitAfterSetが呼ばれ、transmitterThreadが開始する

          interfaces.py
           self.startInterfaceThreads() #this will start the receiver, packetRouter, channelPriority, and channelAccess threads.

            #define standard gestalt packet
            self.gestaltPacket = packets.packet(template = [

               packets.py
                   エンコーダ、デコーダ等

     initControllers
          self.xAxisNode = nodes.networkedGestaltNode('X Axis', self.fabnet, filename = '086-005a.py', persistence = self.persistence)
          self.yAxisNode = nodes.networkedGestaltNode('Y Axis', self.fabnet, filename = '086-005a.py', persistence = self.persistence)
          stages.xyNode = nodes.compoundNode(self.xAxisNode, self.yAxisNode)

          nodes.py
               class compoundNode(object):
                    '''A compound node helps distribute and synchronize function calls across multiple nodes.'''
     
     initFunctions
          self.move = functions.move(
               virtualMachine = self,
               virtualNode = self.xyNode,
               axes = [self.xAxis, self.yAxis],
               kinematics = self.stageKinematics,
               machinePosition = self.position,
               planner = 'null'
          )
          self.jog = functions.jog(self.move) #an incremental wrapper for the move function
          pass



     もしファイルがなければ、ボタンで軸決定をする

# This is for how fast the
stages.xyNode.setVelocityRequest(8)

     virtualNodes 086-005a.py

     class virtualNode
          class setVelocityRequest
               class actionObject 
                    def init(self, velocity)
                         velocityをコンバートし、self.setPacket({velocity: xx}).commitAndRelease().waitForChannelAccess().transmitPersistent()

     core.py
          setPacket
               self.packetSet = self.packetEncoder(packet)
          packetEncoder (初期化でserviceRoutine.packetSet)

          functions.serviceRoutine
               packets.packetSet
          
          commitAndRelease
               self.release
                    clearToRelease
                         threading.Event()

               self.interface.commit
                    commit
                         put into the queue

          waitForChannelAccess
               (channelAccessGranted = threading.Event()).wait

          interface.pyでthreadingが使われているのは下記

Queue(同期キュークラス)
     複数のスレッドで情報交換、同期するためのもの

スレッド - クラスライブラリ応用 : https://www.mlab.im.dendai.ac.jp/javalib/thread3/


「生産者 / 消費者」問題
マルチスレッドプログラミングにおける「生産者 / 消費者」問題とは、 データ (data) の生産者 (Producer) と消費者 (Consumer) を独立のスレッドとして考え、 生産と消費が連動して進むようにプログラムを設計するにはどうしたらよいか
生産者はデータを次々と生産し、消費者は次々とデータを消費する
市場を固定サイズの queue と考え、生産者と消費者でこれを共有しているモデルを考えます
生産者はデータを生産すると、queue に追加をします。 消費者は queue からデータを取得し消費します
queue が一杯になると、生産者は生産を止めて待たなければなりません
queue が空になると、消費者はデータが queue に追加されるまで待たなければなりません。

このように待ち(wait)が発生するとき、 生産者や消費者はブロック(block)される、と言います。

消費者は queue が空の時、生産者は queue が一杯の時、待ち状態に入ります。
状況が変わると通知され、動き出します。
これらの動きを、wait / notify (notifyAll) メソッドにより実現しています。



# Some random moves to test with
moves = [[10,10],[20,20],[10,10],[0,0]]

# Move!
for move in moves:
     stages.move(move, 0) //move命令(VMとしてxy同時)-> functions.pyのmove class.
     status = stages.xAxisNode.spinStatusRequest() //次命令まで待つ
     # This checks to see if the move is done.
     while status['stepsRemaining'] > 0:
          time.sleep(0.001)
          status = stages.xAxisNode.spinStatusRequest() //次命令まで待つ
---
test.vmp (example)

# This Gestalt persistence file was auto-generated @ 2013-10-16 15:18:45.997847
{
'gigapan.py.Z Axis':[95, 35],
'gigapan.py.Y Axis':[142, 74],
'gigapan.py.X Axis':[5, 24],
}

Virtual Nodes
086-005a.py - 340L // VM cod


examples/machines/gigapan/images/086-005a.py

class spinStatusRequest(functions.serviceRoutine):
     class actionObject(core.actionObject):
          def init(self):
               self.setPacket({})  - core.py: setPacket()
                         ---
                          def setPacket(self, packet, mode = 'unicast'):
                              self.packetSet = self.packetEncoder(packet)
                              self.mode = mode

                         
                        ---
               self.commitAndRelease()
               self.waitForChannelAccess()
               if self.transmitPersistent():
                    return self.getPacket()
               else:
                    notice(self.virtualNode, 'unable to get status from node.')


どうもMakefileから?およびコードからは、gestalt.cppを使ってVMとのやりとり、
その下のuserレイヤで086-005a.cppを使いステッパーモータを制御するらしい。

gsArduino/gestalt.cpp

sArduino $ grep -e "){" -e "//-" gestalt.cpp

//--INCLUDES--
//--DEFINE IO--
//--DEFINE NODE VARIABLES--
//--EEPROM LOCATIONS--
//--BOOTLOADER--
//--BOOTLOADER CONSTANTS--
//--BOOTLOADER STATE VARIABLES--
//--DEFINE TRANCEIVER STATE VARIABLES--
//--DEFINE PACKET FORMAT--
//--DEFINE PORTS----



//--DEFINE TRANCEIVER SETTINGS--
//--FLAGS--

#ifdef standardGestalt
//This is being compiled as an independent program, not using the arduino IDE.
int main(){
     setup();
     while(true){
          loop();
     }
}
#endif

// -- FUNCTION: SETUP --
// Basic functionality for communication with the PC is configured here.
void setup(){


}

//----RECEIVER CODE-------------------------------------
//--RECEIVER INTERRUPT ROUTINE--
ISR(USART0_RX_vect){ //atmega324
ISR(USART_RX_vect){ //atmega328, default for arduino
if (rxPosition == lengthLocation){ //check if current byte is packet length byte
if ((rxPosition < lengthLocation)||(rxPosition < rxPacketLength)){ //packet is not finished
if ((rxBuffer[startByteLocation]==multicast)&&(rxPacketChecksum==rxData)){ //multicast packet good
if ((rxBuffer[startByteLocation]==unicast)&&(rxPacketChecksum==rxData)&&(rxBuffer[addressLocation]==networkAddress[0])&&(rxBuffer[addressLocation+1]==networkAddress[1])){

//--RECEIVER WATCHDOG INTERRUPT ROUTINE--
ISR(TIMER2_OVF_vect){
if (watchdogTime == watchdogTimeout){

//------TRANSMITTER CODE----------------------------------------------
//--TRANSMITTER INTERRUPT-------------
ISR(USART0_TX_vect){//atmega324
ISR(USART_TX_vect){
if (txPosition < txPacketLength){ //still in packet
if (txPosition == txPacketLength){ //transmit checksum byte

//--ALIAS TO TRANSMITTER INTERRUPT--

//--START TRANSMISSION OF PACKET
void transmitPacket(){
void transmitUnicastPacket(uint8_t port, uint8_t length){
void transmitMulticastPacket(uint8_t port, uint8_t length ){

//------MAIN CODE---------------------------
void loop(){
     packetRouter();
     userLoop();
}

void packetRouter(){
if (packetReceivedFlag == true){

何もかからなければuserPacketRouterへ
     default: //try user program
          userPacketRouter(destinationPort);
     break;

//--PORT TABLE--
switch(destinationPort){

//------SERVICE ROUTINES---------------------
//--IDENTIFY NODE--ボタンを押して軸を確定させる
void svcIdentifyNode(){
while(counter < 1500000){
//--REQUEST URL--
void svcRequestURL(){
for(offset = 0; offset<urlLength; offset++){
//--SET IP ADDRESS—
void svcSetIPAddress(){
if (rxBuffer[startByteLocation] == multicast){ //wait for button press
while(*IO_buttonPIN & IO_buttonPin){
if (counter == 500000){ //blink frequency
if ((counter2 == 15)||(packetReceivedFlag==true)){ //exit condition, n blinks or packet received (presumeably from other responding node)
for(offset = 0; offset<urlLength; offset++){ //transmit URL
for(offset = 0; offset<urlLength; offset++){ //transmit URL
//--STATUS REQUEST--
void svcStatus(){
void svcResetNode(){
while(1){};
//--BOOTLOADER FUNCTIONS--
void svcBootloaderCommand(){
if (command == 0){
if (command == 1){
void bootloaderInit(){
void applicationStart(){
while (packetOutboundFlag == true){ //wait for response packet to be sent
void svcBootloaderData(){
void writePage(){ //note: code based on http://www.nongnu.org/avr-libc/user-manual/group_avr_boot.html
for (i=0; i<pageSize; i+=2){
void svcBootloaderReadPage(){ //returns page
for (i=0; i<pageSize; i++){

//------UTILITY FUNCTIONS---------------------
//--SET URL--
void setURL(char *newURL, uint8_t newURLLength){




https://github.com/imoyer/gestalt
086-005a.cpp - for atmega on the controller board. 392L





gestalt/functions.py

gestalt $ grep -e "class" -e "def" functions.py





machines.py