Monday, October 20, 2014

A simple Quadruped Robot

Introduction

For a long time I have been fascinated by walking robots, especially the awesome creations from Boston Dynamics. Their 26km/h wildcat made the headlines last year. But quadruped robots like that are something you can’t build by yourself. They require a huge amount of money and programming skills. Or do they?

Ideas


Most four legged robots that you can find on the internet use a spider-like leg configuration, where the legs are symmetrical to the center of the robot. Although this would be more stable and easier to program, I decided to build a mammal-inspired robot. Mostly because I wanted a challenge and it leaves more space for a proper design.

Electronics

The most expensive parts of legged robots often are the servo motors. Most professional robots use “smart servos” like Dynamixel or Herkulex. They are in about any way better than regular servos (torque, speed, accuracy), but normally cost between 60$ and 500$ per piece. And we need twelve of them. For my robot I decided to use Hobbyking servos instead, which are ridiculously cheap:
I paid around 40$ for them in total. Despite the price, the quality is pretty good. They don’t have much trouble lifting the robots weight.

As a servo controller I wanted to use a Pololu Maestro at first, but had some problems with it. It did not accept the power supply I was using, which led to servo jittering and uncontrollable movement. Now I am using an Arduino MEGA, it has enough pins to control all twelve servos on its own.

As a battery for the servos I am currently using a 6.6V 3000mAh 20C LiFePo4 receiver pack:
http://www.hobbyking.com/hobbyking/store/__23826__Turnigy_nano_tech_3000mAh_2S1P_20_40C_LiFePo4_Receiver_Pack_.html.
The Arduino is powered over USB by a Laptop or for untethered testing by a 5V 1A power bank.

Because the robots will not be autonomous for the time being, it needs a remote of some sort. I decided to write an Android app and plug a Bluetooth module into the Arduino.

In a later update I added an ultrasonic distance sensor to the robots head, which can be rotated using a small servo. This way, the robot can scan the environment and could detect if it will run into a wall.

Mechanical Design

To connect servos and electronics some kind of chassis is needed. At first I thought about 3d-printing one, but that would break the budget as I don’t have my own printer and would have to order all the parts. That meant I needed a 2D-like design. Wood as a material does not look professional enough and aluminum is too heavy, so I went with acrylic glass.

Four acrylic plates (5mm) form the robot's body. They are connected by nuts on threaded rods, which are covered in aluminum tubes. The inner plates are rounded and have foam on the sides to protect the robot when falling over. The outer plates hold the counter bearings, which are made of a t-nut, a piece of threaded rod and a small ball bearing. Another horizontal plate serves as a mount for both the Arduino and the battery. Upper and lower legs are connected by acrylic parts, also using ball bearings for the joints. On the lower legs I had to improvise: They are made of cheap camera tripod legs, secured with cable ties. The rubber ends ensure good grip.


Hand made prototype:
Laser cut parts from Formulor:
The finished robot:

Cat for scale:
Every robot needs a head, so I added that later. The ultrasonic sensor serves as the robots eyes.

Everything from the first sketches to the final design was done in Sketchup. At first I made some prototype parts using a fretsaw, which took me hours. Later I decided to order the parts online and came across Formulor, which offers cheap laser cut parts and quick shipping. The precision is impressive, even the edgy curves caused by Sketchup are visible. I somehow underestimated the stability of 5mm acrylic glass, 3mm should be enough in order to save weight.

The unpowered robot would fall over immediately, so I made a simple stand for it using a three-legged stool. Two strings on the front and back of the robot can be attached to a plastic rod on top of it.

If you want to build the robot yourself, I am providing templates for the mechanical parts:
https://www.dropbox.com/s/qaavo54booxvpmb/Quadruped%20templates.zip?dl=0

Arduino Sketch

The Arduino sketch is divided into multiple files (tabs): Main, IK, Gait, Serial and Servo.
The main file calls the other functions and initializes most of the variables.
To move the robots feet you need to calculate the inverse kinematics for each leg. It is basically an equation where you input coordinates and get servo angles as a result. You can either use simple school math or search the internet for solutions. Trying the calculation in Excel before flashing it to an Arduino can save a lot of time. After this is implemented, the robot can move it's body precisely along the XYZ-coordinates. To rotate the robot you need a rotation matrix. This is another system of equations where you input a coordinate and values for roll, pitch and yaw and get the rotated coordinate as the output. I found this blog post by Oscar Liang really helpful: http://blog.oscarliang.net/inverse-kinematics-implementation-hexapod-robots/
I made an Excel sheet for both calculations:


The serial tab manages the incoming data from the Bluetooth module and the ultrasonic sensor. This works the same way as receiving chars in the serial monitor of the Arduino IDE.

Servos are controlled directly using the servo library.

The entire program works like this: At first the Bluetooth function gets called to look for new input. Then the right gait function is selected. It calculates the coordinates for every leg. After that the rotation and translation is calculated. To get the right servo angles the coordinates have to be converted from “full body” to “single leg”, then the inverse kinematics function can work its magic. Finally the servos are moved to the calculated angle.

Walking Gait

Gaits are probably the most complicated thing about legged robots. Because I wanted quick results I made a simple walking gait based on sine functions. As you can see in the diagram each leg is lifted after the other using a function (red) like -|sin(x)|. When one leg is in the air, the robot will tend to fall over. Therefor a second function (blue) moves the body away from the lifted leg. The advantage of this gait is that you have fluid movements. But the robot can only move in one direction.

I tried to program a trot-gait later, but the rapid leg movement is too much for the servos and the frame.

Processing Simulator

Figuring out the gaits is a tedious task. Even the smallest change to the code requires you to flash the Arduino, plug the battery back in and lift the robot out of it's stand. If you make a mistake, the robot will fall over or the servos will move in the wrong direction, eventually destroying themselves or other mechanical parts. After this happened to me a few times I began to search for an easier way of testing. There are some robotics simulators (e.g. Gazebo, v-rep) but I found them too complicated, importing the robot’s geometry alone took me forever.
I ended up writing my own simulator. Processing is the perfect choice for this. It has libraries for graphical user interfaces and can import and render different cad files. The best thing is that you can practically drag and drop Arduino code into Processing (This is because Arduino code was originally based on Processing). There are a few minor differences, for example arrays are initialized differently. But if you can avoid that you can simulate the exact behavior of the robot and copy the working code back to the Arduino. 

Compiling the Processing sketch only takes around 5 seconds.
The final simulator version can display the robot in color, has a user interface with buttons and sliders, there are different modes for manual servo control and moving the robot around different axis (I even found a bug in my IK code this way).

Android App

I needed a quick way to change variables on the Arduino without flashing it. Therefore I bought a Bluetooth module, but couldn’t find the right Android app. So I had to write my own app using Processing for Android. This was basically the same app I posted an article about last year: http://coretechrobotics.blogspot.com/2013/12/controlling-arduino-with-android-device.html
The only difference is an additional sonar to display the values of the ultrasonic sensor.

Conclusion


Considering the low budget I am pretty happy with the robot so far. The stability and servo power is acceptable. I am already working on more advanced gaits, but even with the simulator this is a time consuming process. 

Monday, August 11, 2014

A Universal Bluetooth Remote App


About half a year ago I programmed a small app for Android that communicates with an Arduino over Bluetooth (http://coretechrobotics.blogspot.com/2013/12/controlling-arduino-with-android-device.html). It had some very basic features and a simple user interface, which was just enough for the project it was designed for. But no two projects are alike.

Because of that I decided to make a universal remote that would be customizable to all possible applications.

Ideas

The main idea is that the user should be able to create his own remote, without having to learn java programming. This would require the app to have some kind of built in editor.
Another part is Multitouch, which is essential for any touchscreen application. It seems to be simple at first but as I knew from my previous attempt at an android app, this makes things extremely complicated. Imagine you don’t just have an X/Y position of your mouse pointer but a whole array of those.

Implementation

Instead of finally learning how to program in pure Java I went with Processing again. This helped me a lot and reduced the time I had to spend on the project.
At first I needed Multitouch. This was a difficult part, I had to read through a lot of Android documentation and Internet articles until it finally worked. Maybe I will write a separate Post about this in the near future.
The next step were proper UI elements like sliders and buttons. My previous app used the controlP5 library by Andreas Schlegel (http://www.sojamo.de/libraries/controlP5/). But I wanted something of my own. I chose to write a UI library from scratch, with a simple edgy design and full Multitouch integration. The four UI elements are sliders, joysticks (2D-Sliders), buttons and switches.
The editor is the main part of the app, from there the user can drag and drop the elements on the screen. The size and position can be changed by moving its center or corner.
The color can be selected in a popup window with RGB sliders.
Another important thing: If you look at a RC remote for a car, the steering wheel centers itself with a built in spring. This way the car will drive straight if you let go of the wheel. The same applies to the sticks of an airplane remote. For this purpose I made a second window that lets you select which slider or which axis of a joysticks should jump back to the middle position.
Saving those UI layouts is also implemented. This means you can create up to 8 layouts and save or load them. The most recent layout will be loaded automatically with the app’s startup.
Finally, with a press of the back-button on your Android device the editor buttons vanish and you can use the controls you created.
This means that the slider values and button states are transmitted over Bluetooth. Actually I wanted to write the Bluetooth part myself but had to give up after a while. The process of selecting the Arduino from a menu is complicated enough. To save development time I am using the Ketai Bluetooth library again (https://code.google.com/p/ketai/).

Protocol

The protocol is basically the same as in my previous app, with some additions to support switch states and the slider centering. If a UI element changes, its key (a character A-Z) will be transmitted, followed by its value. This way, the Arduino knows what variable to write the value to. If no control is touched, nothing will be sent. If multiple elements are being changed at the same time, all their values are being transmitted, one at a time. This works fine for up to 3-4 touch points, but you can cause some errors if your try to touch too many elements at the same time, which is should be impractical anyway.


Installation Instructions

Unfortunately I don’t have a Google Developer account, so you won’t find my app on the Play Store. You can download the .apk-file from this link:


1.       Download the .apk and transfer it to your android device
2.       Open it with a file explorer and install it
3.       Done! You can now create your own Bluetooth Arduino remote

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Example Arduino Sketch

I also wrote a simple example sketch you can upload to an Arduino board of your choice. It handles the serial communication with a Bluetooth board and writes the values of the remote to variables.

This is not a fully features library, so you will have to make some manual adjustments depending on which UI elements are used in the Android app. But this is nothing more than Copy and Paste.

Potential Improvements



I put a lot of time into it, but the app still isn’t flawless. There is a settings-button, which currently has no function because there are no changeable settings. The Bluetooth protocol could use some reworking, too. But overall I am very pleased with the outcome of this project. 

Thursday, May 1, 2014

HeadUp - A Webcam Posture Monitor


Since receiving a lot of feedback about my posture sensor solution I was thinking about how to improve it. Despite its functionality there are a few problems like the continuous clicking of the ultrasonic sensor and the way it is installed on the chair’s rest.


Ideas

One way of fixing the clicking noise would be to use an infrared-based proximity sensor. These are available in different operating ranges and prices but sadly I could not find one that was cheap enough and had the right range for the project.
Improving the chair holder is really difficult, too. It would require a lot of 3D-printed parts and still not work on chairs with low rests.
This meant the old idea had to be discarded completely and the project was suspended until I came across this great blog post about a pc-monitor-mounted sensor:

That’s how I got the idea of simply using the webcam as a sensor. Most people have cameras on their laptops and even desktop PCs so there is no additional hardware needed.


Implementation

Because I already used it in other projects I went with Processing as a programming language again:
To detect the position of the head the software has to scan the webcam image. This is a nearly impossible task without the help of a powerful library. OpenCV is the solution:
OpenCV is an Open Source library that contains computer vision functions and was originally developed by Intel. Face detection is just a small part of it.
The “OpenCV for processing” library by Greg Borenstein can easily be downloaded with the Library Manager in Processing and already contains some example sketches to start with.


The software I wrote works in a similar way as the Arduino code in the posture sensor. It scans the webcam image and compares the height and the size (~distance) of the face to limit values. If the limits are exceeded for more than two seconds the alarm will sound and then silence after a few seconds. If the user gets up from his chair or the face is not visible the alarm will not be triggered. The alarm itself is just the Windows error sound. The user interface consists of the live image in the background. Three buttons are for setting up the limit values and for pausing the alarm.
Finally Processing can export everything into a standalone application.


Download

If you have a computer with a webcam you can give “HeadUp” a try:

Instructions: 1. Download HeadUp.zip
                       2. Unzip everything into one folder
                       3. Run HeadUp.exe
Compatibility: Currently only windows is supported. And there are a few webcams it will not work with depending on the drivers. If there are problems you should try running the application in administrator mode. The newest version of java needs to be installed as well.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

I am also releasing the source code so you can customize and improve it:

Copyright (c) 2014 Coretech Robotics

import gab.opencv.*;
import processing.video.*;
import java.awt.*;
import java.awt.Toolkit;//this is needed for the windows sound

int ypos; // height
int rSig; //distance
int almTimer; 
int trigHeight = 0; //height limit
int trigDist = 0; //distance limit
boolean alm, pause;

Capture video;
OpenCV opencv;

void setup() {
  size(320, 240);
  //The next lines are for intitializing the camera and OpenCV
  video = new Capture(this, 640/4, 480/4);
  opencv = new OpenCV(this, 640/4, 480/4);
  opencv.loadCascade(OpenCV.CASCADE_FRONTALFACE);  
  video.start();
}

void draw() {
  getDistance(0);  // run the OpenCV routine
  if (trigHeight!=0 && trigDist!=0 && !pause) { //check if limits have been initialized
                                                //and if pause is off
    if (rSig > trigDist || ypos > trigHeight) { //compare values to limits
      alm = true; 
    }
    else {
      alm = false;
    }
  }

  if (alm==false) almTimer = millis()+2000;  //reset alarm timer if alarm is off
  else if (millis() > almTimer) { //check if alarm timer has expired
    if (millis()-2000 < almTimer) { //do this for additional 2 seconds
      Toolkit.getDefaultToolkit().beep(); //call the windows alarm sound
      delay(150); 
    } 
  }
  
//The following part draws the 2 buttons and checks if they were pressed
  
  textSize(14);
  fill(0, 255, 0);
  text("set distance", 18, 220);
  text("set height", 136, 220);
  text("pause", 248, 220);
  
  stroke(0, 255, 0);
  
  noFill();
    if(mousePressed && mouseOver(10, 200, 100, 30)) {
      trigDist = rSig+3;
      fill(0, 255, 0);
    }
  rect(10, 200, 100, 30);  
  
  noFill();  
    if(mousePressed && mouseOver(120, 200, 100, 30)) {
      trigHeight = ypos+3;
      fill(0, 255, 0);
    }
  rect(120, 200, 100, 30);
  
  noFill();
  if(pause) fill(0, 255, 0); // this part draws the pause switch
  rect(230, 200, 80, 30);
 
}

void getDistance(int interval) { //OpenCV functions
  //pushmatrix and popmatrix prevents the buttons from beeing scaled with the video
  pushMatrix(); 
  scale(2); // scales the video to the window size
  opencv.loadImage(video);
  
  image(video, 0, 0 ); // this draws the webcam image

  noFill();
  if (alm) stroke(255, 0, 0); //draw all lines red if alarm is active
  else stroke(0, 255, 0);
  strokeWeight(2);
  Rectangle[] faces = opencv.detect();
  int dist = 0;
  for (int i = 0; i < faces.length; i++) {
    println(faces[i].x + "," + faces[i].y);
    rect(faces[i].x, faces[i].y, faces[i].width, faces[i].height);
    rSig = faces[i].height;
    ypos = faces[i].y;
    int delta = trigDist-faces[i].height;
    //the following line draws a second box with the limit distance
    if (trigDist!=0) 
    rect(faces[i].x-delta/2, faces[i].y-delta/2, faces[i].width+delta, trigDist);
  }
  //This draws a line at the limit height:
  if (trigHeight!=0) line(0, trigHeight, width, trigHeight);
  popMatrix();
}

void captureEvent(Capture c) { //important OpenCV stuff
  c.read();
}

void mouseReleased(){ //check if mouse was released so the switch gets triggered only once
  if(mouseOver(230, 200, 80, 30)) pause = !pause;
}

boolean mouseOver(int xpos, int ypos, int rwidth, int rheight){ 
  //return true if mouse is over a given rectangle
  if(mouseX > xpos && mouseX < xpos+rwidth && 
      mouseY > ypos && mouseY < ypos+rheight) return true;
  else return false;
}


Conclusion

The new software based posture sensor is reliable and has none of its predecessor’s flaws. You can easily run the program in the background while working on the computer and it will remind you if you are slouching into the chair or getting too close to the screen. 

About slouching: You should be aware that sinking into your chair and leaning back is not only more comfortable than sitting upright but does also relieve your spine. The software rather prevents you from hunching over which even unhealthier.

Monday, February 17, 2014

A very simple Posture Sensor



If you are reading this you are most likely sitting in front of a computer of some sort, maybe even slouching into the chair. Often we don’t care about our posture which can result in back pain and worse. If someone tells you to sit upright this is easy to do, but it won’t hold on for more than a few minutes. Now wouldn’t it be great to have a device that can remind you to improve your posture?

Ideas


There already are some very intelligent approaches on this topic. One of the best known should be lumoback or lumolift as the new version is called. As far as I understand, they use an accelerometer for a sensor to measure the angle at a certain point of the spine. While these sensors should be relatively accurate, they also need tiny electronics and clever power management to be wearable. Something that is really difficult to achieve with simple electronics and tools.



What is left is making a chair-mounted device. One solution would be using a few pressure sensors on the chair that measure the weight distribution. The downside is that a lot of those sensors are needed for a reliable result. And pressure sensors are not too cheap.




Something that changes significantly between good and poor posture is the position of the head. So what I finally came up with is a device that simply measures the distance to the head.


Hardware

The simplest distance sensors are ultrasonic or infrared sensors. I went with a SR-HC04 because it is cheap and sufficiently precise. There are no special requirements to the controller so I am using an Attiny85. A small piezo speaker provides acoustic feedback to the user. The only thing left is the power supply for which 5V are needed because of the ultrasonic sensor. You could easily use an USB port but I did not want to rely on a computer, 3 button cells deliver around 4,5V and should work for a few days.


Soldering everything together is very easy, the circuit does not need a single resistor although the piezo speaker could use one when longer used. The components are soldered onto a piece of prototyping PCB and some pin headers to connect to the sensor and power supply.


The whole device is not mounted directly onto the chair, but is connected to a piece of fabric that hides the battery and can be laid over the chair’s rest.



























Software

The attiny85 is programmed using Arduino and luckily there is no difference in using the hc-sr04.  There are basically 3 modes: The configuration-mode waits until the user holds his head still and saves the distance of a comfortable position. After this the watch-mode starts, what compares the current distance to the saved distance. If your head is too far away it will sound an alarm. If you get your head back the alarm will stop immediately. If not the device beeps a few times and then mutes. After some time it enters standby-mode. This is meant for leaving the device alone, the sensor reads the distance only every few seconds during this to save energy. If you get back to your chair the configuration-mode starts again.


Programming this was not too difficult but there still are some strange bugs I cannot get rid of.

Conclusion

Apart from the minor software issues, the project was successful. This sensor can easily be built for less than 10$. 
But keep in mind: This is simply meant as a reminder to sit upright, not as a medical application.