Friday, March 4, 2016

A DIY Wearable Posture Sensor



Introduction


Posture sensors/monitors have been a recurring theme on this blog. They are supposed to remind you of your posture and prevent you from slouching, which can be a cause for back pain and headaches.While my previous sensors were either fixed to a chair or desk, this time I wanted to create a wearable version, that would allow for free movement. As always, one of the main goals was to make this project cheap and easy to reproduce. 


Concept


There are different ways to measure body posture. My first chair-based sensor used an ultrasonic sensor to measure the distance between the chairs rest and the seated persons head. As the person slouches, the head automatically moves forward, which can easily be detected. The second sensor basically did the same thing with a webcam and face recognition. Both versions were not exactly flawless. The ultrasonic sensor used an annoying piezo buzzer and was difficult to set up. Compatibility issues combined with the constant feeling of being watched were a downside on the webcam monitor. But one of the biggest problems was that these sensors only worked on a specific location and were by no means portable or wearable. 


A wearable posture sensor obviously needs a different approach. One solution would be to measure the curvature of the spine by using flex sensors (https://www.adafruit.com/products/182). The main downside is that the device would need very tight clothing or tape to attach it to the back. 


But it is not even necessary to measure how much the spine is bent. You can simply measure it's angle at certain points using an accelerometer. This idea of course is not new. In my blog post in two years ago I already mentioned Lumo Back and Lumo Lift. These devices are supposed to measure the angle of the spine on the lower back or upper chest. While Lumo Back had to be worn like a belt, Lumo lift can be attached to a regular shirt with a magnet. I initially thought that it would be too difficult to build a device like that. Mostly because the components need to be tiny and draw very little power. But as it turns out, it is not even that difficult.


Electronics



As an accelerometer I am using the ADXL335 breakout board from iTead. It is small, cheap and analog. Analog Outputs are in this case an advantage over the better known MPU6050, which uses I2C. This means that angles can be easily read using analogRead() instead of having to go through the whole I2C setup.

The microcontroller of choice is the Attiny85 (again), which can be programmed using an Arduino. It's 5 I/O pins are more than sufficient for this application.

A vibrating motor is providing haptic feedback to the user. I actually took this one out of an old Nintendo rumble pack, but you can find motors of that size on ebay or e.g. Banggood as well. Because the Attiny can only supply a limited amount of current on its I/O pins, so a transistor is needed to power the motor. Normally the circuit would require decoupling capacitors to deal with the motor noise, but I wanted to keep everything as simple an minimal as possible.

I decided to use a cr2025 coin cell as a power source. It is relatively thin and with 170mAh it should last for about a week. A rechargeable LiPo battery would be a more eco-friendly alternative, but that would make the project more complicated and require a charger.
Additionally a power switch is used to turn the posture sensor on and off.




Housing and Assembly



To protect the components some kind of housing is needed. I used Onshape to design two 3D printed shells that are connected by press-fit:
Some parts like the motor have to be fixed using a drop of glue. The Attiny and transistor have to lose parts of their feet to fit into the case. This also means that the Attiny85 has to be programmed before soldering it. For the electrical connections you will need very thin wire, I used 30AWG wire from Adafruit. A coin cell holder is too large to fit into the case, so I used conductive copper tape. This also makes it possible to replace the battery without too much effort.


A custom PCB would have looked much cleaner, but sadly I can't find an accelerometer in a package that can be soldered by hand.

Lumo Lift is using a magnet clip to make their sensor wearable, which is a really clever idea. I used a regular thin magnet that sticks to the button cell inside the case. As long as it doesn't get too close to the Motor this shouldn't be a problem.


Programming


The code works in a very similar way to the chair-based sensor. There are 4 modes: If the device is turned on it will go into a configuration mode. This means that it will wait for you to assume a good posture and stay still for a few seconds. In "watch mode" the calibrated angle is then compared to the current angle. If they differ too much the motor will start to vibrate until the calibrated angle is reached again. This is the "alarm mode". If the user doesn't correct his posture the motor is turned off and the device will go into "standby mode".

You can download the code here:
https://www.dropbox.com/s/quhnd9fs01zfh49/posture_3.zip?dl=1

There is still a lot to improve. At the moment I am not using any power saving modes on the Attiny, this should improve battery life considerably. Also I am only using one axis of the ADXL335, so the sensor has to be oriented properly.


Conclusion


I am actually quite happy with how the project turned out. The posture sensor works better than expected and the vibrating motor is much more subtle than a piezo buzzer. The best thing is that you only need 10-20€ worth of components.

Saturday, December 26, 2015

Attiny Canbot

Introduction


There were a few projects in the last year that just didn't work. Mostly because they were too ambitious and I did not have enough time and pacience to finish them. So I finally wanted to build a robot that just works without any gimmicks.

Electronics

The main part of the robot is an Atttiny85 again.
The servos are two HXT900 from Hobbyking:
 http://www.hobbyking.com/hobbyking/store/__662__HXT900_9g_1_6kg_12sec_Micro_Servo.html
I removed the potentiometers and the end stops, so that they are continuously rotating.
I am using a 3,7V 600mAh LiPo to power both the Attiny and the servos. A small power switch disconnects the battery after use.
An HC-SR04 ultrasonic sensor enables autonomous driving to a certain degree.
To control the robot a 38kHz infrared receiver is needed as well.




Hardware

For the body of the robot I wanted to make use of my 3D printer. There are two half shells that hold the servos, the ultrasonic sensor and all remaining electronic components. These halves are connected by four screws. The holes are designed to fit the mounting screws that came with the servos.  Both wheels are screwed into the servo shafts. The heavier parts like the battery need to be in the lower half to help the robot remain balanced.



The stl-files are available on Thingiverse:


Depending on your printer and servos you will have to sand the shells down a bit. The remaining electronical parts are then just hot-glued in.

Programming

This part was pretty simple as well, because I could reuse most of the code from my biped robot (http://coretechrobotics.blogspot.de/2013/12/an-attiny85-ir-biped-robot.html).
The Attiny will listen for infrared signals and moves the servos depending on what it receives.
I also built in an autonomous mode than can be activated by pressing a certain key on the remote. It simply drives forward until the ultrasonic sensor detects an obstacle, then it turns and drives on.
You can download the code from the Thingiverse page or directly from Dropbox:

https://www.dropbox.com/s/g5myobjc0wyziqw/Canbottiny_source.zip?dl=1

Keep in mind that this particular code may not work for you if you don't have the same Sony remote as me. You will have to change the codes at least, maybe it won't work at all.

Conclusion

As this robot was just supposed to be a qick weekend project, I am very happy with the overall outcome. A lot of people on Thingiverse seemed to like it as well and a few even built replicas. The only problem left is the balancing. If you want the robot to stay level while rolling, you will have to put some additional weight in the lower half. 

Monday, July 27, 2015

Creating a Robotics Simulator


Introduction

If you start writing code for a robot it is inevitable to go through a number of iterations until it works as planned. Arduino already makes this easier but especially with complex walking robots it can be a tedious task to reprogram it again and again. This is where a simulator can be very useful.
For my quadruped robot project I programmed a simulator using Processing. I didn't describe it in more detail at that time but I want to make up for that now.

Processing

I mentioned Processing in almost every blog post so far. That's because I think it is a really powerful  yet easy to learn programming language/IDE and it's a shame that it only has such a small community compared to Arduino. But what makes it special?
  • The IDE is only ~200mb and can be run from a USB drive. Compared to Matlab or Visual Studio this is definitely an advantage.
  • The IDE is very similar to Arduino and so is the language, As long as you are not using arrays or hardware libraries you can copy and past blocks of code between Arduino and Processing. 
  • Code can be compiled for Windows, Linux, Mac and even Android with very few alterations. (http://coretechrobotics.blogspot.com/2014/08/a-universal-bluetooth-remote-app.html)
  • The community may be small but you still can find tons of examples and tutorials online. Also there are libraries for anything, just like with Arduino.

Programming the Simulator

1. Basics

First you need to download and unpack Processing:
https://processing.org/download/?processing    (This tutorial is based on Processing 2.2.1)
Be sure that the newest Java version is installed as well.
When this is done start the Processing IDE.

We will start with drawing a simple rectangle.
There are two basic functions that need to be in a sketch. Setup() is called once on startup and draw() runs until you close the sketch window. In the setup function we need to define the size of the sketch in pixels. For this tutorial we will stick with 1200x800. Now we already have a working sketch. Next we insert a rectangle by using rect(). The four parameters for this function are x-position, y-position, x-size and y-size. There are other shapes like ellipses that follow the same scheme. This diagram from the processing website describes the directions in 2D and 3D space:

Additionally you can fill the rectangle with a color by using fill(). Under Tools/Color Selector you can pick a color and paste its value as a parameter. It is important to use fill() before drawing the rectangle. Changes in something like color or stroke only affect the subsequent shapes.
We won't need outlines for this, so we will eliminate them with noStroke().

void setup(){
    size(1200, 800);
}

void draw(){  
   fill(#FF9F03);
   noStroke();
   rect(100, 100, 500, 500);
}



2. Make it 3D

Turning the square into a box requires a few changes. We need to change to a 3D renderer by adding OPENGL to the size function. Box(size) creates an equal sided cube at the origin (top left corner). The function translate(x, y, z) can be used to move it away from the corner. Rotation is done using rotateX(angle), rotateY(angle) and rotateZ(angle). width and height are referencing the values we added to the size function, translate(width/2, height/2) always makes the cube appear at the center.
To enable anti aliasing we need to call the smooth() function. This will not work without adding background(color), that gets called every cycle to overwrite the screen. Lights() turns the lights on and add shades to the cube.

void setup(){
    size(1200, 800, OPENGL);
}

void draw(){  
   background(32);
   smooth();
   lights();
   
   fill(#FF9F03);
   noStroke();
   
   translate(width/2, height/2);
   rotateX(-0.5);
   rotateY(0.5);
   box(300);
}



3. Mouse controls

3D is kind of boring if you can't interact with it. The easiest way to do this is replacing the fixed rotation values with the mouse position to rotate the cube around while the sketch is running. We need to create two variables, rotX and rotY that well be used as view rotation. The function mouseDragged() is used to write the mouse position to these variables while a mouse button is pressed.

float rotX, rotY;

void setup(){
    size(1200, 800, OPENGL);
}

void draw(){  
   background(32);
   smooth();
   lights();
   
   fill(#FF9F03);
   noStroke();
   
   translate(width/2, height/2);
   rotateX(rotX);
   rotateY(-rotY); 
   box(300);
}

void mouseDragged(){
    rotY -= (mouseX - pmouseX) * 0.01;
    rotX -= (mouseY - pmouseY) * 0.01;
}



4. Importing geometry

Unless you are building Wall-E, a cube won't be a good representation of your robot. Luckily Processing is able to import various 3D files including .obj-files.
For the next steps you will have to download the parts I prepared:
https://www.dropbox.com/s/ymn59u6qw7zbjyi/robot%20parts.zip?dl=1
Create a new folder in the direction of your sketch file and name it "data". Unpack the 5 obj-files to that folder.
We can now import these objects to our sketch by creating a PShape for each of them and using loadShape("file") to assign the obj-file. Replace the box with shape(base) and Processing will draw the geometry. Depending on the units we will have to scale(factor) the object to better fit the screen. I also used the translate command to position the part lower on the screen because otherwise the robot would be off center later.

PShape base, shoulder, upArm, loArm, end;
float rotX, rotY;

void setup(){
    size(1200, 800, OPENGL);
    
    base = loadShape("r5.obj");
    shoulder = loadShape("r1.obj");
    upArm = loadShape("r2.obj");
    loArm = loadShape("r3.obj");
    end = loadShape("r4.obj");
}

void draw(){  
   background(32);
   smooth();
   lights();
   
   noStroke();
   
   translate(width/2,height/2);
   scale(-4);
   translate(0,-40,0);
   rotateX(rotX);
   rotateY(-rotY);    
     shape(base);
}

void mouseDragged(){
    rotY -= (mouseX - pmouseX) * 0.01;
    rotX -= (mouseY - pmouseY) * 0.01;
}



5. Rotating/Aligning multiple parts

Now we will assembly the robot by adding the remaining parts. Use the translate/rotate functions to position the parts. Translation and rotation values will always add up. That means that all parts are in a chain where each link is moved relatively to its predecessor.
If you are using your own robot parts you can find the right translation values in the cad file. If the base is 60mm high you have to translate the next part 60 units and so on. Rotation values are in radians and sometimes it will take a few attempts to find the right ones.
By defining three rotation values as variables we will be able to move the joints in the next step.
If you export your obj-files from a CAD software there will be a second mtl-file containing the color settings and Processing will render it that way. If not, disableStyle() can be used to render objects with the standard fill/stroke setting.

PShape base, shoulder, upArm, loArm, end;
float rotX, rotY;
float alpha = -1, beta = -2, gamma;

void setup(){
    size(1200, 800, OPENGL);
    
    base = loadShape("r5.obj");
    shoulder = loadShape("r1.obj");
    upArm = loadShape("r2.obj");
    loArm = loadShape("r3.obj");
    end = loadShape("r4.obj");
    
    shoulder.disableStyle();
    upArm.disableStyle();
    loArm.disableStyle(); 
}

void draw(){  
   background(32);
   smooth();
   lights();
   
   fill(#FFE308); 
   noStroke();
   
   translate(width/2,height/2);
   scale(-4);
   translate(0,-40,0);
   rotateX(rotX);
   rotateY(-rotY);    
     shape(base);
     
   translate(0, 4, 0);
   rotateY(gamma);
     shape(shoulder);
      
   translate(0, 25, 0);
   rotateY(PI);
   rotateX(alpha);
     shape(upArm);
      
   translate(0, 0, 50);
   rotateY(PI);
   rotateX(beta);
     shape(loArm);
      
   translate(0, 0, -50);
   rotateY(PI);
     shape(end);
}

void mouseDragged(){
    rotY -= (mouseX - pmouseX) * 0.01;
    rotX -= (mouseY - pmouseY) * 0.01;
}



6. Kinematics

For this step we will add a second tab to the sketch where the inverse kinematics and movements are calculated. For this tutorial I reused some of the code from my quadruped robot. Basically, the IK() function converts three coordinates to three angles. SetTime() generates a time value from 0 to 4. WritePos() calls both functions and generates a sine function that looks like a horizontal eight, making for smooth movements of the robot. 
The only thing we need to change in the main sketch tab is calling the writePos() function.
If you look at the code in the second tab, it could easily be run on an arduino without alterations. This is what I did with my quadruped simulator. I tested the code and later copied the entire thing to my Arduino sketch.

Main Tab
PShape base, shoulder, upArm, loArm, end;
float rotX, rotY;
float posX=1, posY=50, posZ=50;
float alpha, beta, gamma;

void setup(){
    size(1200, 800, OPENGL);
    
    base = loadShape("r5.obj");
    shoulder = loadShape("r1.obj");
    upArm = loadShape("r2.obj");
    loArm = loadShape("r3.obj");
    end = loadShape("r4.obj");
    
    shoulder.disableStyle();
    upArm.disableStyle();
    loArm.disableStyle(); 
}

void draw(){ 
   writePos();
   background(32);
   smooth();
   lights();
   
   fill(#FFE308); 
   noStroke();
   
   translate(width/2,height/2);
   rotateX(rotX);
   rotateY(-rotY); 
   scale(-4);
   
   translate(0,-40,0);   
     shape(base);
     
   translate(0, 4, 0);
   rotateY(gamma);
     shape(shoulder);
      
   translate(0, 25, 0);
   rotateY(PI);
   rotateX(alpha);
     shape(upArm);
      
   translate(0, 0, 50);
   rotateY(PI);
   rotateX(beta);
     shape(loArm);
      
   translate(0, 0, -50);
   rotateY(PI);
     shape(end);
}

void mouseDragged(){
    rotY -= (mouseX - pmouseX) * 0.01;
    rotX -= (mouseY - pmouseY) * 0.01;
}


Inverse Kinematics Tab
float F = 50;
float T = 70;
float millisOld, gTime, gSpeed = 4;

void IK(){

  float X = posX;
  float Y = posY;
  float Z = posZ;

  float L = sqrt(Y*Y+X*X);
  float dia = sqrt(Z*Z+L*L);

  alpha = PI/2-(atan2(L, Z)+acos((T*T-F*F-dia*dia)/(-2*F*dia)));
  beta = -PI+acos((dia*dia-T*T-F*F)/(-2*F*T));
  gamma = atan2(Y, X);

}

void setTime(){
  gTime += ((float)millis()/1000 - millisOld)*(gSpeed/4);
  if(gTime >= 4)  gTime = 0;  
  millisOld = (float)millis()/1000;
}

void writePos(){
  IK();
  setTime();
  posX = sin(gTime*PI/2)*20;
  posZ = sin(gTime*PI)*10;
}



7. Final touches

With the code above this is already a fully functional robotics simulator. But there are tons of other things to do with processing. For the last step I added an effect that was supposed to look like a spray can. It ended up a little differently but still looks nice. I also added a directional light, which makes the robot appear a little more realistic.
If you want you can export the entire project to a executable program for windows or any other operating system by clicking on "Export Application".

PShape base, shoulder, upArm, loArm, end;
float rotX, rotY;
float posX=1, posY=50, posZ=50;
float alpha, beta, gamma;

float[] Xsphere = new float[99];
float[] Ysphere = new float[99];
float[] Zsphere = new float[99];

void setup(){
    size(1200, 800, OPENGL);
    
    base = loadShape("r5.obj");
    shoulder = loadShape("r1.obj");
    upArm = loadShape("r2.obj");
    loArm = loadShape("r3.obj");
    end = loadShape("r4.obj");
    
    shoulder.disableStyle();
    upArm.disableStyle();
    loArm.disableStyle(); 
}

void draw(){ 
   writePos();
   background(32);
   smooth();
   lights(); 
   directionalLight(51, 102, 126, -1, 0, 0);
    
    for (int i=0; i< Xsphere.length - 1; i++) {
    Xsphere[i] = Xsphere[i + 1];
    Ysphere[i] = Ysphere[i + 1];
    Zsphere[i] = Zsphere[i + 1];
    }
    
    Xsphere[Xsphere.length - 1] = posX;
    Ysphere[Ysphere.length - 1] = posY;
    Zsphere[Zsphere.length - 1] = posZ;
   
   
   noStroke();
   
   translate(width/2,height/2);
   rotateX(rotX);
   rotateY(-rotY);
   scale(-4);
   
   for (int i=0; i < Xsphere.length; i++) {
     pushMatrix();
     translate(-Ysphere[i], -Zsphere[i]-11, -Xsphere[i]);
     fill (#D003FF, 25);
     sphere (float(i) / 20);
     popMatrix();
    }
    
   fill(#FFE308);  
   translate(0,-40,0);   
     shape(base);
     
   translate(0, 4, 0);
   rotateY(gamma);
     shape(shoulder);
      
   translate(0, 25, 0);
   rotateY(PI);
   rotateX(alpha);
     shape(upArm);
      
   translate(0, 0, 50);
   rotateY(PI);
   rotateX(beta);
     shape(loArm);
      
   translate(0, 0, -50);
   rotateY(PI);
     shape(end);
}

void mouseDragged(){
    rotY -= (mouseX - pmouseX) * 0.01;
    rotX -= (mouseY - pmouseY) * 0.01;
}




You can download the finished simulator from here:
https://www.dropbox.com/s/sr4gk1y5mlxrrid/demoRobot.zip?dl=1

Conclusion

As you can see, programming a simulator like that is no rocket science. If you need help making your own check out the Processing reference https://processing.org/reference/ and the forums http://forum.processing.org/two/. I hope this tutorial was helpful to you and it would be awesome to see more robots in Processing in the future.

Friday, April 3, 2015

BQ Prusa i3 Hephestos Review

Introduction

Looking at sites like Hackaday and Instructables, there is an increasing amount of projects that make use of 3D printers. And it is incredible to see how good the print quality has become in the past few years. Because of this I finally decided to get my hands on a 3D printer myself.
When I first found out about the BQ Hephestos I expected a catch somewhere. It offers a large print bed (215x210x180mm) and an LCD screen for just 500€. This may not look especially cheap compared to other sub-500€ printers like the Printrbot simple metal. But those are hard to come by in Europe and including taxes they often cost the same as high end printers.
The Hephestos is an open source 3D printer based on the successful Reprap Prusa i3 that includes several improvements added by BQ, for example a completely redesigned extruder. BQ itself is a Spanish smartphone and tablet producer whose robotics and innovation department, among other things, develops 3D printers.

Hardware

The kit was ordered directly from the manufacturer and I received the shipment short after. BQ really put a lot of effort into the design and packaging of the parts. There is a box corresponding to each step in the manual which makes it easy to keep track of the countless screws and bolts. The manual itself is nicely done as well. There is a picture with detailed descriptions on every page, it kind of feels like building a Lego kit. If the manual is not enough there is also a built video on BQ's YouTube channel in which the printer is assembling itself. It must have taken someone days to animate that.
The parts themselves have a decent quality and are easy to assemble. A few tools are already provided, e.g. different size wrenches and needles to clean the nozzle. The screws are fitted in the plastic parts using a soldering iron so sanding was only needed on some of them. One issue are the linear bearings. They are extremely noisy and have too much friction, I was lucky to have ordered some cheap bearings from eBay a few month ago so I could use those instead.
The extruder comes pre-assembled and is the same one used in the Witbox, BQ's bigger 1600€ printer. It has a powerful cooling fan which has proven to be very effective. The Hephestos can easily print steep angles and bridge larger gaps without ruining the print. I clogged the nozzle once because of too low temperatures but that was simple to remove.
The electronic components seem to be of good quality as well. They consist of an Arduino Mega with a RAMPS 1.4 shield, 5 motors and drivers, endstops, an lcd panel and a power supply.
It took me about three days to assemble the printer and it makes a very stable impression. Normally you would need to shorten the motor cables to get a clean wiring but I was too lazy to do that. I later added a little switch to the z-axis endstop to turn the power on and off without pulling the power cord.

Software

While many 3D printers include their own software or a customized version of Repetier host, the Hephestos only comes with a set of Cura configurations. This would be perfectly sufficient if there would not be apparent errors in the code: A Cura configuration file includes the start- and end gcode. This is the code for priming the extruder and for moving it out of the way after each print. With the file provided by BQ the z axis will move up 200mm after printing. Now the Hephestos cannot even move that far up so it will hit the upper end and possibly destroy itself if you can’t pull the plug fast enough. Secondly the coordinates are not absolute but relative in BQ's code. This means that instead of retracting the filament a bit after the print, it will rewind the entire length of filament you printed. Those errors are simple to fix if you already know about 3D printers but if you have no prior knowledge you will eventually end up destroying the printer.
At least the Marlin drivers provided are working fine. The only complaints I have are the strange LCD menu where you need to tap through 6 submenus to move the axes and the bad temperature control, which is sometimes 5 degree off during prints.

You can download the fixed Cura profile with improved purging here:
https://www.dropbox.com/s/ywu9uwuzllz1jon/CTHephestosCuraProfile.ini?dl=0

Print Quality

There was only little calibration necessary to make the first print and I was really impressed by the quality. At a closer distance you can see that the printed parts are not made by a high end printer but you have to keep in mind that this is practically the cheapest 3D printer you can buy in Europe at the moment. The prints are smooth and have acceptable tolerances which should be more than enough for everyday objects.
I am currently printing on painter’s tape on which the plastic sticks very well. Previously I have been using a glue stick on the glass bed but the undersides of the prints came out very rough that way. Hairspray did not work for me as well, maybe it was just the wrong brand.
The print quality strongly depends on the filament. For the first few weeks I have been using blue PLA from BQ which is sold for 20€ per kilo. This is really cheap and unfortunately you can see that on the prints as well. Later I switched to white formfutura PLA which costs 30€. As you can see in the pictures below the layers look much more accurate and smooth.



Summary

The BQ Prusa i3 Hephestos is a great 3D printer if you are on a low budget and don’t need to print highly detailed objects. With its big print bed and solid construction it is a great printer for hobbyists and tinkerers.
I’m not sure if the printer is suited for total beginners like BQ is advertising it but as it is an open source design you can profit from the reprap community and many questions have already been answered on their forums. You can download every printed and non-printed part from BQ's github page and many people already have made their own additions on thingiverse.


Printed Models:
     Scripted Vase www.thingiverse.com/thing:104694
     Stanford Bunny http://www.thingiverse.com/thing:3731

Monday, October 20, 2014

A simple Quadruped Robot

Introduction

For a long time I have been fascinated by walking robots, especially the awesome creations from Boston Dynamics. Their 26km/h wildcat made the headlines last year. But quadruped robots like that are something you can’t build by yourself. They require a huge amount of money and programming skills. Or do they?

Ideas


Most four legged robots that you can find on the internet use a spider-like leg configuration, where the legs are symmetrical to the center of the robot. Although this would be more stable and easier to program, I decided to build a mammal-inspired robot. Mostly because I wanted a challenge and it leaves more space for a proper design.

Electronics

The most expensive parts of legged robots often are the servo motors. Most professional robots use “smart servos” like Dynamixel or Herkulex. They are in about any way better than regular servos (torque, speed, accuracy), but normally cost between 60$ and 500$ per piece. And we need twelve of them. For my robot I decided to use Hobbyking servos instead, which are ridiculously cheap:
I paid around 40$ for them in total. Despite the price, the quality is pretty good. They don’t have much trouble lifting the robots weight.

As a servo controller I wanted to use a Pololu Maestro at first, but had some problems with it. It did not accept the power supply I was using, which led to servo jittering and uncontrollable movement. Now I am using an Arduino MEGA, it has enough pins to control all twelve servos on its own.

As a battery for the servos I am currently using a 6.6V 3000mAh 20C LiFePo4 receiver pack:
http://www.hobbyking.com/hobbyking/store/__23826__Turnigy_nano_tech_3000mAh_2S1P_20_40C_LiFePo4_Receiver_Pack_.html.
The Arduino is powered over USB by a Laptop or for untethered testing by a 5V 1A power bank.

Because the robots will not be autonomous for the time being, it needs a remote of some sort. I decided to write an Android app and plug a Bluetooth module into the Arduino.

In a later update I added an ultrasonic distance sensor to the robots head, which can be rotated using a small servo. This way, the robot can scan the environment and could detect if it will run into a wall.

Mechanical Design

To connect servos and electronics some kind of chassis is needed. At first I thought about 3d-printing one, but that would break the budget as I don’t have my own printer and would have to order all the parts. That meant I needed a 2D-like design. Wood as a material does not look professional enough and aluminum is too heavy, so I went with acrylic glass.

Four acrylic plates (5mm) form the robot's body. They are connected by nuts on threaded rods, which are covered in aluminum tubes. The inner plates are rounded and have foam on the sides to protect the robot when falling over. The outer plates hold the counter bearings, which are made of a t-nut, a piece of threaded rod and a small ball bearing. Another horizontal plate serves as a mount for both the Arduino and the battery. Upper and lower legs are connected by acrylic parts, also using ball bearings for the joints. On the lower legs I had to improvise: They are made of cheap camera tripod legs, secured with cable ties. The rubber ends ensure good grip.


Hand made prototype:
Laser cut parts from Formulor:
The finished robot:

Cat for scale:
Every robot needs a head, so I added that later. The ultrasonic sensor serves as the robots eyes.

Everything from the first sketches to the final design was done in Sketchup. At first I made some prototype parts using a fretsaw, which took me hours. Later I decided to order the parts online and came across Formulor, which offers cheap laser cut parts and quick shipping. The precision is impressive, even the edgy curves caused by Sketchup are visible. I somehow underestimated the stability of 5mm acrylic glass, 3mm should be enough in order to save weight.

The unpowered robot would fall over immediately, so I made a simple stand for it using a three-legged stool. Two strings on the front and back of the robot can be attached to a plastic rod on top of it.

If you want to build the robot yourself, I am providing templates for the mechanical parts:
https://www.dropbox.com/s/qaavo54booxvpmb/Quadruped%20templates.zip?dl=0

Arduino Sketch

The Arduino sketch is divided into multiple files (tabs): Main, IK, Gait, Serial and Servo.
The main file calls the other functions and initializes most of the variables.
To move the robots feet you need to calculate the inverse kinematics for each leg. It is basically an equation where you input coordinates and get servo angles as a result. You can either use simple school math or search the internet for solutions. Trying the calculation in Excel before flashing it to an Arduino can save a lot of time. After this is implemented, the robot can move it's body precisely along the XYZ-coordinates. To rotate the robot you need a rotation matrix. This is another system of equations where you input a coordinate and values for roll, pitch and yaw and get the rotated coordinate as the output. I found this blog post by Oscar Liang really helpful: http://blog.oscarliang.net/inverse-kinematics-implementation-hexapod-robots/
I made an Excel sheet for both calculations:


The serial tab manages the incoming data from the Bluetooth module and the ultrasonic sensor. This works the same way as receiving chars in the serial monitor of the Arduino IDE.

Servos are controlled directly using the servo library.

The entire program works like this: At first the Bluetooth function gets called to look for new input. Then the right gait function is selected. It calculates the coordinates for every leg. After that the rotation and translation is calculated. To get the right servo angles the coordinates have to be converted from “full body” to “single leg”, then the inverse kinematics function can work its magic. Finally the servos are moved to the calculated angle.

Walking Gait

Gaits are probably the most complicated thing about legged robots. Because I wanted quick results I made a simple walking gait based on sine functions. As you can see in the diagram each leg is lifted after the other using a function (red) like -|sin(x)|. When one leg is in the air, the robot will tend to fall over. Therefor a second function (blue) moves the body away from the lifted leg. The advantage of this gait is that you have fluid movements. But the robot can only move in one direction.

I tried to program a trot-gait later, but the rapid leg movement is too much for the servos and the frame.

Processing Simulator

Figuring out the gaits is a tedious task. Even the smallest change to the code requires you to flash the Arduino, plug the battery back in and lift the robot out of it's stand. If you make a mistake, the robot will fall over or the servos will move in the wrong direction, eventually destroying themselves or other mechanical parts. After this happened to me a few times I began to search for an easier way of testing. There are some robotics simulators (e.g. Gazebo, v-rep) but I found them too complicated, importing the robot’s geometry alone took me forever.
I ended up writing my own simulator. Processing is the perfect choice for this. It has libraries for graphical user interfaces and can import and render different cad files. The best thing is that you can practically drag and drop Arduino code into Processing (This is because Arduino code was originally based on Processing). There are a few minor differences, for example arrays are initialized differently. But if you can avoid that you can simulate the exact behavior of the robot and copy the working code back to the Arduino. 

Compiling the Processing sketch only takes around 5 seconds.
The final simulator version can display the robot in color, has a user interface with buttons and sliders, there are different modes for manual servo control and moving the robot around different axis (I even found a bug in my IK code this way).

Android App

I needed a quick way to change variables on the Arduino without flashing it. Therefore I bought a Bluetooth module, but couldn’t find the right Android app. So I had to write my own app using Processing for Android. This was basically the same app I posted an article about last year: http://coretechrobotics.blogspot.com/2013/12/controlling-arduino-with-android-device.html
The only difference is an additional sonar to display the values of the ultrasonic sensor.

Conclusion


Considering the low budget I am pretty happy with the robot so far. The stability and servo power is acceptable. I am already working on more advanced gaits, but even with the simulator this is a time consuming process.