How Not To Build A Robot

I’ve gained a lot of experience over the last few months with regards to Fusion 360, 3d printing, electronics and more besides. I thought I’d share some of those lessons.

As Complex As You Make It

The most important lesson, as with any project, is to have an idea of what you’re building from the start and how long you have to build it. If it’s a relatively simple design, there will still be a lot of issues you’ll come across that will take added time to figure out, doubly so if you’re learning as you go. My robot concept was complex to start with, more so than I expected, and I had a lot more to learn than I realised too. However long you think you need, add more and if possible simplify your design.

In retrospect, more of a plan than a quick sketch wouldn’t have gone amiss…

I had a bunch or early wins, I used existing parts from an RC car to make early proof of concepts which sped things up, and this gave me a little too much confidence. I was designing elements in Fusion 360 in isolation, assuming they’d work, and that burnt me a lot. I went through a number of different chassis designs as prototypes in the early steps and it wasn’t until I realised I needed to have more of a complete design done in CAD to see how they all fitted together that I could save an awful lot of time. I’m still not great at this but certainly getting better.

Longer term I need to learn how to do joints in Fusion 360 so that I can actually see how things fit together and what constraints there are.

I wasted a lot of time in what was designing seven different robots, I couldn’t have got to where I am without doing it though so a difficult balance to make.

Seriously, Make A List. Then Check it Again…

I had the vague idea that I’d have the Stereo Pi up top in the head for stereo vision, this would give a lot of opportunities for computer vision too. Around the chassis would be a ring of sensors, ultrasonics were what I had in mind to start with, but though simple to work with they’re quite large. I didn’t really know better so that’s that I went with. Later on I learned of the VL53L0X which is a really cheap lidar sensor and a lot smaller too. They had the quirk of having the same i2c address by default so you need to use i2c multiplexors or have them connected in such a way to reset their addresses on first boot… More complexity!

Again, we’ve all PHDs in hindsight but having a more solid plan and spending more time on research and planning in the early stages would’ve paid off in the long run.

Burnout

Look. After. Yourself.

As I mentioned earlier on I had lots of early successes which gave me an awful lot of false confidence, as soon as the easy wins came and went and the real struggle began the build got a lot more difficult, both technically and mentally. For those who know me or have been reading the blog for a while will know I suffer from Anxiety and Depression, they’re a bugger individually but when they join forces they’re truly evil. A few weeks before I applied to enter PiWars my beloved cat, Willow, passed away. To say this was hard on me is an understatement, coupled with the year tailing off, getting darker and colder, and things going from win after win to struggle after struggle, things got rough.

I tried to push through it, that was a big mistake, and I made the best decision for the project which is to take breath and start again. With a lot of support from my girlfriend, the rest of the PiWars community, friends, family, and colleagues alike I slowly got out of the funk while making slow but consistent progress. The Epic Rebuild Began.

Conclusions and Next Steps

I’ve learned a lot, come an awful long way in may regards and though I’ve still a lot to do I’m in a better place and so is the robot. The next steps are to get the controller up and running and the robot drivable again.

In the next blog post, I’ll talk about the plans for the challenges. As it stands I’ve almost one arm and only need to finish the hand, add a bunch of sensors and remote control. I have a minimum spec in sight and will at least be able to compete.

MacFeegle Prime Architecture Overview

A long while since the last post, more on that in an upcoming post titled “How Not To Build A Robot”, but thought I’d give an update on the general architecture that is manifesting for MacFeegle Prime.

The Robot

The robot will have at it’s core a Raspberry Pi, in its case it’ll be a Raspberry Pi 3 Compute Module hosted on a StereoPi board. This board is designed to take advantage of the CM (Compute Modules) two camera ports and allows for GPU boosted stereo vision.

The latest render of MacFeegle Prime. This shows a robot with tank-style treads, a head and one arm.
Latest render of MacFeegle Prime

For motor control, and for some of the servos, I’ll be using a RedBoard+ by RedRobotics. This has everything you’ll need for most robots including a pair of 6A motor controllers, 12 servo headers, support for NeoPixels and most importantly great documentation and support from the creator, Neil Lambeth. This HAT also includes a power regulator so it powers the StereoPi too which is incredibly handy.

Connected to the Pi will be a Teensy 4 board, this will handle and collate data from the various sensors around the robot, with an i2c servo board to control the arms, and potentially an NRF24 RF transceiver too.

The Controller

The controller will also be running on a Raspberry Pi, in this case a standard 3 Model B, though connected to a 7″ touchscreen display. This will also have a Teensy 3.6 board which will be used to interface with various buttons and potentiometers. Also possibly another NRF24, it depends on if control via a WiFi access point will be stable enough.

The sort of thing I have in mind is similar to these controllers for cranes and diggers.

PLUS+1® remote controls
https://www.danfoss.com/en/products/electronic-controls/dps/plus1-remote-controls/

I just love the industrial design of them and with the complexity of all the arms and similar it seemed a valid excuse to build one… I have a pair of 4 axis joysticks, these have X and Y as you’d expect but also able to rotate. The 4th axis is a button on the top, I can use this as a modifier or to toggle modes.

One thing I’d love to do is a waldo controller, similar to the one James Bruton developed for his performance robot but I’d prefer it to be smaller and I think that’s out of scope for the competition.

James Bruton’s puppeteering rig from his video

Better yet would be one similar to the controller Naomi Wu showed in her video about the Ganker robot. It attaches around her waist and allows her to control not only the arms but the motion of the robot too as the “shoulders” of the controller is essentially mounted on a joystick.

Still taken from Naomi’s video

This controller is incredibly intuitive, coupled with stereo vision via Stereo Pi and an Android phone in a Google Cardboard headset I think it’d be an exceptional combo. Definitely one for future development!

Software

The software for this will be written in Python but make use of the Robot Operating System. This isn’t an operating system but a collection of libraries and frameworks to allow components of a robot to work together, even if spread across multiple machines. I’ll be running this is Docker as I’ve had pain trying to get it installed and there’s an image available already.

This will run on both robot and controller and the intention is that it’ll allow for control over WiFi as well as telemetry to the controller. If a WiFi access point, likely a phone in hotspot mode, isn’t stable enough for control I’ll fall back to the NRF24 transceiver option. Handily there is an Arduino library that allows for sending and receiving messages in a format suitable for ROS to parse so hopefully that’ll be fairly easy to swap out.

Summary

There is a lot of work to do, the hardware is mostly done and needs mounting, just the end effectors (hands) need designing along with a few tweaks to the head, and the mount for the nerf gun.

I’m a professional software engineer by trade so I’m hoping that writing the code shouldn’t be too bad a job (DOOM! FORESHADOWING! ETC!) and I have the week before the competition off too to allow for last minute hacking…

MacFeegle Prime – Overview

This is definitely going to change with time but this is the current vague, but slowly solidifying, plan for MacFeegle Prime, my competitor for PiWars 2020!

Concept

This robot is heavily inspired by Johnny Five from Short Circuit. To that end, it’ll be tracked, have a pair of arms, a head, and shoulder mounted nerf cannon. There will have to be lasers and blinkenlights in there somehow too! The demeanour and style of the robot will be very heavily influenced by the Wee Free Men from Terry Pratchett’s Discworld series… No, I’m not sure what that’s going to look like either but I’m looking forward to finding out!

Cover of Wee Free Men, pretty sure this is fine under fair use…
Original Designer, Paul Kidby
By Rik Morgan (Rik1138, http://www.handheldmuseum.com) – http://props.handheldmuseum.com/AuctionPics/Johnny5_03.jpg  CC BY-SA 1.0, https://commons.wikimedia.org/w/index.php?curid=2693174

Hardware

Unsurprisingly this will primarily run on a Rasbperry Pi, this will take data from the controller, sensors, and cameras then send appropriate control signals to the motors, servos and lights.

As he has a head, I was planning on using a pair of cameras. The “simple” option is just to stream these over wifi to a phone and use a Google Cardboard headset to get a 3d live stream from the robots perspective. Longer term I’ll be using OpenCV or similar to generate a depth map to allow for autonomous navigation and object detection. I was thinking of using two Pi Zeros with cameras attached, they could be dedicated to rescaling and streaming at different qualities for streaming to a HMD (head mounted display) or lower quality stream to another Raspberry Pi that could use OpenCV on them. In the end, I went with the Stereo Pi as they are designed for this very task! To that end, I’ve the Delux Starter kit on order that includes a Raspberry Pi 3 Compute Module and a pair of cameras with wide angle lenses.

Motors are controlled via an Arduino with a motor shield, in this case an Adafruit Feather and motor wing, and they are connected to a pair of PiBorg monsterborg motors and these are beasts! I started off with motors from an RC car and they rapidly hit the limit of what they could move. The robot already weighs 1.6KG…

For the arms I’ll be using servos, a whole bunch of them! I’m aiming for 7DOF arms, which is the same as humans, with the shoulder servors being a more powerful than the others as they’ve more weight to move around. The head and nerf cannon will also have a pair of servos for moving them around, the torso will need to be actuated too but I’m probably going to use a motor for that as it’ll left the whole weight of the robot from the waist up. To control all of these I’ve an Adafruit 16 servo hat, I may need another…

For sensors I’ve a load of ultrasonic sensors, inertial measurement units and optical sensors. The ultrasonic sensors will be good mounted aroudn the robot to get a decent set of returns to create a map from, the IMUs will be good to check if the motor is level and where it’s moving, and the optical sensors should be handy for line following. These will almost certainly be fused together via an Arduino. This means that the real time bits can be done on dedicated hardware and we don’t have to worry about timing issues on the Pi as it isn’t real time.

Software

I’m expecting to use Python for the lions share of the code for this, the Pi, StereoPi, and the servo hat I’m using have excellent support for this and I’ve a bit of experience with it already. The Arduinos run C and if I need to write something for the phone to enable a head up display I’m going to use the Unity games engine which uses C#. It’s what I use at work and I know both Unity and C# very well.

Construction

I’m using an aluminium plate for the base of the chassis and 3d printing custom components to mount to it, currently all hot glued in place with rapid prototyping/laziness…

Timescale

I’m close to having the drive part of this done, at least the first iteration, and I’m expecting that to be sorted over the next few weeks. There will no doubt be improvements over time so it’ll not be a one shot thing. For the torso, arms and head I’m aiming to get the first iteration of those sorted by the end of November. This will give me plenty of time to improve things and get the software written for next March too.

That’s it for now, it’s mostly a brain dump while I work on things in more detail.

1st Gen Roomba Overview

Before I can rebuild Bumblebee, my 1st generation Roomba, I need to figure out how he works.  I’m going to split this into three sections; Power , motors and sensors.  I’m going to cover how to interface with each of these in future posts.

Power

This was simple enough, I charged the battery and put a multimeter across the terminals, the battery showed 16v across the terminals.

Motors

A quick count shows that there are five motors.  One for each wheel, one for the brush motor, one for the side sweeper and one for the vacuum.  From the fact they all seem to have a black and red wire going into them and from the age of the device I took an educated guess and assumed they are simple DC motors.  In order to test this theory I took the probes from my voltmeter, plugged them into my bench supply and poked at the motor terminals with the voltage and current limit set low.  With this simple setup I was able to give the motors different voltages and easily reverse the polarity, sure enough the speed changed with voltage and direction changed with polarity.  The wheel motors will need to run in either direction but the other three only need to run in one direction.

Sensors

There turned out to be a lot more sensors than I realised and it’s quite a packed little robot!  The sensors fall into two categories; IR sensors and switches.  The microswitches are on either wheel and the caster wheel at the front, it looks like all three are currently wired to the same header so the robot knows only that one wheel is up and not which.  The rest of the sensors are a bit more convoluted.

Wheel Encoders

The drive wheels have an encoder each with four wires going in, once I’d opened one up it turned out that they are comprised of an IR LED and a light dependent resistor.  I checked to see if they were IR by giving them just over a volt and there was no light, I then got out my phone camera and saw the telltale purple glow.  Shortly after this I realised the error of my ways as the LED went out, without a current limiting resistor I burnt it out!  Thankfully the LDR worked with visible light so I ended up replacing the LEDs on both sides with red ones.

Cliff Sensors

Along the underside of the bumper there appears to be four cliff sensors, again IR LED/LDR combos which in this configuration are known as IR distance sensors.  I’ve used these long ago when I built a PIC16F84 based robot at college so these aren’t a mystery.  The resistance of the LDR varies depending on how much light bounces back, you need to calibrate them in your code or circuit but they are simple enough.

Wall Sensor

This is an IR distance sensor on the right hand side of the bumper, it works the same as the distance sensor.

Bumper

This one confused me for a while as I couldn’t see any switches on the end of the arms of the bumper, I ended up taking the bumper out which required removal of the logic board and the penny dropped.  At either end of the logic board there is an IR/LDR pair and when the bumper is hit the light level changes.  I wondered to start with why they didn’t just use a switch but the video linked at the top of this page explained it all.  A switch would be hammered that often it would fail in no time, the design of the bumper mount also cleans the area between the LED/LDR too which is handy.

IR “Eye”

On the top of the bumper at the front is a 360 degree lens which directs light on to an IR sensor of some kind, I’ve not dug deeper in to this one yet.  I believe it acts like an IR receiver for a remote in a TV as it is used with the Roomba’s virtual wall.  If the robot detects the IR code that is being sent out by the virtual wall it acts as though it hit a solid object, this is useful for preventing your hoover from escaping.

 

I’ll cover how I use each of the above in upcoming articles for each part above.

Project Bumblebee

Bumblebee?

 

Bumblebee is my Roomba, so named as long ago he lost his voice.  About a year ago his logic board started playing up and though he was still able to clean, at the end of each cleaning cycle he wouldn’t go into standby and his battery would drain in no time.  At that point he stopped actively gathering dust and started doing it passively as he sat behind my sofa.

Bumblebeee MK2

Since a kid I’ve always wanted to build a robot and figured I’d kill two birds with one stone and use Bumblebee as a chassis for a mobile robot, he already is one after all, but also have the aim of returning his base functionality of being a robot hoover.

The Plan

Bumblebee is an original model Roomba from 2002, he was a gift from a friend who knew I loved to tinker and gave me him broken.  If I could fix him I could keep him, thankfully an easy fix as the battery was dead.  This model is very simple in it’s function and behaviour, it has no mapping capability, no dock or wireless control.  It apparently can be controlled using IR but I’ve never had a remote.  It also lacks the diagnostics port that the newer models have that make hacking a Roomba really easy now so this is going to be a bit trickier, a lot more education and most importantly more fun!

The parts I’ve used to partially resurrect him are a Arduino Leonardo and an Adafruit Motor Controller Shield.  I’ve also a Raspberry Pi 3 to add into the mix, for Wifi control and more processor intensive tasks.  The idea is to use the two to thier strengths; the Arduino will control the motors and read the sensors allowing for real time override in case of collision and the Pi will be able to sit back and give out the orders.  It’s a classic split for mobile robots but thankfully very cheap to implement now.

Current State

As I said I’ve been working on this for a while, I’ve a load of notes to type up and a loads of “learning experiences” to share.  Mostly when I made a rookie error and burnt out one of the motor controllers…  I’ve now got the motors under control over serial, I’ve also a simple console application that lets me drive him around and toggling the sweeper/vacuum fans on, here’s a video of him in action:

Next Steps

My next item to look at is getting sensor data into the Arduino, first up the encoders.  Encoders are devices that allow you to measure the movement of a wheel, you’ve likely used a rotary encoder on a hifi to control the volume, and the Roomba has one is each wheel.  Right now I can control how much power goes to each wheel but because of differences in the state of the gearboxes, carpet and who knows what other factors, the wheels spin at different speeds.  By measuring the speed from the encoders we can compensate for this, we can also use them to calculate the distance we’ve travelled.

After that is the rest of the sensors, those I’ve found so far are;

  1. Cliff sensors – these are under the bumper and detect drops to prevent him falling down stairs, I think there are four of them and they appear to be IR distance sensors
  2. Bumper sensors – these detect collisions, I think there is one at either end of the bumper so I’ll know if I’ve hit something to the left or right
  3. Wall sensor – another IR distance sensor mounted on the right of the bumper, this allows for wall following behaviour
  4. Wheel up/down switches – One on each of the drive wheels and one on the caster at the front.  They detect if the wheels are up or down and can be handy for detecting when we’ve beached ourselves.
  5. Wheel encoders – these were IR LEDs and a light dependant resistor.  I blew one of the LEDs by accident so replaced them both with red LEDs.
  6. Beacon IR Reicever – Not sure how this works yet, it’s a 360 lens on the top that receives a beam from the virtual wall, a device you place by your door to stop him escaping, I’m hoping to add more sensors to make this redundant.
  7. Buttons – there are three buttons for room size to start different cleaning cycles.  They could be useful though I may not bother with them.

Once I’ve all the sensors connected I’ll be able to hook up the Raspberry Pi to start working on reproducing his original behaviour.  After that I’ll be able to build up his capabilities over time and add a load of new ones too.  I’m not intending this just to be a hoover but a mobile robot platform that happens to be able to hoover.

If you’ve got this far, kudos!  That’s it for now, more posts will trickle out as i write up my old notes.  I’m going to carry on having fun building things and write posts in the order they happened.  Hopefully I’ll catch up fairly quickly!

Controlling a Syma S107G with an Xbox 360 Controller

As a tie in to my No ‘Air Ambulance Challenge, a sponsored body wax for the local Air Ambulance charity, I’ve decided to upgrade my old project by swapping the RC car out for a Syma S107G.  If you enjoyed this post, please donate a few quid or whatever you can here.  It’s for a very good cause and as it’ll put me through a *lot* of pain you can be sure I’m not asking for my health!

I tried to implement the IR protocol in C# using the .NET Micro Framework for the Netduino but it proved more than a little tricky to get the timing right.  As there are a few implementation out there for the Arduino I decided to stand on the shoulders of giants and build on top of existing code, two in particular.

http://www.kerrywong.com/2012/08/27/reverse-engineering-the-syma-s107g-ir-protocol/
http://abarry.org/s107g-helicopter-control-via-arduino/

One thing worth of note is that my helicopter uses Channel B, as you can see in the comments in my code (heavily based on Kerry Wong’s) it is easy to switch between the two channels.  For the serial control aspect I implemented a similar method as with my Netduino project and in Andrew Barry’s implementation.  For the Xbox 360 Controller interface, I updated my previous application to listen for the trigger for the throttle and the right analogue stick for pitch and yaw.  I had some fun when I got the values backwards and I slammed the copter into a wall.  Thankfully they are built tough to survive kids and geeks alike…

The IR emitter is  a 500Ohm resistor inline with a pair of IR LEDs, connected to Pin3 on the Arduino.  Video demo below;


Xbox 360 controller interface to enable control of a Syma S107G.
Xbox 360 controller interface to enable control of a Syma S107G.

You can find the code here Syma360.

Microsoft Surface, MSR and Robots

Microsft Research never cease to amaze me, I was lucky enough to briefly visit the labs in Cambridge last week and I’ll freely admit I was gushing like a star-struck fan!  Engadget have an article up featuring some of the work MSR are doing with Surface and SAR robotics, well worth checking out.

http://www.engadget.com/2011/08/11/microsoft-surface-controlled-robots-to-boldly-go-where-rescuers/

Keegan

Netduino Quadcopter

In my first post last year I stated that my New Years resolution was to build more stuff, the Jukebox lights are wired up and that project is more or less done.  One of my other ambitions mentioned was to design and build a UAV.

Now, I’m a Microsoft guy.  It’s what I know and do for a living so I know thier developer stack pretty well so upon learning of the Netduino a plan came together and this years project was born!  So far I’ve the Netduino, a Razor 9 DOF sensor board and a plan!  I’ve been following other similar projects such as Aeroquad and I plan to blog the project as I go.

The plan is currently fluid, I’ve not built anything like this before though have flown model helicopters before.  I’ve a few specifications that the copter will be built around;

  1. Autonomy – With the sensors onboard, the IMU and GPS, it will need to be able to fly a set path.
  2. Payload – Along with flying a path the copter will have a camera or two to capture aerial photographs.  Ideally from multiple angles to have some fun with Photosynth
  3. Telemetry – The copter will have a live telemetry link back to a PC and on board storage for later review of flight data.  Ideally a live video feed too which will be the stretch goal.
  4. Easy Mode – As with the Aeroquad I’d like my copter to have stable and agile settings, I wan’t my friends to be able to fly this easily with minimal practice.

I’ve set myself quite a challenge though as with all my projects it should be a hell of a lot of fun and quite a challenge.  Stay tuned for more…