Installing RedBoard+ Library on Ubuntu

Macfeegle Prime is heading back to PiWars!

A lot has happened in that time, ignoring the whole *gestures broadly at the world*, a lot has moved on in software terms. We now have Ubuntu 20.04 for the Pi, ROS Noetic has been released for it, and Approximate Engineering has rewritten the RedBoard+ library from scratch.

This library also includes support for the servo board I’m using meaning that all the servos can be controlled from the same library, it’ll be one less thing for the Teensy to do so I thought I’d start here.

Notes

Installation of the library using “pip install redboard” works a treat, it includes a GUI that lets you tweak the configuration. If you try using “redboard-gui” on Ubuntu however you’ll find the pigpio daemon isn’t included, I had to install it from source so follow the instructions here.

Next issue I hit was that I didn’t have permission to access i2c, with Raspbian you can run raspi-config to enable and disable access to various interfaces but that isn’t available on Ubuntu. I followed these steps to get it working:

First create a new file using nano:

sudo nano /lib/udev/rules.d/60-i2c-tools.rules

Then add the following lines:

KERNEL=="i2c-0" , GROUP="i2c", MODE="0660" 
KERNEL=="i2c-[1-9]*", GROUP="i2c", MODE="0666"

Reboot, then run “sudo pigpiod” followed by “redboard-gui” and you should get the following:

redboard-gui in a terminal window

You’ll need to run “sudo pigpiod” on startup, to do that run “sudo crontab -e” and add the following to the end of the file:

@reboot /usr/local/bin/pigpiod

Reboot and you’re good to go!

Big News, Going Pro!

I have taken the plunge and I’ve gone part time at work, dropping to three days a week, to give me far more time to concentrate on my lifelong passion of robotics with the aim of making it my career!  This wasn’t exactly planned if I’m honest but the result of a recent epiphany, building on my experience with MacFeegle Prime at PiWars.

My plan is to use this time to continue learning ROS, finish development on MacFeegle Prime and it’s controllers, and restart development of a few stale projects.  I’ll be making all of these projects open source too, to give back to the community that has, and continues to, inspire me to bigger things.

The projects should be familiar to long time followers of my blog and other channels, though as I’ll be rebuilding them to share they’ll have more sensible names.  Sensible, but all puns…

NE-1 (Bumblebee)

This will be a resurrection of the first robot project I tried since my A-Levels, rebuilding my Original Roomba nicknamed “Bumblebee”.  I was given this robot in a broken state by a friend of mine, it turned out that the issue was a duff battery so easy to get working again, it quickly developed a fault where it’s speaker died though and it lost its voice which is where the nickname comes from.  I used it as designed for more years after that before it just stopped working, none of the buttons worked and it was totally unresponsive, looked like the main circuit board had died.

Bumblebee

I made a series of videos showing the in’s and out’s of reverse engineering the robot culminating in being able to drive it around using a gamepad, I never recreated its autonomous functions of being able to hoover by itself though.

Project NE-1 will resurrect this project as a way to show how a cheap old hoover, which can be found on eBay for as little as £25, can be reborn using a Raspberry Pi and given new purpose. The newer versions have a diagnostics port which makes this even easier, but this will concentrate on how to reverse engineer the hardware and hopefully show how this or an RC car can be converted in to a very affordable robotics platform. I’m hoping that this will show that there are ways for anyone *cough* can get in to robotics.

NE-5 (MacFeegle Prime)

MacFeegle Prime was my entry for PiWars 2020, it won the public vote for fan favourite and came second overall! This robot has had a lot of blood, sweat, and tears put in to it but it suffers from a lot of legacy problems from early in its development, essentially I didn’t *really* know what I was doing, making it up as I went along and in some ways it shows.

Very Early Prototype…
Winner! Celebrating on the day of Virtual PiWars 2020

The plan for MacFeegle Prime is to finish the build as per the MVP and get him to the basic level to compete in PiWars for when it’s held next year. At that point I’ll have learned how to build an awesome robot, more importantly though all the ways not to build a robot. Using all my experience, and the parts, from MacFeegle Prime I’ll design and build NE-5 which will be redesigned from the ground up. The mechanical files, BOM, and code for NE-5 will be published so anyone can build one to use for research or just for a laugh.

MacFeegle Prime has been build using off the shelf parts, some slightly modified, along with 3d printed custom components. It’s this combination of easy access to components and relatively simple construct that makes for an incredibly capable mobile robotics platform. Also a testbed for larger versions. In future phases there will be an NE-5L, and hopefully an XL, closer to the size of it’s inspiration…

NE-Where (The Luggage)

Has to be done!

The Luggage has been a daft project that has been on and off the back burner since EMF 2016, when I realised how annoying it was to walk back and forwards from my car to bring all my crap to the campsite. Fast forward a couple of years to EMF 2018 and Hacky Racers was born! I decided to resurrect the project as a racer, with a pair of 2KW motors and a lot of moxie! Again, I was making it up as I went and I didn’t have the experience with CAD I do now, I didn’t really have a plan so much as a pLn, left everything to the last minute and it didn’t come together at all.

After the first couple of events for Hacky Racers I realised I really like commentating and running the races, handy as everyone else wanted to race! This also meant that I didn’t have to worry about falling within the rules (which was already in a lot of grey areas) nor have to worry about getting it done “in time” for anything, which meant it didn’t get done…

Fast forward to now and I have the majority of all the parts to not only finish it but make it much more capable. Rather than being a racer, then a racer that could be remote controlled, it’s now going to be a robot that I can ride.

To complement the relative simplicity of the NE-1 and the complexity of NE-5, NE-Where will be my heavy duty rover. I’ll be redesigning it from scratch along similar lines to my bike trailer. It was literally built around an 84L Really Useful Box, and that’s my plan for NE-Where too. As well as having a daft alter ego in the shape of The Luggage, legs and all, it’ll also have a full set of wheels for more sensible uses.

A Really Useful Trailer

This will be used for research into logistics robots, rough terrain navigation, and as a base for NE-5XL too.

NE-Thing

Robots aren’t much use if they can’t be controlled and for PiWars I made a custom controller for MacFeegle Prime based around a pair of three-axis joysticks, a bunch of switches, and a Raspberry Pi.

This controller uses ROS, as does the rest of the robot, to communicate and control it remotely. I got it to the point where it could drive the robot around and control it’s head too, with the live video stream I could control it remotely. The plan was to have a driving mode and manipulator mode, alas I didn’t get the arms integrated in time to get that far.

This was always intended to be reused for other projects, hence NE-Thing, and the design will be extended with that in mind. Along with WiFi and Bluetooth that are available on-board the Pi I’ll be adding an NRF24L01 transceiver as well as a GPS reciever. This will be useful for outdoor projects where WiFi isn’t available and GPS will allow for dynamic return to base control too.

In this phase this controller will be used for the robots above, longer term I intend to build a drone too where the return to base functionality should come in to its own.

NE-Body

One thing I’m going to research and develop further is teleoperation, with NE-5 having stereo vision I wanted to build a waldo controller to allow for more intuitive control over the robot’s arms.

A render of the waldo controller

The design above was a pretty quick design I threw together in a few hours, that I can say that alone shows how far I’ve come, and needs a lot of improvements to be actually useable. Likely I’ll switch to encoders of some kind instead of potentiometers but that needs researching. Due to the sheer number of potentiometers involved this will be based around either a Teensy or Arduino Mega, I think it’s 16 joints in total after all.

At the minute I’m not sure if this will connect directly with the robot or be a peripheral of the NE-Thing, I’ll have to make that call when I get around to it.

Funding…

Dropping to three days a week means I’ve still got a regular income and I’ll be able to cover my bills, I’m incredibly lucky to be able to say that, and the vast majority of parts for all the projects above I’ve purchased over the years already. For the foreseeable future I’ve plenty to be getting on with and all the parts I need, but for future phases I’ll need to find that budget from somewhere. My plan is to launch a Patreon page to help fund future work with the hope of taking this full time too.

If you stuck around to the end, thank you! Let’s see where this ride takes us!

Using The RedBoard+ From Within a Docker Container…

…because why do anything the easy way?

I’ve switched to using the StereoPi image for MacFeegle Prime as it offers the lowest latency for streaming video, the downside is I can’t get ROS to build on it so I’m using Docker which needs to interface with hardware…

I’m using the RedBoard+ by Red Robotics, it uses the pigpio library which can work over a socket or pipe. I’m trying to figure out how to have the pigpio deamon running on the Pi host and access it from a docker container that’s running ROS.

Security Note Of Doom!

The following changes have security implications, only do this if you’re running on a private, trusted network. I don’t really know what I’m doing so assume the worst and double check everything I suggest!

Things To Tweak

Upon some digging, this wasn’t as evil as I thought it would be. For pigpiod you need to change the service definition to remove the “-l” from the start command:

sudo nano /lib/systemd/system/pigpiod.service

Remove the “-l” from the ExecStart line.

This is needed as by default pigpiod runs in localhost mode which means that it’ll only accept connections from the local machine. When connecting from inside of a docker container, it’s considered another host.

In order for the docker container to access the hardware it’ll need to have privileged status, add this to the command when you create your container. Eg:

docker run --priviledged -d ros 

From inside the docker container, and in the same shell instance you’re running the code that calls the RedBoard library, you need to run this:

export PIGPIO_ADDR=[PI IP ADDRESS]

This sets the address used by the pigpio library that RedBoard uses to the host machine.

Conclusion

After doing the above I was able to use the RedBoard keyboard_control.py script to control the robot from inside a docker container. I’ve not tried anything further yet but that proved the concept enough for today!

Raspberry Pi – Docker on USB Drive

As per my last post I’m using Docker on my robot with ROS. The last task is to get docker running from a dedicated USB drive to split resources between that and the SD card the OS is running from. A good guide to mounting a USB drive can be found here.

Note, rather than using umask=000 for mounting you need to mount, then change the permissions of the “host” directory to 777. For example, mount to /media/usb as per the article then chmod 777 /media/usb **WHILE MOUNTED**. This should allow you to mount, then set to automount on boot.

If you are running headless and there is a problem with the fstab file it can get annoying so to test in advance of a reboot run “sudo mount -a” to mount all volumes as per that file. If it succeeds, you can reboot.

I was having a problem mounting with fstab, I could manually mount the usb folder every time but not using “mount -a”. The penny dropped when I did “df -h” to see how much space was free and noticed /media was itself a mount point. I created a new root folder called “docker” and it worked a treat.

Following this answer I moved the location of the docker containers and such to /docker.

I’ve run “docker pull ros” and it’s now happily using the usb drive instead of the SD card. 🙂

StereoPi Image Mods

I’ve been using the Ubiquity Robotics Raspberry Pi ROS image to run both the robot and controller, it seemed the easiest way to get ROS running, but now I’m trying to get low latency streaming working from the cameras it is proving tricky.

New Plan, use the Stereo Pi image with Docker to host ROS images/containers.

Some Mods Needed…

In order to lower the latency as much as possible the Stereo Pi image is a heavily modified version of Raspbian, this includes custom partitions and the file system set to read only. Here are the steps I followed to get it to a state where I can restart development.

1. Get and Modify the Image

Head here and follow the steps to get and install the image, follow the steps to get it on the wifi and under the Advanced section you’ll see details on how to SSH to the Pi afterwards. Once logged in you’ll need to temporarily set the file system to read/write then edit the fstab file.

Under the “SLP Structure Details” section you’ll find this command:

 sudo mount -o rw,remount /   

This will set the file system to read/write, at which point you can open the fstab and edit it to make this permanent by changing the ro to rw for the root.

nano /etc/fstab

2. Resize the Partitions

I was trying to figure out how to do this on Windows then realised I could just install gparted on the controller and remote to it… I put the micro-sd card in a USB card reader and followed these instructions. The 1.8G root partition was expanded to around 25GB and the recording drive slimmed down accordingly.

screenshot of gparted with the final sizes of the partitions

3. Install Docker

This next bit is trivial, run this command, as taken from the Raspberry Pi blog:

 curl -sSL https://get.docker.com | sh 

4. Get ROS Images

Next bit is easy too:

 docker pull ros

Considerations

One of the reasons StereoPi is very quick is because it is tailored not to use the filesystem on the SD card, to that end it may be worth moving all of the docker containers and images over to a USB drive. There’s more information on how to do that here but I’ve not tried it yet:

https://stackoverflow.com/questions/32070113/how-do-i-change-the-default-docker-container-location

Next Steps

The next things I need to do is convert my code to work in a docker container, this shouldn’t be too tricky but as the RedBoard library will need to talk to the hardware there will likely be complications.

Order of Operations (Time Management)

In an earlier post I talked about The Plan, that post has become a living document as I’m updating it as bits get done. I’m trying to think a bit more about the order in which I’m doing things as some tasks have prerequisites and thought I’d share my thoughts on how to manage it.

For example, I can’t do the remote controlled motion until the remote is sending messages, in order for something to receive them I need the new head printed to take the StereoPi board back in place, and so on. To get the remote working I also need to rewire it slightly to use a hardware serial on the Teensy as USB serial doesn’t work as I’d expect with a Pi. There’s a lot to do but what’s the priority and how to decide?

One thing that is time consuming, more for the build time than design time, is anything that requires 3d printing. Depending on the size of the model it can take an hour or overnight to print parts so this is an easy win for planning. If I have something that takes a while to print, prioritise that so I can get on with something else while my printer does my bidding! <evil laugh>

After that it’s a matter of thinking tasks through to mentally go through the process of doing the work. I like to make notes about each task in bullet lists so that I can go back, edit, move them around in the list, until I get my head around it.

For the example above, here’s a brain dump:

Motion control needs the StereoPi in the new head design, finish the design off and get it printing. For remote control to work we need to get messages from Control to Robot, this will need the test code expanding upon to send useful data. The controller also needs wiring up properly as though the Teensy is putting out the state of the joysticks and switches over serial, the Pi can’t actually open serial to the Teensy, using hardware serial on the Pi apparently mitigates this.

That brain dump then can be turned in to a list:

  1. Finish head redesign
  2. Print new head
  3. Rewire Teensy in controller to hardware serial
  4. Update ROS test code to relay message from Teensy
    1. Control side
    2. Robot side
  5. Add RedBoard library to the receiver code, hook up to motor control

This process doesn’t take long and it helps me a great deal, this project is incredibly complex and has a lot of moving parts that depend on one another. Once I get the above tasks done I can think about the next steps, after the head is controlling the arms, this will require a board to be soldered, new designs doing/printing, and more messages between controller and robot, so the tasks will likely be similar to the above but running through the same process will help concentrate my effort and hopefully reduce the amount of stress involved.

Hope this helped someone, if nothing else if ever I’m struggling again if someone could point me towards this as a reminder I’d appreciate it!

Pi to Pi Comms Using ROS

I’ve decided to use ROS (Robot Operating System) for my PiWars project as it’s industry standard and this is an excellent excuse to learn it. For some reason I thought that ROS was a realtime operating system, turns out it is a bunch of libraries and services that run on top of existing operating systems, though that’s selling it short. It’s been around since 2007 and there are *loads* of libraries available for it, I’m hoping to use these to simplify navigation and control of the arms. There are loads of kinematics libraries available so I’m hoping to stand on the shoulders of giants here.

I’ve been playing around with the tutorials and have messages going from one Raspberry Pi to another so thought I’d share how I got here.

Left, the Raspberry Pi console on the controller. Right, the Raspberry Pi in the robot.

Setup/Prerequisites

I’m using the image provided by Ubiquity Robotics, it works and already supports stereo imagery using the StereoPi so seemed daft not to use it. Once you have two Raspberry Pis running with this image get them both on your network. If you’re running Windows you may also want to install Bonjour Print Services, this includes the same service that the Raspberry Pis use to advertise to each other on the network and means you can find them easier by host name.

Tutorials

The ROS tutorials can be found here, If you’re wanting to do ROS development on a Windows machine this may be of use. It’s instructions for installing ROS in the Windows Subsystem for Linux, Docker or VM.

The specific combo of tutorials I used are the Python pub/sub and “running on multiple machines” tutorials, I ran the former on each Pi first to make sure they were working then followed the steps in the latter to set the robot as the master node, then run the listener on the robot and talker on the controller. You can do it either way around, I just like the idea of sending messages from controller to the robot. 🙂

Python Publisher/Subscriber Tutorial
Running on Multiple Machines

If you follow these along at home you will need to go back a few steps as to run the pub/sub tutorial you need to build a package, to do that you need to create and set up your workspace. The prerequisites for each tutorial are listed at the top of each article so easy to backtrack.

Learnings

So, I can send “Hello, World!” from one machine to another. Woo I hear you say! It doesn’t sound like much but from here I can use these concepts to send and receive messages between controller and robot. For example, one node would publish sensor data, I would then have one or many listeners that use that data. Another would listen for motor control signals, telemetry data, etc…

Next up, use the RedRobotics samples on the robot to enable remote control and basic telemetry back to the controller. This will just be the battery level to start with but that’s a very important thing to know as I trust you’ll agree.

PiWars Postponed, New Plan…

With a heavy heart, PiWars has been postponed and a Virtual PiWars has been announced for April 25th prior to a full competition being run. This is very much the right thing to do and I totally support it. One thing this does mean is more time to overcomplicate expand functionality! Here are a few things I’ve been thinking about that I’m going to investigate further.

I’ll start running through these in order, based on the MoSCoW priority system. It’s a simple way of prioritising tasks.

This page is going to be updated each time a task is done with relevant links added to blog posts or videos.

To MoSCoW!

Must have, Should have, Could have, Won’t have. A logical follow up to my previous post on the MVP of MFP.

Must Haves

The robot must:

  • Be drivable by remote control
    • Motion control Done, 29/3/20
    • Arm control
      • Manual control
      • Preset positions for grab/lift/drop
    • Head control Done, 31/3/20
    • Weapon control
  • Sensors
    • Perimeter distance sensors
    • Line follow/cliff sensors on bumpers
    • Video feed over WiFi
  • Blinkenlights!
  • On robot battery charging (BMS) Charge port added (14/3/20)
  • Controller internal batteries/charging (BMS) Done, 29/3/20

Should Haves

The robot should:

  • Provide a stereo feed to a Google Cardboard enabled phone
  • Provide sensor feedback over WiFi to the controller
  • Arms upgraded with compliant joints
    • I might be able to can do this by modifying the servos to give me an analogue output from the potentiometer similar to these,
    • Proof of concept video here! (14/3/2020)
  • A voicebox!
  • Power monitoring on the battery, displayed on controller
    • Voltage
    • Current
    • Charging state (LED on robot too)

Could Haves

  • Full motion and arm control using puppeteering rig
  • Computer vision for
    • Line following
    • Barrel sorting
  • Automation
    • SLAM/Similar
    • Navigation for the blind maze
    • Navigation for barrel sorting
    • Target detection for Zombie Defense

Must Not

  • Be on fire
  • Break down, unless incredibly funny
  • Gain self awareness and take over the world (unless it has a good plan for public services)

MFP MVP TBD…

MacFeegle Prime, Minimal Viable Product, To Be Decided…

Time’s pressing and though I started with lofty goals I need to set a minimum that I’ll be happy with that are acheivable, in software engineering (and probably other fields) we refer to this as the minimum viable product.

The Challenges

There are seven challenges in PiWars, one is autonomous only, a few are optionally autonomous for extra points and some are remote control suggested but you can do them autonomously for bragging rights. The challenges are as follows:

  • Autonomous Only
    • Lava Palaver – Line Following
  • Remote Control/Autonomous Optional
    • Eco Disaster – Barrel Sorting
    • Escape Route – a blind maze
    • Minesweeper – find and disarm red squares
  • Remote Control
    • Pi Noon – Robot jousting!
    • Zombie Apocolypse – shooting gallery
    • Temple of Doom – Obstacle course!

Required Sensors

This robot will be powered by a Stereo Pi so will have the capability for conputer vision, if I’ll be in a position to learn how to do that is a different matter, so what are the simplest sensors I can use to solve these problems?

Line Following

The simplest way for this is an array of light sensors pointing down along the front bumper. The line will be brighter so you can sense how far from centre you are and change your steering accordingly. I’ve a load of IR distance sensors from the ill fated version one of the shooting gallery that I can press in to play.

Blind Maze

For this I’ll need a bunch of distance sensors arrayed around the robot, I have used ultrasonic sensors in the past but they’re physically quite large and the other competitors mentioned a better option. The VL53L0X is a LIDAR sensor that runs over i2c, it can run in continuous mode and you can request a reading on the fly. These are physically smaller will be easier to have more arrayed around the robot they do have a few downsides.

First off, all of these have the same i2c address by default so you have to either change the address on boot up, which requires a wire per sensor, or use an i2c multiplexor, which requires a few wires per sensor. I heard from one of my fellow competitors that the former was ropey at best when they’ve tried it in the past so multiplexor it is!

The other downside is that the performance of these sensors depends a great deal on the surface they’re reflecting from, white is the best for reflectance and black the worst, guess which colour the walls are at PiWars?

In finding links for this post I just spotted these ultrasonic rangefinders which are much smaller, pricy but they’d certainly do the job.

Mine Sweeper

The way this challenge works is that the robot is placed on a 4×4 grid that is lit up from underneath. One of the squares will be lit up red and if the robot moves to and stands on it, it’ll be defused. For a pure brute force method of doing this you can use a colour sensor facing down on the bumper. You’d have the robot bimble around at random, much like the early Roomba’s, and when it detects red it can stop until the colour changes.

It’s not efficient, but it could work. I’m not sure if the extra points for doing it autonomously would be more fruitful than getting more mines by driving manually. I’ve seen someone post a proof of concept for doing this using computer vision so for this one I’ll go with manual with computer vision being the stretch goal.

Remote Control

The bulk of the challenges will be done manually, so we’re going to need a suitable controller. Ideally I’d have a full waldo controller and VR headset as per my aspiration but I need to be more realistic. As a very basic method of control I have an xbox controller rigged up to a Raspberry Pi with a display. It’ll connect via a wifi hotspot, likely my phone, and issue commands over TCP. With the analogue sticks of the Xbox controller I’ll be able to control the movement with one stick and control the head (cameras) with the other, much like in a first person shooter. If arm control using the sticks proves too tricky to control this way I can just preprogram a few positions so it can put hand in front of the bot, press another to close the hand, another to raise is slightly… It’d be all I need for the barrel challenge but wouldn’t be using the arms to the fullest.

Conclusion

We have a plan, or at least the idea of a plan… Having a smaller set of more constrained targets is a good focus, now I just need to get over this damn lurgy and get some energy back!

How Not To Build A Robot

I’ve gained a lot of experience over the last few months with regards to Fusion 360, 3d printing, electronics and more besides. I thought I’d share some of those lessons.

As Complex As You Make It

The most important lesson, as with any project, is to have an idea of what you’re building from the start and how long you have to build it. If it’s a relatively simple design, there will still be a lot of issues you’ll come across that will take added time to figure out, doubly so if you’re learning as you go. My robot concept was complex to start with, more so than I expected, and I had a lot more to learn than I realised too. However long you think you need, add more and if possible simplify your design.

In retrospect, more of a plan than a quick sketch wouldn’t have gone amiss…

I had a bunch or early wins, I used existing parts from an RC car to make early proof of concepts which sped things up, and this gave me a little too much confidence. I was designing elements in Fusion 360 in isolation, assuming they’d work, and that burnt me a lot. I went through a number of different chassis designs as prototypes in the early steps and it wasn’t until I realised I needed to have more of a complete design done in CAD to see how they all fitted together that I could save an awful lot of time. I’m still not great at this but certainly getting better.

Longer term I need to learn how to do joints in Fusion 360 so that I can actually see how things fit together and what constraints there are.

I wasted a lot of time in what was designing seven different robots, I couldn’t have got to where I am without doing it though so a difficult balance to make.

Seriously, Make A List. Then Check it Again…

I had the vague idea that I’d have the Stereo Pi up top in the head for stereo vision, this would give a lot of opportunities for computer vision too. Around the chassis would be a ring of sensors, ultrasonics were what I had in mind to start with, but though simple to work with they’re quite large. I didn’t really know better so that’s that I went with. Later on I learned of the VL53L0X which is a really cheap lidar sensor and a lot smaller too. They had the quirk of having the same i2c address by default so you need to use i2c multiplexors or have them connected in such a way to reset their addresses on first boot… More complexity!

Again, we’ve all PHDs in hindsight but having a more solid plan and spending more time on research and planning in the early stages would’ve paid off in the long run.

Burnout

Look. After. Yourself.

As I mentioned earlier on I had lots of early successes which gave me an awful lot of false confidence, as soon as the easy wins came and went and the real struggle began the build got a lot more difficult, both technically and mentally. For those who know me or have been reading the blog for a while will know I suffer from Anxiety and Depression, they’re a bugger individually but when they join forces they’re truly evil. A few weeks before I applied to enter PiWars my beloved cat, Willow, passed away. To say this was hard on me is an understatement, coupled with the year tailing off, getting darker and colder, and things going from win after win to struggle after struggle, things got rough.

I tried to push through it, that was a big mistake, and I made the best decision for the project which is to take breath and start again. With a lot of support from my girlfriend, the rest of the PiWars community, friends, family, and colleagues alike I slowly got out of the funk while making slow but consistent progress. The Epic Rebuild Began.

Conclusions and Next Steps

I’ve learned a lot, come an awful long way in may regards and though I’ve still a lot to do I’m in a better place and so is the robot. The next steps are to get the controller up and running and the robot drivable again.

In the next blog post, I’ll talk about the plans for the challenges. As it stands I’ve almost one arm and only need to finish the hand, add a bunch of sensors and remote control. I have a minimum spec in sight and will at least be able to compete.