PiWars 2024: The Challenges and Their Challenges

There are seven challenges at the in-person event for PiWars 2024, each with their own problems to solve. They are split in to three categories: Autonomous only, Autonomous/Remote, Remote Only. For the second of those, if you are in the Advanced/Professional category you have to attempt the challenge autonomously.

State of the Onion

NE-Five exists as a robot, the base hardware is all there and everything has met the minimum requirements as I’ve set them, the devil is in the detail though and integration hell is totally a thing. Also, perfect is the enemy of done and I really need to get a focus on what needs doing now vs what would be nice to have.

Motor control

Motor control has been overhauled with the switch from Red Robotics RedBoard hat to the Pimoroni Yukon. The big difference is the latter has support for encoders, but it also has on board processing which means that processing load is taken away from the overhead on the Raspberry Pi. I also have a ROS node for the Yukon so it can send and receive ROS messages such as motor velocity commands and odometry.

Servo Control

I’ve implemented a new ROS node that not only takes commands to move the robots arms but also provides joint state feedback to the wider ROS system, this also includes the neck servos as they are the same Dynamixel smart servos used in the wrists. The linear actuator that adjusts how high the robot is standing has also been hooked up to the Yukon, it has a feedback line that is factored in too. Each of the smart servos is independent of the Pi too as you give them a command and they do it, handling any PID loops and monitoring internally as needed.

Camera System

I’m still using the Luxonis FFC-3P camera system, with two wide angle global shutter cameras and a narrower field rolling shutter camera in the centre position. Luxonis have recently released a big update that includes on-device pointcloud generation. Previously I was trying to do this on the Pi and it was basically taking all the resources just to do that. Having this board run tasks itself and only providing the data the Pi needs is a big win for sure.

I’ve also been playing around with object recognition, it works and you can run custom neural networks on their which, again, means the Pi doesn’t have to do anything but use the data it produces.

Back to the Challenges…

The robot works great in remote control mode, it’s currently the only mode however which isn’t ideal. There are five challenges I’ll have to tackle autonomously so I’ll concentrate on those for now.

Lava Palava

This is a line following drag race, there is a course with a black floor and white strip down the middle which the robot has to follow as quickly as possible. The course has a chicane in it from previous years but this year will also have a speedhump.

With the motor encoders providing feedback with regards to distance travelled and the camera system able to detect objects, I intend to combine the two to have it aim for a goal that’s X meters in front of the robot and have it follow the line until it’s travelled that far. Or until I push the estop button if it tries to run away…

Eco-Disaster

In this challenge you have to sort a number of red and green barrels in to blue or yellow areas of the arena. Starting in one corner of the arena, NE-Five should be tall enough to be able to detect all the objects it needs to look out for, the start position and two sorting zones are also known locations which helps. Using a similar setup to Lava Palava I should be able to use the detected barrels to position the robot relative to them so it can pick them up. Using odometry and being able to see the coloured sorting zones, it should be able to navigate to them to drop them as needed.

Escape Route

This challenge has to be run without the robot’s operator being able to see the arena directly. For remote control this means using cameras or have someone shouting out commands, for autonomous the operator needs to be behind a screen still but only press “go” and hope for the best.

The arena will be in a randomly selected configuration out of six possible layouts. There are three coloured blocks, each having known dimensions. The plan is that as soon as the challenge begins, the robot scans to see which block is closest and add a waypoint to get passed it. After it gets there, or while en route, it can look for the next block and figure out its next steps there too.

Similar to Lava Palava, where it’s aiming for a point a certain distance ahead, the end goal will be passed the yellow line with intermediate steps to get around each block. The depth camera already has an option to convert a depth image to laser scan so should be relatively easy to detect a clear path.

Minesweeper

For this one, the robot will have to look for an illuminated red square and move to it. Once it has visited that square another will light up, the process repeats. As with Escape Route and Lava Palava I’ll be looking for the specific colour of the square but this time I’ll be setting it as the waypoint to move towards. Once it detects it’s on top of a red square, it’ll stop. Once the red switches off it’ll start looking around for anything of the same colour, with the wide angle stereo cameras this should give a good field of view for this and the odometry once again comes in to play.

Zombie Apocalypse

This one is currently the biggest unknown as I don’t have a projectile launcher for the robot yet, I do have a pile of parts however… Sample designs for the zombie targets have been released though, so I’m planning on trying to detect those to use for the coordinates. I have parts from an electrically fired Nerf gun so will mount that on a pair of servos for pan/tilt and use them to aim at the target. I also have a green laser for this, so hopefully will be able to detect when the laser is within an area in the centre of the target before firing.

If’s, But’s, And Maybe’s…

Other than for the last challenge, I pretty much have everything in place. The devil’s in the details with these things but I’m in a considerably better position than any previous competition which is a great feeling. What are the priorities though?

The Toad List

What needs doing?

  • Nerf gun and mounting hardware
  • Camera to provide coordinates of:
    • A white line, it’ll have length rather than being point data so probably just “make sure white line is in the middle of the view”
    • A zombie, there will be multiple at different heights, the higher ones having more points available. Primarily we’ll need the X,Y coordinates but detecting distance will help ensure were detecting the right things as they’ll all be on a plane.
    • Coloured boxes, these will be used as signposts and will need to be avoided. Depth to laserscan for obstacle detection.
    • Coloured barrels to pick up and navigate around, this will need pose estimation.
    • Coloured flooring, for both mine sweeper and Eco Disaster
  • Arm control to ensure coordinates are in the same system as the camera, this is for picking up the barrels
  • Waypoint system, hook in to odometry to have the robot follow a path.
  • Robot pose estimation, where is the robot and which way is it pointing?

There is a common theme in that a lot of the challenges have overlapping needs but there’s a lot of work to do.

Load’s of time though, right?

Installing RedBoard+ Library on Ubuntu

Macfeegle Prime is heading back to PiWars!

A lot has happened in that time, ignoring the whole *gestures broadly at the world*, a lot has moved on in software terms. We now have Ubuntu 20.04 for the Pi, ROS Noetic has been released for it, and Approximate Engineering has rewritten the RedBoard+ library from scratch.

This library also includes support for the servo board I’m using meaning that all the servos can be controlled from the same library, it’ll be one less thing for the Teensy to do so I thought I’d start here.

Notes

Installation of the library using “pip install redboard” works a treat, it includes a GUI that lets you tweak the configuration. If you try using “redboard-gui” on Ubuntu however you’ll find the pigpio daemon isn’t included, I had to install it from source so follow the instructions here.

Next issue I hit was that I didn’t have permission to access i2c, with Raspbian you can run raspi-config to enable and disable access to various interfaces but that isn’t available on Ubuntu. I followed these steps to get it working:

First create a new file using nano:

sudo nano /lib/udev/rules.d/60-i2c-tools.rules

Then add the following lines:

KERNEL=="i2c-0" , GROUP="i2c", MODE="0660" 
KERNEL=="i2c-[1-9]*", GROUP="i2c", MODE="0666"

Reboot, then run “sudo pigpiod” followed by “redboard-gui” and you should get the following:

redboard-gui in a terminal window

You’ll need to run “sudo pigpiod” on startup, to do that run “sudo crontab -e” and add the following to the end of the file:

@reboot /usr/local/bin/pigpiod

Reboot and you’re good to go!

How Not To Build A Robot

I’ve gained a lot of experience over the last few months with regards to Fusion 360, 3d printing, electronics and more besides. I thought I’d share some of those lessons.

As Complex As You Make It

The most important lesson, as with any project, is to have an idea of what you’re building from the start and how long you have to build it. If it’s a relatively simple design, there will still be a lot of issues you’ll come across that will take added time to figure out, doubly so if you’re learning as you go. My robot concept was complex to start with, more so than I expected, and I had a lot more to learn than I realised too. However long you think you need, add more and if possible simplify your design.

In retrospect, more of a plan than a quick sketch wouldn’t have gone amiss…

I had a bunch or early wins, I used existing parts from an RC car to make early proof of concepts which sped things up, and this gave me a little too much confidence. I was designing elements in Fusion 360 in isolation, assuming they’d work, and that burnt me a lot. I went through a number of different chassis designs as prototypes in the early steps and it wasn’t until I realised I needed to have more of a complete design done in CAD to see how they all fitted together that I could save an awful lot of time. I’m still not great at this but certainly getting better.

Longer term I need to learn how to do joints in Fusion 360 so that I can actually see how things fit together and what constraints there are.

I wasted a lot of time in what was designing seven different robots, I couldn’t have got to where I am without doing it though so a difficult balance to make.

Seriously, Make A List. Then Check it Again…

I had the vague idea that I’d have the Stereo Pi up top in the head for stereo vision, this would give a lot of opportunities for computer vision too. Around the chassis would be a ring of sensors, ultrasonics were what I had in mind to start with, but though simple to work with they’re quite large. I didn’t really know better so that’s that I went with. Later on I learned of the VL53L0X which is a really cheap lidar sensor and a lot smaller too. They had the quirk of having the same i2c address by default so you need to use i2c multiplexors or have them connected in such a way to reset their addresses on first boot… More complexity!

Again, we’ve all PHDs in hindsight but having a more solid plan and spending more time on research and planning in the early stages would’ve paid off in the long run.

Burnout

Look. After. Yourself.

As I mentioned earlier on I had lots of early successes which gave me an awful lot of false confidence, as soon as the easy wins came and went and the real struggle began the build got a lot more difficult, both technically and mentally. For those who know me or have been reading the blog for a while will know I suffer from Anxiety and Depression, they’re a bugger individually but when they join forces they’re truly evil. A few weeks before I applied to enter PiWars my beloved cat, Willow, passed away. To say this was hard on me is an understatement, coupled with the year tailing off, getting darker and colder, and things going from win after win to struggle after struggle, things got rough.

I tried to push through it, that was a big mistake, and I made the best decision for the project which is to take breath and start again. With a lot of support from my girlfriend, the rest of the PiWars community, friends, family, and colleagues alike I slowly got out of the funk while making slow but consistent progress. The Epic Rebuild Began.

Conclusions and Next Steps

I’ve learned a lot, come an awful long way in may regards and though I’ve still a lot to do I’m in a better place and so is the robot. The next steps are to get the controller up and running and the robot drivable again.

In the next blog post, I’ll talk about the plans for the challenges. As it stands I’ve almost one arm and only need to finish the hand, add a bunch of sensors and remote control. I have a minimum spec in sight and will at least be able to compete.

Project Bumblebee

Bumblebee?

 

Bumblebee is my Roomba, so named as long ago he lost his voice.  About a year ago his logic board started playing up and though he was still able to clean, at the end of each cleaning cycle he wouldn’t go into standby and his battery would drain in no time.  At that point he stopped actively gathering dust and started doing it passively as he sat behind my sofa.

Bumblebeee MK2

Since a kid I’ve always wanted to build a robot and figured I’d kill two birds with one stone and use Bumblebee as a chassis for a mobile robot, he already is one after all, but also have the aim of returning his base functionality of being a robot hoover.

The Plan

Bumblebee is an original model Roomba from 2002, he was a gift from a friend who knew I loved to tinker and gave me him broken.  If I could fix him I could keep him, thankfully an easy fix as the battery was dead.  This model is very simple in it’s function and behaviour, it has no mapping capability, no dock or wireless control.  It apparently can be controlled using IR but I’ve never had a remote.  It also lacks the diagnostics port that the newer models have that make hacking a Roomba really easy now so this is going to be a bit trickier, a lot more education and most importantly more fun!

The parts I’ve used to partially resurrect him are a Arduino Leonardo and an Adafruit Motor Controller Shield.  I’ve also a Raspberry Pi 3 to add into the mix, for Wifi control and more processor intensive tasks.  The idea is to use the two to thier strengths; the Arduino will control the motors and read the sensors allowing for real time override in case of collision and the Pi will be able to sit back and give out the orders.  It’s a classic split for mobile robots but thankfully very cheap to implement now.

Current State

As I said I’ve been working on this for a while, I’ve a load of notes to type up and a loads of “learning experiences” to share.  Mostly when I made a rookie error and burnt out one of the motor controllers…  I’ve now got the motors under control over serial, I’ve also a simple console application that lets me drive him around and toggling the sweeper/vacuum fans on, here’s a video of him in action:

Next Steps

My next item to look at is getting sensor data into the Arduino, first up the encoders.  Encoders are devices that allow you to measure the movement of a wheel, you’ve likely used a rotary encoder on a hifi to control the volume, and the Roomba has one is each wheel.  Right now I can control how much power goes to each wheel but because of differences in the state of the gearboxes, carpet and who knows what other factors, the wheels spin at different speeds.  By measuring the speed from the encoders we can compensate for this, we can also use them to calculate the distance we’ve travelled.

After that is the rest of the sensors, those I’ve found so far are;

  1. Cliff sensors – these are under the bumper and detect drops to prevent him falling down stairs, I think there are four of them and they appear to be IR distance sensors
  2. Bumper sensors – these detect collisions, I think there is one at either end of the bumper so I’ll know if I’ve hit something to the left or right
  3. Wall sensor – another IR distance sensor mounted on the right of the bumper, this allows for wall following behaviour
  4. Wheel up/down switches – One on each of the drive wheels and one on the caster at the front.  They detect if the wheels are up or down and can be handy for detecting when we’ve beached ourselves.
  5. Wheel encoders – these were IR LEDs and a light dependant resistor.  I blew one of the LEDs by accident so replaced them both with red LEDs.
  6. Beacon IR Reicever – Not sure how this works yet, it’s a 360 lens on the top that receives a beam from the virtual wall, a device you place by your door to stop him escaping, I’m hoping to add more sensors to make this redundant.
  7. Buttons – there are three buttons for room size to start different cleaning cycles.  They could be useful though I may not bother with them.

Once I’ve all the sensors connected I’ll be able to hook up the Raspberry Pi to start working on reproducing his original behaviour.  After that I’ll be able to build up his capabilities over time and add a load of new ones too.  I’m not intending this just to be a hoover but a mobile robot platform that happens to be able to hoover.

If you’ve got this far, kudos!  That’s it for now, more posts will trickle out as i write up my old notes.  I’m going to carry on having fun building things and write posts in the order they happened.  Hopefully I’ll catch up fairly quickly!