PiWars 2024: The Challenges and Their Challenges

There are seven challenges at the in-person event for PiWars 2024, each with their own problems to solve. They are split in to three categories: Autonomous only, Autonomous/Remote, Remote Only. For the second of those, if you are in the Advanced/Professional category you have to attempt the challenge autonomously.

State of the Onion

NE-Five exists as a robot, the base hardware is all there and everything has met the minimum requirements as I’ve set them, the devil is in the detail though and integration hell is totally a thing. Also, perfect is the enemy of done and I really need to get a focus on what needs doing now vs what would be nice to have.

Motor control

Motor control has been overhauled with the switch from Red Robotics RedBoard hat to the Pimoroni Yukon. The big difference is the latter has support for encoders, but it also has on board processing which means that processing load is taken away from the overhead on the Raspberry Pi. I also have a ROS node for the Yukon so it can send and receive ROS messages such as motor velocity commands and odometry.

Servo Control

I’ve implemented a new ROS node that not only takes commands to move the robots arms but also provides joint state feedback to the wider ROS system, this also includes the neck servos as they are the same Dynamixel smart servos used in the wrists. The linear actuator that adjusts how high the robot is standing has also been hooked up to the Yukon, it has a feedback line that is factored in too. Each of the smart servos is independent of the Pi too as you give them a command and they do it, handling any PID loops and monitoring internally as needed.

Camera System

I’m still using the Luxonis FFC-3P camera system, with two wide angle global shutter cameras and a narrower field rolling shutter camera in the centre position. Luxonis have recently released a big update that includes on-device pointcloud generation. Previously I was trying to do this on the Pi and it was basically taking all the resources just to do that. Having this board run tasks itself and only providing the data the Pi needs is a big win for sure.

I’ve also been playing around with object recognition, it works and you can run custom neural networks on their which, again, means the Pi doesn’t have to do anything but use the data it produces.

Back to the Challenges…

The robot works great in remote control mode, it’s currently the only mode however which isn’t ideal. There are five challenges I’ll have to tackle autonomously so I’ll concentrate on those for now.

Lava Palava

This is a line following drag race, there is a course with a black floor and white strip down the middle which the robot has to follow as quickly as possible. The course has a chicane in it from previous years but this year will also have a speedhump.

With the motor encoders providing feedback with regards to distance travelled and the camera system able to detect objects, I intend to combine the two to have it aim for a goal that’s X meters in front of the robot and have it follow the line until it’s travelled that far. Or until I push the estop button if it tries to run away…

Eco-Disaster

In this challenge you have to sort a number of red and green barrels in to blue or yellow areas of the arena. Starting in one corner of the arena, NE-Five should be tall enough to be able to detect all the objects it needs to look out for, the start position and two sorting zones are also known locations which helps. Using a similar setup to Lava Palava I should be able to use the detected barrels to position the robot relative to them so it can pick them up. Using odometry and being able to see the coloured sorting zones, it should be able to navigate to them to drop them as needed.

Escape Route

This challenge has to be run without the robot’s operator being able to see the arena directly. For remote control this means using cameras or have someone shouting out commands, for autonomous the operator needs to be behind a screen still but only press “go” and hope for the best.

The arena will be in a randomly selected configuration out of six possible layouts. There are three coloured blocks, each having known dimensions. The plan is that as soon as the challenge begins, the robot scans to see which block is closest and add a waypoint to get passed it. After it gets there, or while en route, it can look for the next block and figure out its next steps there too.

Similar to Lava Palava, where it’s aiming for a point a certain distance ahead, the end goal will be passed the yellow line with intermediate steps to get around each block. The depth camera already has an option to convert a depth image to laser scan so should be relatively easy to detect a clear path.

Minesweeper

For this one, the robot will have to look for an illuminated red square and move to it. Once it has visited that square another will light up, the process repeats. As with Escape Route and Lava Palava I’ll be looking for the specific colour of the square but this time I’ll be setting it as the waypoint to move towards. Once it detects it’s on top of a red square, it’ll stop. Once the red switches off it’ll start looking around for anything of the same colour, with the wide angle stereo cameras this should give a good field of view for this and the odometry once again comes in to play.

Zombie Apocalypse

This one is currently the biggest unknown as I don’t have a projectile launcher for the robot yet, I do have a pile of parts however… Sample designs for the zombie targets have been released though, so I’m planning on trying to detect those to use for the coordinates. I have parts from an electrically fired Nerf gun so will mount that on a pair of servos for pan/tilt and use them to aim at the target. I also have a green laser for this, so hopefully will be able to detect when the laser is within an area in the centre of the target before firing.

If’s, But’s, And Maybe’s…

Other than for the last challenge, I pretty much have everything in place. The devil’s in the details with these things but I’m in a considerably better position than any previous competition which is a great feeling. What are the priorities though?

The Toad List

What needs doing?

  • Nerf gun and mounting hardware
  • Camera to provide coordinates of:
    • A white line, it’ll have length rather than being point data so probably just “make sure white line is in the middle of the view”
    • A zombie, there will be multiple at different heights, the higher ones having more points available. Primarily we’ll need the X,Y coordinates but detecting distance will help ensure were detecting the right things as they’ll all be on a plane.
    • Coloured boxes, these will be used as signposts and will need to be avoided. Depth to laserscan for obstacle detection.
    • Coloured barrels to pick up and navigate around, this will need pose estimation.
    • Coloured flooring, for both mine sweeper and Eco Disaster
  • Arm control to ensure coordinates are in the same system as the camera, this is for picking up the barrels
  • Waypoint system, hook in to odometry to have the robot follow a path.
  • Robot pose estimation, where is the robot and which way is it pointing?

There is a common theme in that a lot of the challenges have overlapping needs but there’s a lot of work to do.

Load’s of time though, right?

NE-Five Upgraded? Yukon Bet On It!

Upgrades galore in this latest post! Raspberry Pi 5, Pimoroni Yukon, and a banging sound system?!

I’ve been hard at work with a few upgrades on the robot over the past few months and thought I’d share a long overdue update, and if you’re here from the Raspberry Pi blog, welcome! In this post I’ll talk about some of the upgrades to the robot I’ve not covered yet, and will cover each in more detail in future posts. This includes motor control using PID loops, odometry from the encoders, and using running a ROS node on a Raspberry Pi Pico/RP2040 based board.

The key upgrades? A Raspberry Pi 5, Pimoroni Yukon, a banging sound system! Wait, soundsystem?!

Raspberry Pi 5

The Raspberry Pi 4 was a massive upgrade from the model 3, the extra RAM being the biggest winner for me, but the CPU was still a bit slow compared to some other single board computers on the market. When the Raspberry Pi 5 was announced it sounded like it was the upgrade I needed! I managed to get on the waiting list only a few hours after the announcement and a month or so later the board arrived.

Plenty of others have covered the Pi5 in more detail so I’ll not give a review here, needless to say though it’s made on-robot development *much* easier. I was running VS Code on the robot before which allowed me to develop on it remotely, this worked but had a big overhead that can swamp the Pi’s CPU and RAM if you don’t keep an eye on it. The extra power of the Raspberry Pi 5 makes the experience more responsive and makes it feel like I’m coding directly on my desktop. I’ve not scratched the surface of what it can do yet, but I’m getting closer!

Pimoroni Yukon

One of the primary issues to deal with in robotics is controlling motors, interfacing with sensors, and all the other things required to interact with the real world. Typically you have a computer of some kind coordinating all the components of a robot along with dedicated systems that handle the actual hardware involved. Think of a robot as if it’s a ship, you have the captain in overall control but they delegate specific tasks to others who only interrupt them when something needs attention. They’ll tell the engine room what speed to go at, and they’ll deal with it unless there’s a problem with the engines. They may also have a lookout who’s job it is to watch radar, sonar, or other sensors, and only tell the captain when something will cause a problem.

In this instance, the Raspberry Pi 5 is the captain, the Luxonis FFC-3P is the lookout, each servo in the arms and head look after themselves, but the motors were still controlled by the Pi. The Yukon has helped fix this by taking up the role of the engineers. It has a Raspberry Pi RP2040 chip on board, same as the Pi Pico, and modules that allow for control of specific bits of hardware. In my case, a quad servo direct module, an LED strip module, four big motor modules. Code runs on the Yukon that listens for commands from the Raspberry Pi and actions them, it gives feedback to the code running on the Pi but does the heavy lifting itself. This means I can improve speed control of the motors using encoder feedback, and also use that feedback to provide position data so the robot knows how far it’s moved from its starting position.

Banging Soundsystem

I’ve wanted to have microphones and speakers on the robot since day one, for voice interaction and teleoperation purposes, but also fun. When I saw Eagle Prime throw its first punch at MakerFaire Bay Area years ago, they had some issues to work through with the code so used its PA to blast out some tunes to keep themselves and the crowd occupied while they got thing fixed.

I ended up going with a HifiBerry Miniamp and pair of Tectonic speakers for this, Matt Perks (of DIY Perks fame) recommends them and that’s good enough for me. I also have the Raspotify service installed on the robot so I’m able to use it as a Spotify speaker, works a treat!

Mostly though, it’s the easter eggs I can add…

I’ll be going through each of the above in more detail in future posts, so stay tuned for updates!

NE-Five Mk4 – It’s All About That Base…

An update on the design of NE-Five Mk4

The road to PiWars continues! Most important of all is a solid base to build from so that’s where we’re starting.

A render of the new design for the robot base. It has a mecanum wheel at each corner, these wheels have rollers around their circumference rather than a solid rim.

This is the first iteration of the base design, it’s the same width but slightly longer to give more room inside the enclosure. Another big improvement, I hope I least, is that I’ve added suspension to each motor.

As this robot uses mecanum wheels it’s incredibly important that all four always have contact with the ground as all four wheels work together to allow the robot to move in any direction, if one isn’t in contact then the effect that wheel would have won’t be present and it’ll veer off course. I’ve added a hinge at the bottom of each mount and the black part will be printed in flexible filament. By varying the wall thickness and infill I should be able to control how much travel each wheel has. That’s the hope at least…

Another improvement is for quality of life more than anything and that’s the method by which the upper part of the base (not pictured, or designed yet…) attaches. On the previous iterations of NE-Five these parts have been attached using tabs that are simply screwed in place, this makes working on the robot tricky as if I need to work on the wiring it’s not designed for it.

I’ve also made the switch from Red Robotics RedBoard to the Pimoroni Yukon, the RedBoard has served me very well but the lack of encoder support is a problem. There’s ways around it, like using the Pi to Pico adapter that Neil developed, but the Yukon has a motor controller and encoder module all in one. It’ll also allow me to control the torso actuator and LED lights which is another issue on the Pi.

The NeoPixel library on the Pi requires you to run it as root, this makes running it as part of a ROS launch file a bit of a pain. By handing off control of this to the Yukon that problem goes away.

The other big benefit of switching to the Yokun is that I can send it messages to do something and it’ll do it rather than using CPU cycles on the Pi. Splitting hardware up between real-time and scheduled systems like this is very common and should work a treat here. The Yukon runs MicroPython too so I should be able to use ROS Serial to connect to and have it act like a ROS node, which it will be but running on the hardware.

All of this is theory at the minute and there’s always little problems I miss until at least the third iteration, stay tuned to find out what mistakes I’ve made this time! 😅

Installing RedBoard+ Library on Ubuntu

Macfeegle Prime is heading back to PiWars!

A lot has happened in that time, ignoring the whole *gestures broadly at the world*, a lot has moved on in software terms. We now have Ubuntu 20.04 for the Pi, ROS Noetic has been released for it, and Approximate Engineering has rewritten the RedBoard+ library from scratch.

This library also includes support for the servo board I’m using meaning that all the servos can be controlled from the same library, it’ll be one less thing for the Teensy to do so I thought I’d start here.

Notes

Installation of the library using “pip install redboard” works a treat, it includes a GUI that lets you tweak the configuration. If you try using “redboard-gui” on Ubuntu however you’ll find the pigpio daemon isn’t included, I had to install it from source so follow the instructions here.

Next issue I hit was that I didn’t have permission to access i2c, with Raspbian you can run raspi-config to enable and disable access to various interfaces but that isn’t available on Ubuntu. I followed these steps to get it working:

First create a new file using nano:

sudo nano /lib/udev/rules.d/60-i2c-tools.rules

Then add the following lines:

KERNEL=="i2c-0" , GROUP="i2c", MODE="0660" 
KERNEL=="i2c-[1-9]*", GROUP="i2c", MODE="0666"

Reboot, then run “sudo pigpiod” followed by “redboard-gui” and you should get the following:

redboard-gui in a terminal window

You’ll need to run “sudo pigpiod” on startup, to do that run “sudo crontab -e” and add the following to the end of the file:

@reboot /usr/local/bin/pigpiod

Reboot and you’re good to go!

Big News, Going Pro!

I have taken the plunge and I’ve gone part time at work, dropping to three days a week, to give me far more time to concentrate on my lifelong passion of robotics with the aim of making it my career!  This wasn’t exactly planned if I’m honest but the result of a recent epiphany, building on my experience with MacFeegle Prime at PiWars.

My plan is to use this time to continue learning ROS, finish development on MacFeegle Prime and it’s controllers, and restart development of a few stale projects.  I’ll be making all of these projects open source too, to give back to the community that has, and continues to, inspire me to bigger things.

The projects should be familiar to long time followers of my blog and other channels, though as I’ll be rebuilding them to share they’ll have more sensible names.  Sensible, but all puns…

NE-1 (Bumblebee)

This will be a resurrection of the first robot project I tried since my A-Levels, rebuilding my Original Roomba nicknamed “Bumblebee”.  I was given this robot in a broken state by a friend of mine, it turned out that the issue was a duff battery so easy to get working again, it quickly developed a fault where it’s speaker died though and it lost its voice which is where the nickname comes from.  I used it as designed for more years after that before it just stopped working, none of the buttons worked and it was totally unresponsive, looked like the main circuit board had died.

Bumblebee

I made a series of videos showing the in’s and out’s of reverse engineering the robot culminating in being able to drive it around using a gamepad, I never recreated its autonomous functions of being able to hoover by itself though.

Project NE-1 will resurrect this project as a way to show how a cheap old hoover, which can be found on eBay for as little as £25, can be reborn using a Raspberry Pi and given new purpose. The newer versions have a diagnostics port which makes this even easier, but this will concentrate on how to reverse engineer the hardware and hopefully show how this or an RC car can be converted in to a very affordable robotics platform. I’m hoping that this will show that there are ways for anyone *cough* can get in to robotics.

NE-5 (MacFeegle Prime)

MacFeegle Prime was my entry for PiWars 2020, it won the public vote for fan favourite and came second overall! This robot has had a lot of blood, sweat, and tears put in to it but it suffers from a lot of legacy problems from early in its development, essentially I didn’t *really* know what I was doing, making it up as I went along and in some ways it shows.

Very Early Prototype…
Winner! Celebrating on the day of Virtual PiWars 2020

The plan for MacFeegle Prime is to finish the build as per the MVP and get him to the basic level to compete in PiWars for when it’s held next year. At that point I’ll have learned how to build an awesome robot, more importantly though all the ways not to build a robot. Using all my experience, and the parts, from MacFeegle Prime I’ll design and build NE-5 which will be redesigned from the ground up. The mechanical files, BOM, and code for NE-5 will be published so anyone can build one to use for research or just for a laugh.

MacFeegle Prime has been build using off the shelf parts, some slightly modified, along with 3d printed custom components. It’s this combination of easy access to components and relatively simple construct that makes for an incredibly capable mobile robotics platform. Also a testbed for larger versions. In future phases there will be an NE-5L, and hopefully an XL, closer to the size of it’s inspiration…

NE-Where (The Luggage)

Has to be done!

The Luggage has been a daft project that has been on and off the back burner since EMF 2016, when I realised how annoying it was to walk back and forwards from my car to bring all my crap to the campsite. Fast forward a couple of years to EMF 2018 and Hacky Racers was born! I decided to resurrect the project as a racer, with a pair of 2KW motors and a lot of moxie! Again, I was making it up as I went and I didn’t have the experience with CAD I do now, I didn’t really have a plan so much as a pLn, left everything to the last minute and it didn’t come together at all.

After the first couple of events for Hacky Racers I realised I really like commentating and running the races, handy as everyone else wanted to race! This also meant that I didn’t have to worry about falling within the rules (which was already in a lot of grey areas) nor have to worry about getting it done “in time” for anything, which meant it didn’t get done…

Fast forward to now and I have the majority of all the parts to not only finish it but make it much more capable. Rather than being a racer, then a racer that could be remote controlled, it’s now going to be a robot that I can ride.

To complement the relative simplicity of the NE-1 and the complexity of NE-5, NE-Where will be my heavy duty rover. I’ll be redesigning it from scratch along similar lines to my bike trailer. It was literally built around an 84L Really Useful Box, and that’s my plan for NE-Where too. As well as having a daft alter ego in the shape of The Luggage, legs and all, it’ll also have a full set of wheels for more sensible uses.

A Really Useful Trailer

This will be used for research into logistics robots, rough terrain navigation, and as a base for NE-5XL too.

NE-Thing

Robots aren’t much use if they can’t be controlled and for PiWars I made a custom controller for MacFeegle Prime based around a pair of three-axis joysticks, a bunch of switches, and a Raspberry Pi.

This controller uses ROS, as does the rest of the robot, to communicate and control it remotely. I got it to the point where it could drive the robot around and control it’s head too, with the live video stream I could control it remotely. The plan was to have a driving mode and manipulator mode, alas I didn’t get the arms integrated in time to get that far.

This was always intended to be reused for other projects, hence NE-Thing, and the design will be extended with that in mind. Along with WiFi and Bluetooth that are available on-board the Pi I’ll be adding an NRF24L01 transceiver as well as a GPS reciever. This will be useful for outdoor projects where WiFi isn’t available and GPS will allow for dynamic return to base control too.

In this phase this controller will be used for the robots above, longer term I intend to build a drone too where the return to base functionality should come in to its own.

NE-Body

One thing I’m going to research and develop further is teleoperation, with NE-5 having stereo vision I wanted to build a waldo controller to allow for more intuitive control over the robot’s arms.

A render of the waldo controller

The design above was a pretty quick design I threw together in a few hours, that I can say that alone shows how far I’ve come, and needs a lot of improvements to be actually useable. Likely I’ll switch to encoders of some kind instead of potentiometers but that needs researching. Due to the sheer number of potentiometers involved this will be based around either a Teensy or Arduino Mega, I think it’s 16 joints in total after all.

At the minute I’m not sure if this will connect directly with the robot or be a peripheral of the NE-Thing, I’ll have to make that call when I get around to it.

Funding…

Dropping to three days a week means I’ve still got a regular income and I’ll be able to cover my bills, I’m incredibly lucky to be able to say that, and the vast majority of parts for all the projects above I’ve purchased over the years already. For the foreseeable future I’ve plenty to be getting on with and all the parts I need, but for future phases I’ll need to find that budget from somewhere. My plan is to launch a Patreon page to help fund future work with the hope of taking this full time too.

If you stuck around to the end, thank you! Let’s see where this ride takes us!

Raspberry Pi – Docker on USB Drive

As per my last post I’m using Docker on my robot with ROS. The last task is to get docker running from a dedicated USB drive to split resources between that and the SD card the OS is running from. A good guide to mounting a USB drive can be found here.

Note, rather than using umask=000 for mounting you need to mount, then change the permissions of the “host” directory to 777. For example, mount to /media/usb as per the article then chmod 777 /media/usb **WHILE MOUNTED**. This should allow you to mount, then set to automount on boot.

If you are running headless and there is a problem with the fstab file it can get annoying so to test in advance of a reboot run “sudo mount -a” to mount all volumes as per that file. If it succeeds, you can reboot.

I was having a problem mounting with fstab, I could manually mount the usb folder every time but not using “mount -a”. The penny dropped when I did “df -h” to see how much space was free and noticed /media was itself a mount point. I created a new root folder called “docker” and it worked a treat.

Following this answer I moved the location of the docker containers and such to /docker.

I’ve run “docker pull ros” and it’s now happily using the usb drive instead of the SD card. 🙂

StereoPi Image Mods

I’ve been using the Ubiquity Robotics Raspberry Pi ROS image to run both the robot and controller, it seemed the easiest way to get ROS running, but now I’m trying to get low latency streaming working from the cameras it is proving tricky.

New Plan, use the Stereo Pi image with Docker to host ROS images/containers.

Some Mods Needed…

In order to lower the latency as much as possible the Stereo Pi image is a heavily modified version of Raspbian, this includes custom partitions and the file system set to read only. Here are the steps I followed to get it to a state where I can restart development.

1. Get and Modify the Image

Head here and follow the steps to get and install the image, follow the steps to get it on the wifi and under the Advanced section you’ll see details on how to SSH to the Pi afterwards. Once logged in you’ll need to temporarily set the file system to read/write then edit the fstab file.

Under the “SLP Structure Details” section you’ll find this command:

 sudo mount -o rw,remount /   

This will set the file system to read/write, at which point you can open the fstab and edit it to make this permanent by changing the ro to rw for the root.

nano /etc/fstab

2. Resize the Partitions

I was trying to figure out how to do this on Windows then realised I could just install gparted on the controller and remote to it… I put the micro-sd card in a USB card reader and followed these instructions. The 1.8G root partition was expanded to around 25GB and the recording drive slimmed down accordingly.

screenshot of gparted with the final sizes of the partitions

3. Install Docker

This next bit is trivial, run this command, as taken from the Raspberry Pi blog:

 curl -sSL https://get.docker.com | sh 

4. Get ROS Images

Next bit is easy too:

 docker pull ros

Considerations

One of the reasons StereoPi is very quick is because it is tailored not to use the filesystem on the SD card, to that end it may be worth moving all of the docker containers and images over to a USB drive. There’s more information on how to do that here but I’ve not tried it yet:

https://stackoverflow.com/questions/32070113/how-do-i-change-the-default-docker-container-location

Next Steps

The next things I need to do is convert my code to work in a docker container, this shouldn’t be too tricky but as the RedBoard library will need to talk to the hardware there will likely be complications.

Pi to Pi Comms Using ROS

I’ve decided to use ROS (Robot Operating System) for my PiWars project as it’s industry standard and this is an excellent excuse to learn it. For some reason I thought that ROS was a realtime operating system, turns out it is a bunch of libraries and services that run on top of existing operating systems, though that’s selling it short. It’s been around since 2007 and there are *loads* of libraries available for it, I’m hoping to use these to simplify navigation and control of the arms. There are loads of kinematics libraries available so I’m hoping to stand on the shoulders of giants here.

I’ve been playing around with the tutorials and have messages going from one Raspberry Pi to another so thought I’d share how I got here.

Left, the Raspberry Pi console on the controller. Right, the Raspberry Pi in the robot.

Setup/Prerequisites

I’m using the image provided by Ubiquity Robotics, it works and already supports stereo imagery using the StereoPi so seemed daft not to use it. Once you have two Raspberry Pis running with this image get them both on your network. If you’re running Windows you may also want to install Bonjour Print Services, this includes the same service that the Raspberry Pis use to advertise to each other on the network and means you can find them easier by host name.

Tutorials

The ROS tutorials can be found here, If you’re wanting to do ROS development on a Windows machine this may be of use. It’s instructions for installing ROS in the Windows Subsystem for Linux, Docker or VM.

The specific combo of tutorials I used are the Python pub/sub and “running on multiple machines” tutorials, I ran the former on each Pi first to make sure they were working then followed the steps in the latter to set the robot as the master node, then run the listener on the robot and talker on the controller. You can do it either way around, I just like the idea of sending messages from controller to the robot. 🙂

Python Publisher/Subscriber Tutorial
Running on Multiple Machines

If you follow these along at home you will need to go back a few steps as to run the pub/sub tutorial you need to build a package, to do that you need to create and set up your workspace. The prerequisites for each tutorial are listed at the top of each article so easy to backtrack.

Learnings

So, I can send “Hello, World!” from one machine to another. Woo I hear you say! It doesn’t sound like much but from here I can use these concepts to send and receive messages between controller and robot. For example, one node would publish sensor data, I would then have one or many listeners that use that data. Another would listen for motor control signals, telemetry data, etc…

Next up, use the RedRobotics samples on the robot to enable remote control and basic telemetry back to the controller. This will just be the battery level to start with but that’s a very important thing to know as I trust you’ll agree.

MFP MVP TBD…

MacFeegle Prime, Minimal Viable Product, To Be Decided…

Time’s pressing and though I started with lofty goals I need to set a minimum that I’ll be happy with that are acheivable, in software engineering (and probably other fields) we refer to this as the minimum viable product.

The Challenges

There are seven challenges in PiWars, one is autonomous only, a few are optionally autonomous for extra points and some are remote control suggested but you can do them autonomously for bragging rights. The challenges are as follows:

  • Autonomous Only
    • Lava Palaver – Line Following
  • Remote Control/Autonomous Optional
    • Eco Disaster – Barrel Sorting
    • Escape Route – a blind maze
    • Minesweeper – find and disarm red squares
  • Remote Control
    • Pi Noon – Robot jousting!
    • Zombie Apocolypse – shooting gallery
    • Temple of Doom – Obstacle course!

Required Sensors

This robot will be powered by a Stereo Pi so will have the capability for conputer vision, if I’ll be in a position to learn how to do that is a different matter, so what are the simplest sensors I can use to solve these problems?

Line Following

The simplest way for this is an array of light sensors pointing down along the front bumper. The line will be brighter so you can sense how far from centre you are and change your steering accordingly. I’ve a load of IR distance sensors from the ill fated version one of the shooting gallery that I can press in to play.

Blind Maze

For this I’ll need a bunch of distance sensors arrayed around the robot, I have used ultrasonic sensors in the past but they’re physically quite large and the other competitors mentioned a better option. The VL53L0X is a LIDAR sensor that runs over i2c, it can run in continuous mode and you can request a reading on the fly. These are physically smaller will be easier to have more arrayed around the robot they do have a few downsides.

First off, all of these have the same i2c address by default so you have to either change the address on boot up, which requires a wire per sensor, or use an i2c multiplexor, which requires a few wires per sensor. I heard from one of my fellow competitors that the former was ropey at best when they’ve tried it in the past so multiplexor it is!

The other downside is that the performance of these sensors depends a great deal on the surface they’re reflecting from, white is the best for reflectance and black the worst, guess which colour the walls are at PiWars?

In finding links for this post I just spotted these ultrasonic rangefinders which are much smaller, pricy but they’d certainly do the job.

Mine Sweeper

The way this challenge works is that the robot is placed on a 4×4 grid that is lit up from underneath. One of the squares will be lit up red and if the robot moves to and stands on it, it’ll be defused. For a pure brute force method of doing this you can use a colour sensor facing down on the bumper. You’d have the robot bimble around at random, much like the early Roomba’s, and when it detects red it can stop until the colour changes.

It’s not efficient, but it could work. I’m not sure if the extra points for doing it autonomously would be more fruitful than getting more mines by driving manually. I’ve seen someone post a proof of concept for doing this using computer vision so for this one I’ll go with manual with computer vision being the stretch goal.

Remote Control

The bulk of the challenges will be done manually, so we’re going to need a suitable controller. Ideally I’d have a full waldo controller and VR headset as per my aspiration but I need to be more realistic. As a very basic method of control I have an xbox controller rigged up to a Raspberry Pi with a display. It’ll connect via a wifi hotspot, likely my phone, and issue commands over TCP. With the analogue sticks of the Xbox controller I’ll be able to control the movement with one stick and control the head (cameras) with the other, much like in a first person shooter. If arm control using the sticks proves too tricky to control this way I can just preprogram a few positions so it can put hand in front of the bot, press another to close the hand, another to raise is slightly… It’d be all I need for the barrel challenge but wouldn’t be using the arms to the fullest.

Conclusion

We have a plan, or at least the idea of a plan… Having a smaller set of more constrained targets is a good focus, now I just need to get over this damn lurgy and get some energy back!