The road to PiWars continues! Most important of all is a solid base to build from so that’s where we’re starting.
This is the first iteration of the base design, it’s the same width but slightly longer to give more room inside the enclosure. Another big improvement, I hope I least, is that I’ve added suspension to each motor.
As this robot uses mecanum wheels it’s incredibly important that all four always have contact with the ground as all four wheels work together to allow the robot to move in any direction, if one isn’t in contact then the effect that wheel would have won’t be present and it’ll veer off course. I’ve added a hinge at the bottom of each mount and the black part will be printed in flexible filament. By varying the wall thickness and infill I should be able to control how much travel each wheel has. That’s the hope at least…
Another improvement is for quality of life more than anything and that’s the method by which the upper part of the base (not pictured, or designed yet…) attaches. On the previous iterations of NE-Five these parts have been attached using tabs that are simply screwed in place, this makes working on the robot tricky as if I need to work on the wiring it’s not designed for it.
I’ve also made the switch from Red Robotics RedBoard to the Pimoroni Yukon, the RedBoard has served me very well but the lack of encoder support is a problem. There’s ways around it, like using the Pi to Pico adapter that Neil developed, but the Yukon has a motor controller and encoder module all in one. It’ll also allow me to control the torso actuator and LED lights which is another issue on the Pi.
The NeoPixel library on the Pi requires you to run it as root, this makes running it as part of a ROS launch file a bit of a pain. By handing off control of this to the Yukon that problem goes away.
The other big benefit of switching to the Yokun is that I can send it messages to do something and it’ll do it rather than using CPU cycles on the Pi. Splitting hardware up between real-time and scheduled systems like this is very common and should work a treat here. The Yukon runs MicroPython too so I should be able to use ROS Serial to connect to and have it act like a ROS node, which it will be but running on the hardware.
All of this is theory at the minute and there’s always little problems I miss until at least the third iteration, stay tuned to find out what mistakes I’ve made this time! 😅
One common questions I get asked about NE-Five is “is it open source” and I always give the answer “mostly…”, there’s a reason for that. The code has been available for a long while, along with a basic simulation that you can run in Gazebo, but the CAD model has never been shared. That’s becuase it’s a mess, to say the least! I hadn’t used Fusion 360 in anger until I started working on MacFeegle Prime, the v1 of NE-Five. As such, it’s an absolute mess of deadends, redundant designs, and terrible usage of the component system…
The Road to NE-Five Mk4
There have been various versions of NE-Five over the years, the biggest changes being from MacFeegle Prime to NE-Five, in the early days they had tank treads for the true Johnny Five aesthetic. Treads were a bit unreliable, at least the way I made them, so I switched to wheels for a later version rather than getting bogged down trying to make them work, all the while the rest of the robot being a mess. I think, bizarrely, that the first NE-Five was actually the Mk2, I guess I considered NE-Five the class of robot and MacFeegle Prime was just the name of the first one? Sound’s plausible, it’ll do as an explanation at least.
The Mk1 introduced an aesthetic which we’ve pretty much stuck with ever since, it also moved the motor and servo control from the head to an enclosure on his caboose. You can see the mess of cables coming out of the back of his head in the photo below. The bundle was that tight that if he tried to look up too far, his head would pop off as the servo couldn’t move the wires…
Mk2 still used a Stereo Pi board in his head, but the RedBoardwas moved to the caboose and hosted on a PiZero connected over USB. This meant that there was only a USB cable and wire for the NeoPixels between head and the rest of the body which was much less stressful on the neck mechanism.
Both of these designs still used hobby servos for the arms and neck mechanisms, along with the Stereo Pi for control and vision, so the Mk3 was the biggest leap from a technology perspective.
NE-Five Mk3
Aesthetically the Mk2 and Mk3 look very similar, but there is an awful lot changed internally.
Mk2, left. Mk3, right.
The overall form-factor has been retained but significant upgrades were made in the arms and vision system. The Stereo Pi was replaced with a Luxonis FFC-3P and the Pi Zero in the caboose replaced with a Raspberry Pi 4, the RedBoard is retained for power supply, NeoPixel, and motor control. The “hobby” servos were replaced with Robotis Dynamixels, the most obvious benefit of which is that they daisy chain together so the wiring is much tidier. More importantly though, they give position and status feedback so you can detect if they’ve stalled for example, they are more expensive but they’re worth it purely in savings compared to the number of servos I burned out without realising it!
The Luxonis FFC-3P vision board is also a game changer, where with the StereoPi I was able to generate a depthmap at 320×240 at around 18fps, the 3P can run at around 900x600px and 30 fps. This is as well as running neural networks for object recognition using it’s three cameras as well, it has dedicated hardware for the task, it also compresses images for preview on the Pi or over wifi to massively reduce bandwidth needed. This takes basically all the stress of computer vision away from the Pi and only sends over the data we need over USB, rather than swamping the Pi with frames that will mostly be discarded.
An example of the output from the Luxonis FFC-3P. Top left, a view from one of the two wide-angle cameras, top right is a coloured disparity image showing the distance of various objects, and bottom middle is the view from the narrower-fov centre camera.
The 3P is configured with three cameras: Left and right are both AR0234 sensors with 110degree wide angle lenses, these also have global shutter sensors so ideal for reducing shutter effects when moving. The centre camera is an IMX378 with auto-focus, this has a 69 degree FOV so much narrower. It also has a rolling shutter so more for use to get closer views of objects when stationary. Imagery from all three can be used simultaneously, I’ve barely scratched the surface of what these can do.
NE-Five Mk4, and Future
So, with all those lessons learned and improvements in mind, I’m going to design the Mk4 from scratch in Fusion 360. The reason for this is to get rid of all the baggage from the last four years of designs in the files so that they will be much easier to maintain, and more importantly to share. There’s also some cool scripts for Fusion 360 that can generate URDF (universal robot definition files) but they only work if it’s laid out in a certain way.
Internally and externally it’ll look much like the Mk3 does now, but with what I hope to be a more “production” quality of finish. There will also be a new motor controller board I’m developing that will include reading encoders and IMU to generate odometry.
The future part is an aspiration, I’m planning on developing this in to an actual product and sell as kits or preassembled, I am going to be releasing the CAD models and BOM still though in case people want to build their own. For those in teaching or research who just want one that comes assembled and with a warranty though, buying one may be preferrable. It’s very early days on that at the minute but I’m hoping it’ll be ready for sale towards the end of 2024. I was always intending for this to be a portfolio peice for the company to help me get robotics work but that hasn’t worked out too well. A few folks have said it’s almost a product already so I figured I’d try and get it the rest of the way there.
Stay tuned for more info on this, this is the precursor to me documenting the build of the Mk4 so a lot more info will be coming soon.
I’ve been using Red Robotics excellent RedBoard for years now, and while upgrading NE-Five to use a Raspberry Pi 5 I discovered breaking changes which means the available libraries wont work.
Basically, the way that GPIO works on the Raspberry Pi 5 is different as it uses its shiny new RP1 chip to wrangle its devices. As such, legacy code that used the old way needs updating. In our case that’s pigpiod, and the develop has said it’s non-trivial and will take time. This is entirely reasonable, it’s an open source project and depends on the good will and free time of the developer.
I’ve been using Approximate Engineering’s RedBoard library which abstracts a lot of the underlying bits away, so I thought I’d take a stab at updating it to use gpiozero instead. I’ve managed to get enough features working for NE-Five, the rest I will revisit when I’ve recovered a few spoons from the effort…
Long story short, you need to specify the pin factory for gpiozero to work in a virtual environment, and for whatever reason lgpio doesn’t work if you install it in a venv from pip. Instead, you need to install it from source, ignoring their instructions… This took an awful lot of trial and error to figure out, hence the spoon defecit.
If you go to my fork, there are instructions on installing everything which hopefully will get you working. Just make sure you’re on the develop branch!
I’ve been working on a remote control for my cargobot, more on those little teasers soon, and I’ve been having an annoying problem with something that should’ve been simple.
The Problem?
Switching it on.
The controller is running on a Sparkfun ESP32 Thing Plus which has an ESP32-S2 at it’s core. It has an EN (enable) pin that if tied to ground, should put the chip in deep sleep and it does this a treat. I expected that when it’s switched back on, however, that it would carry on as normal. When I switch it back on, lights come back on, as does the GNSS receiver, the Wii Nunchuck breakout, and the serial port. It’s definitely running, except for the screen.
I’m not quite sure what to do to fix this so the screen come back on, I have discovered a nuclear workaround though.
The ESP32 has lots of cool features, one of them is the watchdog timer. Once set, this listens for a command to reset the timer, if it isn’t reset within a certain period it will reboot the ESP32, assuming that something has gone wrong.
The Fix/Workaround
Relatively simple in the end, but a bit of trial and error to get it working for me.
Add this import: #include <esp_task_wdt.h>
This definition: #define WDT_TIMEOUT 5
Add this to setup(), somewhere near the top seems more reliable: esp_task_wdt_init(WDT_TIMEOUT, true); //enable panic so ESP32 restarts esp_task_wdt_add(NULL); //add current thread to WDT watch
Finally, add this to loop(): esp_task_wdt_reset();
What this does is create a watchdog timer with a timeout of 5 seconds, ties it to the main thread, and if the reset function isn’t called within the timeout resets the board. One thing to note, if you’re using more than one of the cores on the ESP32, you’ll need to call esp_task_wdt_add on the appropriate thread.
It’s not ideal for my use case as it means there’s a delay waking from sleep. It does mean that I can charge the remote without it being switched on though as the battery and charge circuit are connected to the Thing Plus board.
Hope this helps someone, if not future me, and stay tuned for more info in what I’ve been up to over the last, erm, six months…
I’ve gained a lot of experience over the last few months with regards to Fusion 360, 3d printing, electronics and more besides. I thought I’d share some of those lessons.
As Complex As You Make It
The most important lesson, as with any project, is to have an idea of what you’re building from the start and how long you have to build it. If it’s a relatively simple design, there will still be a lot of issues you’ll come across that will take added time to figure out, doubly so if you’re learning as you go. My robot concept was complex to start with, more so than I expected, and I had a lot more to learn than I realised too. However long you think you need, add more and if possible simplify your design.
In retrospect, more of a plan than a quick sketch wouldn’t have gone amiss…
I had a bunch or early wins, I used existing parts from an RC car to make early proof of concepts which sped things up, and this gave me a little too much confidence. I was designing elements in Fusion 360 in isolation, assuming they’d work, and that burnt me a lot. I went through a number of different chassis designs as prototypes in the early steps and it wasn’t until I realised I needed to have more of a complete design done in CAD to see how they all fitted together that I could save an awful lot of time. I’m still not great at this but certainly getting better.
Longer term I need to learn how to do joints in Fusion 360 so that I can actually see how things fit together and what constraints there are.
A few of the prototypes, with the almost final form
I wasted a lot of time in what was designing seven different robots, I couldn’t have got to where I am without doing it though so a difficult balance to make.
Seriously, Make A List. Then Check it Again…
I had the vague idea that I’d have the Stereo Pi up top in the head for stereo vision, this would give a lot of opportunities for computer vision too. Around the chassis would be a ring of sensors, ultrasonics were what I had in mind to start with, but though simple to work with they’re quite large. I didn’t really know better so that’s that I went with. Later on I learned of the VL53L0X which is a really cheap lidar sensor and a lot smaller too. They had the quirk of having the same i2c address by default so you need to use i2c multiplexors or have them connected in such a way to reset their addresses on first boot… More complexity!
Again, we’ve all PHDs in hindsight but having a more solid plan and spending more time on research and planning in the early stages would’ve paid off in the long run.
Burnout
Look. After. Yourself.
As I mentioned earlier on I had lots of early successes which gave me an awful lot of false confidence, as soon as the easy wins came and went and the real struggle began the build got a lot more difficult, both technically and mentally. For those who know me or have been reading the blog for a while will know I suffer from Anxiety and Depression, they’re a bugger individually but when they join forces they’re truly evil. A few weeks before I applied to enter PiWars my beloved cat, Willow, passed away. To say this was hard on me is an understatement, coupled with the year tailing off, getting darker and colder, and things going from win after win to struggle after struggle, things got rough.
I tried to push through it, that was a big mistake, and I made the best decision for the project which is to take breath and start again. With a lot of support from my girlfriend, the rest of the PiWars community, friends, family, and colleagues alike I slowly got out of the funk while making slow but consistent progress. The Epic Rebuild Began.
The evolution of the rebuild
Conclusions and Next Steps
I’ve learned a lot, come an awful long way in may regards and though I’ve still a lot to do I’m in a better place and so is the robot. The next steps are to get the controller up and running and the robot drivable again.
In the next blog post, I’ll talk about the plans for the challenges. As it stands I’ve almost one arm and only need to finish the hand, add a bunch of sensors and remote control. I have a minimum spec in sight and will at least be able to compete.
Bumblebee is my Roomba, so named as long ago he lost his voice. About a year ago his logic board started playing up and though he was still able to clean, at the end of each cleaning cycle he wouldn’t go into standby and his battery would drain in no time. At that point he stopped actively gathering dust and started doing it passively as he sat behind my sofa.
Since a kid I’ve always wanted to build a robot and figured I’d kill two birds with one stone and use Bumblebee as a chassis for a mobile robot, he already is one after all, but also have the aim of returning his base functionality of being a robot hoover.
The Plan
Bumblebee is an original model Roomba from 2002, he was a gift from a friend who knew I loved to tinker and gave me him broken. If I could fix him I could keep him, thankfully an easy fix as the battery was dead. This model is very simple in it’s function and behaviour, it has no mapping capability, no dock or wireless control. It apparently can be controlled using IR but I’ve never had a remote. It also lacks the diagnostics port that the newer models have that make hacking a Roomba really easy now so this is going to be a bit trickier, a lot more education and most importantly more fun!
The parts I’ve used to partially resurrect him are a Arduino Leonardo and an Adafruit Motor Controller Shield. I’ve also a Raspberry Pi 3 to add into the mix, for Wifi control and more processor intensive tasks. The idea is to use the two to thier strengths; the Arduino will control the motors and read the sensors allowing for real time override in case of collision and the Pi will be able to sit back and give out the orders. It’s a classic split for mobile robots but thankfully very cheap to implement now.
Current State
As I said I’ve been working on this for a while, I’ve a load of notes to type up and a loads of “learning experiences” to share. Mostly when I made a rookie error and burnt out one of the motor controllers… I’ve now got the motors under control over serial, I’ve also a simple console application that lets me drive him around and toggling the sweeper/vacuum fans on, here’s a video of him in action:
Next Steps
My next item to look at is getting sensor data into the Arduino, first up the encoders. Encoders are devices that allow you to measure the movement of a wheel, you’ve likely used a rotary encoder on a hifi to control the volume, and the Roomba has one is each wheel. Right now I can control how much power goes to each wheel but because of differences in the state of the gearboxes, carpet and who knows what other factors, the wheels spin at different speeds. By measuring the speed from the encoders we can compensate for this, we can also use them to calculate the distance we’ve travelled.
After that is the rest of the sensors, those I’ve found so far are;
Cliff sensors – these are under the bumper and detect drops to prevent him falling down stairs, I think there are four of them and they appear to be IR distance sensors
Bumper sensors – these detect collisions, I think there is one at either end of the bumper so I’ll know if I’ve hit something to the left or right
Wall sensor – another IR distance sensor mounted on the right of the bumper, this allows for wall following behaviour
Wheel up/down switches – One on each of the drive wheels and one on the caster at the front. They detect if the wheels are up or down and can be handy for detecting when we’ve beached ourselves.
Wheel encoders – these were IR LEDs and a light dependant resistor. I blew one of the LEDs by accident so replaced them both with red LEDs.
Beacon IR Reicever – Not sure how this works yet, it’s a 360 lens on the top that receives a beam from the virtual wall, a device you place by your door to stop him escaping, I’m hoping to add more sensors to make this redundant.
Buttons – there are three buttons for room size to start different cleaning cycles. They could be useful though I may not bother with them.
Once I’ve all the sensors connected I’ll be able to hook up the Raspberry Pi to start working on reproducing his original behaviour. After that I’ll be able to build up his capabilities over time and add a load of new ones too. I’m not intending this just to be a hoover but a mobile robot platform that happens to be able to hoover.
If you’ve got this far, kudos! That’s it for now, more posts will trickle out as i write up my old notes. I’m going to carry on having fun building things and write posts in the order they happened. Hopefully I’ll catch up fairly quickly!