A long while since the last post, more on that in an upcoming post titled “How Not To Build A Robot”, but thought I’d give an update on the general architecture that is manifesting for MacFeegle Prime.

The Robot

The robot will have at it’s core a Raspberry Pi, in its case it’ll be a Raspberry Pi 3 Compute Module hosted on a StereoPi board. This board is designed to take advantage of the CM (Compute Modules) two camera ports and allows for GPU boosted stereo vision.

The latest render of MacFeegle Prime. This shows a robot with tank-style treads, a head and one arm.
Latest render of MacFeegle Prime

For motor control, and for some of the servos, I’ll be using a RedBoard+ by RedRobotics. This has everything you’ll need for most robots including a pair of 6A motor controllers, 12 servo headers, support for NeoPixels and most importantly great documentation and support from the creator, Neil Lambeth. This HAT also includes a power regulator so it powers the StereoPi too which is incredibly handy.

Connected to the Pi will be a Teensy 4 board, this will handle and collate data from the various sensors around the robot, with an i2c servo board to control the arms, and potentially an NRF24 RF transceiver too.

The Controller

The controller will also be running on a Raspberry Pi, in this case a standard 3 Model B, though connected to a 7″ touchscreen display. This will also have a Teensy 3.6 board which will be used to interface with various buttons and potentiometers. Also possibly another NRF24, it depends on if control via a WiFi access point will be stable enough.

The sort of thing I have in mind is similar to these controllers for cranes and diggers.

PLUS+1® remote controls

I just love the industrial design of them and with the complexity of all the arms and similar it seemed a valid excuse to build one… I have a pair of 4 axis joysticks, these have X and Y as you’d expect but also able to rotate. The 4th axis is a button on the top, I can use this as a modifier or to toggle modes.

One thing I’d love to do is a waldo controller, similar to the one James Bruton developed for his performance robot but I’d prefer it to be smaller and I think that’s out of scope for the competition.

James Bruton’s puppeteering rig from his video

Better yet would be one similar to the controller Naomi Wu showed in her video about the Ganker robot. It attaches around her waist and allows her to control not only the arms but the motion of the robot too as the “shoulders” of the controller is essentially mounted on a joystick.

Still taken from Naomi’s video

This controller is incredibly intuitive, coupled with stereo vision via Stereo Pi and an Android phone in a Google Cardboard headset I think it’d be an exceptional combo. Definitely one for future development!


The software for this will be written in Python but make use of the Robot Operating System. This isn’t an operating system but a collection of libraries and frameworks to allow components of a robot to work together, even if spread across multiple machines. I’ll be running this is Docker as I’ve had pain trying to get it installed and there’s an image available already.

This will run on both robot and controller and the intention is that it’ll allow for control over WiFi as well as telemetry to the controller. If a WiFi access point, likely a phone in hotspot mode, isn’t stable enough for control I’ll fall back to the NRF24 transceiver option. Handily there is an Arduino library that allows for sending and receiving messages in a format suitable for ROS to parse so hopefully that’ll be fairly easy to swap out.


There is a lot of work to do, the hardware is mostly done and needs mounting, just the end effectors (hands) need designing along with a few tweaks to the head, and the mount for the nerf gun.

I’m a professional software engineer by trade so I’m hoping that writing the code shouldn’t be too bad a job (DOOM! FORESHADOWING! ETC!) and I have the week before the competition off too to allow for last minute hacking…