Nerf Gun Shooting Gallery

This is a work in progress as it has turned out to be far more complicated than I thought it would be! I’ve learned a lot about how to design parts for my 3d printer and how to use the laser cutter at Reading Hackspace too though so already winning!

Over the passed few years my friends in the Reading Beer Festival Games Team have been talking about building a shooting gallery using Nerf guns to keep things nice and safe.  This year I offered to give it a go and the spec quickly escalated/became more fun!

As I’m building this more for the team than the players it needs to be easy for them to use while dealing with the festival punters who are typically moderately inebriated…  Resetting the board easily, keeping track of the score and setting up for the next game seems like the three things to concentrate on.

The game works, or will work, as follows:

  1. Press the start button, the counter should reset along with the targets
  2. Give the player a Nerf gun and a clip with 12 darts
  3. The player shoots the targets, which reset once hit, and the score is tallied

Having a bunch of clips that have 12 darts and a button that resets the game means resetting should be a lot easier than having to do it manually so that’s easy win, enter the Arduino!

I bought a bunch of hobby servos of eBay to use to reset the targets and designed a hinge with a magnet to hold up the target and a switch of some kind to track when a target is hit.

I’ve got a design working for the hinge but need to replicate it for all five targets.  More details once it’s complete but here’s an incredibly satisfying video of the mechanism in action!

Project Bumblebee



Bumblebee is my Roomba, so named as long ago he lost his voice.  About a year ago his logic board started playing up and though he was still able to clean, at the end of each cleaning cycle he wouldn’t go into standby and his battery would drain in no time.  At that point he stopped actively gathering dust and started doing it passively as he sat behind my sofa.

Bumblebeee MK2

Since a kid I’ve always wanted to build a robot and figured I’d kill two birds with one stone and use Bumblebee as a chassis for a mobile robot, he already is one after all, but also have the aim of returning his base functionality of being a robot hoover.

The Plan

Bumblebee is an original model Roomba from 2002, he was a gift from a friend who knew I loved to tinker and gave me him broken.  If I could fix him I could keep him, thankfully an easy fix as the battery was dead.  This model is very simple in it’s function and behaviour, it has no mapping capability, no dock or wireless control.  It apparently can be controlled using IR but I’ve never had a remote.  It also lacks the diagnostics port that the newer models have that make hacking a Roomba really easy now so this is going to be a bit trickier, a lot more education and most importantly more fun!

The parts I’ve used to partially resurrect him are a Arduino Leonardo and an Adafruit Motor Controller Shield.  I’ve also a Raspberry Pi 3 to add into the mix, for Wifi control and more processor intensive tasks.  The idea is to use the two to thier strengths; the Arduino will control the motors and read the sensors allowing for real time override in case of collision and the Pi will be able to sit back and give out the orders.  It’s a classic split for mobile robots but thankfully very cheap to implement now.

Current State

As I said I’ve been working on this for a while, I’ve a load of notes to type up and a loads of “learning experiences” to share.  Mostly when I made a rookie error and burnt out one of the motor controllers…  I’ve now got the motors under control over serial, I’ve also a simple console application that lets me drive him around and toggling the sweeper/vacuum fans on, here’s a video of him in action:

Next Steps

My next item to look at is getting sensor data into the Arduino, first up the encoders.  Encoders are devices that allow you to measure the movement of a wheel, you’ve likely used a rotary encoder on a hifi to control the volume, and the Roomba has one is each wheel.  Right now I can control how much power goes to each wheel but because of differences in the state of the gearboxes, carpet and who knows what other factors, the wheels spin at different speeds.  By measuring the speed from the encoders we can compensate for this, we can also use them to calculate the distance we’ve travelled.

After that is the rest of the sensors, those I’ve found so far are;

  1. Cliff sensors – these are under the bumper and detect drops to prevent him falling down stairs, I think there are four of them and they appear to be IR distance sensors
  2. Bumper sensors – these detect collisions, I think there is one at either end of the bumper so I’ll know if I’ve hit something to the left or right
  3. Wall sensor – another IR distance sensor mounted on the right of the bumper, this allows for wall following behaviour
  4. Wheel up/down switches – One on each of the drive wheels and one on the caster at the front.  They detect if the wheels are up or down and can be handy for detecting when we’ve beached ourselves.
  5. Wheel encoders – these were IR LEDs and a light dependant resistor.  I blew one of the LEDs by accident so replaced them both with red LEDs.
  6. Beacon IR Reicever – Not sure how this works yet, it’s a 360 lens on the top that receives a beam from the virtual wall, a device you place by your door to stop him escaping, I’m hoping to add more sensors to make this redundant.
  7. Buttons – there are three buttons for room size to start different cleaning cycles.  They could be useful though I may not bother with them.

Once I’ve all the sensors connected I’ll be able to hook up the Raspberry Pi to start working on reproducing his original behaviour.  After that I’ll be able to build up his capabilities over time and add a load of new ones too.  I’m not intending this just to be a hoover but a mobile robot platform that happens to be able to hoover.

If you’ve got this far, kudos!  That’s it for now, more posts will trickle out as i write up my old notes.  I’m going to carry on having fun building things and write posts in the order they happened.  Hopefully I’ll catch up fairly quickly!

Satellite Applications Catapult Inventorthon – Bring Your Own Disaster

Videowall – (Operations room) 18 1080p screens, eight dual monitor workstations

The Satellite Applications Catapult are very lucky to have the facilities they have and believe they have the potential to help save lives. To that end they’re opening their toy box for everyone to play with to try and do just that at their next hackathon. The premise is simple; in a disaster scenario, how would you use their facilities to best help those in need?

They have some data from the recent Nepal earthquake to work with but if you have a scenario of your own then feel free to being your ideas with you. Some of the ideas we’ve had are so far include;

  • Earthquake/natural disaster response
  • Coordination of people/aid
  • Search and rescue

They have a great range of kit at their disposal including;

  • Two videowalls, both with 24 cores, 256GB RAM and at least three NVidia Quadro K6000 graphics cards each. One has 28 720p monitors at 9562×3072 and the other a whopping 18 1080p monitors to a resolution of 11512×3240! These graphics cards have 12GB RAM and 3072 processing cores EACH!
  • An Oculus Rift DK1 – a virtual reality headset
  • Two Kinect 2 for Windows sensors – These are the new sensors based of the Xbox One design, each can track six people with full colour and depth support, they also have an excellent microphone array.
  • A Leap Motion sensor – this enables incredibly fine gesture tracking to help create more natural gesture based control systems.
  • Multiple large touchscreen devices – These include a number of electronic whiteboards with pen support and two four screen mini-video walls.
  • A Microsoft Surface Table – The SUR40, not to be confused with a tablet! This is a table PC that supports not only 50 simultaneous touches but also tag support, a great example of what it can do is NUIverse
  • A Microsoft Gadgeteer Kit – including loads of modules, this platform allows for rapid prototyping of hardware devices and includes GPS and GPRS (2g mobile data) support.
  • A van decked out with sat coms kit (I’m calling shotgun for the Zombie Apocalypse)
  • Raspberry Pi and Arduino devices – small microcontroller boards that can be programmed to perform functions without need of a full PC
  • 3D HD Dual Projection Facility – A large (5.5m x 1.9m) 3d HD projection system that can be used by up to 34 people simultaneously
  • Ovei Multimedia Pod – a multimedia pod that includes a surround sound system
  • AIS transceivers, a safety of life at sea transponder system
  • 3D screen
  • Parrot AR Drone – This can be controlled via wifi and has GPS and cameras on board
  • A transparent rear projected touchscreen


Satellite Applications Catapult, Harwell Campus
Dual 3D HD projector suite
Videowall (Saturn) – 28 720p screens.



The rest is up to you!

The event will be held at their office at Harwell Campus on September 12/13th, you are welcome to crash overnight too. To register for the event click here and add yourself to the group and event on Facebook to join the discussion!

Unity UI for DeskCycle Arduino Speedometer

[Updated, see end of article]

Since my last post I’ve improved the Arduino speedo code to respond to requests and also tweaked the gear ration to be a bit closer to my road bike.  I’ve also implemented a simple speedo interface using Unity.

Unity DeskCycle Speedometer
Unity DeskCycle Speedometer

The Unity application also automatically searches the COM ports on the machine until it finds the Arduino speedo, it’s a bit hacky but works and means I don’t have to implement a COM port selection UI.

I’m going to add functionality to save off the readings to CSV log files too, at a later date I’ll add some kind of analysis in but getting the data saved is the important thing for now.

Update:  CSV functionality has been added and I’ve tweaked the interface too.

DeskCycle Arduino Speedo
DeskCycle Arduino Speedo Interface

For the Arduino portion of this weekend long hack, see here.

Arduino Speedometer for the DeskCycle

All my previous jobs were based in the town I live in so I used to be able to cycle to work, with my current job it’s far enough away I can’t reasonably cycle to work.  As such, and as the company is jokingly referred to as the “Cake-apult” for the amount of cake we seem to go through, my weight has inevitably increased.  To try and remedy this I’ve recently purchased a DeskCycle.  I would like to give a walking desk a go at some point but this seems a far easier solution and as sitting down for extended periods is linked to many problems I figured it worth a go.

It arrived earlier this week and I managed to cycle while sat at my desk for over two hours each day, I felt knackered by the end of it so it was certainly having an effect!  The only issue for me is the speedo.  The creators of DeskCycle designed the device such that the speedo is accurate when the resistance is set to maximum, this results in the speed and calories calculated too high if you have the resistance set lower.  They provide a calorie calculator to provide a more accurate set of results once you’ve punched in the values your speedo provides.

On to the how;

Looking at the bike it looked like the speedo works in the same way to the speedo on my road bike, a switch is closed once per revolution of the flywheel.  I connected my multimeter to it in continuity tester mode and it confirmed my theory.  As the bike uses a 3.5mm headphone jack for a cable it was simple enough to make a cable to connect to the header on my Arduino.

DeskCycle/Arduino Speedo Cable
DeskCycle/Arduino Speedo Cable

The cable has a 3.5mm headphone jack at one end, tip and ground in use, and a pair of header pins at the other.  Connected to the Arduino via a bit of breadboard, I’ve connected using pin 7 in pullup mode with the other end of the switch connected to ground.

DeskCycle/Arduino Speedo Pinout
DeskCycle/Arduino Speedo Pinout

Once connected I’ve followed the timer tutorial provided by Amanda Ghassaei to calculate the RPM by counting the interval between revolutions.  One thing I learned is that the millis() function uses timer0 internally so if you want to use that function and a timer interrupt then use timer1 or timer2.

The code can be found on github in the dcspeedo repo.

DeskCycle/Arduino Speedo Debug Screenshot

Next up is a simple application that reads the RPM and calculates speed and distance to display it on my PC to start with.  I’m intending to add some cool functions like map integration to do virtual challenges such as Lands End to John O’Groats and similar which should be good for a laugh.

Also, this same code will be the basis of the digital speedo adapter for my Mini so two birds with one stone!  As practice for my Mini speedo, and more practice for stuff for work, I’m going to write it using C# and Unity 5.

Update:  The Unity part is done, more information here.

Ending Illegal Fishing Using Games Technology

As we’ve just had our big media launch of the project I thought I’d share some information about the project I’ve been working on as part of my job at the Satellite Applications Catapult here in the UK.

For the passed 18 months we have been working on a project with Pew Charitable Trusts with the goal of building a system to use satellite derived data to track and deter illegal fishing at sea. Around one in five fish sold today is illegally caught so it’s a certainly a big problem to tackle.

Our system uses a live feed of vessel positions, currently provided by ExactEarth, using satellite AIS (Automatic Identification System) combined with a few other data sources to create a near real-time and historical view of fishing activity around the world.

We’re using the Unity games engine for data visualisation and as the interface to the system, both on our video wall and desktop machines, and we have a tablet version in the works. The video wall runs at 11536×3252 and Unity runs a treat!

We’ve a lot of vessels tracked at any point in time, all of which are rendered and animated on the screen at an accelerated rate for analysis. We are building the system using MMOs as inspiration as we’ve all seen how it’s possible to organise and work in a large group of people in raids to achieve a common goal. Having analysts working globally with large sets of data, it seemed a good model of interaction to follow.

A few screenshots and a video of the software in action are below, as the son of an engineer and ecologist I’m happy to say I’m proud of the work we’re doing and this is only the beginning!





A full screen, compressed, screenshot from our operations room running the Virtual Watchroom software as part of Project Eyes on the Seas.
A full screen, compressed, screenshot from our operations room running the Virtual Watchroom software as part of Project Eyes on the Seas.

More information on the project, can be found here;

We’re considering doing a live presentation of the system in a few weeks using Twitch or similar, if anyone would be interested in a demo or if you’ve any thoughts or comments please feel free too leave them below.


Dad’s Clock

Work has been mental for a few months so despite doing odds and sods on Hugo and a buttload of work on our illegal fishing project at work (hopefully more on that soon) I’ve not posted anything in a while so thought I’d post about a clock I’m building for my Dad.

My dad’s an engineer, heavy fabrication mostly, designing and building access platforms for the nearby oil refinery for over 40 years and since he drove me school on my first day on his crane he’s been a big influence.

Dad's Crane

As a reference to his engineering heritage and my current work  with satellite data I’m building an Arduino based clock that uses the GPS time signal to set the time and uses a set of voltage panels as the face.  The GPS receiver is the Adafruit Ultimate GPS board and I’m using a DS1307 based board for the real time clock.  The Ultimate GPS board is a bit OTT but it does allow me to receive a time signal indoors, as I only need one signal to get the time rather than the multiple signals needed for a full fix it works quite well.

The code is a work in progress but you’ll find it on my github page.

Geoserver, Leaflet Angular Directive and viewParams

Catchy title, I know…

In the big GIS project I’m working on we have a large set of data, 100 million rows+, which needs to be rendered as an overlay.  This turned out to be tricky as I’m not really a GIS developer and learning as I go. I figured it out through trial and error so I’m writing this post to help those like me, hopefully I’m using the correct keywords I was looking for so that SEO picks this up!

Our frontend architecture is as follows;

To create the tiles in Geoserver we ended up creating a parametric SQL layer, more info on that here, and we are adding it as a WMS layer.  What I discovered in the end is that to pass the viewParams through to Geoserver you need to add a layerParams option with viewParams within it, for example with start and end as SQL parameters;

layerParams: {
    viewParams: "start:'2014-01-01T00:00:00.000Z';end:'2014-01-31T00:00:00.000Z'"

In a WFS call the equivalent would be to append the following and one thing to note in both cases is to properly escape the characters;

Hope this helps!

JavaScript IDE, Git and Deployment

So I was working on a set of scripts to automatically commit and deploy HTML, JavaScript and CSS stuff using PowerShell however as I’ve just discovered WebStorm which does all this and allows for live preview and debugging of code I’m not going to bother anymore!

After installing it I quickly managed to configure auto-deployment to a Linux server via SFTP and live previews in Chrome.  Having spent the last few months getting to grips with JavaScript programming I have to say I’ve missed having a good IDE, live preview of code that updates on the fly really is a brilliant touch and should make things far easy moving forward as I rewrite our UI using AngularJS.

Note:- I had an issue testing my SFTP connection when I configuring the deployment location and received the following error;
“Conection to ‘’ failed.  Invalid descendent file name “Accept:”.”

It turns out that I’d managed to somehow create a file called “Accept:” in my home folder in Ubuntu and the WebStorm IDE doesn’t appear to deal well with filenames that contain ‘:’, at least I think that is why.  All I know is I deleted the file and was then able to connect without issue.  Hope this helps someone as it took a while to figure out this morning in my pre-caffeinated fugue.