In my previous post I talked about some issues I hit when I upgraded from Ubuntu 14.04 to 16.04, it wasn’t all plain sailing and in this one I’ll cover the issues I’ve had with getting Greyhole back up and running.

At the end of the last post I had my “missing” disks mounted and I mentioned I was moving data around.  Thankfully the two disks that were mounting fine we’re two of my largest, 4TB and 2TB worth, the two that weren’t mounting are 2TB and 3TB, after deleting a load of old files and reducing the redundancy level on the non-critical shares it looked like I’d have just enough space to make things easier.

One at a time I ran the command to remove a disk from the pool and waited for Greyhole to finish balancing;
greyhole –going=/mnt/three/gh

You can see what the Greyhole service is doing by running “greyhole –L”, one it tells you it is sleeping you can crack on with the second disk.

This completed and I was able to see my files from a remote machine via Samba, huzzah!  The problem was the install wasn’t tidy any more, I couldn’t control Greyhole using the service command and the landing zones were on a disk I was intending to reformat.  I tried unsuccessfully to fix it but decided to follow the steps to reinstall it in the end.  From the perspective of the documentation this would be the same as migrating to a new machine.

First off I ran “sudo apt remove greyhole –purge” which removes the service with extreme prejudice and I then followed the standard steps to install as per this page.  I restarted Samba and Greyhole after running the fsck command and lo and behold I got most my shares back online!  Two were showing up fine, full of files, one was showing up but empty.  This was my backups share which was a little worrying but I’d already backed it up to another machine so wouldn’t be a big issue to rebuild it.

It turns out that when I was configuring the smb.conf and greyhole.conf files I called the backup share “Backups” rather than “Backup” and this meant that Greyhole couldn’t find the files to make them accessible again.  I fixed this typo, ran fsck again and they are now showing up.

Regarding the other two drives, it looks like I’d initialised them as zfs_members at some point and with Ubuntu 16.04 and they can’t be mounted in the same way.  It’s a vaguely educated guess so happy to be corrected!  To get rid of them I used the wipefs tool which strips the drive bare of partition signatures.  BE VERY CAREFUL WITH THIS!

I ran “wipefs –all /dev/sdc” and “wipefs –all /dev/sdd” which seemed to do the trick.  After that I followed this guide to format my drives using parted.  I’ve no idea why but blkid still doesn’t show the UUID for the partitions I’d created but I took note of them from the output of the mkfs.ext4 command.  I put them into fstab along with creating a folder to mount against with the other two drives and ran “sudo mount /dev/sdc1” and the same for sdd1, they then showed up!

Finally I added the two drives to the Greyhole storage pool by following this guide and ran “greyhole –balance”.

A massive faff but a great learning experience!


So I logged in to my home server recently and found in the MOTD that an upgrade from 14.04 to 16.04 was available.  Being a bit cautious about things I asked a colleague if he’d done the upgrade and he had, the only issue he’d come across was for hardware I don’t use so thought I’d crack on.

That night I got home, ran do-release-upgrade, answered a few questions and left it to it.  It carried on tinkering with one of my programming projects on my desktop PC and several hours later, tired after a satisfying nights hacking, I shutdown my desktop.  Completely forgetting I had an SSH session open…
I promptly logged back on and checked my server, in HTOP there was a process at the top of the list that looked upgrade related so I left it to it overnight.  Turns out that as I didn’t have screen installed there was no way to reconnect to that upgrade session which was an arse to say the least!  I didn’t have a choice, that I know of, but to reboot.  I did so and it kernel panicked on boot, something to do with not being able to mount something.



::Expletive deleted::

I loaded in to maintenance mode by selecting the advanced option on reboot and looked at what was or wasn’t mounted.  It turns out two of my four disks weren’t being mounted by fstab on boot, I ran blkid and they weren’t listed either.  I managed to find the following command on Ask Ubuntu which showed that the disks we’re still being detected which was a good sign.



I managed to manually mount the disks as EXT4 and could access the data so I figure this is a quirk of 16.04 I need to figure out.  So far so good!  I commented the two drives out of fstab and attempted a reboot, I got a bit further but ended up in maintenance mode again.

This time around I did some more digging and found the “apt –configure –a” command which reconfigures all the packages installed, this was recommended for interrupted installations and for me it worked a treat.  I could now boot normally!

As previously mentioned I use Greyhole for file duplication across my disks, for long-time readers of my blog or those familiar with it it’s very similar in concept to Drive Extender on Windows Home Server, Greyhole wasn’t happy.  First off it complained about PHP and MySQL errors so one by one searched for the error line and installed the missing packages.  After that I managed to get Greyhole running against the manually mounted disks and I’m now moving data around so I can reformat the two odd ones out that are listed as zfs_members so I can get them in line with the others.  That’s in progress and I’ll cover it in another post as this one has rambled on long enough.

It has certain been a learning experience and I’ve got nerd points from my colleagues for actually managing to fix a borked upgrade, apparently most people would just reinstall but I figured I’d have a stab at it.  For a certified Windows fanboy I’ve certainly come a long way!

Tagged with:

I thought I’d write up the steps I’d take to set up the Raspberry Pi 3 I’m using on my Roomba, including wifi and the rest, then discovered PiBakery and frankly this post writes itself!PiBakery is a tool for Windows and Mac which makes configuring a new Pi a block based affair.  It keeps up to date with the latest version of Raspbian too.  Basically you select blocks from the left hand side, change the values and once you are happy you write to an SD card by clicking “write”.  As I’m running headless on the Roomba being able to configure without the faff of plugging in to a keyboard and mouse is brilliant, it’s a little thing but they add up.

From the screenshot you’ll seepibakery-setup on first boot I configured the wifi, SSH key, changed the hostname and set the Pi to boot to console to save resources.  I was dubious but plugged in the SD card, gave it power and sure enough it appeared on the network a few minutes later.  The only step I took afterwards was to install XRDP which is handy for debugging and if I want to deploy new code to the Arduino directly from the Pi. You can install packages as part of the setup process too and I’ll certainly be doing that next time as I know what I want.

I’ve also used the same method with the PiZero to turn it into a USB gadget which worked a treat.


A long while back now I bought a DeskCycle to use at work to help my body stay more active whilst at my desk, standing desks aren’t an option so this seemed ideal.  I’m also a massive geek, which is a massive surprise I know, so I built a PC interface for it using an Arduino and a desktop display using a Unity application.  I’ve been using this for the last year and a half according to the CSV logs.

The DeskCycle has developed a squeel at certain speeds so I though I’d throw a tweet towards the manufacturers to ask how to oil it and to quote my distance too, I used a simple powershell script to get a CSV of the total distance for each day then threw it into Excel.


As of about half an hour ago I’ve cycled a virtual 3159 miles at an average 9.46MPH.  Damn I need to add an odometer to my display!

The powershell script is as follows for those interested, it gets a list of all CSVs, gets the last line from each and spits it out into a new file. Very handy!

$alldata = "DateTime,Speed (MPH),Cadence (RPM),Distance (Miles),Duration (HH:MM:SS)`r`n"
Get-ChildItem -Filter *.csv | 
Foreach-Object {
 $content = Get-Content $_.FullName -tail 1
 $alldata += $content + "`r`n"
$alldata | Out-File alldata.csv


While I was doing the initial tinkering with the Roomba to figure out what made it tick I made a load of video logs more for my reference than anything.  I’ve put them up on YouTube and will have to remember to carry on doing them…


Before I can rebuild Bumblebee, my 1st generation Roomba, I need to figure out how he works.  I’m going to split this into three sections; Power , motors and sensors.  I’m going to cover how to interface with each of these in future posts.


This was simple enough, I charged the battery and put a multimeter across the terminals, the battery showed 16v across the terminals.


A quick count shows that there are five motors.  One for each wheel, one for the brush motor, one for the side sweeper and one for the vacuum.  From the fact they all seem to have a black and red wire going into them and from the age of the device I took an educated guess and assumed they are simple DC motors.  In order to test this theory I took the probes from my voltmeter, plugged them into my bench supply and poked at the motor terminals with the voltage and current limit set low.  With this simple setup I was able to give the motors different voltages and easily reverse the polarity, sure enough the speed changed with voltage and direction changed with polarity.  The wheel motors will need to run in either direction but the other three only need to run in one direction.


There turned out to be a lot more sensors than I realised and it’s quite a packed little robot!  The sensors fall into two categories; IR sensors and switches.  The microswitches are on either wheel and the caster wheel at the front, it looks like all three are currently wired to the same header so the robot knows only that one wheel is up and not which.  The rest of the sensors are a bit more convoluted.

Wheel Encoders

The drive wheels have an encoder each with four wires going in, once I’d opened one up it turned out that they are comprised of an IR LED and a light dependent resistor.  I checked to see if they were IR by giving them just over a volt and there was no light, I then got out my phone camera and saw the telltale purple glow.  Shortly after this I realised the error of my ways as the LED went out, without a current limiting resistor I burnt it out!  Thankfully the LDR worked with visible light so I ended up replacing the LEDs on both sides with red ones.

Cliff Sensors

Along the underside of the bumper there appears to be four cliff sensors, again IR LED/LDR combos which in this configuration are known as IR distance sensors.  I’ve used these long ago when I built a PIC16F84 based robot at college so these aren’t a mystery.  The resistance of the LDR varies depending on how much light bounces back, you need to calibrate them in your code or circuit but they are simple enough.

Wall Sensor

This is an IR distance sensor on the right hand side of the bumper, it works the same as the distance sensor.


This one confused me for a while as I couldn’t see any switches on the end of the arms of the bumper, I ended up taking the bumper out which required removal of the logic board and the penny dropped.  At either end of the logic board there is an IR/LDR pair and when the bumper is hit the light level changes.  I wondered to start with why they didn’t just use a switch but the video linked at the top of this page explained it all.  A switch would be hammered that often it would fail in no time, the design of the bumper mount also cleans the area between the LED/LDR too which is handy.

IR “Eye”

On the top of the bumper at the front is a 360 degree lens which directs light on to an IR sensor of some kind, I’ve not dug deeper in to this one yet.  I believe it acts like an IR receiver for a remote in a TV as it is used with the Roomba’s virtual wall.  If the robot detects the IR code that is being sent out by the virtual wall it acts as though it hit a solid object, this is useful for preventing your hoover from escaping.


I’ll cover how I use each of the above in upcoming articles for each part above.


I’ve just realised that the .engineering TLD exists so I’ve bought and I’ve pointed it towards this site.  The old and links will carry on working but this is the new URL and I can’t imagine it changing in a hurry!




Bumblebee is my Roomba, so named as long ago he lost his voice.  About a year ago his logic board started playing up and though he was still able to clean, at the end of each cleaning cycle he wouldn’t go into standby and his battery would drain in no time.  At that point he stopped actively gathering dust and started doing it passively as he sat behind my sofa.

Bumblebeee MK2

Since a kid I’ve always wanted to build a robot and figured I’d kill two birds with one stone and use Bumblebee as a chassis for a mobile robot, he already is one after all, but also have the aim of returning his base functionality of being a robot hoover.

The Plan

Bumblebee is an original model Roomba from 2002, he was a gift from a friend who knew I loved to tinker and gave me him broken.  If I could fix him I could keep him, thankfully an easy fix as the battery was dead.  This model is very simple in it’s function and behaviour, it has no mapping capability, no dock or wireless control.  It apparently can be controlled using IR but I’ve never had a remote.  It also lacks the diagnostics port that the newer models have that make hacking a Roomba really easy now so this is going to be a bit trickier, a lot more education and most importantly more fun!

The parts I’ve used to partially resurrect him are a Arduino Leonardo and an Adafruit Motor Controller Shield.  I’ve also a Raspberry Pi 3 to add into the mix, for Wifi control and more processor intensive tasks.  The idea is to use the two to thier strengths; the Arduino will control the motors and read the sensors allowing for real time override in case of collision and the Pi will be able to sit back and give out the orders.  It’s a classic split for mobile robots but thankfully very cheap to implement now.

Current State

As I said I’ve been working on this for a while, I’ve a load of notes to type up and a loads of “learning experiences” to share.  Mostly when I made a rookie error and burnt out one of the motor controllers…  I’ve now got the motors under control over serial, I’ve also a simple console application that lets me drive him around and toggling the sweeper/vacuum fans on, here’s a video of him in action:

Next Steps

My next item to look at is getting sensor data into the Arduino, first up the encoders.  Encoders are devices that allow you to measure the movement of a wheel, you’ve likely used a rotary encoder on a hifi to control the volume, and the Roomba has one is each wheel.  Right now I can control how much power goes to each wheel but because of differences in the state of the gearboxes, carpet and who knows what other factors, the wheels spin at different speeds.  By measuring the speed from the encoders we can compensate for this, we can also use them to calculate the distance we’ve travelled.

After that is the rest of the sensors, those I’ve found so far are;

  1. Cliff sensors – these are under the bumper and detect drops to prevent him falling down stairs, I think there are four of them and they appear to be IR distance sensors
  2. Bumper sensors – these detect collisions, I think there is one at either end of the bumper so I’ll know if I’ve hit something to the left or right
  3. Wall sensor – another IR distance sensor mounted on the right of the bumper, this allows for wall following behaviour
  4. Wheel up/down switches – One on each of the drive wheels and one on the caster at the front.  They detect if the wheels are up or down and can be handy for detecting when we’ve beached ourselves.
  5. Wheel encoders – these were IR LEDs and a light dependant resistor.  I blew one of the LEDs by accident so replaced them both with red LEDs.
  6. Beacon IR Reicever – Not sure how this works yet, it’s a 360 lens on the top that receives a beam from the virtual wall, a device you place by your door to stop him escaping, I’m hoping to add more sensors to make this redundant.
  7. Buttons – there are three buttons for room size to start different cleaning cycles.  They could be useful though I may not bother with them.

Once I’ve all the sensors connected I’ll be able to hook up the Raspberry Pi to start working on reproducing his original behaviour.  After that I’ll be able to build up his capabilities over time and add a load of new ones too.  I’m not intending this just to be a hoover but a mobile robot platform that happens to be able to hoover.

If you’ve got this far, kudos!  That’s it for now, more posts will trickle out as i write up my old notes.  I’m going to carry on having fun building things and write posts in the order they happened.  Hopefully I’ll catch up fairly quickly!

Tagged with:

I have a lot of projects on the go, some stalled and some my current obsession, and that’s ok.  I’ve previously beaten myself up about not getting things finished but the truth is it’s just how my brain works so I’ve decided to work with the meatsack between my ears rather than fight it

Here is my usual project lifecycle:

  1. Discover new thing
  2. Obsess over thing
  3. Buy parts to build thing
  4. Hit limit of knowledge
  5. Get frustrated and shelve it

Projects I’ve got in flight are always in the back of my mind and occasionally I’ll learn something new which mitigates stage 4 and I’ll start back at stage 1 again, rinse and repeat.  I sat down and looked over my projects and realised most of them shared common themes in some way, usually in the tech involved, but all use it in slightly different ways for very different results.  To that end I wrote a list of my projects, what each involve at a high level, and put ticks in boxes to see where they overlap.

You see, having multiple projects that overlap on the go is a great thing.  It means if one of them starts kicking your arse you can switch to another!  It’ll cleanse the mental palette by giving a context switch and if they use similar tech you might just get some inspiration to help elsewhere.

Mostly, it keeps me making.  After all, if one of the most notorious makers in the community has self doubt every now and then it seems I’m in good company.

Next up, I’m planning on adding my ongoing projects to the site to make this more a work log again.  I’ve been up to all sorts but mostly using OneNote for not taking and leaving this place fallow.


I’ve a Blackmagic Intensity Pro capture card in my machine at work, initially we were going to capture video for webcasts from a professional camera but that never came to pass.  It’s been sat in my machine gathering  dust soI thought it would make quite a nice virtual monitor for embedded devices.  As I’m playing with Raspberry Pis more at work it’s an idea way of not having to faff around switching inputs on my monitor.

Required software:  VLC and the Blackmagic drivers for the card.

Plug your HDMI device into the capture card, looking at the back of your machine it’ll be the one closest to your motherboard.  This may work for DVI capture too, I’ve not tried it.

Open VLC then File -> Open Capture Device.  Select Declink Video Capture, enter 1920×1080 for the video size, click on Advanced options and change the aspect ratio to 16:9.  Click on then click play and you should have a picture, albeit slightly laggy.  I’m using it as a virtual monitor so not an issue, if you are capturing a stream from a console it may be irritating.  I used to have a way of minimalising this when capturing from a webcam but can’t remember it, if you find out please leave a comment and I’ll update the post.  To reduce lag, click on “More options” and set caching to 0 ms, seems to have done the trick for me.

This one was more a note for me for the next time I try to use the card for this use and have forgotten, thought it may be useful for others.

Update:  To shortcut the whole process, this works nicely as a command line and if you create a shortcut to VLC and pass the arguments in accordingly you can open straight into the stream:
vlc dshow:// :dshow-vdev=”Decklink Video Capture” :dshow-size=1920×1080 :dshow-aspect-ratio=16\:9 :live-caching=0

Pretty sure if you change the device to the name of any capture card or webcam this should work, not tested so your mileage may vary.

%d bloggers like this: