Ball flinger

We were so impressed with last year’s ball flinger from Hitchin Hackspace, that we’ve decided to do our own take on it for this year.

It was a struggle making it fit in the 100mm space we are allowed, but I think it should work.  *crosses fingers again*


And of course with just a few days to go, we’ve had a major disaster on the robot.  This tiny gear has cracked:

And so it won’t stay on the shaft.  Sadly, its the main pinion gear of one of our motors, so we’re down a motor and we don’t have a spare. 🙁

We quickly ordered another from China (but that won’t arrive before PiWars).  We also ordered some similar looking gear motors from Ireland – hopefully that will turn up in time, and the gearboxes will be similar enough that we can either transplant the entire ‘box or just steal the pinion gear from them.  *crosses fingers*

Or we’ll have to go to PiWars with a lame robot 🙁

Electrical Gremlins

Imagine my dread when I woke up to this message from John:

“Are you around tomorrow? I could do with some advice. I’m having an issue with the propeller board which is resetting itself randomly when the motors run at any speed. I think its an electrical problem rather than software, but really struggling to figure out what’s going on.”

The usual advice here – add decoupling capacitors everywhere!

After lots of hunting around, adding new 0.1uF caps to suppress the motors (one cap between the terminals, one from each terminal to the case), John eventually figured it out:

“I’ve been probing around with my ‘scope this morning and the power supply looks dead smooth, so my latest theory is that the reset line is the culprit and is picking up spikes from the motor. ”

And sure enough – a cap on the reset line to our propellor hat fixed it!  Phew…

Brushless motors

Brushless motors are amazing.  Massive power to weight ratio – which is why they’re used heavily by the model flying community in their planes and quadcopters.

Their mass adoption also means they’re cheap and plentiful!

They’re basically 3 phase motors, which means they have 3 sets of coils inside.  They also have permanent magnets attached to the rotor and by energising the coils in sequence (and changing the polarity), you can make a rotating magnetic field which drags the rotor magnets around.  You have to be quite precise however, especially when starting up or dealing with a variable torque – you need to know where the rotor is all the time.  Fortunately, someone has done all the hard work for you in the form of pre-packaged Electronic Speed Controllers (ESCs) which measure the rotor position by  looking at the induced voltage in the coils themselves!  Genius.  And you can buy an ESC with this genius in it from ebay for a fiver!

Of course if you’re going to use one in a robot, you really want it to got forwards and backwards.  And most ESCs on the market are single direction (because for RC planes, that’s what you want).  So either look for reversible ESCs or do it the hard way and re-program the ESCs you’ve already bought by mistake!

There are a couple of open source ESC firmwares out there – SimonK and BLHeli are the main ones.  But you can download them, compile them and flash them to your ESC.

Some ESCs are helpful and include in-circuit programming pads to allow reprogramming, but if (like me) you bought cheap ones, you’ll have to break out the soldering iron and magnifying glasses to solder wires onto the right pins (see RCgroups) on the microcontroller:

Line following

Having failed at line following in previous years due to various problems with IR sensors (spacing, distance from point of rotation, etc), we’ve decided to go with mounting a Pi camera on the bottom of the robot and running an openCV algorithm to track the line.

OpenCV is amazing – you can do so much with only a few lines of code – e.g. face detection:

For line tracking, we’re finding the centre of the line with a gaussian blur, then walking the gradient to the darkest points and joining them up:

Next job is to extract from the image a vector so that we can know which way to go.  We take the point where the line crosses the middle row, then do a polar transform and plot the amount of black in each direction.  The directions with the most will be along the line – in either direction.  We then assume the correct direction is the one towards the front of the robot.

You also then need to add behaviour for when the line is away from the edge of the picture and in danger of not being in the next frame (i.e more it towards the centre)!  And code to deal with the case  where there is no line in the shot (hunt around until you find it…)

And finally Tuning.  Lots of factors affect how well the line following works – not least speed of the robot and the update rate of your sensor/control loop.  That can really only be done with a real course on the real robot, and you can get scuppered if the course designer has put sharp curves or hairpins on the course…

Good luck!