Brushless motors

Brushless motors are amazing.  Massive power to weight ratio – which is why they’re used heavily by the model flying community in their planes and quadcopters.

Their mass adoption also means they’re cheap and plentiful!

They’re basically 3 phase motors, which means they have 3 sets of coils inside.  They also have permanent magnets attached to the rotor and by energising the coils in sequence (and changing the polarity), you can make a rotating magnetic field which drags the rotor magnets around.  You have to be quite precise however, especially when starting up or dealing with a variable torque – you need to know where the rotor is all the time.  Fortunately, someone has done all the hard work for you in the form of pre-packaged Electronic Speed Controllers (ESCs) which measure the rotor position by  looking at the induced voltage in the coils themselves!  Genius.  And you can buy an ESC with this genius in it from ebay for a fiver!

Of course if you’re going to use one in a robot, you really want it to got forwards and backwards.  And most ESCs on the market are single direction (because for RC planes, that’s what you want).  So either look for reversible ESCs or do it the hard way and re-program the ESCs you’ve already bought by mistake!

There are a couple of open source ESC firmwares out there – SimonK and BLHeli are the main ones.  But you can download them, compile them and flash them to your ESC.

Some ESCs are helpful and include in-circuit programming pads to allow reprogramming, but if (like me) you bought cheap ones, you’ll have to break out the soldering iron and magnifying glasses to solder wires onto the right pins (see RCgroups) on the microcontroller:

Line following

Having failed at line following in previous years due to various problems with IR sensors (spacing, distance from point of rotation, etc), we’ve decided to go with mounting a Pi camera on the bottom of the robot and running an openCV algorithm to track the line.

OpenCV is amazing – you can do so much with only a few lines of code – e.g. face detection: http://www.knight-of-pi.org/opencv-primer-face-detection-with-the-raspberry-pi/

For line tracking, we’re finding the centre of the line with a gaussian blur, then walking the gradient to the darkest points and joining them up:

Next job is to extract from the image a vector so that we can know which way to go.  We take the point where the line crosses the middle row, then do a polar transform and plot the amount of black in each direction.  The directions with the most will be along the line – in either direction.  We then assume the correct direction is the one towards the front of the robot.

You also then need to add behaviour for when the line is away from the edge of the picture and in danger of not being in the next frame (i.e more it towards the centre)!  And code to deal with the case  where there is no line in the shot (hunt around until you find it…)

And finally Tuning.  Lots of factors affect how well the line following works – not least speed of the robot and the update rate of your sensor/control loop.  That can really only be done with a real course on the real robot, and you can get scuppered if the course designer has put sharp curves or hairpins on the course…

Good luck!

Ragworm PCBs

The Ragworm PCBs arrived! Actually they arrived a while ago, but I’ve been too busy at work to do much with them. They’re a lovely orange with immersion gold pads.

These PCBs are for mounting hall sensors. They then mount on to the motor above where the shaft sticks out the back of the motor. You glue (epoxy) a magnet to the shaft and then the hall sensor will give you encoder pulses corresponding to the rotation of the motor.

Then you couple that with servo control software (we’ve written our own software running on a propellor hat from Pimoroni) and you have a high power, continuous rotation servo that can be powerful enough to run a CNC machine (depends on your motors of course).

PiWars 2017

Well – we’ve been beavering away in the background on the next Metabot, and we’ve had 2 prototypes so far to the ‘remote control car’ stage.

The changes to skittles and the unknown quantity of the “slightly deranged golf” are causing mechanical design headaches and we’ve got big plans for the software that might or might not come to fruition in time for the event. With less than 2 months to go, its certainly squeaky bum time!

This time I’ve tried experimenting with making my own PCBs:IMG_20170121_180217

Before eventually deciding that I couldn’t get enough accuracy for a 0.5mm pitch SSOP device and getting Ragworm to make the boards for me.  I’ll post pictures of those when they arrive.  I’m sure I’ll have good enough accuracy for traditional through-hole PCBs at 0.1″ pitch though.

And here’s out latest prototype.  TBD if this will be the chassis that we actually compete with, but its a good starting point for playing with the software:

IMG_1011 IMG_1012

Aftermath

PiWars 2015 was awesome.  The most amazing thing was the number of different solutions to the same set of requirements.  The competitors was friendly, the event was well organised and a good time was had (by us anyway!).  Tim and Mike should be very proud of what they’ve done here.

Metabot did us proud – coming second in the Obstacle Course and Skittles challenges and second overall in the “A4 and under” category.

Full results can be found here: http://piwars.org/2015-competition/results-of-the-2015-competition/

There are some things we should have done differently and things we did correctly.  We’ll be looking at what lessons we’ve learned over the next few weeks (and maybe prepare for PiWars 2016?!)

In the meantime, here are some videos of Metabot in action:

 

Final tinkering

Over the week we got the last hardware pieces finished – the ball flinger is complete and the body covers are cut and attached. And team mate Emily did a cracking job on the decorative touches (see below).

The rest of today is going to be a day of tinkering – software tuning and testing. The trick today will be to not regress anything – changes need to be small and self contained.

We’ll be at the meetup this evening – hope to see some of you there. I’ll leave you with a final glamour shot of the robot for those of you who won’t be there to see it in the flesh.

20151204_115044

The home straight

This time next week, we’ll be in the Computer Lab, setting up our mobile pit and staring in wonder at the other roboteer’s creations.

So where are we?

Its always easier to list the things which still remain to be done, so lets start with those:

  • Make more options configurable via the command line UI
  • Finish our ball flinger (for the skittles event)
  • Add decorative touches to the robot
  • Get the connection to the IMU working properly and write code to process the output into a useful form
  • Integrate that data into the event code
  • Lots of motion control improvements (smooth acceleration, tuning to get the most out of our motors)
  • 3 point turn code needs improvement

Its clearly going to be a very busy week!

But what have we achieved?  If PiWars were today, how would we do?

  • Proximity Alert has code which would do, though it could always be closer
  • Speed Test has code which would do, though it could do with driving straighter
  • Pi Noon has code which would do and a wire holder which should work, but the driver needs more practice!
  • Obstacle course has code which would do (but see above for driver)
  • Line following code has code which would do, but it can always be faster

So I’d better get on with it…

We’re planning to be at the meetup at the Cambridge MakeSpace on Friday evening – hopefully we’ll see some of you there 🙂

Proximity Alert

With the rebuild complete, we can finally move on to fine tuning the code for the events.  John had a great weekend here, with working code for the proximity alert.

The robot uses optical sensors to spot the wall – we’re using A to D converters on the Arduino to read the amount of reflected IR light.  Away from the wall, virtually no light is reflected.  As you get closer, the reflection gets brighter.  The trick is to decide at what level you should stop.  We have multiple sensors on the front so that if the robot doesn’t approach the wall perfectly straight on, one of the side sensors will still spot the wall and stop us before collision.

The next task in the tuning will be to test many times and against different types of wall – that’ll tell us what sort of repeat-ability we have – which will be key to getting a good result on the day.

Total rebuild 2

After a mammoth weekend effort, John got the robot back into (almost) working order in the new chassis.

We’ve had a fun evening driving it round the office, testing out the line following sensors and seeing what sort of slopes it can climb up.

Snags we’ve hit:

  • the line following doesn’t work
    • we *think* that the line following sensor array has been bolted on backwards, so the robot is turning away from the line instead of turning towards it.
  • The new chassis has a slight twist in it
    • But it doesn’t seem to affect the straightness of travel (phew)
      • probably because our front wheels don’t do any steering anyway
  • The motors were wired backwards initially

Things which are great about the new chassis:

  • We’ve managed to fit in bigger motors for moar powa!
    • Which makes us quite a lot heavier too, but hey-ho
  • The battery compartment is big enough for our bigger batteries
    • and as a result, the batteries lasted for this evening’s testing session
  • The logic power switch has been moved away from the motor power switch, so no more accidental turning off the wrong one!
  • It has an external USB programming port for the arduino
    • so no more fishing around a cramped compartment trying to get the connector to go in
  • It still fits inside the A4 footprint (with about 1mm to spare!)
  • It’s red!
    • Which is obviously the colour of fast things 🙂

20151116_223117 20151116_223148