Distance sensing

Last year we ran with ultrasonic ping sensors, but a lot of the teams were using the VL53L0X time of flight sensors with good results.  So this year we thought we’d have a go with some of those too.  We got the ones on a pololu carrier board for about 10 quid each.

And they’re *lovely*.  I think that under the covers they’re doing something very complicated/interesting and hiding all that from us, but the readings we get from them are very accurate and very consistent.  No need for any averaging or filtering code on the Pi side.

Its not all roses though.  The minor downside of that complexity is that you need to initialise them with a C-library.  If you’re using the python library, that’s all taken care of for you, but we are writing our code in Golang, so we had to mess about linking in that C-library.

Another quirk to be aware of – they are an I2C device, so they all need an I2C address.  They come with a fixed one from the factory. If you only have one this is fine, but if you have more than one, they will all have the same address…  Other I2C sensors usually allow you to tune the address with jumpers, but this doesn’t seem to be an option with these boards – at startup you need to hold all but one in reset (using GPIO pins) and then send it I2C commands to change the address, and repeat with a different one in reset.  We decided we didn’t have enough GPIO pins for this.

Alternatively, you can use an I2C multiplexer like the TCA9548A (adafruit do a nice carrier board for it too).  With this, you attach the ToF sensors to the different buses coming out of the multiplexer, then you send commands to the multiplexer to change which bus you want to talk to.

Here’s Tigerbot wearing a few sensors on its front.

Motor choices

At the centre of the robot’s performance is the motors and motor drivers.  We’ve tried many options over the years: old drill motors (cheap and powerful, no position feedback), stepper motors (very controllable, heavy, expensive), brushed DC motors with gearboxes and encoders (fairly powerful, fairly expensive, good position feedback).

We’ve considered (but not yet chosen) brushless DC motors (most powerful for their size, controllers very expensive).

This year (like last year) we went with brushed DC motors with gearbox and encoders.  Last year’s units came from China via ebay which caused us trouble when a gear cracked at the last moment and we were unable to get a replacement in time.  This year I decided that all our critical parts were going to come from suppliers in Europe, and be a brand name so that they could be easily purchased from multiple suppliers.

We went with Pololu gear motors – 25mm diameter units with gearboxes.  These motors come in a range of power/gearing options with the same form factor and we could buy them from both RobotShop and TME.  The motor drivers were the same as last year: 13A Cytron units from Robotshop.  These should be able to deliver twice as much current as the motors can handle.

Here’s the populated chassis.  Motor drivers are on the left, Pi + propeller + interconnect board are on the right.  Space for a LiPo battery is at the front.  And right at the bottom is a little 5V switch-mode power supply (as used in model aircraft) to power the logic boards.

 

Interconnect Board

At the heart of the robot is the Pi.  But how does it connect to everything else?  Via the interconnect board of course 🙂

This little board is where *everything* connects – where all the sensors, motor drivers, power supplies, Pi, Propeller, etc come together.  Its a very custom board for every robot, so I generally make it by hand using “padboard”.  This is a cheap, 0.1 inch pitch board with drilled pads.  Unlike stripboard, it doesn’t have defined tracks – so you make your own with solder bridges.

This allows a more compact layout than stripboard, while still being fairly quick/easy to use.

Our board has headers on it for logic power supply (in the middle), motor drivers (6 pins, near the edges), sonar pingers (blue) and servos (yellow).

Introducing Tigerbot

Panic panic panic, must pull my head up out of the code and start blogging!

It’s been an eventful 18 months for the Metabot team.  The balance of the team moved from our old home at Metaswitch to a new company, Tigera.  We couldn’t resist a name change so I’d like to introduce Tigerbot….

(It’ll be orange and black striped by the big day, we promise!)

The picture above shows the bot’s bare bones before we added the motor drivers and power circuitry. Lance’s new 3D printer has been getting a good workout, churning out the new chassis, wheels and attachments.

Just in case there are any PiWars organiser’s reading, the bot’s a lot further along by now, of course!  Not long after the above photo, Lance had the bot to “remote control car” stage:

This is the traditional point for him to lose interest move on to building whizzy attachments while the code monkeys on the team get to work.

Joking aside, “remote control car” is a huge milestone:

  • chassis printed
  • power electronics in place to drive the motors, Pi and sensors
  • Propeller hat mounted on the Pi; this little board gives us a fast little micro-controller to do hard real time motor speed and servo position control; a big hat tip to John who wrote all the Propeller spin code for Metabot II, which we’re using largely unaltered
  • interconnect board soldered up, exposing the I2C bus, which we use talking to the Propeller and sensors
  • remote control connected; we’re using a DualShock 4 controller again paired over Bluetooth
  • It’s a bot!  IT WORKS!!

Now, the rest is “just code”…

 

 

 

Ball flinger

We were so impressed with last year’s ball flinger from Hitchin Hackspace, that we’ve decided to do our own take on it for this year.

It was a struggle making it fit in the 100mm space we are allowed, but I think it should work.  *crosses fingers again*

Disaster!

And of course with just a few days to go, we’ve had a major disaster on the robot.  This tiny gear has cracked:

And so it won’t stay on the shaft.  Sadly, its the main pinion gear of one of our motors, so we’re down a motor and we don’t have a spare. 🙁

We quickly ordered another from China (but that won’t arrive before PiWars).  We also ordered some similar looking gear motors from Ireland – hopefully that will turn up in time, and the gearboxes will be similar enough that we can either transplant the entire ‘box or just steal the pinion gear from them.  *crosses fingers*

Or we’ll have to go to PiWars with a lame robot 🙁

Electrical Gremlins

Imagine my dread when I woke up to this message from John:

“Are you around tomorrow? I could do with some advice. I’m having an issue with the propeller board which is resetting itself randomly when the motors run at any speed. I think its an electrical problem rather than software, but really struggling to figure out what’s going on.”

The usual advice here – add decoupling capacitors everywhere!

After lots of hunting around, adding new 0.1uF caps to suppress the motors (one cap between the terminals, one from each terminal to the case), John eventually figured it out:

“I’ve been probing around with my ‘scope this morning and the power supply looks dead smooth, so my latest theory is that the reset line is the culprit and is picking up spikes from the motor. ”

And sure enough – a cap on the reset line to our propellor hat fixed it!  Phew…

Brushless motors

Brushless motors are amazing.  Massive power to weight ratio – which is why they’re used heavily by the model flying community in their planes and quadcopters.

Their mass adoption also means they’re cheap and plentiful!

They’re basically 3 phase motors, which means they have 3 sets of coils inside.  They also have permanent magnets attached to the rotor and by energising the coils in sequence (and changing the polarity), you can make a rotating magnetic field which drags the rotor magnets around.  You have to be quite precise however, especially when starting up or dealing with a variable torque – you need to know where the rotor is all the time.  Fortunately, someone has done all the hard work for you in the form of pre-packaged Electronic Speed Controllers (ESCs) which measure the rotor position by  looking at the induced voltage in the coils themselves!  Genius.  And you can buy an ESC with this genius in it from ebay for a fiver!

Of course if you’re going to use one in a robot, you really want it to got forwards and backwards.  And most ESCs on the market are single direction (because for RC planes, that’s what you want).  So either look for reversible ESCs or do it the hard way and re-program the ESCs you’ve already bought by mistake!

There are a couple of open source ESC firmwares out there – SimonK and BLHeli are the main ones.  But you can download them, compile them and flash them to your ESC.

Some ESCs are helpful and include in-circuit programming pads to allow reprogramming, but if (like me) you bought cheap ones, you’ll have to break out the soldering iron and magnifying glasses to solder wires onto the right pins (see RCgroups) on the microcontroller:

Line following

Having failed at line following in previous years due to various problems with IR sensors (spacing, distance from point of rotation, etc), we’ve decided to go with mounting a Pi camera on the bottom of the robot and running an openCV algorithm to track the line.

OpenCV is amazing – you can do so much with only a few lines of code – e.g. face detection: http://www.knight-of-pi.org/opencv-primer-face-detection-with-the-raspberry-pi/

For line tracking, we’re finding the centre of the line with a gaussian blur, then walking the gradient to the darkest points and joining them up:

Next job is to extract from the image a vector so that we can know which way to go.  We take the point where the line crosses the middle row, then do a polar transform and plot the amount of black in each direction.  The directions with the most will be along the line – in either direction.  We then assume the correct direction is the one towards the front of the robot.

You also then need to add behaviour for when the line is away from the edge of the picture and in danger of not being in the next frame (i.e more it towards the centre)!  And code to deal with the case  where there is no line in the shot (hunt around until you find it…)

And finally Tuning.  Lots of factors affect how well the line following works – not least speed of the robot and the update rate of your sensor/control loop.  That can really only be done with a real course on the real robot, and you can get scuppered if the course designer has put sharp curves or hairpins on the course…

Good luck!