Brushless motors are amazing. Massive power to weight ratio – which is why they’re used heavily by the model flying community in their planes and quadcopters.
Their mass adoption also means they’re cheap and plentiful!
They’re basically 3 phase motors, which means they have 3 sets of coils inside. They also have permanent magnets attached to the rotor and by energising the coils in sequence (and changing the polarity), you can make a rotating magnetic field which drags the rotor magnets around. You have to be quite precise however, especially when starting up or dealing with a variable torque – you need to know where the rotor is all the time. Fortunately, someone has done all the hard work for you in the form of pre-packaged Electronic Speed Controllers (ESCs) which measure the rotor position by looking at the induced voltage in the coils themselves! Genius. And you can buy an ESC with this genius in it from ebay for a fiver!
Of course if you’re going to use one in a robot, you really want it to got forwards and backwards. And most ESCs on the market are single direction (because for RC planes, that’s what you want). So either look for reversible ESCs or do it the hard way and re-program the ESCs you’ve already bought by mistake!
There are a couple of open source ESC firmwares out there – SimonK and BLHeli are the main ones. But you can download them, compile them and flash them to your ESC.
Some ESCs are helpful and include in-circuit programming pads to allow reprogramming, but if (like me) you bought cheap ones, you’ll have to break out the soldering iron and magnifying glasses to solder wires onto the right pins (see RCgroups) on the microcontroller:
Having failed at line following in previous years due to various problems with IR sensors (spacing, distance from point of rotation, etc), we’ve decided to go with mounting a Pi camera on the bottom of the robot and running an openCV algorithm to track the line.
OpenCV is amazing – you can do so much with only a few lines of code – e.g. face detection: http://www.knight-of-pi.org/opencv-primer-face-detection-with-the-raspberry-pi/
For line tracking, we’re finding the centre of the line with a gaussian blur, then walking the gradient to the darkest points and joining them up:
Next job is to extract from the image a vector so that we can know which way to go. We take the point where the line crosses the middle row, then do a polar transform and plot the amount of black in each direction. The directions with the most will be along the line – in either direction. We then assume the correct direction is the one towards the front of the robot.
You also then need to add behaviour for when the line is away from the edge of the picture and in danger of not being in the next frame (i.e more it towards the centre)! And code to deal with the case where there is no line in the shot (hunt around until you find it…)
And finally Tuning. Lots of factors affect how well the line following works – not least speed of the robot and the update rate of your sensor/control loop. That can really only be done with a real course on the real robot, and you can get scuppered if the course designer has put sharp curves or hairpins on the course…
The Ragworm PCBs arrived! Actually they arrived a while ago, but I’ve been too busy at work to do much with them. They’re a lovely orange with immersion gold pads.
These PCBs are for mounting hall sensors. They then mount on to the motor above where the shaft sticks out the back of the motor. You glue (epoxy) a magnet to the shaft and then the hall sensor will give you encoder pulses corresponding to the rotation of the motor.
Then you couple that with servo control software (we’ve written our own software running on a propellor hat from Pimoroni) and you have a high power, continuous rotation servo that can be powerful enough to run a CNC machine (depends on your motors of course).
Well – we’ve been beavering away in the background on the next Metabot, and we’ve had 2 prototypes so far to the ‘remote control car’ stage.
The changes to skittles and the unknown quantity of the “slightly deranged golf” are causing mechanical design headaches and we’ve got big plans for the software that might or might not come to fruition in time for the event. With less than 2 months to go, its certainly squeaky bum time!
This time I’ve tried experimenting with making my own PCBs:
Before eventually deciding that I couldn’t get enough accuracy for a 0.5mm pitch SSOP device and getting Ragworm to make the boards for me. I’ll post pictures of those when they arrive. I’m sure I’ll have good enough accuracy for traditional through-hole PCBs at 0.1″ pitch though.
And here’s out latest prototype. TBD if this will be the chassis that we actually compete with, but its a good starting point for playing with the software:
PiWars 2015 was awesome. The most amazing thing was the number of different solutions to the same set of requirements. The competitors was friendly, the event was well organised and a good time was had (by us anyway!). Tim and Mike should be very proud of what they’ve done here.
Metabot did us proud – coming second in the Obstacle Course and Skittles challenges and second overall in the “A4 and under” category.
Over the week we got the last hardware pieces finished – the ball flinger is complete and the body covers are cut and attached. And team mate Emily did a cracking job on the decorative touches (see below).
The rest of today is going to be a day of tinkering – software tuning and testing. The trick today will be to not regress anything – changes need to be small and self contained.
We’ll be at the meetup this evening – hope to see some of you there. I’ll leave you with a final glamour shot of the robot for those of you who won’t be there to see it in the flesh.
With the rebuild complete, we can finally move on to fine tuning the code for the events. John had a great weekend here, with working code for the proximity alert.
The robot uses optical sensors to spot the wall – we’re using A to D converters on the Arduino to read the amount of reflected IR light. Away from the wall, virtually no light is reflected. As you get closer, the reflection gets brighter. The trick is to decide at what level you should stop. We have multiple sensors on the front so that if the robot doesn’t approach the wall perfectly straight on, one of the side sensors will still spot the wall and stop us before collision.
The next task in the tuning will be to test many times and against different types of wall – that’ll tell us what sort of repeat-ability we have – which will be key to getting a good result on the day.