Having failed at line following in previous years due to various problems with IR sensors (spacing, distance from point of rotation, etc), we’ve decided to go with mounting a Pi camera on the bottom of the robot and running an openCV algorithm to track the line.
OpenCV is amazing – you can do so much with only a few lines of code – e.g. face detection: http://www.knight-of-pi.org/opencv-primer-face-detection-with-the-raspberry-pi/
For line tracking, we’re finding the centre of the line with a gaussian blur, then walking the gradient to the darkest points and joining them up:
Next job is to extract from the image a vector so that we can know which way to go. We take the point where the line crosses the middle row, then do a polar transform and plot the amount of black in each direction. The directions with the most will be along the line – in either direction. We then assume the correct direction is the one towards the front of the robot.
You also then need to add behaviour for when the line is away from the edge of the picture and in danger of not being in the next frame (i.e more it towards the centre)! And code to deal with the case where there is no line in the shot (hunt around until you find it…)
And finally Tuning. Lots of factors affect how well the line following works – not least speed of the robot and the update rate of your sensor/control loop. That can really only be done with a real course on the real robot, and you can get scuppered if the course designer has put sharp curves or hairpins on the course…
The Ragworm PCBs arrived! Actually they arrived a while ago, but I’ve been too busy at work to do much with them. They’re a lovely orange with immersion gold pads.
These PCBs are for mounting hall sensors. They then mount on to the motor above where the shaft sticks out the back of the motor. You glue (epoxy) a magnet to the shaft and then the hall sensor will give you encoder pulses corresponding to the rotation of the motor.
Then you couple that with servo control software (we’ve written our own software running on a propellor hat from Pimoroni) and you have a high power, continuous rotation servo that can be powerful enough to run a CNC machine (depends on your motors of course).
Well – we’ve been beavering away in the background on the next Metabot, and we’ve had 2 prototypes so far to the ‘remote control car’ stage.
The changes to skittles and the unknown quantity of the “slightly deranged golf” are causing mechanical design headaches and we’ve got big plans for the software that might or might not come to fruition in time for the event. With less than 2 months to go, its certainly squeaky bum time!
This time I’ve tried experimenting with making my own PCBs:
Before eventually deciding that I couldn’t get enough accuracy for a 0.5mm pitch SSOP device and getting Ragworm to make the boards for me. I’ll post pictures of those when they arrive. I’m sure I’ll have good enough accuracy for traditional through-hole PCBs at 0.1″ pitch though.
And here’s out latest prototype. TBD if this will be the chassis that we actually compete with, but its a good starting point for playing with the software:
PiWars 2015 was awesome. The most amazing thing was the number of different solutions to the same set of requirements. The competitors was friendly, the event was well organised and a good time was had (by us anyway!). Tim and Mike should be very proud of what they’ve done here.
Metabot did us proud – coming second in the Obstacle Course and Skittles challenges and second overall in the “A4 and under” category.
Over the week we got the last hardware pieces finished – the ball flinger is complete and the body covers are cut and attached. And team mate Emily did a cracking job on the decorative touches (see below).
The rest of today is going to be a day of tinkering – software tuning and testing. The trick today will be to not regress anything – changes need to be small and self contained.
We’ll be at the meetup this evening – hope to see some of you there. I’ll leave you with a final glamour shot of the robot for those of you who won’t be there to see it in the flesh.
With the rebuild complete, we can finally move on to fine tuning the code for the events. John had a great weekend here, with working code for the proximity alert.
The robot uses optical sensors to spot the wall – we’re using A to D converters on the Arduino to read the amount of reflected IR light. Away from the wall, virtually no light is reflected. As you get closer, the reflection gets brighter. The trick is to decide at what level you should stop. We have multiple sensors on the front so that if the robot doesn’t approach the wall perfectly straight on, one of the side sensors will still spot the wall and stop us before collision.
The next task in the tuning will be to test many times and against different types of wall – that’ll tell us what sort of repeat-ability we have – which will be key to getting a good result on the day.
So what should you never do in the run up to a competition? Throw away your robot and start again. But that’s (almost) what we’ve done…
So the old robot chassis was always intended to be temporary – made of 3mm cardboard and aluminium brackets. We finally got round to replacing it with a chassis made of 5mm PVC foamboard cut on a CNC router. But it means a tedious task of removing all the parts from the old chassis, bolting them on the new and re-routing all the wires.
We’ll see shortly if we’ve managed to do that without error…
John and I had a session last week trying to get the line following sensors to produce the results we expected. To cut a long story short, we ended up putting blinkers on the sensors which seems to have made all the difference.
John then went away and put together some line following software – and it seems to work. There might be a bit of tuning to be done and some alterations to the calibration code, but it does seem to get round a test course: