With the rebuild complete, we can finally move on to fine tuning the code for the events. John had a great weekend here, with working code for the proximity alert.
The robot uses optical sensors to spot the wall – we’re using A to D converters on the Arduino to read the amount of reflected IR light. Away from the wall, virtually no light is reflected. As you get closer, the reflection gets brighter. The trick is to decide at what level you should stop. We have multiple sensors on the front so that if the robot doesn’t approach the wall perfectly straight on, one of the side sensors will still spot the wall and stop us before collision.
The next task in the tuning will be to test many times and against different types of wall – that’ll tell us what sort of repeat-ability we have – which will be key to getting a good result on the day.
So what should you never do in the run up to a competition? Throw away your robot and start again. But that’s (almost) what we’ve done…
So the old robot chassis was always intended to be temporary – made of 3mm cardboard and aluminium brackets. We finally got round to replacing it with a chassis made of 5mm PVC foamboard cut on a CNC router. But it means a tedious task of removing all the parts from the old chassis, bolting them on the new and re-routing all the wires.
We’ll see shortly if we’ve managed to do that without error…
John and I had a session last week trying to get the line following sensors to produce the results we expected. To cut a long story short, we ended up putting blinkers on the sensors which seems to have made all the difference.
John then went away and put together some line following software – and it seems to work. There might be a bit of tuning to be done and some alterations to the calibration code, but it does seem to get round a test course:
So after the “serial over USB” problems (see last post), we removed USB completely, switching over to use the built in UART port on the Raspberry Pi and connecting direct to another serial port on the Due (it has lots!). Both devices run at 3.3 volts, so only 3 wires are needed – Tx, Rx and Gnd. A quick search and replace and recompile of the Due code to change the port it is listening to and we fired up the code again – and it worked!
Here’s video of us driving it round the office under joystick control:
Our design calls for an Arduino (to allow real-time stuff like outputting steps, running control loops, etc). We’re using an Arduino Due.
We had planned to get the Raspberry Pi to talk to the Due over ‘serial over USB’ – i.e. we connect the USB port on the Due into the Pi and then send bytes to the /dev/ttyACM0 device that gets created by the OS.
This worked, but after a while (30s or so of chatter) the port would stop responding. A quick google showed that a few other people have seen similar behaviour, but no one had a solution.
Now to find a workaround…
The plan here is to get last year’s Metabot code (with as few changes as possible) working on the new robot so that we can try it out and check the performance is sufficient. If we need new motors shipped from China, I’d like to know now rather than halfway through November!
The first 2 bytes in each frame are sync bytes (0x03, 0x01), then there are 7 pairs of bytes which tell you the position of each of the transmitter channels (joystick axes). Just read the bytes and make your robot dance!
At first glance, the straight line speed test sounds like an easy event. But once you try it, you start to notice problems:
Robots don’t go straight naturally. There’s always some difference in the wheels, motors, speed controllers, etc so that the robot starts turning. And over the long, narrow speed test course, you’ll probably hit the walls at some point. So how can you deal with this?
Drive under manual control
This is harder than it looks – most robots last year didn’t have fine direction control.
Detect the walls and avoid them
This is a good option – detect the walls somehow (e.g. with an IR or ultrasound sensor – much like the proximity alert sensors, but mounted facing sideways) and if you see one, either speed up the wheel on that side or slow down the one on the other side.
Mechanically follow the wall
This is what we did last year with metabot
Place some sort of bumper (wood, teflon, bearings?) on the sides below the top of the wall
and optionally place the robot against the wall on the side it naturally turns towards
The downside of course is that this increases friction
Write code to process the image, spot the edges of the course and adjust the motors appropriately.
Image processing code is hard… People have used OpenCV with the RPi.
Regularly read it and adjust the motor speeds if it detects that you’re turning
BUT – I’ve never tried this myself and I don’t know for sure that the compass will be sensitive enough to detect the very small rotations you’ll need to spot, or will work well in the presence of the magnets in your motors!
Finally – think about how you’re going to stop… If your robot is capable of any decent speed, the stopping area is quite short. We implemented a “dead mans handle” on Metabot last year – i.e. the robot ran at full speed while we held down the button and stopped immediately when the button was released. The event judge ruled that this was a safety device and NOT manual control in case that matters to you.
I saw recantha’s post pointing people at Brian Corteil’s post on how to control your robot and was surprised to see that the method I used for Metabot last year wasn’t listed. So the obvious thing to do is to document it here, right?
Write the robot’s controlling python script – it listens for TCP on a port
The laptop runs a very simple pygame script which listens for joystick inputs:
These come into the script from the Pygame library as a dictionary.
The script then converts this dictionary to JSON (using the python json library)
and sends the JSON over TCP to the IP address/port that the robot is listening on
The robot then converts the JSON back into a dictionary and reads the joystick values out of it.
Note that if you want an easy control method, I do NOT recommend our one – it can be a pain to set up the wifi access point and the method you use depends on the chipset of the WiFi dongle you have.
Instead, I recommend the bluetooth/wiimote method: this tutorial explains it better than I can and with code examples too. My experience with bluetooth dongles and wiimotes is that Genuine Nintendo wiimotes are required – I tried using a cheap knockoff wiimote and it wouldn’t pair with the dongle, but my genuine ones worked fine.
We received some stepper drivers on Monday – TB6600 based boards. These are 4A bipolar stepper drivers and are very cheap on ebay (£7 per motor). The downside is this sort of ebay driver is often a bit rubbish – this thread documents some of the bad things. The worst problems:
powering down the logic side of the circuit but leaving the motor supply connected = burnt out driver.
output current is often less than it should be (due to shut-down circuitry)
circuit uses the a pin on the TB6600 to provide 5V to the supporting circuitry – but that pin is for a decoupling capacitor – it isn’t meant to provide power to anything else
So last night we decided to find out exactly what might be wrong with our drivers by tracing the schematic. This way we can work out a method to avoid the worst problems and (hopefully) get reasonable performance out of our motors!
Here’s the schematic we came up with for our particular TB6600 driver – YMMV…
If you’re reading this because you’re interested in building your own robot, and you don’t fancy mucking about with reverse engineering cheap drivers from ebay, do yourself a favour and build/buy decent drivers – e.g.