Arduino serial over USB problems

Our design calls for an Arduino (to allow real-time stuff like outputting steps, running control loops, etc). We’re using an Arduino Due.

We had planned to get the Raspberry Pi to talk to the Due over ‘serial over USB’ – i.e. we connect the USB port on the Due into the Pi and then send bytes to the /dev/ttyACM0 device that gets created by the OS.

This worked, but after a while (30s or so of chatter) the port would stop responding. A quick google showed that a few other people have seen similar behaviour, but no one had a solution. 🙁

Now to find a workaround…

The plan here is to get last year’s Metabot code (with as few changes as possible) working on the new robot so that we can try it out and check the performance is sufficient.  If we need new motors shipped from China, I’d like to know now rather than halfway through November!

Manual Control with an RC transmitter

I just discovered something on the internet that I’m very excited about.

Brian Corteil in his post on manual control said that while you could use an RC transmitter to control your robot, but there was no simple solution to connect the signal into your Raspberry Pi.

Well – I’ve just found one 🙂

It seems that Spectrum DSM2/DSMX Satellite receivers connect to their master unit using a 3-pin cable. This cable carries 3V3, GND and a SERIAL link. This can connect directly into the Raspberry Pi UART – set the baud rate to 115200bps (which I think is the default anyway).

The first 2 bytes in each frame are sync bytes (0x03, 0x01), then there are 7 pairs of bytes which tell you the position of each of the transmitter channels (joystick axes).  Just read the bytes and make your robot dance! 🙂

Straight line speed test

At first glance, the straight line speed test sounds like an easy event. But once you try it, you start to notice problems:

Robots don’t go straight naturally.  There’s always some difference in the wheels, motors, speed controllers, etc so that the robot starts turning.  And over the long, narrow speed test course, you’ll probably hit the walls at some point.  So how can you deal with this?

Options:

  • Drive under manual control
    • This is harder than it looks – most robots last year didn’t have fine direction control.
  • Detect the walls and avoid them
    • This is a good option – detect the walls somehow (e.g. with an IR or ultrasound sensor – much like the proximity alert sensors, but mounted facing sideways) and if you see one, either speed up the wheel on that side or slow down the one on the other side.
  • Mechanically follow the wall
    • This is what we did last year with metabot
    • Place some sort of bumper (wood, teflon, bearings?) on the sides below the top of the wall
      • and optionally place the robot against the wall on the side it naturally turns towards
    • The downside of course is that this increases friction
  • Camera
    • Write code to process the image, spot the edges of the course and adjust the motors appropriately.
    • Image processing code is hard…  People have used OpenCV with the RPi.
  • Compass
    • Add a compass sensor to the robot
    • Regularly read it and adjust the motor speeds if it detects that you’re turning
    • BUT – I’ve never tried this myself and I don’t know for sure that the compass will be sensitive enough to detect the very small rotations you’ll need to spot, or will work well in the presence of the magnets in your motors!

Finally – think about how you’re going to stop…  If your robot is capable of any decent speed, the stopping area is quite short.  We implemented a “dead mans handle” on Metabot last year – i.e. the robot ran at full speed while we held down the button and stopped immediately when the button was released.  The event judge ruled that this was a safety device and NOT manual control in case that matters to you.

Manual Control

I saw recantha’s post pointing people at Brian Corteil’s post on how to control your robot and was surprised to see that the method I used for Metabot last year wasn’t listed.  So the obvious thing to do is to document it here, right? 🙂

Our method:

  • Connect a cheap USB wifi dongle to the Pi
  • Set the Pi up as a WiFi access point
  • Connect a laptop to the Pi’s access point
  • Connect an xbox 360 joypad to the laptop
  • Write the robot’s controlling python script – it listens for TCP on a port
  • The laptop runs a very simple pygame script which listens for joystick inputs:
    • These come into the script from the Pygame library as a dictionary.
    • The script then converts this dictionary to JSON (using the python json library)
    • and sends the JSON over TCP to the IP address/port that the robot is listening on
    • The robot then converts the JSON back into a dictionary and reads the joystick values out of it.

Note that if you want an easy control method, I do NOT recommend our one – it can be a pain to set up the wifi access point and the method you use depends on the chipset of the WiFi dongle you have.

Instead, I recommend the bluetooth/wiimote methodthis tutorial explains it better than I can and with code examples too.  My experience with bluetooth dongles and wiimotes is that Genuine Nintendo wiimotes are required – I tried using a cheap knockoff wiimote and it wouldn’t pair with the dongle, but my genuine ones worked fine.

 

Stepper drivers

We received some stepper drivers on Monday – TB6600 based boards.  These are 4A bipolar stepper drivers and are very cheap on ebay (£7 per motor).  The downside is this sort of ebay driver is often a bit rubbish – this thread documents some of the bad things.  The worst problems:

  • powering down the logic side of the circuit but leaving the motor supply connected = burnt out driver.  
  • output current is often less than it should be (due to shut-down circuitry)
  • circuit uses the a pin on the TB6600 to provide 5V to the supporting circuitry – but that pin is for a decoupling capacitor – it isn’t meant to provide power to anything else

So last night we decided to find out exactly what might be wrong with our drivers by tracing the schematic.  This way we can work out a method to avoid the worst problems and (hopefully) get reasonable performance out of our motors!

Here’s the schematic we came up with for our particular TB6600 driver – YMMV…

If you’re reading this because you’re interested in building your own robot, and you don’t fancy mucking about with reverse engineering cheap drivers from ebay, do yourself a favour and build/buy decent drivers – e.g.

It begins

Earlier this week, John and I had a design evening.  A few things were accomplished:

I brought in a THB6064 stepper driver (from my CNC machine) and we plugged that in to the stepper.  John had written an interrupt based arduino stepper driving program (complete with acceleration control!) and we tried that out.  We proved to ourselves that the motors we’ve chosen can indeed get to the maximum speed we’re after, but we’re still not sure about what acceleration they’ll give us.  We learned that smoothly accelerating steppers is important and that once they start to slip, you have to drop the step rate right down to re-capture the rotor.  More to do here we think…

We had a discussion about how we wanted the robot to look (which has a bearing on how the chassis should be constructed).  Likely answer (nothing is set in stone) is that we’ll have flat side panels with a nice profile cut out of the top instead of the flat plate we had last year.  This should mean we can make the robot a little prettier and possibly lighter too.  The downside is that its harder to make accurately, but if we use my CNC to carve out the side panels, we should be OK.

Material choice was also discussed – wood is nice and cheap, but heavy and needs painting. Metal is probably overkill, but we’ll certainly use it for anywhere we need to reinforce.  After a bit of googling, I discovered PVC foamboard – this is a plastic foam sheet with machining properties similar to wood, but less dense.  It can also be heat formed, so curves are possible. We’ve got a sample on order from ebay 🙂

Finally, we played skittles with the set we bought which was linked from the event page.  This gave us a clue how hard the ball will need to hit the skittles – and now I’d be very surprised if simply pushing the ball to the skittles was enough to get a strike.  Looks like we’ll need to design a mechanism to accelerate the ball…  Looks like Leo White has been doing the same thing 🙂

 

We got selected!

PiWars 2015 was so popular that they couldn’t accommodate all the teams. However, we were one of the lucky few to be chosen, so we look forward to seeing you there in December.

Now we’ve just got to build a robot 🙂

Here’s a shot of the motor test chassis (a very fancy name for a plywood sheet!):Metabot2 Motor test chassis

We’ll be using this to determine if the motors have enough grunt for the performance we want.  Initial indications are that the drivers (little polulu stepper drivers) get scarily hot when driving the current that the motors need for best performance – so we’ll need to figure something out or they’ll die.  Either a bigger heatsink for these drivers or bigger drivers.  Sadly, this also means we can’t yet tell if these motors are going to be up to the job either.  Maybe we can borrow some big drivers from somewhere?

Entering PiWars 2015

We had such a good time at PiWars last year that we’ve decided to enter again this year.  The application is in – now the nervous wait to see if we will be accepted.

In the meantime, we’ve been brainstorming approaches to the various events.  The Skittles event interests me particularly 🙂

On the hardware side, we’ve got a couple of stepper motors and a plywood baseboard screwed together as a test bed (we’d like to see if the motors are powerful and controllable enough for what we have in mind).

We’re planning to blog about it here – its going to be an interesting challenge writing about the journey without giving too much away!

PiWars 2014

Metabot

On Saturday 6th December 2014, a large number of Robots and their makers descended on the Institute of Astronomy at Cambridge University for the first PiWars!

We were lucky enough to snag a competitor ticket for the event when applications opened in September and the race was on to build a robot to compete in all the events. Our entry was Metabot!

Hopefully over the next week or two I’ll be updating this page and detailing how we went about creating Metabot. In the meantime here’s a compilation of the best bits to whet your appetite. Metabot is the large, black, chunky wooden robot which often has a ramp attached to the front.