In the 80s when I first started building robots, I found there were roughly three ways I could program my bot for autonomous navigation depending on the sensors. The first option was basic roaming where the robot would simply move in a given direction until a sensor detected something in its path, at which point the robot would choose a new path (based on where the object was detected). The second option was to use beacons or a line to guide the robot from one point to another.
Finally, a third method known as dead reckoning was demonstrated to me via a plastic toy tank created by Milton Bradley in 1979 (see Figure 1). This six-wheeled toy tank known as “Big Trak” was probably the most advanced toy I had seen in awhile. It used an internal optical wheel encoder to determine how far it moved and could execute commands telling it how far to go in a direction, how much to turn, and even to stop for periods of time.
Wide Open Spaces
In a living room or kitchen environment, the Big Trak could be placed in a fixed spot at a fixed angle and expect to navigate using dead reckoning with fairly good repeatability. However, the playing field opened up for vehicles that can move faster and farther such as radio controlled boats, planes, and even larger robots. In these situations, dead reckoning really isn’t that useful. To complicate things even more, most typical sensors don’t have the range to handle avoidance at the speeds these
vehicles move at. Lines become useless and beacons become impractical. In a larger arena, you really need something that works on a much larger scale.
Fast Forward
What wasn’t available to the general public 20 years ago has now become widely available and extremely accessible to the robotics hobbyist. GPS — or the Global Positioning System — can provide larger vehicles and robots a new method of navigation. GPS receivers are integrated into cell-phones, vehicles, and even camcorders now. For the hobbyist, GPS has become widely available from sources such as Parallax Inc for well under $100 (see Figure 2). Using GPS, a robot, boat, or plane can easily travel great distances autonomously with very little extra hardware.
How Does It Work?
GPS works by calculating a position based on signals received from a network of satellites in orbit around the earth. These satellites send their time of transmission, precise orbital information, and the signal strength and health of the satellites. The GPS receiver uses this information to calculate its position on the globe using trilateration. Like dead reckoning, GPS is prone to errors and a small error can cause a big change. At least three satellites are required for the GPS to have valid data with which to calculate positions; more satellites mean more accuracy.
Bringing It All Down To Earth
If you really want to understand trilateration, there’s an excellent article on Wikipedia that explains how GPS receivers do this. For our purposes, we will be dealing with a flat map and a much smaller area in terms of navigation. The GPS receiver will provide us with several key pieces of information such as latitude, longitude, altitude, and time. Most also provide heading and speed, however, I recommend only using the latitude, longitude, and time. Altitude on most GPS receivers is not nearly as accurate or consistent as position. Heading and speed are derived from changes in position and are also not very reliable. Also, since you have to be moving to get either value they’re useless when not moving.
Sensor Fusion
Adding a compass module such as the HMC6352 to your GPS receiver allows you to get a more accurate heading. This
compass (see Figure 3) can be obtained from Parallax as well. The first steps in fusing our data from the GPS and the
compass module start by knowing the destination coordinates, as well as the current coordinates. The destination coordinates are often known in advance and programmed into the robot ahead of time.
This same concept is the basis for the RoboMagellan competition where robots are given access to the GPS coordinates of the waypoints on the course, and the robot must navigate from its current position to the point or points on the map. Each waypoint is designated by a large orange traffic cone allowing the robot to confirm it has reached that waypoint visually.
Going Back To School
In order to calculate the heading to the next waypoint from the current position, you have to do a little math. To simplify things, let’s assume we won’t be crossing the equator or the prime meridian since this is unlikely anyway. By doing this, we can treat all latitude and longitude values as absolute. So, to calculate distance and heading you will plot a point at the coordinates you are heading to relative to the coordinates you are at. Looking at Figure 5, we will say that the point where x and y intersect is our home point. This represents the current coordinates. Point A represents the point where our waypoint destination is.
Essentially, what we’ve done is created a virtual right triangle which will allow us to get two important values. By treating all values as absolute, you can simply subtract the x values to get a and subtract the y values to get b. Using the Pythagorean Theorem, you can now calculate c. Remember, a2 + b2 = c2. Now we have our distance which can be used in a robot with wheel encoders to ensure we travel far enough. Now to calculate the heading.
Some trigonometry is involved in this step. In order to obtain θ (Theta), you will take the inverse of the formulae within Figure 5. This will give you the degrees of the heading you need to be on. Based on your current heading, you can then calculate the new heading. For the most part, you won’t have to worry about the calculations. They should all be tucked into a nice subroutine. About the only thing not covered here is some filtering to smooth out changes. Once things are dialed in, the robot shouldn’t veer much. As a safety, we can make it stop whenever there is no GPS lock if need be.
Final Thoughts
The specifics of handling the data from the compass module will vary depending on which compass module you use and how its data is formatted. The specifics of dealing with the distance data will vary some as well, depending on your microcontroller, drive system, encoders, and overall resolution.
In Part 2, we will explore a real world example of having a robot navigate through several waypoints to reach a destination. Based on the information provided here, I can almost see the light bulbs over your heads and the wheels turning (both figuratively and literally). You’ve got a month to implement these ideas. Until then, brush up on your math and have fun experimenting!
Resources
Discuss this project on Savage///Chats
This project was published in the March 2010 issue of Servo Magazine
Note:
THIS IS A RESTORED ARCHIVE PROJECT AND AS SUCH MAY BE MISSING PHOTOS, PARTS LIST, SCHEMATIC, SOURCE CODE, ETC.
PLEASE FEEL FREE TO LEAVE YOUR COMMENTS, QUESTIONS, SUGGESTIONS OR FEEDBACK ON THIS POST.
Leave a Reply
You must be logged in to post a comment.