The Robot My robot is based on the Rug Warrior Pro platform, but with quite a few modifications and improvements. Bellow is a list
of the standard rug warrior features, more information can be found here.
Motorola MC68HC11 microcontroller
Interactive C programming environment
LCD screen; two lines, 16 characters per line
32 K of battery backed RAM
RS-232 serial port
Full coverage collision detector
Dual photoresistor light sensors
Infrared obstacle emitters and detector
Microphone
Dual shaft encoders
Two channel motor driver chip
Two 6 volt gear head drive motors
Piezoelectric buzzer
The standard Rug Warrior platform is a very nice starter kit for hobby robotics, but after a while the processing
power of the microcontroller starts to be become a barrier on the algorithms that can be implemented. For that reason
I had to make a few additions. There are quite a few things that can be improved (e.g. sonar and compass placement),
but the last few months I was quite busy so I had no time to bother (and anyway the inaccuracies produced by the sensor
placement, can be used to test the robustness of the algorithm).
RF Link In order to take the processor intensive tasks off the on board microcontroller an RF link was required.
I use the Radiometrix BIM transceivers. These are quite easy to interface to a standard serial port.
One transceiver is connected to the 68hc11 serial port and the other to my Linux box, both running at 9600bps.
The BIM modules do not take care of retransmission and error detection, they will just modulate and
transmit the data received through the serial port. When working with RF there are quite a few things that must be
considered, first the link is unreliable and bit errors will occur quite frequently. Second the transmitted data stream
must have an even distribution of 1s and 0s.
I had to implement a simple error detection and retransmission protocol. Before the data are send to the serial port will
first be checksummed, then each byte will be padded with its complement to ensure equal distribution of 1s and 0s.
In order to avoid collisions a master slave architecture was used, the Linux box will send requests to the robot and the
robot will respond. If a response is not received the request will be retransmitted. The link is quite reliable within
about 30m indoors, the range can be extended to 100m if a better antenna is used (!).
The protocol was quite easy to implement in the RugWarrior using interactive C, when all sensors were sampled, I managed
to get a 3Hz status update rate from the robot, which is better than enough for my application.
Range Sensor The popular Polaroid sonar range sensor was chosen, mounted on an RC servo it is possible to take a 180deg sonar sweep.
Two PIC16C84 microcontrollers are used to drive the servo and sonar sensor. Both PICs are memory mapped in to the
68HC11 address space. The Polaroid sensors can measure distance from about 15cm to 10m with a 1cm accuracy. Of course
sonar ranging has certain disadvantages, in my opinion the ghost ranges is the most annoying. In certain situations
(especially in corners), the ultrasonic pulse may be reflected many times not directly back to the sensor. When the last
reflection finally arrives, due to the increased time of flight delay a long range is measured.
Digital Compass A digital compass is necessary to get an orientation fix since the RugWarrior wheel encoders suffer from noise when
the motors are running. In my experience the RugWarrior wheel encoders are not reliable for long range dead reckoning
(but are good enough when combined with a more accurate positioning source). The Vector 2x digital compass is used,
this compass gives up to 2deg accuracy and is controlled through an SPI port. Like any compass a magnetic field will
influence the reading. I found that when the motors are running a 5-10deg error is produced. Also large metallic
objects may affect the reading. Care must be taken to mount the compass vertically to the ground, since large errors
may be produced if the compass is inclined.
The PC Software All the control software is written in Java, my development platform is the following:
PII at 330MHz
64Mb RAM
Red Hat Linux 5.2
Sun JDK1.3
RxTx CommAPI (for controlling the serial port)
Jini 1.1
The following figure shows the software structure. In my implementation all the objects shown in the figure are
Jini services that communicate using RMI. One thing to keep in mind is that this is also a learning exercise
for me, so some things may seem redundant, the software is designed in such a way that the use of Jini or RMI is optional. In fact the demonstration
version that I am working on now (using the RP1 simulator) runs all objects on a single VM as a standalone application.
Robot Proxy This is a generic proxy that handles the low level robot control and communication. For example
there are methods for controlling the motors, turning the robot, sweep the range sensor etc. I tried to keep the interface
independed of my robot architecture so that it can be used by almost any mobile robot. All implementations must
implement the RobotProxy interface. My present implementation, among other things, handles the RF communication protocol
and translates high level commands to a sequence of low level commands. But the implementation can be whatever you
want as long as the RobotProxy interface is implemented. Presently I am working on a RobotProxy for
the Rossum RP1 simulator, that way it will be possible to develop and test the algorithms without even having
access to a robot. The rest of the software components will not even know that the implementation is not a real robot
since they only see the RobotProxy interface.
Robot Space Originally I was planning to use a JavaSpace as an object repository but later I found out that some features I needed
are not there, in addition I didn't want to tie my software to Jini. For that reason I implemented a simple RobotSpace.
When I started implementing the RobotProxy object I found out that certain objects like the RobotStatus and RangeSensorSweep
will be needed by many other objects. That means whenever the range sensor is sampled the results will need to be returned
to a number of interested listeners. One way to do that would be to register listeners with the proxy object and invoke
the callback whenever a range sensor sweep takes place. That would put some extra load in the proxy especially if one
of the listeners is not responding for some reason. A more robust architecture would be to use an object space (I am not sure if
my implementation qualifies as a space). Any object that needs to receive a result from an other object, first needs to create
a queue in the space, when the queue is created a Filter object is also supplied. Whenever a result is produced by an other
object (e.g. a Location object by the PositioningController or a RobotStatus object by the RobotProxy), it is written into
the space. The space will check if this result is accepted by any of the registered filters, if yes the result will be put in
the queue.
Positioning Controller The positioning controller is responsible for maintaining the global map and estimating the robot's present
position. The controller receives sonar sweeps and estimates the current position based on the present global map,
if enabled it will also update the global map. The current position is generated as a Location object that is
saved into the RobotSpace for further use by other objects (e.g. the ExplorationAlgorith). The positioning algorithm
presently used (Feasible Pose Localization Algorithm), is much faster if the approximate position is known. To do that the
common dead reckoning approach is used. The RobotStatus object is retrieved from the space and any position changes measured
by the wheel encoders are processed. That way a location is always available and at regular intervals a sonar sweep must be
taken to correct any errors produced by the dead reckoning approach. The positioning controller operates in one of a number
of high level modes (presently exploration and navigation). The mode defines the action that should be taken when a sonar
sweep is received.
Positioning Controller GUI This is the graphical front end for the positioning controller. It can be used to display sonar sweeps and
the global map in a variety of ways
Exploration Controller The exploration controller determines where and how the robot should go so that all accessible space is
properly explored and mapped.
Robot Space Logger The space logger receives and stores all objects send to the space.
That way it is possible to reproduce a sequence of events. This is mostly useful during testing, for example when
a modification on the positioning controller is made there is no need to redo the range sensor sweeps.
The logger can be used to store and repeat these sweeps.
If you have any questions or suggestions about these pages please drop me an email Copyright Vassilis Varveropoulos
Last updated 19/11/00