| Home | Robotics | HomeAutomation | Racing | 1/24 Models | Mustang | Family | PaPa |
Overview | Goal |
Updates | Hacking Remote |
Onboard Controller | Motor
Control | Odometer Encoders
Radio Link | IR Sensors | Eye Sensors | Sonar Sensor | PIR Sensor | Camera
Last Update: 11/27/02
For Christmas 2000 we ordered our then five year old a RAD 2.0 toy robot from eToys. Since the day or so before the big day the ordered robot had not arrived, I made a quick trip out of town to get a standby robot in case the mail order didn't make it. The mail order ones arrived two days AFTER Christmas so I was relieved I made the emergency trip. However, that left me with an extra RAD. Of course we could have taken it back or shipped the one from eToys back for a refund... BUT visions of both me and my son playing with our own RADs (Which we do and enjoy) and the possibilities of hacking one of them the decision was made just to keep it. So the project began. Generally this is a just a fun try different things project at this point.
Modify a R.A.D. 2.0 toy robot for autonomous and hobby use for entertainment and educational purposes. I'm still looking at using a external laptop to receive data from the BX24 via one-way RF data link and drive the robot based on that input. The onboard controller will read the encoder data, process all the sensors and then transmit this data back to the laptop for processing and decision making. Actually the BX24 will do the odometer calculations as well and send back X/Y data, heading, and all sensor information.
11/2002 - Encoders added, BX24 and H-Bridge added
Finally got interested in this again. Instead of driving the remote from a laptop, at this point I have an H-Bridge to control the drive motors by direct pulse width modulation from the BX24 controller. This provides better control and dead reckoning capabilities but may be bypassed once the data link is established. Read more below. Currently the base can roam about based on the dead reckoning targets provided and generally return to home within a few inches.
03/31/2001 - Camera Added
For the fun of it I stuck an extra wireless X10 color camera on top of the hacked RAD to see a 'robots eye' view. It's rather fun to drive the thing around using the 'tele-presence' provided by the camera.
02/25/2001 - Project Start
This web page. Idea has been there since early January 2001 but just now getting really started. Surprised there aren't more web sites/others that have done this.
The overall goal is broken down into several sub-goals as follows:
Hacking the Remote Control (Done)
Implement Motor Control (Done)
Implement Onboard Micro controller (Done)
Implement Onboard Wheel Encoders (Done)
Implement Onboard IR Sensors
Implement Onboard Light Sensors
Implement Waist and Arm Positional Feedback
Implement Sonar Sensor
Implement PIR Sensors
The first step of the project is getting control of the robot by the computer or micro controller. The existing 49Mhz or 27Mhz hand held remote control has been hacked to allow all existing functions of the robot to be controller by the PC. These functions include Left Motor 1) Forward and 2) Reverse, Right Motor 3) Forward and 4) Reverse, 5) Arms Open, 6) Arms Close, 7) Waist Up, 8) Waist Down, 9) Fire Missiles, 10) Talk, and 11) 'Shift' Function.
The Parallel port will be used to drive the remote control. Instead of using up multiple I/O lines the lower three bits of the parallel port will drive a 74LS138 3 to 8 decoder that will provide up to 8 digital outputs from three output pins. The 8 outputs will be used for non-drive motor functions since the remote does not allow any of these to be simultaneously utilized anyhow. In reality you only have seven controls since you need to not include the first output of the 74LS138 since it will be present with a zero on the output byte. We end up with the following lower nibble values to control these features;
0 - Nothing, 1 - Arms Close, 2 - Arms Open, 3 - Waist Up, 4 - Waist Down, 5 - Fire Missile, 6 - Talk, 7 - Shift Function
The Shift function is used conjunction with the Left/Right motor controls to turn on/off the audio and built in light effects the remote has which gets very annoying quickly so I'm sure that will be the first thing the system will do on startup.
The onboard controller is used to collect all the sensor data, make some basic decisions and transmit the data back to the laptop or desktop PC. I selected a BX24 controller due to it's expanded EPROM memory and on board floating math abilities. The BX24 is mounted in a home built carrier board that provides onboard voltage regulation for the BX24 and 5volt output for other devices. Headers have been provided with Gnd / +5v / IO Pin layout providing the ability to power any sensors attached. In reality the BX24 can be programmed to drive the robot by itself if the motors are connected to the BX24 which is the current plan.
The biggest down side to the BX24 is it's limited RAM of 400 bytes. Although 32Kb of EEPROM is great, when you try to implement odometry, dead reckoning, and any PID control the RAM problem comes up quickly.
For better control of the motors I am looking at using the signal coming from the remote control and inputting that data into the BX24 via digital input for the logic to control the motor. If that works out then I will leave the H-Bridge in the base, otherwise I'll remove the H-Bridge and let the remote drive the motors. Just not sure what is the best way to handle this at this time. If this doesn't work out I'll pull the H-Bride, hook the on board logic back to the motors and just let the laptop control the robot.
To allow any possibility of knowing where the robot is actually at, onboard wheel encoders were required and dead reckoning implemented. Obviously it will not be perfect but should provide enough data to allow approximate X/Y coordinates and heading for the robot. Since the robot is track driven, it will be somewhat of a challenge to get good encoder data but should be good enough for my purposes.
Originally the concept was to pick the signal off the drive or bogey wheels but that soon provided to be inadequate. Instead I ended up tapping off the secondary drive gear in the gear boxes to obtain the feedback. This provide a much higher resolution for the encoders and is much more stable.
If I end up using the remote control to drive the motors, I am still going to have to know which direction the motors are moving since the encoders are NOT quadrature based.
A TWS 433 / RWS 433 Transmitter / Receiver pair is to be used for the back link data stream which will be read by the laptop or PC to control the robot.
Initial object detected will be a pair of IR sensors mounted in the base of the robot. Since the base has two areas that angle outward it should provide a reasonable location to drive the IR detectors. Some simple Sharp binary detectors will be used with the emitters driven by the BX24 at this point.
If required, bump sensors may be mounted on the base of the robot to cover anything the IR sensors cannot see such as black walls, items, etc.
Two CDS cells are to be mounted near the eye areas of the robot providing left/right light level sensing for light following or avoidance. These will be connected to two BX24 ADC inputs for tracking the light data.
The initial goal of this step is to simply attach a feedback pot that can be read by the micro controller's Analog to Digital input to feed a basic position of the waist and arms back to the PC for correctional control. Without this the controller would have to guess where the waist and arms were at during any particular time period.
A sonar sensor scavenged from a $6 ebay camera will be installed in the front of the robot either where the existing missile system is or in the mouth area where the speaker is now. I am somewhat concerned that the arms might create reflections for sonar if the transducer is mounted in the missile spot.
To provide some additional fun for the robot two PIR sensors will be mounted possibly in the eye areas to detect and track human or warm bodied occupants in the area. Code will then be developed for how to react to this presence.
Just for fun I stuck this camera on top of the bot to see what it is like trying to drive the thing by watching the monitor only. I can sit in the living room chair and drive the bot around the house (inside and outside) by using the monitor only. Really funny watching the dogs reaction to the robot moving in her 'space' and me speaking through the speaker at her.
Click on the pic for a big view
|Robotics and EE WebRing|
|[ Prev. Site | Skip Prev. Site | Random Site | List Pages | Next 5 Pages | Next Site ]|
|BEAM Web WebRing|
|[ Prev. Site | Skip Prev. Site | Random Site | List Pages | Next 5 Pages | Next Site ]|
Proposed BX 24 Pin Configuration
|5||74ls138 Output Bit 0||Digital Output|
|6||74ls138 Output Bit 1||Digital Output|
|7||74ls138 Output Bit 2||Digital Output|
|8||Left IR Input||Digital Input|
|9||Right IR Input||Digital Input|
|10||Sonar Echo Input (trig by 138)||Digital Input|
|11||Left Motor Direction Control||Digital Output|
|12||Right Motor Direction Control||Digital Output|
|13||Left Encoder||Digital Input|
|14||Right Encoder||Digital Input|
|15||Waist Position||Analog Input|
|16||Arms Position||Analog Input|
|17||Left Eye||Analog Input|
|18||Right Eye||Analog Input|
|19||Left PIR||Digital Input|
|20||Right PIR||Digital Input|
|25||IR Emitter||Digital Output (Pulsed)|
|26||Left Motor PWM||Digital Output (PWM)|
|27||Right Motor PWM||Digital Output (PWM)|
FastCounter by bCentral