ebook img

NASA Technical Reports Server (NTRS) 19940029531: Ground vehicle control at NIST: From teleoperation to autonomy PDF

6 Pages·0.51 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview NASA Technical Reports Server (NTRS) 19940029531: Ground vehicle control at NIST: From teleoperation to autonomy

N94-34037 Ground Vehicle Control at NIST: from Teleoperation to Autonomy Karl N. Murphy, Maxis Juberts, Steven A. Legowik, Marilyn Nashman, Henry _chneiderman, Harry A. Scott, and Sandor Szabo Robot Systems Division National Institute of Standards and Technology Gaithersburg, MD 20899 ABSTRACT /gIST is applying their Real-time Control System (RCS) methodolo for contr ground vehicles for both the 1T_ ,_--- ,-, ... gY ol of ..... _,my ttesearcn Lao, as part of the DOD's Unmanned Ground Vehicles program, and for the Department of Transportation's Intelligent Vehicle / Highway Systems (IVHS) program The actuated ve ' military HMMWV, has motors for steerina br-t-,, .t._..,'.. _. _ hicle: a o, .- --,_, u,,utuc, ¢[c. aria sensors zor me dashboard gauges. For military operations, the vehicle has two modes of operation: a teleoperation mode where an operator remotely controls the vehicle over an RF communications network; and a semi-autonomous mode called retro-traversc - where the control system uses an inertial navigation system to steer the vehicle along a precrccorded path. For the .I'_S work, intelligent vision processin numan teteoperator to achieve autonomous, visually guided ro_ elements replace following. I. INTRODUCTION NIST's involvement in unmanned ground vehicles started in 1986 with the U.S. Army Research Lab's (ARL, formerly LABCOM) techbase program. This program became part of the Defense Dcparmcnt's Robotics Testbed program resulting in Demo I. NIST's responsibility included implementing a mobility controller and developing an architecture for unmanned ground vehicles (UGV) which would support integration and evaluation of various component technologies. [1,2,3] _o a typical scenario, military personnel remotelv ,.,n,,_),. ...... ! m an e • .' ."r"..... ---_,_, Robotic Combat Vehicles (RCVs) Up rator Control Umt (OCU). Each vehicle contains: actuators on the steering, brake, throttle, W,msmission, transfer case, and parking brake; an inertial navigation system; a mission package which performs target detection, tracking, and laser designation; and data and video communication links. The OCU contains controls and displays for route planning, driving, operationof themission package,andcontrol of thecommunication links. A typical mission includes a planning phase where the operator plans a route using a digital terrain data base. The operator then remotely drives the vehicle to a desired location as the vehicle records the route using navigation data. The operator activates the mission package for automatic target detection, and when targets arc detected, the mission package designates them with a laser. The vehicle then automatically retraces the recorded route, a process termed retro-traverse. In 1992 N/ST demonstrated vision based autonomous driving, expanding its vehicle control applications into the civilian area as pan of the Department of Transportation s Intelligent Vehicle/ Highway Systems (IVHS) program [4-9]. IVHS is a major initmtive academia to improve the atlnn'_ o.._ ......... of government, industry, and _ N...... ou,,a_.c uaas nataon s "" Advanced Vehicle Con_-,-,I e ...... ,.,,,,,-,_,, 1_. . ystems [10]. One IVHS comrmnent th,. •-,,, ,.,,vo,_,,,v_v_>), employs aavanced sensor and control technologies to 137 assist the driver. In the long term, AVCS will provide fully automated vehicle/highway systems replacing the human driver altogether. The use of vision-based perception techniques for autonomous driving is being investigated in many ted States as well as in other countries [11]. Use of machine vision as a primary programs in the Urn. ............... _,-,,-act is minimized relative to other approaches. sensor has prormse m mat me lnrrasu u,.,,_,. ,,,'v This paper describes the testbed vehicle and support van. It presents the RCS reference model architecture for an autonomous vehicle and its implementation on the NIST vehicle. The paper then briefly describes the applications of teleoperation, retro-traverse, and autonomous driving. 2. TESTBED AND SUPPORT VEHICLES The unmanned vehicle, a HMMWV, was actuated by NIST, ARL and the Tooele Army Depot as pan of the DOD's Unmanned Ground Vehicles program [1,2]. Figure 1 is a photograph of the testbed vehicle. The vehicle contains electric motors for steering, brake, throttle, transmission, transfer case' and park brake and sensors to monitor the dashboard gauges indicating speed, RPM, and temperature. m, Figure 1. Testbed vehicle followed by support van A mobile computing and communications van was prepared to support NIST's development work [6,7]. This van houses development and suoDort hardware, provides communication for operator eration, and con{ains the required computing systems to support lane control units during teleop ._.- .... tA,, .... _n- video imaoery is gathered by a camera on following on public roadways. L_unng lane zt,ii,.,_,,l _, _,. . • • the HMMWV and is sent by a rmcrowave hnk to the chase van. The smage mformataon ssprocessed in the van. Vehicle control commands are computed and then sent back to the HMMWV control computer via an RF data link. Although the ultimate goal is to mount all vision processing and vehicle mobility controller real-time computational resources on the test vehicle, a portable development and performance evaluation facility will still be necessary. 3. RCS CONTROL ARCHITECTURE One of the first steps performed by NIST to support its evaluation of autonomous vehicle component technology was to develop a reference model. The reference model describes what functions are to be performed and attempts to organize them based on a consistent set of guidelines [1,21. Figure 2 shows a portion of the reference model architecture for an autonomous land vehicle. Modules in the hierarchy are shown with Sensor Processing (SP), World Modeling (WM) and Task Decomposition (TD). The sensory processing modules detect, filter, and correlate sensory 138 information. The world modeling modules estimate the state of the external world and make predictions and evaluations based on these estimates. The task decomposition modules implement real-time planning, control, and monitoring functions. The roles of these submodules are further described in [8]. This reference model has not been fully implemented but has served as a guide throughout the years as various control nodes were completed and as the vehicle's capability increased from teleoperation to autonomous driving. Points Figure 2. Reference model control architecture for an autonomous land vehicle. The highest level of control for an individual vehicle, the Task level module, executes mission tasks phrased in symbolic terms, such as: Drive to exit 11 on 1-270. A vehicle may be equipped with several subsystems, such as navigation, perception, and mission modules, which are directed by the Task level to achieve certain phases of the task. The implementation for the U.S. Army Research Lab at Demo I used the lower two levels, Prim and Servo, of the mobility pan of the reference model architecture to perform the mission elements. The servo level mobility controller drives motors for steering, brake, throttle, transmission, etc. and monitors the dashboard gauges. Vehicle navigation sensor data (position, velocity and acceleration) is processed and used to update the WM in the lowest level of the navigation subsystem. This data is used for steering and speed control of the vehicle during retro-traverse. Extensions to the control system were necessary for implementing the IVHS autonomous road following [4, 5]. The lower two levels, Prim and Servo, on the perception side of the generic vehicle control system were developed. See Figure 2. The vision perception system uses a model of the lane edges to assist in the prediction and tracking of the lane markers on the road. The computed coordinates of the center of the lane are then used to steer the vehicle, in a similar fashion to retro-traverse. Additionalwork on carfollowingand collisionavoidance requirestheimplementation of thenext higherlevelof thecontrolsystem,Emove. In thiscase thecontrolsystem usesthevisualsurface featuresof therearoftheleadvehicleforlateral/longitudincaolntrolinordertoperform platooning [9].Eventually,theperformance of higherleveltaskssuch as obstaclerecognition/avoidanceand routeplanningwillrequirefurtherextensionstotheEmove and implementationoftheTask levelsof thearchitecture. 139 4. APPLICATIONS Teleoperation Although theultimategoalforroboticvehiclesisafullyautonomous system,controltechnologyhas notadvanced farenough torealizethisgoal.Some form ofoperatorinterventionisneeded,atleast partof thetime. For IVHS needs,thedriverresumes controlwhen theautomatic system can not function.In amilitarysettingthevehicleisunmanned and operatorcontrolrequiressome form of telcoperation. The ARL vehiclescommunicate to a varietyof operatorcontrolunits. One isa small suitcase controllerdeveloped by NIST forfieldtestingand iscalledtheMobilityControlStation(MCS). A second operatorstationishoused inatrackedvehicleand iscapableofcontrollingfourunmanned vehiclesatone time.This iscalledtheUnmanned Ground VehicleControlTestbcd (UGVC'r) and was developed by FMC fortheTank Automotive Command. Each system allowstheoperatorto controlallmobilityfunctions.High levelcommands areissuedusinga touch screendisplay.A graphicdisplaypresentsvehiclestatustotheoperator. Teleoperationissurprisinglydifficulth,ampered mostly,perhaps,by thedifficultyin perceiving motion from a video image. To aidtheoperator,severalareasarc being investigated:force feedback,graphicoverlays,and delaycompensation. Forcefeedbackofthesteeringwheel providestheoperatorafeelforroadconditionsaswellassense of turnrateand vehiclespeed. Unfortunately,closinga high speed forcereflectionloop places increaseddemands on an alreadyburdened communication link.A simulatedforcefeedback is being investigated.Here, vehiclespeed and theoperatorwheel positionisused to emulate the straighteningtorquethatwould bc felton thevehicle.The operatorcannot feelthebumps inthe road,but can geta senseof wheel positionand vehiclespeed. In addition,safetylimitscan bc imposed so thewheel isnotallowedtoturnpastalimitwhich isafunctionofspeed. Inmany situationst,heope.ratocran locateaclearpathinthevideobuthastroubledetermininghow much toturnthesteeringwt_cclm orocrtosteerthevehicleover theclearpath.To facilitattehis,we areusing a graphicoverlaytorepresentthepositionof where thevehiclewilltravelatthegiven steeringposition.The projectedvehiclepositionrepresentedin thevideo assumes a flatground planeand moves furtheraheadofthevehicleasforward speedincreases. F'mally,we areinvestigatingcontrollerdelaycompensation. During teleoperation,severalsteps occursequentially.The videocamera takesan image, itistransmittedtothecontrolstation,the operatormoves thesteeringwheel,thecommanded wheel positioniswansmittedtothevehicle,and theactuatorresponds. Each steptakesa finiteamount of time,adding,tothecontroldelay. This delaycan be very largeespeciallyforsome forms of videocompresslon. During thisdelay,the vehiclemoves and thelocationofthedesiredpathasspecifiedby.thesteeringanglechanges position relativetothevehicle.Using navigationsensors,thechange m positionduring thedelaycan bc measured and the locationof thedesired path relativeto the currentvehicle positioncan bc determined. Retro.traverse For retro-traverset,hevehicle'spath isrecorded during teleoperationallowing thevehicle to autonomously returnalongthepath.During Dcmo I,thisform ofnavigationallowed thevehicleto laya smoke screenand travelthrough thesmoke without theoperatorinput. Driving through a smoke screenrulesout theuse ofavisionsystem by aremote operator,butsome form of obstacle detectionisnecessaryin caseswhere vehiclesor humans wander onto thepath. A microwave sensorthatwould allowthevehicletodetectobstaclesisbeinginvestigated. The retro-traverspeath isstoredduringtheteachphase asa seriesofX-Y (orNorthing-Easfing) points. During theplayback phase,a goal pointisselectedthatison thepath and isa specified ' " using the "pure pursuit" method distance in front of the vehicle. The steenng angle Is computed 140 [12]. The operator specifies the desired velocity and selects an automatic turnaround manuver. The Modular Azimuth Position System (MAPS), an inertial navigation unit, is used to sense vehicle position and orientation. MAPS uses ring=laser gyros and accelerometers to determine vehicle motion. An interface board (called the Navigation Interface Unit) and software to integrate vehicle odometry with MAPS data was developed by Alliant Tech and used during Demo I. Details of the navigation portion of the driving package are in [3]. Autonomous Driving There are two low level functions required to drive a.vehicle down the road, stay on the road and do not hit anything. NIST has been developing a vision based perception system to perform these functions. The controller tracks the lane markers commonly painted on roadways and steers the vehicle along the center of the lane in the following steps. First, edges are extracted from the video image within a window of interest. Edges occur where the brightness of the image changes, such as where the image changes from agray road to awhite stripe. Then, quadratic curves that represent each of the two lane boundaries as they appears in the video image are updated. The system computes the coefficients of the curves using arecursive least square fit with exponential decay. The steering wheel angle that steers the vehicle along the center of the perceived lane is calculated using the pure pursuit method used for retro-traverse. Finally, navigation sensors compensate for the computation and transmission delay by adjusting the steering goal in accordance to the motion of the vehicle during the delay. More details of the vision processing andcontrol algorithms can be found in [4,5]. Figure 3 shows the various scenes obtained when applying awindow of interest to the road scene. The lateral position of the window of interest shifts in order to keep it centered on the road and its shape changes as afunction of the predicted road curvature. Figure 3. Road Scene, Window of Interest. Masked Road Scene. The Montgomery County DOT permitted NIST to test the instrumented vehicle on apublic highway. During these tests, autonomous driving was maintained over several kilometers (gaps in the lane markings atintersections prevented test runs of longer distances) and at speeds up to 90 Km/h. The vehicle has also been driven on various tests courses under weather conditions ranging from ideal to heavy rain, and under various outdoor lighting conditions including night time with headlights on. Besides following the road, an autonomous vehicle must track and avoid obstacles and other vehicles. In addition, if the system can track another vehicle, it can follow that vehicle, forming a plat.oon. P!a.tooning isenvisioned by the military to reduce manpower requirements. In the IVHS verston, vemc_es woma pmtoon attwo meter spacings, in order to increase traffic throughput. 141 An approach to vision-based car following was developed that tracks the back of a lead vehicle or a target mounted on the back of the vehicle [9]. Si.nce orientation is approximately.constant during car following, the algorithm estimates only the relative translauon ot me _eaa vemc_e. I ne system was tested using a video recording taken while the testbed vehicle was manually driven behind the lead vehicle. The system demonstrated tracking for vehicle separations of up to 15 meters. 5. SUMMARY NIST's roles are to evaluate component technology for autonomous vehicles and to work with industry and academia to advance the state-of-the-art. To perform such a task, an architecture has been developed that will allow incremental development of autonomous capabilities in a modular fashion. The low ievels of the control system have been implemented to support the DOD near term robotic tech base. That system was demonstrated at the 1992 Demo I. The control system was systematically extended to incorporate higher levels of autonomous capabilities to support further evaluations and developments in conjunction with the DOD tech base and DOT IVHS programs. 6. ACKNOWLEDGEMENTS The authors would like to thank Chuck Shoemaker of the Army Research Laboratory and Richard Bishop of the IVI-IS Research Division at FHWA for their support and direction. Also, thanks to Roger V. Bostelman, Tom Wheatley, and Chuck Giauque of NIST for their help and dedication to the program. REFERENCES 1. Szabo, S, Scott, H A, Murphy, K.N., Legowik, S.A., Bostelman, R.V., "High-Level Mobility C_ontroller for a Remotely Operated Unmanned Land Vehicle", Journal of Intelligent and Robotic Systems, "vbl 5: 63-77,1992. 2. Szabo, S., Murphy, K., Scott, H., Legowik, S., Bostelman, R., "Control System Architecture for Unmanned Vehicle Systems, Huntsville AL, June 1992. 3. Murphy, K., " Navigation and Retro-traverse on a Remotely Operated Vehicle," IEEE Sigapore International Conference on Intelligent Control and Instrumentation, Singapore, February 1992. 4. Schneiderman, H., Nashman, M., "Visual Processing for Autonomous Driving," IEEE Workshop on Applications of Computer Vision, Palm Springs, CA., Nov 30-Dec 2, 1992. 5. Nashman, M., Scheiderman H., "Real-Time Visual Processing for Autonomous Driving" IEEE Intelligent Vehicles '93 conference, Tokyo, Japan July 14, 1993 6. M. Juberts, K. Murphy, M. Nashman, H. Scheiderman, H. Scott, S. Szabo, "Development And Test Results for a Vision-Based Approach to AVCS." 26th ISATA meeting on Advanced Transport Telematics / Intelligent Vehicle Highway Systems, Aachen, Germany. September 1993 7. Juberts, M., Raviv, D. "Vision Based Mobility Control for AVCS," Proceedings of the IVHS America 1993 Annual Mtg., Washington, D.C., April, 1993. 8. Albus, J., Juberts, M., Szabo, S., "A Reference Model Architecture for Intelligent Vehicle and Highway Systems", ISATA 25th Silver Jubilee International Symposium on Automotive Technology and automation. Florence, Italy, 1992. 9. Schneiderman, H., Nashman, M., Lumia, R., "Model-based Vision for Car Following." SPIE 2059 Sensor Fusion VI, Boston, MA September 8, 1993. 10. "Strategic Plan for Intelligent Vehicle Highway System.s in the United States," Report No. IVHS-AMER-92-3, published by IVHS America, May 20, 1992. 1I. Schneiderman, H., Albus, J., "Survey of Visual-Based Methods for Autonomous Driving," Proceedings of the IVHS America 1993 Annu_,Mtg.,W_hingto.n, D.C: Apri!199?¢_,v_ -it- i2. Amidi, O., "Integrated Mobile Robot L:ontrol , M._. lnes_s, _ameg_e Mellon um _rs y, May 1990. 142

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.