since 10/24/2005
The Second Running, October 2005
What is DARPA?
The Defense Advanced Research Projects Agency (DARPA) is the central research and development organization for the Department of Defense (DoD). The agency manages and directs selected basic and applied research and development projects for DoD, and pursues research and technology where risk and payoff are both very high and where success may provide dramatic advances for traditional military roles and missions. DARPA's reputation for technical excellence stems from its determination to seek out the best solutions to problems through forward-looking investments in research and technologies. What is the DARPA Grand Challenge? Created in response to a Congressional and DoD mandate, DARPA Grand Challenge is a field test to accelerate research and development in autonomous ground vehicles that will help save American lives on the battlefield. The Grand Challenge brings together individuals and organizations from industry, the R&D community, government, the armed services, academia, students, backyard inventors, and automotive enthusiasts in the pursuit of a technological challenge DARPA's mission is to leverage ingenuity and research to develop transformational technologies that give our armed forces a decisive edge. One promising area is that of autonomous ground vehicles which will save lives by performing many of the hazardous battlefield functions that currently place our troops in harm's way. The first Grand Challenge was held in the Mojave Desert in March 2004, but there was no winner. The second Grand Challenge was held in Primm, NV on October 8, 2005, and the Stanford Racing Team won the $2 million prize by successfully completing 132 mile desert course in the shortest elapsed time under 10 hours – six hours, 53 minutes and 58 seconds. What is the purpose of the Grand Challenge? DARPA's mission is to create technologies for future application on the battlefield. In recent years, Congress and the Department of Defense have envisioned unmanned vehicles teamed with people to create efficient, integrated, agile and cost-effective forces that will lower the risk to American life. The Grand Challenge is a bold
effort to draw widespread attention to the technology issues associated
with autonomous vehicles and to generate discontinuous breakthroughs in
performance. The event is designed to attract and challenge the most
capable and innovative companies, institutions, and entrepreneurs in
America. Many of these organizations have never before been involved with
defense work, and their participation is helping the men and women in our
armed forces.
|
|
The 2005 DARPA
Grand Challenge is over, with Stanford
University's "Stanley" winning the race. Once again, 3rd St. R & D provided a large, multi-channel communications system, data backbone, logistic support, and systems integration support for the event. Although this is a technical article, we developed a great respect and affection for the Mojave Desert and its secrets. Several Biologists that were monitoring the desert for DARPA gave us significant insight into the animals and plants that inhabit the Mojave. We learned that a desert that appears barren and dead actually teems with life. So, bear with us as we throw in a few observations about it along with the technical. |
Deployment
Our equipment left our Missouri
Facility on July 10, 2005,
and System deployment in the Primm, Nevada area began on July 15, 2005. The system went
operational on July 24, 2005. - An elapsed time of 14 days including travel.
|
|
(L-R) Site Shelters, Mobile Repeater site, and shelter transport/placement vehicle |
Site Mechanical shop and Warehouse Trailer |
Upon arrival at the operations area, our first unhappy discovery was that lightning had started an 80,000 acre desert fire several days earlier. Although heroic efforts by firefighters had extinguished the fire, it had burned something on the order of 84 power poles which supplied power to one of our remote sites. |
Typical Desert Before the Fire |
Desert After the Fire |
These two shots were actually taken on either side of the same road on the same day Northwest of Goodsprings, Nevada. The Desert does not benefit from fire like a forest does, and a biologist told us that it will take years, if at all, for the desert to recover. This was a particularly hot fire due to the invasive grass (the yellow) that has come into the Mojave. It is woody and has a high fuel-load value. The Joshua Trees cannot withstand that kind of fire, and if their bark is breached they burn from the inside out.
|
So, as we deployed the sites, one of them that was not planned to be a "primitive" site suddenly became one. A rush call to Aggreko and Donovan Driscoll in their LA office, had us a generator with autostart delivered and ready to deploy in less than 24 hours. |
Event Site Support Beyond radio and network support, 3rd st. R & D provided
site support for the buildings and services required to run the event. SAI, Inc. arranged for GE Capital to provide a 60' triple
wide office trailer for the Challenge Operations Center (COC), and a 60' double wide
office trailer for the Network Operations Center (NOC) and 3rd St. R & D's Voice Comm
Office. 3rd St. R & D provided power distribution for the COC/NOC complex and the adjacent 3D display facility by Solid Terrain Modeling.
3rd St. R & D provided a 125A 3-phase 480V service consisting of 600' of camlock feeder from the Buffalo Bills Casino generator compound to our power distribution hut adjacent to the COC/NOC complex. A 150KVA step-down transformer could provide up to 250A of 120/208 volt 3-phase power by way of our distribution system to the buildings. There were approximately 10kw of switching power supplies for computers and other electronics running in the buildings, generating powerline harmonics that could have caused problems in a 600' supply line. The Delta-to-Wye transformer absorbed these harmonics and kept the power clean. The over-rating of the transformer became important when ambient temperatures reached 117 degrees F. Even with forced-air cooling, the air leaving the power hut was too hot to put your hand into for more than a couple of seconds. Air conditioning load for 50 people in 115 degree heat and the other electronics added significant power load. A 150kw backup generator and automatic transfer switch from Aggreko were integrated into our distribution, since the 480V source was not on the casino backup circuit. |
Voice Communications and Data Backbone System Configuration The communications system was arranged as a four-site, six-channel, conventional UHF voted system. The sites were connected in a chain via ADTRAN TRACER 4 x T1 microwave radios. ADTRAN TSU600 T1 channel banks with drop and insert modules and 24VDC power supply options delivered audio and keying signals to and from the sites via one of the T1's. Two additional T1's were utilized as two separate data networks. The data network is discussed further in the article. Visit Adtran's website for more information on these products. For the 2004 event, we had the good fortune to have existing hard sites at all the necessary locations. For the 2005 Event, two of the four sites were existing hard sites, the third was a portable site at the operations center, but the fourth was a primitive location in the saddle of a mountain pass. By primitive, we mean a difficult road even for 4WD and no AC power. Microwave Network We really cannot say enough about the reliability and ease of use of the Adtran TRACER series of 5.8ghz microwave radios. Tracers were used exclusively in the networks, which beyond the voice and data network between the sites, provided several "last mile" links to eliminate the need for temporary copper or fiber runs across unprotected ground, substantially improving reliability in SPF (single point of failure) situations. Last Mile The Internet access and telco circuits for the NOC/COC complex in Primm terminated at Sprint network interfaces deep within the Buffalo Bill's Casino complex in a secure data/voice connectivity room. That was as far as Sprint could construct the services. A 300' run of multipair copper in the ceiling of the complex basement got the circuits to a point where they could exit the building. It was an additional 1/8 mile from the exit point to the NOC/COC complex. Tracer 4 x T1 microwave was used to deliver two Internet T1's and a POTS T1 across the "air gap" between the Casino complex and the NOC/COC. In the four months of continuous operation of this link, no failures occurred due to the microwave radios themselves. (A janitor unplugged the AC to the casino end microwave rack one day, but.......) During the NQE (National Qualifying Event) in Fontana, California, a Tracer system was used again to provide last mile connectivity from the point where Internet T1 access was available over to the TOC (Track Operations Center) in the racetrack infield. |
|
Site Network |
|
The Course and Operations area
covered approximately 500 square miles, and the radio system covered approximately 1200
square miles. The 4x T1 network that provided connectivity between the communications sites and the Primm NOC took a round-robin path, connecting the sites in a chain. The path lengths were 7 miles, 23 miles, and 21 miles respectively. Again for 2005, we used Andrew 4' solid surface coax-fed antennas with excellent results. With +20dbm out of the Tracer radios, we were able to achieve full saturation of the paths after the final round of tweaking right before the event. The Andrew antennas supplied by Talley Communications, which we modified for easy installation in temporary environments, stood up under constant wind velocities which exceeded 50 mph and gusts in excess of 60 mph during the monsoons. |
|
Batteries as Prime Power To insure system operability in the event of power outages, which are a common occurrence in the Mojave, we had decided to run all the equipment at the sites on 12VDC and 24VDC prime power. At sites with AC availability, the utility power would run the air-conditioning and SENS float/recovery chargers which would keep battery banks up. At the primitive site, power consumption would be critical. Fortunately, it would be necessary to run a generator often at that site to keep the interior of the equipment shelter cool with air-conditioning. As long as the external environmental temperature was elevated (not a problem in the Mojave with July daytime temperatures over 110 degrees), the air-conditioning would call for cooling on a regular interval, and when the generator would run for cooling, it would also run the charger and keep the banks up. Each receive site contained a rack of six voting receivers, a receiver preamp/distribution amplifier, two network routers, a T1 channel bank, a vehicle tracking network RF node, a laptop computer which formatted the tracking data for transmission downrange over one of the networks, two microwave radios, and telemetry equipment to transmit critical site electrical parameters downrange. All told, the 24VDC load was a constant 10 amps, and the 12VDC load was a constant 4 amps. At the transmit site, this same equipment complement plus six 100W Motorola MTR2000 repeaters were on line. With all six repeaters transmitting, the 24VDC load at this site was about 80 amps. Battery bank condition and the presence or absence of AC power would be critical information to have. |
What's Going On At The Sites? With the sites distributed over a wide area, and travel to some of them being time consuming or treacherous, we did not want to make those trips any more often than absolutely necessary. We would also have to break off a crew and additional equipment to go to Telluride, Colorado for the Telluride Film Festival for a ten-day period centered on the Labor Day Weekend.
Since the DGC system in Nevada would be turned on and largely unattended during the time we were in Telluride, we had the requirement to monitor and evaluate critical site parameters remotely. We also needed the capability of overriding automatic systems. When we returned to the Nevada event site, we would still need that monitoring capability. The parameters monitored included battery bank voltages, battery bank charging currents, Utility AC voltages, generator run status at the primitive site, and fuel levels at the primitive site. Remote control capabilities were chiefly the ability to remotely start or shutdown the primitive site generator. We turned to IOTech and their line of networked PointScan data acquisition modules. At all the sites, PointScan 104 analog modules reported the analog voltage and current levels, and at the primitive site, an additional 129 digital module provided contact closure sensing and actuation to monitor thermostat activation and provide start/stop control for the generator. The data gathered by the IOTech modules was packaged as UDP traffic and routed via a second T1 in our microwave network downrange to the Network Operations Center (NOC) and into our Comm center. IOTech also provided DasyLab software which allows a user to build visual control panel surfaces with switches, analog and digital movement meters, and perform actions based on the values which are reported. A computer at the Network Operations Center (NOC) in Primm was set up to run these applications, and it could be accessed through a remote desktop connection over the web. While at the Telluride Film Festival in Colorado, we could call up DasyLab and several instances of Hyperterminal to watch the site parameters and monitor the microwave circuits in Primm, Nevada for deterioration and performance. This screenshot shows the DasyLab Application. |
||||||
This screenshot shows an actual operating
condition where we have forced a generator run with remote start. The high
charge current (22 amps) at the primitive site indicates that the
batteries were in need of charge. |
In this shot, each of the three remote sites
are arranged in vertical columns. Note the remote fuel gauge in the third column. Aggreko
personnel were very helpful in supplying specifications for the signal feeding the fuel
gauge on the generator. The fourth column is the thermostat status at the primitive site.
There are two thermostats, one which is tied into the IOTech networked modules, and a
second which is arranged for "brute force" control of the generator. If anything
happened to the network, the brute force thermostat would start the generator at a
temperature value slightly higher than the network thermostat, and shut the generator down
at a temperature value slightly lower than the network thermostat. A feature of the PointScan network which was of particular value was the ability to automatically open all contact closures in the event of loss of network connectivity. This feature would prevent uncontrolled generator runs if there were to be an interruption in network connectivity. The fifth column contains an annunciator which indicates that the generator has actually gotten a start signal, and a manual start button and annunciator which can be used to manually start the generator. This capability became more important as the weather began to cool in September and October. On cool nights, it could be several hours before the temperature inside the shelter rose enough to cause a generator run. Values which are above or below a desired range display in red. There was also another level of redundancy using Omron programmable timers. If there had not been a generator run for four hours (40 amperehour drain), the timers would force a one-hour run (35 amperehours of charge) to bridge the batteries just in case. Thanks to Dave Bishop of Dave's Two-Way for building these little beauties. Dave actually built two sets of these timer arrays to solve a temporary problem at the mountaintop site while power crews rebuilt the power lines. Simple thermostat control of the generator at this site was not working out, because that site was at 6200 ft above sea level and the temperatures were cool enough that the generator would not run often enough to keep the batteries charged. Also displayed are multiple instances of Hyperterminal. Each instance displays the status of an Adtran microwave radio pair at selected points in the microwave network. |
Tracking Data Network and Site
Supervision Network While
all robot vehicles ('bots) competing in the Grand Challenge were autonomous and not remote
controlled, it was still necessary to be able to assert emergency stop and pause commands
in the event that a 'bot left the course, went out of bounds, or in case a faster 'bot was
to be allowed to pass a slower 'bot. Each 'bot was followed along the course by a Dodge
Pickup Control Vehicle which could run, pause, or disable its companion 'bot. DARPA utilized a vehicle emergency stop and tracking system designed and built by OmniTech Robotics, and supported by SRS Technologies. One of the features of the E-Stop system is that it transmits accurate vehicle location several times per second from every E-Stop equipped vehicle on the course. The tracking system is near real-time, with position updates on the order of 500msec. Each of the four tracking nodes could be receiving data from a substantial number of vehicles at any time, with the probability that multiple nodes would be receiving the same data and sending it downrange, so the aggregate data traffic across the network could easily equal 4 times the actual data. There were 20 competitor vehicles and 20 control vehicles all sending location data, resulting in a substantial amount of data being gathered and transmitted back to the COC. At each of our comm sites, an E-Stop node gathered tracking telemetry and packaged it for transmission to the COC as UDP multicast traffic. The node consists of an E-Stop data radio and a laptop computer. The laptop conditions the data and acts as an RS232-to-IP translator. To power the laptops efficiently from 24VDC prime power, we used Lind Electronics DC-DC converters. These converters gave us a substantial efficiency advantage over an inverter or UPS. The data was transmitted on the third T1 in our microwave network to the NOC where it was massaged to strip duplicate packets before being routed to the COC for display on 43 judges computers and large-screen situational displays. At each site, an ADTRAN 1224R integrated 24-port switch/router with dual T1 NIMs (WIC's in Cisco-speak) handled the high volume of data without a hitch and relayed the data downrange to the next router in the chain. We utilized the 24VDC powered variant of the 1224 since all equipment at the sites had to operate on either 12VDC or 24VDC. This makes backup power much more efficient by avoiding the inefficiencies of inverters or UPS's. For the Site Supervision Network Data (DasyLab and Microwave management), each site also contained a second ADTRAN 1224R 24-port switch/router which operated over a functionally identical, but isolated, parallel network on the second T1 in the microwave network. Each TRACER microwave radio has an RS232 supervision port. To get the RS232 data downrange, we used DigiOne RS232-to-IP device servers from B&B Electronics. These handy little gadgets take RS232 data and package it for transmission over an IP network. They are the perfect thing to use for an RS232 administered device which does not have SNMP capability; or in the case of the microwave radios, a physical layer device. It was possible to "see" every microwave radio in the network at each end of each leg of the network. We were also able to remotely access the repeater programming and diagnostics with the Motorola Radio Service Software (RSS) using these devices. At the COC/NOC network center, a Cisco 3725 and a 2611 provided by DARPA and administered by Mike Amato of SRS handled outside network access and master sifting of all data arriving at the NOC. We brought in Cory Horn of ECFBN, Inc. to do our router configurations and network administration, and Cory was a key player in working with SRS to configure and administer the Adtran and Cisco Routers. We cannot say enough good things about Cory and his talents. He is a pit bull who will not eat or sleep until a problem is solved, or a configuration complete. He is the consummate team player. Tracking Data from the nodes at our sites was routed to SRS and inserted into the COC network. At the COC, a tracking and display application written by SRA International displayed the data as individual icons superimposed over a satellite photo of the course area. 3rd St. R & D's task in the tracking network was to install and optimize the 900mhz antenna systems at our tower nodes, and on the control vehicles. We also worked with competitor teams to make suggestions to optimize the performance of the E-Stop systems installed on their vehicles. When all was done, the tracking system performed beautifully. Over the whole 130+ mile course, covering a land area of approximately 500 square miles, there was an aggregate total of approximately 1000-1500 feet where tracking data was intermittent. Pretty amazing for 1-watt data radios mounted on vehicles in mountainous terrain. |
Radio System and Inventory
Motorola. What else do we have to say? |
The RF Infrastructure
subsystem was composed entirely of Motorola equipment, and was arranged as a
six-channel analog conventional voted system. The core of the system was Digitac Voting Comparators which acted as central controllers and audio distribution/processing. Four sites were used in the DGC System: |
|
Primary Transmit/Receive Site
|
The primary transmit/receive site consisted of six 100W MTR2000 UHF repeaters which were running in repeater configuration with Spectratac operation enabled. Normally, you would expect that repeaters used in a voting system would be configured as base stations to allow the system comparators to have control of them. In our case, there was the outside chance that there might be a system failure in the comparators, in the microwave system, or some other catastrophic event unrelated to equipment failure such as vandalism, and the repeater function had to continue to operate. With the MTRs configured as repeaters, and transmit keying priority given to wireline/TRC, the repeaters would momentarily keyup in cabinet until the TRC signal from the comparators occurred about 100msec later whereupon the comparator keyup signal would take control. A longer repeater hang-time in the local keyup configuration would immediately alert us that connectivity had been lost and that the repeaters had reverted. In this configuration, it was also possible to take the control system or microwave system down for module replacement, tweaking, or adjustments without disabling repeater operation, although portables in the field would be impaired while the voting sites were down. Due to the reliability of all the systems, this scenario did not come into play except in the stress testing of the systems where catastrophic failures were simulated. |
|
|
The two remote receive sites at the mountaintop location and the primitive location each contained six SpectraTac receivers. It's hard to imagine a more stable and reliable satellite receiver platform than these proven performers. However, we look forward to a future refit with MTR's, as all of our remote sites will be converted to transmit/receive sites so they can function as independent standalone systems. The MTR's will have the added advantage of programmable audio levels which will not be affected by transport shock like the SpectraTacs can be. | |
|
Mountaintop site microwave paths to the primitive site and the primary site |
COC/NOC receive site |
The fourth site at the COC/NOC complex was our self-contained mobile repeater/tower site with six Motorola MTR2000's programmed as base stations. Programmed in this way, the MTRs acted like receivers and were prevented from keying locally. After returning from Telluride Film Festival over Labor Day, this site was integrated into the Primm system for the DARPA event. During the NQE event in Fontana, California, a seventh MTR2000 in Fontana was integrated into the Primm System with VoIP over the Internet. More on this interesting VoIP experiment later in this article. A user with a walkie in Fontana could talk to a user with a walkie in Nevada. |
The mobile radio subsystem consisted of 34 Motorola Spectra 110W A7 UHF mobiles and 32 Motorola GM300's and Maxtrac 300's. In each control vehicle, two mobiles were installed. The Spectra was the primary comms radio, with the GM300/Maxtrac as a backup; but the main purpose of the GM/Maxtrac radio was to continuously monitor the Course Safety Channel. Radio traffic was very very busy on the "Course Control" channels, and the race director could use "Safety" as a quick-call channel if (when) the primary channels were busy. 3rd St. R & D vehicles were also equipped with Spectras, and day-of-event were stationed at strategic high ground locations to provide a tertiary backup Comm path by relaying simplex transmissions if "the whole Comm world came to an end". Didn't happen. The portable radio inventory consisted of 125 Motorola portables and a selection of audio accessories such as surveillance kits, high-noise-level headsets for use in the helicopters, and lightweight headsets for use around lower noise areas. The Zone switching function allowed HT1250 portable users to easily switch back and forth between the Primm channels, Fontana channels, and third party channels. Six repeater and fourteen simplex channels were in use. The radio system had a near full-quieting coverage area with walkies of approximately 1200 square miles in difficult desert mountain terrain.
|
VoIP Dispatch Subsystem
Once again the incredibly flexible Vega/Telex
VoIP system solved a difficult problem: Q: How do you support 43 sit-down radio users, each of which requires access to, and simultaneous monitoring of, any of six radio channels? And without generating an RF environment that can wreak havoc on a concentration of computers and video projectors all concentrated in a 60' x 60' space? A: With Vega C6200 consoles and the Vega C-Soft VoIP virtual dispatch console application. |
The same laptops from the 2004 event were used, which still had C-Soft installed. An update to the control surface configuration quickly made them ready for 2005. |
This panoramic view shows the COC and all of the C-Soft/tracking operator positions, as well as the display screens at the front of the room for tracking data situational displays prior to installation of all positions. |
Thirty judges positions and thirteen support positions in the COC and NOC were equipped with laptops which concurrently ran the SRA Tracking Data Display application and the Vega C-Soft application. The user could share the screen with Tracking and C-Soft, or minimize C-Soft for a full-screen display of tracking. While minimized, C-Soft transmit is operated by a custom foot pedal interface provided by 3rd St. R & D. The user only need bring C-Soft back to the top if an adjustment or channel change was required. The user could also play back the last 10 seconds or 20 seconds of activity on the selected radio channel if necessary. |
A pair of Vega C6200 dispatch consoles provided supervision of the radio system in 3rd St. R & D's Comm center which was adjacent to the NOC and COC. This photo was taken before the second console was integrated, while it was in Telluride, Colorado. |
Tying Widely Separated Systems Together Over Internet Connectivity |
While the systems at DARPA Grand Challenge and Telluride Film Festival were both running, and in preparation for the tie between Fontana and Primm, we did a concept demonstration to show how widely separated systems can be joined with VoIP and some router magic over Internet Connectivity. Separate Vega C6200 Consoles, each with VoIP and analog capability, were utilized in the Grand Challenge and Telluride systems. The analog interfaces were connected to the radio systems on each end, and the IP connections went to the Internet. The two consoles were arranged so that each had its local system in channel positions 1-6, and the remote system in channel positions 7-12. Either console could control both systems, and the Crosspatch function could bridge the systems together. A user on the street in Telluride could talk to a user in the desert in Primm, Nevada; either with radios or with laptops running C-Soft. Three different configurations of C-Soft were used: |
- The Primm laptops were running the 6-channel version
of C-Soft with the DGC channel layout. - Two of the Telluride laptops were running the 6-channel version of C-Soft with the Telluride channel layout. - A third laptop in Telluride was running the 12-channel version of C-Soft with both systems in a common layout. |
The two consoles also had private intercom capability between them without going out over the air in either system. The two radio systems were very similar in configuration, but also significantly different: |
- The DGC system was controlled with
Digitac Comparators, while the Telluride system was controlled with SpectraTac
Comparators. - The DGC system utilized Class C DIA T1 connectivity to the Internet, while Telluride utilized a cable modem from Adelphia. |
The intent of the demonstration was to show the
applicability of the Internet cross-tie, and to do it with very different types of
Internet connectivity. The cable modem connection had more than adequate bandwidth to
handle significant VoIP loading. What surprised us was the almost complete lack of latency over the Internet connection. Everyone has had the irritating experience of talking on digital cellphones and dealing with the latency that is unavoidable. And, if you've ever heard radio traffic audio bleeding across a cellphone call, you have experienced the delay between the real-time radio audio and the delayed cellphone audio. However, while testing the system tie across, we noticed that the latency of the VoIP radio system connection was much less than that of a cellphone call between Primm and Telluride, so much so that the radio audio bleed into the cellphone call was nearly simultaneous.
|
Teardown and Retrograde
The Voting System was taken off-line on October 9, 2005 and tear-down
began. The system repeaters remained in operation through end-of-day
October 10.
The last-mile microwave hop remained in operation through October 12 so that DARPA personnel could complete their inventory and have internet access. Teardown and pack was completed and the trucks left Nevada with the whole system on board on October 14, 2005. |
|
|
The 2005 DARPA Grand Challenge was a rousing success, not
only to DARPA, but to 3rd St. R & D. The only real disappointment is that somebody won the race, so we won't get to do it again. We had an extraordinary opportunity to develop new techniques and to provide Comm support to an important event. 3rd St. R & D would be remiss if we did not mention and thank all the vendors, suppliers, manufacturers, and individuals who contributed to our team and made the project possible: Motorola - Radios and Infrastructure - Jerry
West at Motorola; and Carter, Milton, Mark, John Sr. and John Jr. at Radiophone
Engineering And the extraordinary people, agencies, and contractors who made up the overall Grand Challenge Team and who threw one great event. Thanks for having us on your team:
|
3rd St. R & D is available to consult on land mobile radio designs and installations; and advanced networking disciplines in support of those systems. If we can be of service to you, please give us a call. Our deployable services are available for Disaster Restoration, Special Events, or any application where you need serious communications quickly. Thanks for visiting. If you enjoyed the article, drop us an email and let us know. |