06.01.2013 Views

Unmanned Aircraft Systems Roadmap 2005-2030 - Federation of ...

Unmanned Aircraft Systems Roadmap 2005-2030 - Federation of ...

Unmanned Aircraft Systems Roadmap 2005-2030 - Federation of ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

UAS ROADMAP <strong>2005</strong><br />

Displays that show intent, as well as the algorithms which develop the intent, must be matured.<br />

Currently ground-breaking work in this area is being undertaken by J-UCAS and AFRL; work needs<br />

to be accomplished to migrate this technology to smaller and less expensive systems. These displays<br />

must also show the operator what is going on at a glance, and must fit into the lightweight system<br />

requirements as outlined above. Additionally, significant work has been accomplished to improve<br />

man-machine interfaces in non-UA programs and these improvements (such as tactile stimulation to<br />

improve situational awareness) need to be investigated as part <strong>of</strong> the UA C3 and ground control<br />

processes.<br />

� Voice Control. One area that might not be receiving the attention it deserves is the capability to voice<br />

command the UA. Voice recognition technology has been around for years, but only recently has<br />

algorithm and hardware advances made it practical for small and critical applications. DoD Science<br />

and Technology (S&T) organizations continue to research and develop this technology. DoD<br />

programs can also begin taking advantage <strong>of</strong> developments in the commercial sector to have the<br />

operator interface with a UA via voice. Now is the time to harvest that research and apply it to<br />

reducing the complexity <strong>of</strong> command and control interfaces to small UA.<br />

� Multi-Vehicle Control. Advancing the state <strong>of</strong> the art in all <strong>of</strong> the areas discussed above allow a<br />

single person to control multiple aircraft. Highly autonomous aircraft have reduced requirements for<br />

ground equipment and communications and can leverage advances in displays and voice control. The<br />

benefits <strong>of</strong> this are reduced manpower, reduced hardware (and therefore logistics), and increased<br />

effectiveness.<br />

Flight Autonomy and Cognitive Processes<br />

Advances in computer and communications technologies have enabled the development <strong>of</strong> autonomous<br />

unmanned systems. The Vietnam conflict era remotely piloted vehicles (RPVs) were typically controlled<br />

by the manned aircraft that launched them, or by ground elements. These systems required skilled<br />

operators. Some <strong>of</strong> these systems flew rudimentary mission pr<strong>of</strong>iles based on analog computers, but they<br />

remained primarily hand flown throughout the majority <strong>of</strong> the mission pr<strong>of</strong>iles. In the 1970s the Air<br />

Force embarked on the Compass Cope program to develop a high altitude long-endurance system capable<br />

<strong>of</strong> reconnaissance at long range. The Compass Cope systems were still hand flown.<br />

In 1988 DARPA developed the first autonomous UA, a high altitude long endurance UA called Condor,<br />

with a design goal <strong>of</strong> 150 hours at 60,000 feet. This aircraft was pre-programmed from take<strong>of</strong>f to landing<br />

and had no direct manual inputs, e.g. no stick and rudder capability in the ground station. The system<br />

flew successfully 11 times setting altitude and endurance records. The level <strong>of</strong> autonomy in this aircraft<br />

was limited to redundancy management <strong>of</strong> subsystems and alternate runways. It demonstrated these<br />

features several times during the flight test program. Next came Global Hawk and DarkStar, which<br />

advanced autonomy almost to Level 3 (see Figure D-5); with real-time health and diagnostics and<br />

substantial improvements in adaptive behavior to flight conditions and in-flight failures.<br />

The J-UCAS program is extending the work being accomplished by these programs, advancing the state<br />

<strong>of</strong> the art in multi-aircraft cooperation. Decisions include: coordinated navigation plan updates,<br />

communication plan reassignments, weapons allocations or the accumulation <strong>of</strong> data from the entire<br />

squadron to arrive at an updated situational assessment. Cooperation in this context applies to<br />

cooperative actions among the J-UCAS aircraft. They will have inter-aircraft data links to allow transfer<br />

<strong>of</strong> information between them and the manned aircraft. The information may include mission plan<br />

updates, target designation information, image chips and possibly other sensor data. Key mission<br />

decisions will be made based on the information passed between the systems. The J-UCAS will still have<br />

all <strong>of</strong> the subsystem management and contingency management autonomous attributes as the previous<br />

generation <strong>of</strong> UA systems. The J-UCAS program plans to demonstrate at least level 6 autonomy. Figure<br />

D-5 depicts where some UA stand in comparison to the ten levels <strong>of</strong> autonomy.<br />

APPENDIX D – TECHNOLOGIES<br />

Page D-9

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!