<strong>Cyber</strong> <strong>Physical</strong> <strong>Systems</strong> – <strong>Situation</strong> <strong>Analysis</strong>DRAFT – March 9, 2012GPS absolute localization data with data computed by a vision system to provide accurate vehicleposition and orientation. Usually, one integrates the position and orientation data into a global referenceusing a map of the environment and then estimates localization parameters using a particle filter.Vehicle-state sensing is of a lower level and concentrates on measuring a vehicle‘s movement andmonitoring its actuators. Examples include detection of vehicle position, velocity, and acceleration;engine pressure and temperature; tire pressure; temperature; friction coefficients; and similar variables.In-vehicle environment sensing involves collecting information about the driver and the passengers; i.e.,behavior monitoring. Specific topics include monitoring the driver‘s eye movements, vigilance, andtiredness; the interaction inside the car; and so forth. Sensing inside the vehicle is equally important toout-vehicle sensing. The driver‘s diminishing vigilance level has become a serious problem in trafficsafety. NHTSA estimates that in the United States, drowsy drivers cause 100,000 accidents each year,resulting in more than 1,500 fatalities and 71,000 injuries (www.aaafoundation.org/). Among differentapproaches in this field, monitoring the driver‘s head position has received considerable interest. Thiscould be used to infer the driver‘s fatigue level (especially when combined with a driver-eye-gazetracking system) and implement a ―smart‖ airbag.Several researchers have proposed novel ideas for inferring driver fatigue. New ideas come from apsychology perspective that defines monotony as an exogenous contributing factor of fatigue. Anintegrated fatigue detection system uses driver-head-pose and eye-gaze tracking as well as road monotonyanalysis. NHTSA also pointed out that although airbags saved more than 6,000 lives by the end of 2000,they also killed more than 200 occupants through inappropriate deployment(www.iihs.org/safety_facts/qanda/airbags.htm). In response, NHTSA issued a set of regulationsmandating smart airbags that can adapt intelligently to the occupant. The head position algorithm must berobust to lighting conditions and uncontrolled driver postures. Infrared cameras can help eliminate thedisturbance of poor lighting conditions. Different signal processing algorithms can help deal withocclusions and the presence of other competing head-like objects in the scene (such as hands).There are different challenges in sensing out of the vehicle and in the vehicle. The major difference is thatthe illumination environment inside the car can be potentially controlled, while the outside illuminationenvironment is subject to the weather conditions. The challenges for the inside car analysis are related tothe analysis of human behavior and teams of engineers and psychologists are needed to address theseproblems in the future. Regarding out-of-vehicle sensing, the current trend of letting ―the sensors do thework‖—using sensing across the spectrum (e.g., infrared, thermal imaging)—is useful but restricted bythe difficulty of the motion segmentation problem. Specifically, the humans or other vehicles that will bedetected are usually also moving. Several ambiguities can arise such that it is not possible to detecthumans or other vehicles because of the ―special‖ way in which they are moving. To address these issues,one would require an approach in which information fusion from a variety of sensors eliminatesambiguities.Advanced user interfaces are also needed to convey to the driver the multitude of information that thesystem captures. Intelligent assistance systems can show drivers more information—for example, usingsmart-tire-monitor sensors to display the status of the wheels, or the space around the car, etc. As a result,information visualization, display placement, and viewing methods are currently topics attractingattention in the broad engineering community and the Human Machine Interaction disciplines. Thenewest ideas propose interfaces for maximization of information representation. The idea is to collapsemany of the separate dashboard controls, displays, and systems into a single multifunction display. Amore challenging idea is to switch the representation of information to match different driving situations47
<strong>Cyber</strong> <strong>Physical</strong> <strong>Systems</strong> – <strong>Situation</strong> <strong>Analysis</strong>DRAFT – March 9, 2012depending on context (e.g., city versus highway driving). Alternatively, it is easy to overload the averagehuman driver with information. Thus, it is important to investigate this topic together with psychologistsand ergonomists.AviationNAS, which is currently used to manage air traffic, is generally thought of as safe and effective but not ascoordinated and efficient as .possible. For this reason, the Federal Aviation Administration (FAA) hasenvisioned NextGen replacing NAS. While NextGen will not be fully implemented until 2025, substantialbenefits are already emerging. For example, in 2010 the FAA approved the use of Automatic DependentSurveillance-Broadcast (ADS-B). ADS-B provides both air traffic control and pilots on ADS-B-equippedaircraft with a constantly updated display of real-time traffic and other information such as speed,altitude, and aircraft type. ADS-B also provides weather information, allowing for heightened situationawareness. ADS-B technology will be required for aircraft in most airspace by January 1, 2020. 239Another developing technology in NextGen is performance-based navigation (PBN). PBN allows formore efficient design of airspace and procedures, which leads to improved safety, airspace access, andpredictability of operations, as well as reduction in delays, fuel use, emissions, and noise. The PBNframework defines aircraft performance requirements and provides a basis for the design andimplementation of automated flight paths, airspace design, and obstacle clearance, and is not constrainedby the location of ground navigation aids. The two main components of PBN are Area Navigation(RNAV), which enables more flexibility for point-to-point operations, and Required NavigationPerformance (RNP), which monitors the navigation performance of the aircraft and alerts the crew if anyrequirement is not met during an operation. Both RNAV and RNP allow the aircraft to determine whetherit can safely qualify for an operation based on the specified performance level. While the use of PBN isnot yet widespread, its benefits are already being seen at some airports. In Atlanta, arrivals that utilizedPBN procedures saved hundreds of thousands of gallons of fuel, and carbon dioxide and air pollutantswere reduced by thousands of tons. 240The use of UAVs is also indicative of an increase in autonomy in defense applications. UAVs are usedfrequently in military airspace and have even been introduced into civilian airspace for data collectionpurposes. 241BROAD CHALLENGES AND BARRIERSThere are a number of broad challenges, both technical and non-technical, that must be overcome beforethe envisioned smart transportation systems of the future can come to fruition. The challenges today aregreater than those faced in the past and will continue to grow as individual systems evolve, operate withgreater autonomy and intelligence, and operate as part of a networked system of systems. The challengeswill also increase in complexity as future generations of unmanned air systems begin operating in nationalairspace. 242239 FAA. ―NextGen Implementation Plan.‖240 FAA. ―NextGen Implementation Plan.‖ http://www.faa.gov/news/fact_sheets/news_story.cfm?newsid=8768241 Poovendran et al. ―2008 HCTCPS Workshop Report.‖242 Winter. ―CPS in Aerospace.‖48