ABSTRACT

The conventional focus of the data fusion community has involved using physical sensor sources such as visual and infrared imagery, radar, satellite, and acoustic sensor data to observe physical entities like troops, vehicles, weapon systems, or other objects. This type of sensor activity has been useful for performing tracking, situation assessment, and threat assessment for military operations [1,2]. Two recent factors have caused a major reassessment of this paradigm. First, military emphasis has largely shifted from conventional warfare to the challenges of counterinsurgency and counterterrorism [3]. Second, the emerging concept of human-centered information fusion [4] involves exploring new ways in which humans and computer systems can work together to address challenges to optimally utilize the capabilities of physical sensors, computer hardware and software, supporting cyber-infrastructure, and human beings [5–7].