Information Fusion and Visualization | College of Information Sciences and Technology
Close Open

Please Update Your Browser.

It is recommended that you update your browser to the latest version to view the website's full experience.


Information Fusion and Visualization

You are here

About: Multi-sensor information fusion seeks to combine information from multiple sensors and sources to achieve inferences that are not feasible from a single sensor or source. The proliferation of micro and nano-scale sensors, wireless communication, and ubiquitous computing enables the assembly of information from sources including physical sensors, humans acting as observers, on-line data sources, and model data.   This information can be used for a wide variety of applications such as: environmental monitoring, crisis management, medical diagnosis, monitoring and control of manufacturing processes and intelligent buildings.  A key problem is how to integrate or fuse information from heterogeneous sources.  Techniques for such information fusion are drawn from a broad set of disciplines including: statistical estimation, signal and image processing, artificial intelligence, and information sciences.  Major issues involve architectures for distributed sensing and processing, selection and integration of algorithms, the role of the human-in-the-loop for analysis and decision-making, degree of automation and computer-aided cognition.   A related research area, data visualization, seeks to explore how to use advanced visualization and human-machine interaction to support understanding and analysis of large and complex data sets.

Areas of Strength: The Center for Network-Centric Cognition and Information Fusion (NC2IF) provides a focus on sensor and information fusion and data visualization.   Researchers associated with the center explore the information chain from energy detection via sensors and human observation to physical modeling, signal and image processing, pattern recognition, knowledge creation, information infrastructure, and human decision-making—all in the context of organizations and the nation.  See  Our research focuses on the gap between the collection of reports and data in computer systems and the knowledge and decisions in the minds of computer users.  The NC2IF is also host to the Extreme Events Laboratory (EEL) illustrated. The Extreme Events Laboratory is designed primarily to support research and experimentation in the areas of hard and soft data fusion, visualization, and sonification (viz., the translation of data into sound to use human hearing for data analysis, classification and anomaly detection).   This facility allows our researchers to run end-to-end experiments that improve situational awareness and enhance our ability to optimally leverage all available sensors, human observers, and technology in order to escape "information overload" and extract the true meaning hidden  within the vast mountains of available data.

Faculty: Guoray Cai, Nicklaus Giacobe, Jacob Graham, Sharon Huang, Jeff Rimland, John Yen, Luke Zhang