Skip to Main Content
virtual Seminar/Symposium

The objective of this research is to fuse streaming data from multiple sources and identify rare events to alert the user and meet the mission requirements. The user can ask for specific information or the machine learning system will learn the needs/interest of user and forward new incoming data with a relevance score. Research questions such as trustworthiness of data, variation of data values from same source (such as sensor, video camera, user tweet, police incident report) are addressed due to uncertainty of data accuracy and noise.

Zoom Link

Abstract:

The objective of research is to fuse streaming data from multiple sources and identify rare events to alert the user and meet the mission requirements. The user can ask for specific information or the machine learning system will learn the needs/interest of user and forward new incoming data with a relevance score.

This research utilizes the best in Database systems, Knowledge representation, Machine Learning to get the right data to the right user at the right time with completeness and low noise. If user’s need is unmet, queries evolve and get modified to come close to satisfy mission needs which may themselves be unclear. If need is partially met, when new streaming data streams in, this research connects relevant data to queries. Application of this research are to assist in security at military bases and the “missing person” problem. When a suspect or a person is missing, police want to find him/her. The same problem arises in amber alerts, prison escapes, and missing children. When an incident report or 911 call arrives in police station, a physical description of the missing person (e.g., white male with medium built wearing a blue shirt, and black jeans) is available. Families may give additional details of a missing child. This research is improving the interaction of police when they deal with a person with mental issues.

We utilize neural networks to extract relevant objects from video and latent semantic indexing techniques to model topics for unstructured text. We present a unique Situational Knowledge Query Engine that continuously builds a multimodal relational knowledge base constructed using SQL queries and pushes dynamic content to relevant users through triggers based on modeling of users’ interests. At present data from West Lafayette police is being analyzed to provide identifying suspicious activity and deal with disasters such as school shooting.

https://www.cs.purdue.edu/news/articles/2019/bhargava-realm-ng.html

 

Biosketch:

Bharat Bhargava a professor of computer science at Purdue University. He is leading a Northrup Grumman sponsored consortium on Real Applications of Machine Learning (REALM) with MIT, CMU and Stanford. https://www.cs.purdue.edu/news/articles/2019/bhargava-realm-ng.html.

He is contributing to Department of Defense on The Science of Artificial Intelligence and Learning for Open-world Novelty (SAIL-ON) and another project with NGC on explainable AI and adversarial machine learning. He works with Sandia Corporation on science and technology for advancing resilience for contested space (STARCS) to maintain mission capabilities of the US Space Enterprise, Jet Propulsion Lab to predict attacks on space systems and Ford Corporation on software defined networking for Vehicle to Vehicle communication.

Prof. Bhargava has won eight best paper awards in addition to the technical achievement award and golden core award from IEEE, and is a fellow of IEEE. He is major thesis advisor of the very first African American woman to receive her Ph.D in the history of Computer Science department at Purdue in May 2019.  https://www.purdueexponent.org/campus/article_20698968-b471-11e9-bd28-9be90c622f07.html.

Professor Bhargava has worked extensively at research laboratories of Air Force and Navy. He has successfully completed several Darpa and Navy STTR and AFRL projects.  www.cs.purdue.edu/homes/bb

Call to Action, 1 Columnt Text Block

To date, DPI has released more than $1 million in R&D funding. Click below to see all of our current funding opportunities.

See all funding opportunities