Author/Creator ORCID




Information Systems


Information Systems

Citation of Original Publication


This item may be protected under Title 17 of the U.S. Copyright Law. It is made available by UMBC for non-commercial research and education. For permission to publish or reproduce, please see http://aok.lib.umbc.edu/specoll/repro.php or contact Special Collections at speccoll(at)umbc.edu
Access limited to the UMBC community. Item may possibly be obtained via Interlibrary Loan through a local library, pending author/copyright holder's permission.
Access limited to the UMBC community. Item may possibly be obtained via Interlibrary Loan thorugh a local library, pending author/copyright holder's permission.



Artificial Intelligence (AI) solutions in various domains have become an integralpart of everyone’s life, and it is undeniable that smart usage of AI solutions in disaster management can provide extremely relevant information that can save lives, minimize infrastructure damage by predicting the disaster-related events, and extracting timely information for crucial decision-making. We witnessed numerous disasters throughout the United States and in our neighborhood that required reliable, authentic, and quickly processed information to prioritize rescue and resource allocation tasks during emergencies. In this work, we propose a scalable holistic framework capable of handling environmental sensors and social media sensing for disaster management. We investigate and provide multiple AI solutions to assist in resource management during an emergency. We explore the deeper sentiments of social media users during an emergency event using an emotional change point detection algorithm to extract contextual information utilizing an unsupervised sentiment analysis approach. We also proposed an algorithm to expand and enrich the emotion lexicon to improve the performance of the emotional analysis model. Next, we investigate transfer learning techniques (a machine learning research problem focused on gaining knowledge from one domain and applied to another) and pretrained NLP models to classify the disaster-relevant tweets with minimal labeling while fine-tuning the pre-trained language model on a new target location disaster. We achieved 95% accuracy with only 5% labeling data in a public data set. We evaluated our proposed text classification methods and compared them with other pre-trained, non-pre-trained (simple) models and a new self-collected local dataset. We discuss our findings and insights after comparing different classification models. We evaluated our proposed classification model on public datasets (CrisisLex) and our own datasets collected from (Ellicott City, Maryland, USA) to demonstrate the adaptability of this proposed model. Then, we posit to combine the physical sensor’s data with our social sensing text classification model to postulate a more reliable and accurate flood detection and classification model. We deployed physical sensor systems for flood monitoring and assessment in a neighborhood testbed (Ellicott City, Maryland, USA) and acquired data via environmental sensors and social media platforms (Twitter). We proposed and demonstrated our fusion of physical and social sensing data on a decision level, which improved the performance of the flood-related tweet classification model. Finally, we proposed another AI solution as an interactive question-answering (Q&A) chatbot for localized flood inquiries. This AI chatbot is called “FloodBot” which receives and combines all the data (camera images, social, water level, weather, etc.) sources to build a local multi-data corpus and uses BERT to extract answers to the questions. Our FloodBot serves as a two-way communication medium and a support guide that regularly provides contextual and localized information about the disaster to users and answers their specified questions. Our scalable and holistic cyber-physical disaster management framework is able to increase its resiliency and reliability by incorporating multi-data sources for a new disaster event detection and decreasing the need for labeled data.