eScholarship@Morgan is an institutional repository for scholarship created by the faculty, students, and staff of Morgan State University. As part of the Maryland Shared Open Access Repository (MD-SOAR), eScholarship@Morgan provides long-term storage and public access to academic materials from the Morgan community, meeting the data management and open access requirements for grants and other funding agencies. The repository also includes historical materials relating to Morgan and the Morgan community.
To deposit materials, register at https://mdsoar.org. Once registered, a staff member from the library will contact you with instructions on uploading materials. Files may be up to 2GB in size, and may be deposited in any format. Please note that files may be converted to other formats for optimal preservation. The following formats are preferred:
Textual works: PDF
Images: TIFF, JPEG2000
Datasets: JSON, XML, TSV, CSV
Video: AVI, MOV, MP4
Please note that by submitting your work, you grant Morgan State University the right to reproduce, publicly display, and distribute your item at no cost to users world-wide.
For more information, contact the Earl S. Richardson library at (443) 885-3477, or Jason Riggin at firstname.lastname@example.org.
Browsing eScholarship@Morgan by Subject "Aerospace engineering"
(2012) Sissinto, Paterne; Ladeji-Osias, Jumoke; Electrical and Computer Engineering; Doctor of Engineering
Visible and infrared image fusion technology effectively enhances image information content and rendering quality. They are many image fusion approaches that can be utilized to produce high quality fused images. Outdoor scenes exhibit no linearity, are random, and their images have independently distributed pixel colors. Many of the existing image fusion methods including but not limited to, Principal Component Analysis, Wavelet Transform based, make assumption of both image content linearity and dependent distribution and when applied to grayscale fusion. Empirical Mode Decomposition (EMD) represents input images as Instrinsic Mode Functions (IMFs) carrying their spatial and frequency components about each pixel with no a priori assumption. Visible and infrared signals lay in different spectrum ranges. Spectral emissivity and reflectivity, depending on wavelength, special contrast can be enhanced not only in visible video but also in infrared imagery. This work proposed a biologically-inspired fusion of visible and infrared images based on EMD and opponent processing (Bio-EMD). First, registered visible and infrared captures of the same scene are decomposed into Intrinsic Mode Functions through EMD. Then, the fused image is generated by opponent processing the source IMFs. Finally, the results are evaluated based on the amount on information transferred, the clarity of details, vividness of depictions, and range of meaning differences in lightness and chromaticity. Perceptual comparison of the results showed that opponent-processing-based techniques outperformed algorithms based on intensities, and some other techniques. Quantitative assessment confirmed that the proposed technique transferred twice information as much as the state-of-the-art methods did. Bio-EMD provided a high level sharpness, yielded more natural-looking colors, and produced a high magnitude of visually meaningful differences in lightness and chromaticity for the fusion of low-light visible and infrared images. These results were obtained without optimization of filters involved in the process.
(2012) Latimer, Bridgette A.; Scott, Craig J.; Electrical and Computer Engineering; Doctor of Engineering
The Federal Aviation Administration (FAA), National Transportation and Safety Board (NTSB), National Aeronautics and Space Administration (NASA), numerous corporate entities, and research facilities have each come together to determine ways to make air travel safer and more efficient. These efforts have resulted in the development of a concept known as the Next Generation (Next Gen) of Aircraft or Next Gen. The Next Gen concept promises to be a clear departure from the way in which aircraft operations are performed today. The Next Gen initiatives require that modifications are made to the existing National Airspace System (NAS) concept of operations, system level requirements, software (SW) and hardware (HW) requirements, SW and HW designs and implementations. A second example of the changes in the NAS is the shift away from air traffic controllers having the responsibility for separation assurance. In the proposed new scheme of free flight, each aircraft would be responsible for assuring that it is safely separated from surrounding aircraft. Free flight would allow the separation minima for enroute aircraft to be reduced from 2000 nautical miles (nm) to 1000 nm. Simply put "Free Flight is a concept of air traffic management that permits pilots and controllers to share information and work together to manage air traffic from pre-flight through arrival without compromising safety ." The primary goal of this research project was to create a conceptual model that embodies the essential ingredients needed for a collision detection and avoidance system. This system was required to operate in two modes: air traffic controller's perspective and pilot's perspective. The secondary goal was to demonstrate that the technologies, procedures, and decision logic embedded in the conceptual model were able to effectively detect and avoid collision risks from both perspectives. Embodied in the conceptual model are five distinct software modules: Data Acquisition, State Processor, Projection, Collision Detection, and Alerting and Resolution. The underlying algorithms in the Projection module are linear projection and Kalman filtering which are used to estimate the future state of the aircraft. The Resolution and Alerting module is comprised of two algorithms: a generic alerting algorithm and the potential fields algorithm . The conceptual model was created using Enterprise Architect ® and MATLAB ® was used to code the methods and to simulate conflict scenarios.