Abstract:
Navigating through everyday environments poses significant challenges for
impaired individuals, particularly when it comes to aviation settings where safety and
precision are paramount. This thesis presents the development of an innovative mobile
application designed to assist impaired individuals in navigating these complex
environments with greater ease and confidence. Our application, built with Google
Maps integration, incorporates advanced object detection functionality to enhance the
user experience and provide crucial navigational aid.
Unlike conventional navigation apps, our app features a camera view integrated
into the navigation interface. This allows users to hold their phones facing forward
while walking, enabling the app to detect and recognize objects in their path. By
leveraging real-time image processing and machine learning algorithms, the system
identifies obstacles and provides immediate feedback to the user, ensuring a safer and
more informed journey.
This application not only facilitates independent navigation but also empowers
impaired individuals by increasing their mobility and confidence in unfamiliar or
challenging settings. The development process, technical specifications, and user
testing outcomes discussed in this thesis demonstrate the app's potential to significantly
improve the quality of life for impaired individuals, particularly in the aviation context.
Through this work, we aim to contribute to the growing field of assistive
technology and underscore the importance of inclusivity in technological
advancements. We believe this app will serve as a vital tool in bridging the gap between
impaired individuals and their environments, fostering greater independence
and accessibility.