Abstract :Vision Is One Of The Most Important Senses That Help People Interact With The Real World. There Are Nearly 200 Million Blind People All Over The World, And Being Visually Impaired Hinders A Lot Of Day-to-day Activities. Thus, It Is Very Necessary For Blind People To Understand Their Surroundings And To Know What Objects They Interact With. This Project Proposes An Android Application To Help Blind People See Through A Handheld Device Like A Mobile Phone. It Integrates Various Techniques To Build A Rich Android Application That Will Not Only Recognize Objects Around Visually Impaired People In Real Time But Also Provide Audio Output To Assist Them As Quickly As Possible. For Object Recognition And Detection, The Application Uses The Single Shot Detector (SSD) Algorithm As Well As You Only Look Once (YOLO), Both Of Which Are Wellknown Deep Learning-based Object Detection Models. While YOLO Is Recognized For Its Highspeed, Single-pass Detection Capability, SSD Is Preferred For Mobile Applications Due To Its Balance Between Speed And Accuracy. Both Algorithms Ensure Efficient Real-time Performance.The Application Further Utilizes Android TensorFlow APIs For Deep Learning Processing And The Android TextToSpeech API To Generate Audio Output. By Combining These Technologies, The Application Aims To Provide A Seamless And Efficient Way For Visually Impaired Individuals To Perceive And Interact With Their Surroundings |
Published:25-8-2025 Issue:Vol. 25 No. 8 (2025) Page Nos:397-406 Section:Articles License:This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. How to Cite |