Smart Approach for Finding Indoor Navigation Using BLE for Visually Impaired Person
Kalyani Mule1 , J. V. Shinde2
Section:Research Paper, Product Type: Journal Paper
Volume-7 ,
Issue-8 , Page no. 88-93, Aug-2019
CrossRef-DOI: https://doi.org/10.26438/ijcse/v7i8.8893
Online published on Aug 31, 2019
Copyright © Kalyani Mule, J. V. Shinde . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
View this paper at Google Scholar | DPI Digital Library
How to Cite this Paper
- IEEE Citation
- MLA Citation
- APA Citation
- BibTex Citation
- RIS Citation
IEEE Style Citation: Kalyani Mule, J. V. Shinde, “Smart Approach for Finding Indoor Navigation Using BLE for Visually Impaired Person,” International Journal of Computer Sciences and Engineering, Vol.7, Issue.8, pp.88-93, 2019.
MLA Style Citation: Kalyani Mule, J. V. Shinde "Smart Approach for Finding Indoor Navigation Using BLE for Visually Impaired Person." International Journal of Computer Sciences and Engineering 7.8 (2019): 88-93.
APA Style Citation: Kalyani Mule, J. V. Shinde, (2019). Smart Approach for Finding Indoor Navigation Using BLE for Visually Impaired Person. International Journal of Computer Sciences and Engineering, 7(8), 88-93.
BibTex Style Citation:
@article{Mule_2019,
author = {Kalyani Mule, J. V. Shinde},
title = {Smart Approach for Finding Indoor Navigation Using BLE for Visually Impaired Person},
journal = {International Journal of Computer Sciences and Engineering},
issue_date = {8 2019},
volume = {7},
Issue = {8},
month = {8},
year = {2019},
issn = {2347-2693},
pages = {88-93},
url = {https://www.ijcseonline.org/full_paper_view.php?paper_id=4795},
doi = {https://doi.org/10.26438/ijcse/v7i8.8893}
publisher = {IJCSE, Indore, INDIA},
}
RIS Style Citation:
TY - JOUR
DO = {https://doi.org/10.26438/ijcse/v7i8.8893}
UR - https://www.ijcseonline.org/full_paper_view.php?paper_id=4795
TI - Smart Approach for Finding Indoor Navigation Using BLE for Visually Impaired Person
T2 - International Journal of Computer Sciences and Engineering
AU - Kalyani Mule, J. V. Shinde
PY - 2019
DA - 2019/08/31
PB - IJCSE, Indore, INDIA
SP - 88-93
IS - 8
VL - 7
SN - 2347-2693
ER -
VIEWS | XML | |
358 | 323 downloads | 186 downloads |
Abstract
In todays’ life, the problems faced by the visually impaired persons are increases due to the huge growth in urbanization in cities. Even a normal person also gets confused, if they come across to the new locations. To handle this problem, in this paper, we have proposed a new robust system, which provide help to user while navigating in big industrial buildings. This system uses BLE (bluetooth low energy) devices to communicate with the hardware present at user and then it will direct the route to the user. The process includes user interaction through voice for the input location after that system will find desired location of user by connecting the hardware to various BLE devices and depending upon the signal strengths from each BLE user will be get navigated. If the range of BLE devices get less than that means, that user is going away from that BLE device and similarly if the range of particular device is getting increase then it means that user going towards the BLE device. Now to get accurate result we are implementing Three Dimensional Triangulation Technique where the hardware present at user will simultaneously connect with multiple BLE devices and then find the required route for navigation. Along with this we are providing IR(infra-red) SONAR sensors through which we can find any obstacle that comes between the user and its navigation. We have added buzzer and LED lights to notify the obstacle to others.
Key-Words / Index Term
Indoor navigation, BLE beacons technology for triangulation. Blind navigation, wayfinding, robotic navigation aid, pose estimation
References
[1] He Zhang ; Cang Ye, "An Indoor Wayfinding System Based on Geometric Features Aided Graph SLAM for the Visually Impaired" IEEE Transactions on Neural Systems and Rehabilitation Engineering (Volume: 25 , Issue: 9 , Sept. 2017 )
[2] D. Yuan and R. Manduchi, “A Tool for Range Sensing and Environment Discovery for the Blind,” in Proc. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2004.
[3] K. Tsukada and M. Yasumura, “Activebelt: Belt-type wearable tactile display for directional navigation,” in Proc. Ubiquitous Comput., 2004, pp. 384–399.
[4] F. Endres, J. Hess, N. Engelhard, J. Sturm, D. Cremers, and W. Burgard, “An evaluation of the RGB-D SLAM system,” in Proc IEEE Int. Conf.Robotics and Automation, 2012, pp. 1691-1696.
[5] A. Tamjidi, C. Ye, and S. Hong, “6-DOF pose estimation of a portable navigation aid for the visually impaired,” in Proc. IEEE international symposium on robotic and sensors environments, 2013, pp. 178-183.
[6] C. Ye, S. Hong, and A. Tamjidi, “6-DOF pose estimation of a robotic navigation aid by tracking visual and geometric features,” IEEE Trans. Autom. Sci. Eng., vol. 12, no. 4, pp. 1169-1180, Oct. 2015.
[7] V. Kulyukin, C. Gharpure, J. Nicholson, and G. Osborne, “Robot-assisted wayfinding for the visually impaired in structured indoor environments,” Auton. Robot., vol. 21, no. 1, pp. 29-41, 2006.
[8] J. A. Hesch and S. I. Roumeliotis, “Design and analysis of a portable indoor localization aid for the visually impaired,” Int. J. Robot. Res., vol. 29, no. 11, pp. 1400-1415, 2010.
[9] T. Bailey and H. Durrant-Whyte, “Simultaneous Localization and Mapping (SLAM): Part II,” IEEE Robotics Automation Magazine, vol. 13, no. 3, pp. 108-117, 2006.
[10] M. Kaess, A. Ranganathan, and F. Dellaert, “iSAM: Incremental smoothing and mapping. Robotics,” IEEE Transactions on Robotics, vol. 24, no.6, pp.1365-1378, 2008.
[11] G. Klein and D. Murray, “Parallel tracking and mapping for small AR workspaces,” in Proc. IEEE and ACM International Symposium on Mixed and Augmented Reality, 2007, pp. 225-234.
[12] R. A. Newcombe, S. J. Lovegrove and A.J. Davison, “DTAM: Dense tracking and mapping in real-time,” in Int. Conf. Computer Vision, 2011, pp. 2320-2327.
[13] J. Engel, T. Schöps, and D. Cremers, “LSD-SLAM: Large-scale direct monocular SLAM,” in Proc. European Conference on Computer Vision, 2014, pp. 834–849.