Sign Language Based ATM Accessing System for the Visually Impaired

Authors

  • Sajad Ali A Department of ECE, Islamiah Institute of Technology, Bangalore, India
  • Daniya Afreen Department of ECE, Islamiah Institute of Technology, Bangalore, India
  • Noor Ayesha Department of ECE, Islamiah Institute of Technology, Bangalore, India
  • Revansidha Revansidha Department of ECE, Islamiah Institute of Technology, Bangalore, India

DOI:

https://doi.org/10.5281/zenodo.6543954

Keywords:

Hand gestures recognition, Real time image processing, ATM for Blind, ATM machines

Abstract

Around 285 million individuals all throughout the planet are outwardly hindered and around 35 million are visually impaired. There are 3 sorts of visual impairment night visual impairment, complete visual impairment, and Color visual impairment. One of the issues they face in their routine is during ATM exchanges. Despite the fact that ATMs are recorded with Braille it doesn't totally wipe out the issues looked by dazzle individuals. But the main problem faced by them is performing monitory transaction how difficult for a blind to withdraw money form ATM and the risk in the world .Thus in order to enhance his security while withdrawing money from the ATM we plan to develop a secure environment for them , by designing sign language based ATM accessing system. Concept of real time video processing will be incorporated for the realization of the same. Further it has been noticed that they face difficulty in inserting their ATM card in the designated slot. Thus instead of using magnetic strip based card, we proposed colour coded ATM card, which allows them to access the account upon placing the card in front the camera, the same camera will be utilized to identify the gesture as their pin.

Downloads

Download data is not yet available.

References

J. Wang and T. Zhang, “An ARM- based embedded gesture recognition system using a data glove,” presented at the 26th Chinese Control and Decision Conf., Changsa, China, May 31 - June2, 2014.

Shukor, M. F. Miskon, M. H. Jamal Uddin, F. A. Ibrahim, M. F. Asyraf and M. B. Bahar, “A new data glove approach for Malaysian sign language detection,” Procedia Computer. Sci., vol. 76, no. 1, pp. 60-67, Dec. 2015.

N. Sriram and M. Nithiyanandham, “A hand gesture recognition based communication system for silent speakers,” presented at the Int. Conf. Human Compute. Interact., Chennai, India, Aug. 23-24, 2013.

S. V. Matiwade and M. R. Dixit, “Electronic support system for deaf and dumb to interpret sign language of communication,” Int. J. Innov. Research Sci. Eng. Technol., vol. 5, no. 5, pp. 8683-8689, May 2016.

Murakami, K., & Taguchi, H. (1991, April). Gesture recognition using recurrent neural networks. In Proceedings of the SIGCHI conference on Human factors in computing systems(pp. 237-242). ACM.

Gruen, A. (1985). Adaptive least squares correlation: a powerful image matching technique. South African Journal of Photogrammetry, Remote Sensing and Cartography, 14(3), 175-187.

S. Huang, C. Mao, J. Tao and Z. Ye, "A Novel Chinese Sign Language Recognition Method Based on Keyframe-Centered Clips," in IEEE Signal Processing Letters, vol. 25, no. 3, pp. 442- 446, March 2018.

Caporusso N., Biasi L., Cinquepalmi G., Trotta G.F., Brunetti A., Bevilacqua V. (2018) A Wearable Device Supporting Multiple Touch- and Gesture-Based Languages for the Deaf-Blind. In: Ahram T., Falcão C. (eds) Advances in Human Factors in Wearable Technologies and Game Design. AHFE 2017. Advances in Intelligent Systems and Computing, vol 608.

B. G. Lee and S. M. Lee, "Smart Wearable Hand Device for Sign Language Interpretation System With Sensors Fusion," in IEEE Sensors Journal, vol. 18, no. 3, pp. 1224-1232, 1 Feb.1, 2018.

B. Lee and W. Chung, "Wearable Glove-Type Driver Stress Detection Using a Motion Sensor," in IEEE Transactions on Intelligent Transportation Systems, vol. 18, no. 7, pp. 1835-1844, July 2017.

T. Starner and A. Pentland, “Visual recognition of American sign language using hidden Markov model,” in Proc. IEEE Int. Conf. Autom. Face Gesture Recognit., 1995, pp. 1–52.

C. Wang, W. Gao, and S. Shan, “An Approach based on phonemes to large vocabulary Chinese sign language recognition,” in Proc. IEEE Int. Conf. Autom. Face Gesture Recognit., 2002, pp. 393–398.

P. Jangyodsuk, C. Conly, and V. Athitsos, “Sign language recognition using dynamic time warping and hand shape distance based on histogram of oriented gradient features,” in Proc. ACM Int. Conf. Pervasive Technologies Related Assistive Environments, 2014, pp. 1–6.

G. Marin, F. Dominio, and P. Zanuttigh, “Hand gesture recognition with leap motion and Kinect devices,” in Proc. IEEE Int. Conf. Image Process., 2014, pp. 1565–1569.

J. Zhang, W. Zhou, X. Chao, J. Pu, and H. Li, “Chinese sign language recognition with adaptive HMM,” in Proc. IEEE Int. Conf. Multimedia Expo., 2016, pp. 1–6

Downloads

Published

2022-04-05

How to Cite

[1]
S. Ali A, D. Afreen, N. Ayesha, and R. Revansidha, “Sign Language Based ATM Accessing System for the Visually Impaired”, pices, vol. 5, no. 12, pp. 113-115, Apr. 2022.

URN