Information Technology Augmentative and Alternative Communication Using Smart Mobile Devices

Authors

DOI:

https://doi.org/10.13052/jmm1550-4646.1743

Keywords:

Communication technology, sign language, leap motion, hand model, dynamic gestures, non-contact text entry

Abstract

The article describes the information technology of alternative communication implemented by non-contact text entry using a limited number of simple dynamic gestures. Non-contact text entry technologies and motion tracking devices are analysed. A model of the human hand is proposed, which provides information on the position of the hand at each moment in time. Parameters sufficient for recognizing static and dynamic gestures are identified. The process of calculating the features of the various components of the movement that occur when showing dynamic hand gestures is considered. Common methods for selecting letters with non-contact text entry are analysed. To implement the user interaction interface, it is proposed to use a radial virtual keyboard with keys containing alphabetical letters grouped. A functional model and a model of human-computer interaction of non-contact text entry have been developed. It enabled to develop an easy-to-use software system for alternative communication, which is implemented by non-contact text entry using hand gestures. The developed software system provides a communication mechanism for people with disabilities.

Downloads

Download data is not yet available.

Author Biographies

Iurii Krak, Taras Shevchenko National University of Kyiv, Kyiv, Ukraine

Iurii Krak, Full Doctor (1999), Full Professor (2000), Corresponding Member of the National Academy of Science of Ukraine (NASU). He is currently working Head of the Theoretical Cybernetics Department at Taras Shevchenko National University of Kyiv (Ukraine) and Principal researcher at V.M.Glushkov Cybernetics Institute NASU. His scientific interest in the fields of data and image processing, information classification and recognition, robotics, artificial intelligent, sign language modeling and recognition, face emotions recognition, NLP, HCI etc.

Ruslan Bahrii, Khmelnytskyi National University, Khmelnytskyi, Ukraine

Ruslan Bahrii obtained the bachelor’s and master’s degree specialization of mechanical engineering technology from Technological University of Podillya (Khmelnytskyi) in 2000 and 2001, and the Candidate of Technical Science degree (PhD) in Ternopil National Economic University in 2018, respectively. He is currently working as an Associate Professor at the Department of Computer Science and Information Technologies, Faculty of Programming, Computer and Telecommunication Systems, Khmelnytskyi National University. His research areas include information technology of alternative communication, nonverbal communication, mobile multimedia, text prediction methods and statistical language model and beyond.

Olexander Barmak, Khmelnytskyi National University, Khmelnytskyi, Ukraine

Olexander Barmak, Full Doctor (2013), Full Professor (2015). He is currently working Head of Computer Science and Information Technologies Department Khmelnytskyi National University (Ukraine). His research interests include: (1) Development and improvement of methods of classification and clustering of information, theoretical and applied bases of human-oriented information technologies on the principles of ethics and trust in artificial intelligence; (2) Information technology for modelling virtual reality problems, including the development of new types of human-computer interfaces for Augmentative and Alternative Communication.

References

Augmentative and Alternative Communication (AAC). http://www.asha.org/public/speech/disorders/AAC/. Accessed Sept. 06, 2019

O. Barmak. R. Bagriy, I. Krak, V. Kasianiuk Information technology for entering text based on tools of the special virtual keyboard mobile and auxiliary devices. In: Proceedings of the Second International Workshop on Computer Modeling and Intelligent Systems (CMIS-2019), Zaporizhzhia, Ukraine, 2019. pp. 413–427

Sign Language Alphabets From Around The World https://blog.ai-media.tv/blog/sign-language-alphabets-from-around-the-world. Accessed Nov. 01, 2019

Adhikary J Text Entry in VR and Introducing Speech and Gestures in VR Text Entry. In: Proceedings of the MobileHCI 2018 Workshop on Socio-Technical Aspects of Text Entry, Barcelona, Spain, September 3 2018. pp. 15–17

Leap Motion https://www.leapmotion.com/. Accessed April 09, 2020

Yu Chun, Yizheng Gu, Zhican Yang, Xin Yi, Hengliang Luo, Yuanchun Shi Tap, dwell or gesture: Exploring head-based text entry techniques for hmds. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 2017. ACM, pp. 4479–4488

Dobosz K., Popanda D., Sawko A. (2020) Head-Based Text Entry Methods for Motor-Impaired People. In: Gruca A., Czachórski T., Deorowicz S., Harȩżlak K., Piotrowska A. (eds) Man-Machine Interactions 6. ICMMI 2019. Advances in Intelligent Systems and Computing, vol 1061. Springer, Cham

Vogel, Daniel, Ravin Balakrishnan Distant freehand pointing and clicking on very large, high resolution displays. In: Proceedings of the 18th annual ACM symposium on User interface software and technology, 2005. ACM, pp. 33–42

Wilson, Andrew D Robust computer vision-based detection of pinching for one and two-handed gesture input. In, Proceedings of the 19th annual ACM symposium on User interface software and technology, 2006. ACM, pp. 255–258

Bowman, Doug A, Chadwick A, Wingrave JM, Campbell V, Q. Ly, Rhoton C. J (2002) Novel uses of Pinch GlovesTM

for virtual environment interaction techniques. Virtual Reality 6 (3):122–129

Markussen A, Jakobsen MR, Hornbæk K Vulture: a mid-air word-gesture keyboard. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, Ontario, Canada, 2014. Association for Computing Machinery, pp. 1073–1082. doi:10.1145/2556288.2556964

Ni T, Bowman D, North C AirStroke: bringing unistroke text entry to freehand gesture interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 2011. Association for Computing Machinery, pp. 2473–2476. doi:10.1145/1978942.1979303

Castellucci SJ, MacKenzie IS Graffiti vs. unistrokes: an empirical comparison. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Florence, Italy, 2008. Association for Computing Machinery, pp. 305–308. doi:10.1145/1357054.1357106

Belkacem I., Pecci I., Martin B., Faiola A. (2019) TEXTile: Eyes-Free Text Input on Smart Glasses Using Touch Enabled Textile on the Forearm. Human-Computer Interaction – INTERACT 2019 pp. 351–371

Wang RY, Popović J (2009) Real-time hand-tracking with a color glove. ACM Trans Graph 28 (3):Article 63. doi:10.1145/1531326.1531369

Anant Agarwal, Manish K Thakur Sign Language Recognition using Microsoft Kinect. In: 6th International Conference on Contemporary Computing, IC3, 2013. pp. 181–185

Hansaem Lee, Junseok Park (2015) Hand Gesture Recognition in Multi-space of 2D/3D. IJCSNS International Journal of Computer Science and Network Security 15 (6):12–16

Mossel A (2014) Robust Wide-Area Tracking and Intuitive 3D Interaction for Mixed Reality Environments. Thesis for: PhD

Boletsis, Costas & Kongsvik, Stian (2019). Text Input in Virtual Reality: A Preliminary Evaluation of the Drum-Like VR Keyboard. Technologies 7, 31

Hürst, W, van Wezel C (2013) Gesture-based interaction via finger tracking for mobile augmented reality. Multimed Tools Appl 62: 233–258

Microsoft Kinect https://en.wikipedia.org/wiki/Kinect. Accessed April 06, 2019

Digital Assistance for Sign-Language Users. https://www.microsoft.com/en-us/research/blog/digital-assistance-for-sign-language-users/. Accessed Sept. 16, 2019

Yang HD (2014) Sign language recognition with the Kinect sensor based on conditional random fields. Sensors (Basel) 15 (1):135–147. doi:10.3390/s150100135

B. Liao, J. Li, Z. Ju, G. Ouyang Hand Gesture Recognition with Generalized Hough Transform and DC-CNN Using Realsense. In: 2018 Eighth International Conference on Information Science and Technology (ICIST), Cordoba, 2018. pp. 84–90

Stereo depth modules and processors by Intel® RealSenseTM

https://www.intelrealsense.com/stereo-depth-modules-and-processors/ Accessed January 26, 2021

Intel® RealSenseTM

SDK 2016 R3 Documentation https://software.intel.com/sites/landingpage/realsense/camera-sdk/v1.1/documentation/html/doc_hand_hand_tracking_modes.html. Accessed April 16, 2020

Naidu C, Ghotkar A (2016) Hand Gesture Recognition Using Leap Motion Controller. International Journal of Science and Research (IJSR) 5 (10):436–441

Leap Motion Developers https://developer.leapmotion.com/documentation/ index.html Accessed April 09, 2020

Tracking/Leap Motion Controller https://www.ultraleap.com/product/leap-motion-controller/ Accessed January 26, 2021

Elleuch H, Wali A, Samet A. and Alimi AM (2015) A static hand gesture recognition system for real time mobile device monitoring, 15th International Conference on Intelligent Systems Design and Applications (ISDA), Marrakech 2015, pp. 195–200

Kryvonos IG, Krak IV, Barmak OV, Kulias AI (2017) Methods to Create Systems for the Analysis and Synthesis of Communicative Information. Cybernetics and Systems Analysis 53 (6):847–856

Krak I.V, Kryvonos IG, Barmak OV, Ternov AS (2016) An Approach to the Determination of Efficient Features and Synthesis of an Optimal Band-Separating Classifier of Dactyl Elements of Sign Language. Cybernetics and Systems Analysis 52 (2):173–180

Kryvonos IG, Krak IV, Barmak OV, Bagriy RO (2016) New tools of alternative communication for persons with verbal communication disorders. Cybernetics and Systems Analysis 52 (5):665–673

Markussen A., Jakobsen M.R., Hornbæk K. (2013) Selection-Based Mid-Air Text Entry on Large Displays. In: Kotzé P., Marsden G., Lindgaard G., Wesson J., Winckler M. (eds) Human-Computer Interaction – INTERACT 2013. INTERACT 2013. Lecture Notes in Computer Science, vol 8117. Springer, Berlin, Heidelberg

Gomide, Renato & Loja, Luiz & Lemos, Rodrigo & Flôres, Edna & Melo, Francisco & Teixeira, Ricardo (2016) A new concept of assistive virtual keyboards based on a systematic review of text entry optimization techniques. Research on Biomedical Engineering 32 (2):176–198

Grover D, King M, Kuschler C (1998) Reduced keyboard disambiguating computer. Tegic Communications, Inc., Seattle

Sarcar, Sayan & Ghosh, Soumalya & Saha, Pradipta & Samanta, Debasis (2010) Virtual keyboard design: State of the arts and research issues. TechSym 2010 – Proceedings of the 2010 IEEE Students’ Technology Symposium. 289–299

Bachmann D, Weichert F, Rinkenauer G (2018) Review of Three-Dimensional Human-Computer Interaction with Focus on the Leap Motion Controller. Sensors (Basel) 18 (7). doi:10.3390/s18072194

He, Zhenyi & Yang, Xubo (2014). Hand-based interaction for object manipulation with augmented reality glasses. 227–230. 10.1145/2670473.2670505.

Bahrii R, Krak I, Barmak O, Kasianiuk V Implementing Alternative Communication using a Limited Number of Simple Sign Language Gestures. In: 2019 IEEE International Conference on Advanced Trends in Information Theory (ATIT), Kyiv, Ukraine, 2019. pp. 435–438

Published

2021-06-21

How to Cite

Krak, I., Bahrii, R., & Barmak, O. (2021). Information Technology Augmentative and Alternative Communication Using Smart Mobile Devices. Journal of Mobile Multimedia, 17(4), 527–554. https://doi.org/10.13052/jmm1550-4646.1743

Issue

Section

Mobile Communication and Computing for Internet of Things and Industrial Automat