No matter in life, study, or at work, reading is one of the major sources to obtain information. However, traditional books are mostly printed on paper. The books are not suitable for visually impaired persons to read. Though such a situation can be improved by Optical Character Recognition (OCR) technology. However, the quantity or timeliness of such resources exhibits a big gap versus that of normal sources of information for the general public. While E-books have become increasingly popular, due to the lack of consideration of the special needs of the visually impaired, as well as appropriate reading systems, the visually impaired still face many difficulties in E-book reading.
In this project, we proposed a voice reading system for the visually impaired. The system provides an accessible E-book reading environment with content parsing and speech synthesis technologies. Additionally, an accessible E-book reader App is also developed with friendly interface designs for visually impaired persons. This system supports the common-used E-book format, offers a better, faster, and more convenient E-book reading environment for the visually impaired, and improves the situations of insufficient amount and poor timeliness. This improved E-book reading system for the visually impaired people would help the visually impaired people in E-learning and information access.
Hsiao Ping Lee, Chung Shan Medical University, Taiwan
Tzu-Fang Sheu, Providence University, Taiwan
I-Wen Huang, Chung Shan Medical University, Taiwan
Stream: Education & Difference: Gifted Education
This paper is part of the ACEID2020 Conference Proceedings (View)
View / Download the full paper in a new tab/window
To cite this article:
Lee H., Sheu T., & Huang I. (2020) A Voice E-book Reading System Designed for the Visually Impaired People ISSN: 2189-101X – The Asian Conference on Education & International Development 2020 Official Conference Proceedings https://doi.org/10.22492/issn.2189-101X.2020.17
To link to this article: https://doi.org/10.22492/issn.2189-101X.2020.17
Comments & FeedbackPlace a comment using your LinkedIn profile
Share this Research