Manuscript received December 20, 2022; revised March 16, 2023; accepted March 22, 2023.
Abstract—This paper considers the implement of Kids Can
Read Application to assist kids with learning disabilities to spell
and read correctly. The researchers applied the advantage of
Quick Response (QR) code technology into the interaction
design. The Kids Can Read will install on smartphone or tablet
and will be using with word card with QR code integrated, kids
will scan the QR code, and the spelling instruction will appear
with sound of each word. Kids will be able to open the
application to reviewing the word afterward without the
re-scan the QR code. The interface design of the Kids Can Read
is easy to use and kids friendly. Most of the kids in this study
prefer to use Kids Can Read to help them learn and practice of
word reading and spelling. As a result, when the application
was conducted into practice in schools, the sample in this study
had a statistically significant increase in spelling ability at 0.05.
This is in consistent with the hypothesis that the ability to read
word spelling of the Grade 3 students with learning disability in
reading is higher after using the Kids Can Read. Hence, we
anticipate that all of students with learning disabilities who
study in Grade 3 in the country will have chance to use the
application by the end of next year.
Index Terms—Assistive technology, augmented reality,
interaction design, learning disabilities, quick response code
Onintra Poobrasert and Sirilax Luxsameevanich are with the Assistive
Technology and Medical Devices Research Center, National Science and
Technology Development Agency, Pathumthani, Thailand.
Paweena Meekanon is with Trudy Busch Valentine School of Nursing,
Saint Louis University, MO, USA.
*Correspondence: onintra.poo@nstda.or.th (O.P.)
Cite: Onintra Poobrasert*, Sirilak Luxsameevanich, and Paweena Meekanon, "Using the Technique of Interaction Design (IxD) and Augmented Reality (AR) as Assistive Technology for Students with Disabilities," International Journal of Information and Education Technology vol. 13, no. 8, pp. 1199-1207, 2023.
Copyright © 2023 by the authors. This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).