RESNA Annual Conference - 2020

Exploring The Effects Of Individualized Interaction Interventions Using Smart And Assistive Devices For Students With Severe And Multiple Disabilities

Sejin. Park1, Juhye. Yook2

1,2Department of Rehabilitation Technology, Korea Nazarene University

INTRODUCTION

Communication is based on interaction between individuals. However, people with severe and multiple disabilities have considerable limitations in expressing their intentions and accepting their conversation partners’ messages. This is because their cognitive and language development delays and disabilities are severe and accompanied by physical, visual, and hearing impairments. They are frequently ignored by conversation partners because their expressions are non-symbolic or difficult to interpret, and thus tend to give up the basic communication and interaction attempts.[1]

Students with severe and multiple disabilities express through simple non-symbolic signals, such as facial expressions and body movements, which are expressions in early communication stages.[2] According to Kang's study, children with disabilities come to understand a way of interacting with others if a conversation partner accepts their behaviors such as finger movements, facial expressions, and vocalizations as communication signals, and provides contextual feedback. The study shows that this type of interaction has a great effect on children with disabilities.[3]

Recently, intervention through smart devices has been shown to increase the interest and attention of children with disabilities and to provide immediate interaction feedback through various apps and activities. Tablet PCs are easy to use with touch screens and access devices, yielding sensitive responses to input and portability.[4] However, there are few studies on the effects of interaction interventions using tablet PCs or smart devices with various apps for students with severe and multiple disabilities.[5]

In Korea, a discussion on the national special education curriculum for students with severe and multiple disabilities began to draw meaningful attention since 2012.[6] Direct exploration, observation and evaluation of the interventions are critical for the development and application of interactive communication programs in the special education curriculum for these students.[6]

The purpose of this study was to explore how individualized interaction interventions using smart devices, assistive devices, and computer programs increased voluntary device input actions and social responses of students with severe and multiple disabilities. Specific research questions were as follows. Firstly, what were the effects of individualized interaction interventions with smart devices, assistive devices, and computer programs on device input actions of students with severe and multiple disabilities? Secondly, what were the effects of individualized interaction interventions with smart devices, assistive devices, and computer programs on social responses of students with severe and multiple disabilities?

METHODS

Design

Interaction interventions were conducted 9 to 13 times from March to June in 2019. The intervention sessions were held once a week for each student after school in a classroom of a special school in Chungnam, Korea. Interventions were conducted by two assistive technology professionals. One was a professor in the department of rehabilitation technology and taught for 15 years with a doctoral degree in special education. The other was a graduate student in a master’s program in rehabilitation technology.

 First of all, the professionals met three students with severe and multiple disabilities and their homeroom teachers to conduct ecological evaluation through observations and interviews. After deciding each student's long-term goal, the short-term goals were set by evaluating the intervention results each session for each student. Each student had an individualized interaction intervention program with different intervention activities and goals. This was to plan and implement interventions sensitive to their fluctuating health statuses, concentration and response levels, and moods. Consequently, smart devices, assistive devices, and computer programs have been secured for various activities at different levels.

Participants

The criteria for the selection of study participants were students who: (a) have severe and multiple disabilities, such as brain lesions, developmental delays, visual impairments, and hearing impairments, and required most assistance in activities of daily living; (b) lacked spoken language, intentional reaction, and communication modes; (c) regularly attended special schools; and (d) had not been involved in communication-related interventions previously. Three students with severe and multiple disabilities attending A special school in Chungnam, Korea, participated in the study. These students were girls between the ages of ten and thirteen. Their communication characteristics were all at very primary levels, and participation in their classrooms were extremely limited. Specific participants’ characteristics are shown in Table 1.

Table 1. Participants’ characteristics

Variables
Student A Student B Student C
Age/Gender 11 / Female 10 / Female 13 / Female
Major disabilities Brain lesions Brain lesions Brain lesions
Accompanied disabilities Hearing impairment with a cochlear implant and recognizing, visual impairment with no eye tracking Rett’s syndrome Overall developmental delays
Severity Grade 1 – Most severe Grade 1 – Most severe Grade 1 – Most severe
Communication characteristics Expression Nonverbal / Laughter, babbling, and turning head Nonverbal / Laughter Nonverbal / Smiling and vocalization
Acceptance Her name and sounds from surroundings Her name, a few favorite cartoon characters and music Her name, a few favorite sounds
Attention span 5 min. 10 min. 5 min.
Posture and sitting / Movement of upper body Sitting on a wheelchair with an inner system for inability to control her trunk and arms Sitting on a wheelchair with some movements of her neck, eye tracking, and left arm Sitting on a wheelchair with an inner system and shoulder belts for limited ability to control her trunk, neck, and arms

Equipment

The participants responded when communication partners called their names but did not respond to other speech messages. Therefore, various smart devices, assistive devices, and computer programs with interaction contents and sounds that each participant preferred and responded to were used for intervention. A tablet PC, a notebook computer, and a smart phone were used as smart devices. The assistive devices used in the interventions were PCEye (eye gaze system), wired and wireless switches, and switch toys. Interactive programs and apps were Look to Learn (eye gaze software), cause and effect programs of animals and plants, interactive sound apps of baby sleeping, animal sound, drum, piano, pond, sound of surroundings and daily living, sound effect, and Pororo and other cartoon music were installed in the smart devices.

Procedure

A special school in Chungnam, Korea, was contacted to select participants who met the criteria. The school referred three students with their parents’ consent documents, and their homeroom teachers were interviewed about socio-demographic information, disability characteristics, preferences, and communication needs of their students. The professionals then evaluated seating statuses, upper extremity motor skills, communication skills, abilities to use the devices, and reactions and time to focus on computer programs of each student.

Each intervention period was 30 minutes once a week. In the first five minutes, the assistive technology professionals called the student's name and greeted, attempting to make eye contact, and provided with their favorite character music or songs. During the next 20-minute intervention period, students were provided with interactive activities using smart devices, assistive devices, and computer programs. If the students gave no response, other activities were provided. If they were highly responsive to the activities, various contents of the activities were provided, and the levels of difficulties were gradually increased. In the last five minutes, the assistive technology professionals showed some of the next activities, called student's name, attempting to make eye contact, and said goodbye. Each intervention was video recorded.

After ending each session, the interventions and students’ responses were analyzed to plan and apply the interventions needed for the next session. Once a month, special education teachers, assistant technology professionals, and speech therapists had a case meeting as a multidisciplinary team to evaluate the suitability of the intervention and discuss next directions of the interventions.

Analysis

After the 13 sessions of interventions were completed, video files recorded were analyzed for the effects of intervention activities on interactive device input actions and social responses of the students. The students' accurate input actions were extracted and divided by all device input actions and multiplied by 100 to calculate the rates of interactive device input actions. The students' consistent social responses were extracted and divided by all social responses and multiplied by 100 to calculate the rates of consistent responses. If the students did not show accurate device input actions or consistent social responses, the frequencies of all input actions or social responses were measured.

RESULTS

1) Student A

filler
Figure 1. The social response rates of student A

Student A's interaction intervention was initially a tactile toy and chracter songs with a tablet PC, but there was no response. So, intervention activities were changed into the sounds of living noise application program with a smart phone, and she showed consistent social responses to the sounds such as a hair dryer, a car and shower water. Consistent social responses to the student's preferred sounds were apparent, although varying from 0% to 100% of laughter, babbling, eyes wide open, and head movement.

2) Student B

Filler
Figure 2. The device put rates of student B

Student B's interaction intervention activity was initially pressing a wireless switch with cause & effect interaction programs of animals and plants. At first, she did not look at the screen and used the palm to press the switch. As the interventions were repeated, she looked at the screen and used her wrist, finger, or a side of a hand, controlling her hand to press the switch effectively. At this stage, the eye gaze program, Look to Learn, was introduced with a PCEye eye gaze system. Her eye gaze time increased from 1 to 3 seconds to perform the activities. Student B’s accurate device input rates of using PCEye and a wireless switch increased and maintained from 73% to 96%.

3) Student C

filler
Figure 3. The frequencies of inputting touch screen and a switch

Student C's interaction intervention initially focused on drum, piano, and pond interaction apps using the tablet PC's touch screen. She touched the tablet PC screen but did not show accurate device input actions or consistent social responses. So, the tablet PC was replaced by a switch toy, a concrete object. She pressed the switch but did not show accurate device input actions or consistent social responses. She was often absent from school until the 9th session and did not attend school after 9th session due to health problems. She touched the screen from 4 to 12 times a session and pressed the switch 3 to 33 times a session.

DISCUSSION

The results of this study showed positive effects on the device input actions and social responses through interaction interventions with smart devices, assistive devices, and interactive computer programs for individuals with severe and multiple disabilities. Instead of setting fixed goals at an early phase of individual interventions, a sensitive modification of variables was proposed with a process of analyzing results and setting next goals at each session. The development of expressions, along with the constant discovery of students' preferred activities and their timely application to interventions based on the students’ actual actions and responses, have had a positive impact on interaction outcomes. The limitations of this study were different dependent and independent variables for each student analyzed and a small number of participants. Duplication of this study needs to verify the results of this study and to establish realistic methods of interaction interventions for these students in the future.

REFERENCES

  1. Beukleman D, Mirenda P. Augmentative and alternative communication: Supporting children with complex communication needs 4th ed. Baltimore: Paul Brooks. 2013
  2. Ha Na Ji, Su Kjeong Rhie. (2014). Action Research on Applying "Affective Communication Assessment (ACA)" for a Student with Severe and Multiple Disabilities. Zeitschrift fuer paedagogische Forschung, 19(1), 113-139.
  3. Hae Kyung Kang. (2008). A review of the early communication intervention on pre-linguistic period children with severe disabilities. The Korean Journal of Early Childhood Special Education, 8(3), 113-138
  4. Ku Jong Yoo. (2012). A study on the development of program by using smart phones and tablet PC and its effects on scientific thinking of young children. The Journal of Korea Open Association for Early Childhood Education, 17(3), 85-110.
  5. Hye Rim Lee, So Hyun Lee. (2017). The effects of a mother mediated play activity with educational application on social interaction of young children with autism spectrum disorders. The Korean Journal of Early Childhood Special Education, 17(2), 101-129.
  6. Hye-Ri Kim. (2017). Review of Research Trends on Curriculum for Students with Severe and multiple disabilities in Korea. Journal of Educational Innovation Research, 27(2), 79-102
  7. Ji Sun Choi, So Hyun Lee. (2017). Responsive Communication Intervention Based on Connection of Preschool and Home. The Korean Journal of Early Childhood Special Education, 17(2), 79-100

ACKNOWELDGEMENT

This research was supported by the National Research Foundation of Korea (NRF) funded by the MSIT (NRF-2018R1A4A1025559) in 2020.