Systematic Review Of Text Entry Methods For People With Physical Impairments

Heidi Horstmann Koester, PhD1 and Sajay Arthanat, PhD, OTR/L, ATP2

1Koester Performance Research, Ann Arbor MI and 2University of New Hampshire, Durham NH

PURPOSE

A wide variety of computer access methods to support text entry are available for people with physical impairments, including standard keyboards and associated adaptations, speech recognition systems, on-screen keyboards, one- or two-switch scanning input, brain interfaces, and others.  The process involved in choosing the most appropriate method, or combination of methods, to meet their needs is critically important and complex.

Because impairments and motor skills can vary widely across individuals, finding the best access method for someone is an individualized process that focuses heavily on the user’s specific strengths and limitations.  However, practitioners and users can also benefit from understanding the external evidence, i.e., the published outcomes for similar clients with similar needs. External evidence can provide insight into expectations for long-term performance and learning times, as well as comparisons between different candidate systems.

The purpose of this study is to perform a systematic literature review, in order to develop a better understanding of the typing speed provided by text entry methods for people with physical impairments.  This is the first step in a longer-term project to organize and synthesize the available literature in the area of computer access.

METHODS

Search Strategy

To systematically search the literature, we used the approach outlined by Schlosser et al. (2005).  The guiding question for the search was: “What are the reported speed and accuracy of various text entry methods that are relevant to people with physical limitations?”  We identified the Person, Intervention, and Outcomes components of the question to generate search keywords:

  1. Person: categories related to disability such as “Disabled Persons” and “Motor Skills Disorders” and specific diagnoses that typically produce physical impairments
  2. Intervention: categories related to computer access methods, such as “Assistive Technology,” “Communication Aids for Disabled” and keywords specific to particular types of access (e.g., “mouthstick”)
  3. Outcome: keywords related to text entry rate, speed, and accuracy, as well as a general “Outcome” category.

A complex search string for each component was defined, using specific categories from each database’s thesaurus as well as common keywords.  The full search string was formed by combining the three components as Person AND Intervention AND Outcome.  For the complete search string, see the supplementary material for this paper (Koester, 2016).

We searched 10 databases in December 2015: PubMed, PEDro, OTseeker, ERIC, DARE, Cochrane, google scholar, ACM, CINAHL, and PsychInfo.  We manually sifted through the results of each search based on the title and abstract, keeping citations that focused on at least one assistive device or technology to enhance computer use or communication.  This initial search-and-sift phase yielded 617 articles.

Study Selection

We performed a second round of abstract screening to exclude abstracts where the dependent variables mentioned did not include some measure of text entry performance.  (Abstracts that didn’t include specific dependent variables were retained at this point.)  We also excluded abstracts that focused exclusively on pointing performance rather than typing, such as target acquisition with alternative mouse control.  This stage retained 362 articles.

We reviewed the full text of those 362 articles.  We kept the 143 articles that reported typing speed either in words per minute (WPM) or as a measure that could be converted to WPM.  For this paper, we sorted the articles based on whether at least one of the access methods in the paper is readily available for consumer use (N=91) or not (N=52).  Our focus here is only on those access methods that are actually available for use.  We added 11 articles from a brief manual search process.  The 102 articles on available access methods form the core database representing literature related to our guiding question.

Organization of Evidence

We extracted data from the 102 studies into a spreadsheet based on the critical review form of the McMaster University Evidence-Based Practice Group (Law et al., 1998).  For each study, we extracted specific features related to the study purpose and design, participant characteristics and sample size, the access method(s), text entry measurement procedures, and quantitative typing speed results.

We then did a validation check to remove any studies that: a. don’t actually include available technology (N=3); b. are simulations or only cite WPM from other studies (N=6); c. are duplicates of another study (N=2); or d. provide only anecdotal or unclear typing speed measurements (N=8).  This removed 19 studies from the database, leaving 83.

Finally, we removed the studies that included only able-bodied participants as subjects.  While the data from that population can in some cases provide a relevant benchmark, for this initial analysis, we wanted to include only those studies that measured directly from the population of interest.  The resulting database for this paper includes 56 studies (Koester, 2016).

Quantitative Analysis

For this paper, we organized the studies by access method(s) (standard keyboard, ASR, Morse, direction selection OSK, scanning OSK, BCI, and other) and control site (e.g., hand(s), head, eyes, etc.). 

We then analyzed each access method, beginning with the standard keyboard access method. We created a spreadsheet that includes the data from each study that involved standard keyboard input.  For studies that reported results across a group of subjects, we entered the average typing speed, as well as the range and standard deviation where available. For results reported individually for each subject, we computed the average typing speed, range, and standard deviation across those subjects.  If words per minute units were not reported directly, we converted to WPM by assuming 5 letters per word.  For some studies, we estimated the range from figures provided in the original article.

Using the sample size, average typing speed, and standard deviation, it is possible to combine the results mathematically across studies, using meta-analysis techniques.  We sorted the standard keyboard studies into four categories, roughly corresponding to the body site used for typing: cervical spinal cord injury (SCI) upper extremity (UE), mixed diagnosis UE, mouthstick or headstick, and foot.  We then computed the combined average and standard deviation of typing speed for each of these groups, and across the entire set of studies as a whole (Chang, 2016).

RESULTS

 

Table 3. An expanded view of Table 2, showing each study in four general categories for standard keyboard studies. Statistics are for words per minute typing speed.
UE Cervical Spinal Cord Injury N Avg SD Min Max
Pouplin 2016 C6-8 30 13.8 7.2
Pouplin 2015 C5-8 24 8 2.22 6 14
Koester 2015 1 7.7 0
Alcantud 2006 15 10.76 7.22
Koester 1994 C4-6 6 23.2 6.8
Total for UE Cervical SCI: 76 12.03 7.17 <= 6 >= 23.2
UE Mixed Diagnosis N Avg SD Min Max
Koester 2015 9 5.28 2.93 1.3 9.7
Koester 2015 4 26.4 14.79 12.6 39
Pires 2012 CP, keyguard 1 6.34 0 6.34 6.34
Chiaparrino 2011 CP. keyguard 1 1.5 0 1.5 1.5
Tam 2009 10 9.8 6.96 2.5 22
Mezei 2009 4 16.15 11.83 5.6 32.6
Garrett 2008 5 21.5 17.27 6 48
Koester 2007 11 9.9 9.47 1.8 35.22
Mezei 2005 3 12.4 4.61 7.2 16
Koester 2004 18 15 11.46 3.5 32.2
Tumlin 2004 5 8.63 5.64 2.5 14
Tam 2002 4 8.56 2.63 5.64 11.76
Manaris 1998 1-hand 1 29.16 0 29.16 29.16
Total for UE Mixed: 76 12.55 10.73 1.3 48
Mouthstick N Avg SD Min Max
Pouplin 2015 C4 1 10 0 10 10
Koester 2007 CP 1 7.06 0 7.06 7.06
Devries 1998 SCI 1 5 0 5 5
Devries 1998 GBS 1 6.4 0 6.4 6.4
Manaris 1998 1 13.68 0 13.68 13.68
Lau 1993 SCI, DMD 4 8 3.46 3.6 11.2
Smith 1989 post-polio 1 22.5 0 22.5 22.5
Pires 2012 CP, headstick 1 10.28 0 10.28 10.28
Total for Mouthstick: 11 9.72 5.17 3.6 22.5
Foot N Avg SD Min Max
Nguyen 2012 congenital armless 1 30 0 30 30
Pires 2012 CP, 1-foot 1 8.64 0 8.64 8.64
Total for Foot: 2 19.32 15.10 8.64 30
OVERALL TOTAL: 165 12.14 8.94 1.3 48

The 56 studies in the database include a total of 464 subjects with physical impairments. Seventeen studies also include some subjects who don’t have impairments.  The most common number of subjects in a study was one, for 16 studies, while the median number of subjects with physical impairments in a study was four.  In most studies (N=43), subjects were a regular, experienced user of at least one of the access methods used.

Table 1 shows the number of studies for each basic type of access method.  Most studies (N=36) include only one basic access method, but may compare two or more “flavors” of that access method (e.g., with and without word prediction). 

Table 1. The access methods involved in the database of 56 studies.  OSK = On-screen Keyboard.  Note that a study can include more than one access method.

Access Method

N of Studies

Standard keyboard

21

Speech recognition

7

OSK (direct selection)

19

OSK (scanning selection)

17

Morse code

2

Brain-computer interface

1

Other (tongue keypad, EdgeWrite)

8

At the time of this writing, we completed the quantitative analysis for the standard keyboard studies only.  By the time of the presentation, we will have results for the remaining access methods as well.

Typing on Standard Keyboard

Standard keyboard is the largest and most diverse grouping of studies in the database.  It includes all forms of typing directly on the physical keyboard, such as the use of multiple fingers on both hands, bilateral typing with pencils or splints, one finger typing, foot typing, and mouthstick typing.  Table 2 summarizes the typing speed data reported across the 21 standard keyboard studies.  Table 3 provides more detail for the studies within each category.

DISCUSSION

This systematic literature review provides a good start at organizing and understanding what is known about performance for typists with physical impairments.  This discussion focuses on what the results from the standard keyboard studies may tell us.

For the standard keyboard studies, there is a reasonable amount of consistency in the methods used.  Almost all of the data reflect experienced typists, performing typical text copy tests using letter-by-letter typing.

While the methods may be fairly consistent, the reported typing speeds are not.  Speeds range from a low of 1.3 wpm for an individual with cerebral palsy using one finger to a high of 48 wpm for a high school student with spinal muscular atrophy  (see Table 3).  Given such a large range, are there ways to apply this evidence effectively?

First, the slower end of the range is a reminder that typing on a keyboard doesn’t necessarily yield productive results. Seven of the studies had minimums below 4 wpm, and all but one of these involved upper extremity typing.  It’s important to take measurements from individuals during and after assessments, in order to identify people who may be struggling and accommodations that may improve the situation. 

Second, the higher end of the range highlights the fact that severe physical impairment doesn’t necessarily mean slow typing.  It is not unusual for someone with a C5 SCI, using bilateral typing splints, to type at 20 wpm or more. Some mouthstick typists can also achieve this level of speed.   This can help define aspirational goals for some users, while measurements can track progress toward that goal.

Third, even with the wide range in observed typing speed, the average speed for typists using upper extremities is quite consistent for cervical SCI and mixed diagnoses, averaging 12.03 and 12.55 wpm, respectively. 

Table 2. Combined typing speed results for the 21 studies involving use of the standard keyboard. SD = Standard Deviation. (Note: some studies include subjects in more than one category.)
Category N studies N subjects Average WPM SD Range
UE Cervical SCI 6 76 12.03 7.17 (<= 6, > 23)
UE Mixed Diagnosis 12 76 12.55 10.73 (1.3, 48)
Mouthstick (+ 1 headstick) 7 11 9.72 5.17 (3.6, 22.5)
Foot 2 2 19.32 15.10 (8.64, 30)
Overall 21 165 12.14 8.94 (1.3, 48)

Challenges

Performing a systematic review in this area is a challenge, since relevant results may be scattered across varied media in a wide range of fields including rehabilitation, medicine, education, engineering, human-computer interaction, and assistive technology. Schlosser’s (2005) article provided key guidance to the process, such as forming effective search strings and identifying which databases to search. 

Even with this structured approach and a large number of relevant articles, we have likely missed some studies at this early stage, especially from conference proceedings such as RESNA, which are not usually included in literature databases.  We need to do a wider manual search, as well as a backwards search, to expand the coverage of our database.

Future Work

One need is to work toward more well-defined and replicable methods in computer access research.  While many of the basic elements were mentioned in most articles, important details were almost always omitted.  A common structure for performing text entry studies would provide a stronger platform for cumulating results across studies over time.

Our current database of articles is a good foundation to address a variety of questions about computer access methods and to identify gaps in the literature base to guide future research. We’d like to find effective ways of maintaining and leveraging this database over time, possible by allowing others to add studies and use the database to explore their own questions.

REFERENCES

1. Chang A. (2016). Utility to combine statistics from multiple groups into one group.  Retrieved February 12, 2016 from statstodo.com/ComMeans_Pgm.php.

2. Koester H. (2016). Citations for the 56 reviewed studies and additional search details can be found at kpronline.com/pubs.php.

3. Law M, Stewart D, Pollock N, Letts L, Bosch J, Westmorland M. (1998). Guidelines for Critical Review Form – Quantitative Studies. Retrieved from http://srs-mcmaster.ca/research/evidence-based-practice-research-group/.

4. Schlosser RW, Wendt O, Angermeier KL, Shetty M. (2005). Searching for evidence in augmentative and alternative communication: Navigating a scattered literature. Augmentative and Alternative Communication, 21:4, 233-255.