-->

 

Accessible But Not Usable: Improving Practices For Surveying People With Disabilities
Irina Nikivincze (Georgia Tech (Georgia Institute of Technology), College of Design)*; Julie Ancis (Georgia Tech)

Abstract

This paper reports on survey strategies employed in a large scale research study on the impact of accessible technology (E-Books) on student’s academic success. Participants were approximately 400 racial and ethnic minority students with print related disabilities enrolled at 53 Minority Serving Institutions in the United States. In exploring the most accessible online survey platforms for our research, we conducted an accessibility evaluation of four commercial platforms (Qualtrics, SurveyMonkey, SurveyGizmo, FluidSurveys) from 2015 to 2018. We discuss the results of  this evaluation and the challenges encountered. The continuum of accessibility and “how much” accessibility makes the survey usable by people with disabilities is reviewed. Since online survey tools are increasingly used, it is important to understand their capabilities and limitations.

Introduction

Surveys, broadly defined as organized templates for gathering information by asking people questions, are popular tools for scientific and non-scientific inquiries.  Surveys are used to gather a range of individual feedback, for example, opinions, impressions, experiences, attitudes, or satisfaction. A more scientific approach, survey research, uses research methodology to design, sample, and collect information “for the purposes of constructing quantitative descriptors of the attributes of the larger population of which the entities are members” (Groves et al., 2009, p. 2). Historically, people with disabilities were excluded from many of these inquiries, often by “design” (Wilson et al., 2013).

In 1998, the National Council on Disability in their report “Reorienting Disability” research, criticized the research community for excluding people with disabilities from all stages of research (National Council on Disability, 1998). The report, which is still relevant today, makes a number of useful recommendations, starting with defining disability as “an interaction between an individual with an impairment and the environment rather than as a deficit of an individual.” Building on the report, Mulhorn (2006) offered five considerations in evaluating if a survey is inclusive and accessible to people with disabilities (PWDs):

  1. Were PWDs involved in planning or developing the survey?
  2. Were sampling strategies implemented to increase representativeness of PWDs (e.g., screening)?
  3. Was the survey instrument produced in multiple formats to accommodate PWDs (e.g., TTY, large type)?
  4. What efforts were made to produce the results in multiple formats for dissemination?
  5. Were questions asked to elicit information about environment and/or participation?

The design of data collection activities often does not account for differences in participant’s ability to see, hear, understand, and respond to questions asked. Survey accessibility is an essential requirement for including people with disabilities. Although a number of publications over several decades discuss creating inclusive surveys, design considerations and practical guidelines regarding accessible survey development are relatively difficult to locate (Chamie, 1989; Henry et al., 2006). This is particularly the case for online surveys.

Historically, people with disabilities were considered to be a “special population” and specific strategies were employed to collect data from that population. Since paper and pencil surveys were often not accessible, researchers collected data in-person or via telephone, and sometimes via proxy - a third person who could speak on behalf of a person with a disability (Hasnain et al., 2015). To learn what format works best for people with visual impairments, the New Jersey Commission for the Blind and Visually Impaired conducted a study using a multi-modal design in which participants completed a survey in one of four self-administered formats: large-print document, Braille, computer disk, or audio tape (Murray, 2006). The researchers concluded that self-administered survey design can be used with people with visual disabilities and that a combination of large print and telephone modes provides adequate access (Murray, 2006).

Some of the early efforts to survey people with disabilities, such as the California Disability Survey (1981), used pioneering approaches such as telephone and computer-based data collection procedures. A number of persistent and recurring issues were identified such as 1) difficulties and resources needed to identify and locate a small target group of people with disabilities or with specific disabilities in the general population, 2) changing the content of questions for specialized subgroups, and 3) using computers to handle the complexity of item branching (Shanks, Nicholls, & Freeman, 1981). In addition to automating branching, the use of computers allowed researchers to schedule callbacks, pre-screen responders, line up appropriate questions at a later point in the survey, digitally store open-ended questions and demographic items, automatically insert text of pre-coded answers from previous questions, store both English and Spanish versions of the instrument and responses, restart interrupted interviews at any point, and go back and correct previous questions (Shanks, et al., 1981). Many innovative features of telephone and computer-assisted surveys are taken for granted with the availability of online survey platforms.

Online Survey Platforms

Until recently, online survey providers had limited expertise in accessibility and did not fully follow accessibility guidelines of WCAG 2.0, Section 508, or WAI-ARIA recommendations. Gottliebson, Layton, and Wilson’s (2010) study of over 11 popular online survey platforms for general compliance with WCAG in accessibility found that all had accessibility issues and were not “usable” by many assistive technology users. Only one survey platform was able to reliably translate to a screen reading software while the rest had varying degrees of non-compliance  (Gottliebson et al., 2010). While many vendors claim that their products are accessible, this is not always the case (Gottliebson et al., 2010). Few vendors conduct usability testing with people with visual disabilities (Byerley, Chambers, & Thohira, 2007).

The implementation of online services is constantly evolving and as developers fix issues, the title of “the most accessible platform” shifts from one company to another. In 2008 the Web Accessibility Center at Ohio State University assessed the degree to which online survey tools were accessible by keyboard and by screen-reader. Out of six online tools examined (SurveyGizmo, SurveyMonkey,  Zoomerang, Checkbox, LimeSurveys, Snap Survey prof.) the highest grade of B+ was awarded to SurveyGizmo (Hasnain et al., 2015).  Other reviews by Gottliebson and colleagues (2010) and by Ken Petri in 2010 and 2012 gave the highest marks to SurveyMonkey for a reasonable compromise between accessibility and functionality. Audits often reveal varying degree of accessibility and usability issues making it difficult for users to decide whether the platform is actually accessible and reliable for data collection.

Below we report strategies used in the design of surveys for a study of the impact of accessible textbooks on student achievement. Our study monitored the accessibility of commercially available platforms (Qualtrics, SurveyMonkey, SurveyGizmo, FluidSurveys) from 2015 to 2018. In addition, we collected student feedback regarding accessibility of the surveys. We also summarize specific accessibility issues encountered in various online platforms and whether they were resolved by us or by the company (see Table 1 for an abridged version).

Accessible Survey Design

Survey Design

The design of an accessible survey begins with a consideration of the target population and its characteristics. Some modifications such as streamlining the language, instructions, question types, and scales need be made during the survey design stage, while others can be left to survey implementation and testing. The survey must be understood by survey respondents, which requires the use of clear and simple language for questions and instructions. It is important that question types be familiar and accessible to all users. Further, researchers should consider the relevance of questions to people with disabilities and receive feedback about their questions from the studied population. Previously validated instruments may need to be revised for one’s audience. The length of the instrument, or the scale may strain the respondents. We revised all scales for our instruments 1) making them short, no longer than seven points, 2) explicit, providing numeric and verbal categories next to each other, and 3) consistent (same) across various instruments when possible. As validated by various studies, including ours, it is important to keep the timing of the survey under 20 minutes or risk  incomplete or careless responses.

Table 1: Summary of Accessibility Issues

Accessibility Issue

Online Survey Platform

Navigation: “invisible” cells/spaces in tables were read by screen readers

Qualtrics (resolved)

Navigation: visibility of keyboard focus indicator

SurveyGizmo (resolved);

Qualtrics (inconsistent)

Navigation: tabbing order issues

SurveyGizmo (resolved); Qualtrics (still issues)

Operability: default contrast ratio of survey navigation buttons, some themes

User fix: SurveyGizmo; Qualtrics; SurveyMonkey

Operability: progress bar not read by some screen readers

Qualtrics (still issues)

Operability: radio buttons and check boxes not confirmed as checked

Qualtrics (resolved); SurveyMonkey (resolved)

Operability: radio buttons read twice

Qualtrics (resolved);

SurveyGizmo (resolved)

Implementation

In our experience, most accessibility issues arise during the implementation stage. When using online survey platforms, researchers rely on that platform’s implementation and compliance to web accessibility standards, as well as their own knowledge of best practices. Common adjustments include changes to the survey layout, color contrast, presentation of sections, questions and scales.

Validation and Error Handling

If a survey includes required fields or validates the format of some responses, it is important to make sure that the instructions are explicit about the format and data entry expectations (e.g. “Required field” or “Please provide a numeric response”). While it is a good idea to avoid validation, if one must have it, it should be properly handled by screen readers. Not having a sound alert and an explanation when a mistake has been made can be a frustrating experience for a survey taker.

Providing Multiple Ways to complete a survey

Providing multiple means of taking a survey and obtaining help if there are problems can help resolve many unanticipated issues. Assistance should be readily available and timely. It is important to notify respondents of how to obtain assistance. In our survey, we provided contact information at the bottom of each page.

Testing

Testing can help to identify technical and design issues. What may look “good” to a sighted user may not be accessible to a person with visual or physical impairments. Survey navigation is a good example: keyboard focus is not always perceivable, tabbing order sometimes can be off, screen readers may pick up hidden elements, and headings are not always properly used.

It is important to test and receive feedback from people with disabilities. The diversity of disabilities has implications for using and interfacing with conventional and assistive technologies (Randolph & Hubona, 2006). Even experts, at times, may disagree if a survey element is accessible and usable.  

CONCLUSIONS

The implementation of principles of universal design can help to increase survey accessibility and usability by people with disabilities. In this paper, best practices for surveying people with disabilities, common issues, and recommendations for how to increase the accessibility and usability of surveys are provided. While some adjustments can be made by researchers when designing the survey, others depend on implementation of the survey platform. When choosing an online survey platform, understanding the degree of accessibility and limitations is essential. Furthermore, it is important to be flexible and creative (Kroll et al., 2006, p. xiv).

References

Byerley, S. L., Chambers, M. B., & Thohira, M. (2007). Accessibility of web-based library databases: the vendors’ perspective in 2007. Library High Tech, 25(4), 509-527.

Chamie, Mary. (‎1989)‎. Survey design strategies for the study of disability. World health statistics quarterly, 42(‎3)‎, 122-140.

Gottliebson, D., Layton, N., Wilson, E. (2010). Comparative effectiveness report: online survey tools. Disability and Rehabilitation: Assistive Technology, 5(6): 401- 410.

Groves, R. M. (2004). Survey methodology. Hoboken, NJ: John Wiley & Sons.

Henry, A. D., Gallagher, P., Stringfellow, V., Olin, L., Hooven, F., & Himmelstein, J. (2006). Notes from the Field: Contemporary Strategies for Developing Surveys of People with Disabilities: The Masshealth Employment and Disability Survey. In T. Kroll, D. Keer, P. Placek, J. Cyril, & G. Hendershot (Eds.), Towards Best Practices for Surveying People with Disabilities (pp. 127-146). New York: Nova Science Publishers.

Husnain, R., Shpigelman, C., Scott, M., Gunderson, J. R., Rangin, H. B.,Oberoi, A., McKeever, L. (2015). Surveying People with Disabilities: Moving Toward Better Practices and Policies. In T. P. Johnson (Ed.), Handbook of Health Survey Methods (pp. 619-642). Hoboken, NJ: John Wiley & Sons.

Mulhorn, K. A. (2006) Addressing Recommendations for Increasing the Rate of Response by Persons with Disabilities: A Comparison Across Six National Surveys. In T. Kroll, D. Keer, P. Placek, J. Cyril, & G. Hendershot (Eds.), Towards Best Practices for Surveying People with Disabilities (pp. 1-11). New York: Nova Science Publishers.

Murray, P. (2006). Multi-Mode Approach for Surveying Visually Impaired Populations. In T. Kroll, D. Keer, P. Placek, J. Cyril, & G. Hendershot (Eds.), Towards Best Practices for Surveying People with Disabilities (pp. 165-180). New York: Nova Science Publishers.

National Council on Disability. (1998). Reorienting Disability Research. Washington DC: National Council on Disability. Retrieved https://ncd.gov/publications/1998/April1998

Randolph, A. B., Hubona, G. S. (2006). Organizational and individual acceptance of assistive interfaces and technologies. In D. Galletta & P. Zhang (Eds.), Human–computer interaction and management information systems—Applications (Vol. 5). Armonk, NY: M. E. Sharpe.

Shanks, J. M., Nicholls, W. L., & Freeman, H. E. (1981). The California Disability Survey. Sociological Methods & Research, 10(2), 123-140.

Wilson, E., Campain, R., Moore, M., Hagiliassis, N., McGillivray, J., Gottliebson, D., Bink, M., Caldwell, M., Cummins, B., & Graffam, J. (2013). An accessible survey method: increasing the participation of people with a disability in large sample social research. Telecommunications Journal of Australia, 63(2), 24.1-24.13

Acknowledgements

We thank John Rempel, AMAC QA Accessibility Analyst; Kare Romanski, (former) AMAC ICT Accessibility Compliance Director; and Christopher Lee for their help and suggestions.

The contents of this paper were developed under grant #P116F140452  from the U.S. Department of Education.  However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government.