29th Annual RESNA Conference Proceedings

MED-AUDIT (Medical Equipment Device-Accessibility and Universal Design Information Tool): Usability Analysis

Rochelle Mendonca, MS, Roger O. Smith, PhD, OT


The MED-AUDIT prototypes have begun to demonstrate the potentials of this measurement approach. This study provides a preliminary investigation of several aspects of usability for two versions of the MED-AUDIT: the Expert User version and the Black Box version. It was hypothesized that the Black Box version requires less time than the Expert User version to score and that individuals with more universal design knowledge find the Expert User version more usable. Data analysis with t-tests showed that the Black Box required lesser time than the Expert User version with significance at p = 0.000. The second hypothesis correlated universal design knowledge, background and exposure to usability of the two versions, all of which did not achieve significance. The results of this study demonstrated various usability characteristics of the two versions and will play an important role in further developing the MED-AUDIT.


Measurement, accessibility; MED-AUDIT; usability; medical instrumentation


According to the U. S. Census Bureau more than 20% of the general population suffers from some form of disability (1). Recently, national level attention is beginning to focus on the affects of inaccessible medical equipment for these individuals. Evidence shows that advances in healthcare technology are increasing the survival rates of individuals with severe disabilities and that disability rates for women have considerably increased in the past few decades (2, 3). Additionally, there is an increase in the trend to minimize patient care in the hospital causing migration of medical devices from medical facilities to patient's homes (1). These circumstances bring to light the evident need for the manufacture and utilization of accessible medical instruments to facilitate access for individuals with disabilities. There is a vast area of literature that documents the scope and impact of inaccessible medical devices for people with disabilities including the extreme consequence that medical instruments which cannot be used by individuals with disabilities may be life threatening. (4,5,6,7,8,9). Medical instruments that are not accessible to people with disabilities also fail to comply with the intent of the Americans with Disabilities Act (ADA).

The accessibility of medical instruments is a relatively new area. Existing studies assessing the general issue of accessibility primarily use descriptive and qualitative measures. These provide an in-depth understanding of accessibility issues that individuals with disabilities face, however they fail to quantify accessibility. In addition, there are no studies that specifically address the accessibility of medical devices. To address this need a team from the RERC-AMI at Marquette University and the University of Wisconsin-Milwaukee developed a prototype assessment to quantitatively measure the accessibility of medical devices. This software based assessment system is called MED-AUDIT for Medical Equipment Device Accessibility and Universal Design Information Tool and rates medical device procedural tasks, sub-tasks, and device features to calculate an integrated accessibility score. Under development, the MED-AUDIT has two versions that present with different usability characteristics that target two distinct scoring elicitation approaches and user audiences. This paper presents results of a study addressing aspects of usability, validity and reliability of the MED-AUDIT. Two primary research questions are presented here: 1) What is the difference between the times required to score the two versions of the MED-AUDIT? and 2) What is the relationship between universal design knowledge and usability of the two versions of the MED-AUDIT?


A convenience sample of thirty-eight student participants were provided course extra credit or a modest monetary incentive to participate in the study (based on an approved IRB procedure). Twenty-eight occupational therapy and 10 biomedical engineering participants comprised the overall study sample. Each participant was randomly assigned to one of two groups: one scoring the Black Box Version and the other the Expert User Version.

The Black Box System Version includes about 800 distinct questions. These questions are arranged in a convention of hierarchal outline with Roman Numerals in major headings broken down five or six levels for the taxonomy. The outline structure provides the branching options from level to level. Through the logic of branching, parts of the taxonomy that are irrelevant for a given type of device are naturally bypassed.  Also, because of branching, the actual questions become more targeted, requiring less expertise and conceptual synthesis on the part of the user.

The Expert User System Version includes about 700 questions, which are based on 16 core questions. In comparison to the more specific and narrow questions for the Black Box version, the Expert User version involves more integrative concepts and thus more expertise in the rater.  Each of the 16 core questions scored for both "easily/flexibly" and "safely" categories and each is administered for up to 13 impairment categories.  There is no hierarchical branching with this version.

All subjects assessed the accessibility of two different models of a blood pressure monitor; an aneroid monitor and a digital monitor. Prior to scoring the MED-AUDIT, each participant answered a pre-test questionnaire including demographic information; questions to assess universal design knowledge, background and exposure; and two questions to assess the participants' subjective perceptions of accessibility of two monitors on thirteen different disabilities. After this, they were provided instructions on the use of the software and time to interact and familiarize themselves with the MED-AUDIT and the medical devices. Following this introduction to the task, each participant was allowed 45 minutes to score the first device on the MED-AUDIT. After a five-minute break they scored the second device with the same version of the MED-AUDIT. Lastly, participants answered a post-test questionnaire, which included questions about the usability of the version they scored, and the same two questions as the pre-test questionnaire about the accessibility of the two blood pressure monitors for the thirteen different disabilities. The MED-AUDIT software includes a new time tracking module to record all events during the scoring procedure.

This study analyzed time differences in terms of cognitive load per question, efficiency per question and total time to score each version. A correlational analysis assesses the relationship between universal design knowledge, background and exposure and the usability of the two versions.


For the first hypothesis, differences in the time required scoring the two versions, data was analyzed using independent samples t-tests. Results revealed that the efficiency per question was greater for the Black Box Version compared to the Expert User Version (t (74) = -6.647, p = 0.000). The total time to score each version was greater with the Expert User Version compared to the Black Box Version (t (74) = -5.268, p = 0.000) as seen in Figure 1.

Two histograms of the total time required to score the Black Box and Expert User MED-AUDIT versions.Figure 1: d Histograms of the Total Time Required to Score the MED-AUDIT (Click image for larger view)

However, the cognitive load per question was greater for the Black Box Version compared to the Expert User Version (t (74) = 4.633, p =0.000) contrary to the first hypothesis.

For the second hypothesis, the Pearson Product Moment Correlation was used to analyze the relationship between the universal design knowledge, background, exposure and usability at a = 0.05 using a one-tailed test. All three correlations were not significant.


Results from the first hypothesis indicate that the total test efficiency as well as the efficiency per question was greater for the Black Box Version compared to the Expert User Version. This is because of the scoring system used for the Black Box Version called Trichotomous Tailored Sub-Branching Scoring (TTSS). TTSS increases the efficiency of scoring thus decreasing the amount of time required to score the assessment.

On the other hand, for cognitive load, the Expert User Version was found to be more efficient. This may be due to the redundancy of questions for each of the thirteen disabilities. This probability was further explored by comparing the time it took participants to score the first and the last disability. The results indicated that the t-test was significant with t (86) = 4.358, p = 0.000. The comparison between means showed that the time taken by participants to score the first disability (Mean=378.59) was approximately twice as long as it took them to score the last completed impairment question (Mean=177.24). This indicates that practice may have played a role in the cognitive load per question being lesser for the Expert User Version than the Black Box Version. The time results obtained for this study were relative to the device scored that is blood pressure monitors. It is possible that results may not be the same if different devices are scored depending on the complexity of the device.

Results for the second hypothesis indicated non-significance for relationships between universal design knowledge, background and exposure. Factors that may have contributed to this could be the inequality between the distribution of participants (occupational therapy=28, engineering=10) or inadequacy of questions used to assess the universal design parameters. However, comparison between the two groups of participants did not indicate any significant differences.

This study demonstrates usability characteristics of the MED-AUDIT and provided provides considerable insight about the taxonomy, scoring structure and the target population. Results from this study will be used in further development of the MED-AUDIT. This study also provided a platform to conduct additional usability studies on newer versions as well as reliability and validity studies.


This work is supported in part by the National Institute on Disability and Rehabilitation Research, grant numbers H133A010403 and H133E020729. The opinions contained in this paper are those of the grantee and do not necessarily reflect those of the NIDRR and U.S. Department of Education . We also acknowledge Todd Schwanke, Eli Gratz, Jack Winters, Molly Follette Story, Kerri Grogran, and Teresa Snyder for their work and continued support on this project.


  1. Wilcox, S. B. (2003). Applying the principles of universal design to medical devices. Retrieved October 11, 2004, from http://www.devicelink.com/mddi/archive/03/01/contents.html
  2. Gans, B. M., Mann, N. R., & Becker, B. E. (1993). Delivery of primary care to the physically challenged. Archives of Physical Medicine and Rehabilitation, 74 (Supplement) , s15-s19.
  3. Kaye, H. S., LaPlante, M. P., Carlson, D., & Wenger, B. L. (1996). Trends in disability rates in the United States, 1970-1994. Disability Statistics Abstract, 17 , 1-6.
  4. Nary, D. E., Froehlich, K., & White, G. W. (2000). Accessibility of fitness facilities for persons with physical disabilities using wheelchairs. Topics in Spinal Cord Injury Rehabilitation, 6 (1), 87-98.
  5. Rimmer, J. H., Riley, B., Wang, E., Rauworth, A., & Jurkowski, J. (2004). Physical activity participation among persons with disabilities. American Journal of Preventive Medicine, 26 (5), 419-425.
  6. McClain, L., Medrano, D., Marcum, M., & Schukar, J. (2000). A qualitative assessment of wheelchair users' experience with ADA compliance, physical barriers, and secondary health conditions. Topics in Spinal Cord Injury Rehabilitation, 6 (1), 99-118.
  7. Veltman, A., Stewart, D. E., Tardiff, G. S., & Branigan, M. (2001). Perceptions of primary healthcare services among people with physical disabilities. Part I: Access issues. Medscape General Medicine, 3 (2).
  8. Young, M. E., & Ellen, G. (2001). Managed care experiences of persons with disabilities. Journal of Rehabilitation, 67 (3), 13-20.
  9. Grabois, E., Nosek, M. A., & Rossi, D. (1999). Accessibility of primary care physicians' offices for people with disabilities. Archives of family medicine, 8 , 44-51.

Rochelle Mendonca, MS
Rehabilitation Research Design & Disability (R 2 D 2 ) Center
University of Wisconsin-Milwaukee
P.O. Box 413
Milwaukee, WI 53201-0413
Phone No. (414) 229-6803

This should be in the right column.