Development of an Interface for Integration of Communication and Robotic Play

Mark Corrigan1, BA, Kim Adams1,2, MSc, Albert Cook, PhD1
University of Alberta1 and Glenrose Rehabilitation Hospital2, Edmonton, Alberta


Children with disabilities may have decreased ability to play, and thus less opportunity to acquire associated motor, cognitive, and linguistic skills.  There is evidence that robotic play may help children acquire those skills, and our overall goal is to facilitate integration of spontaneous robotic play and communication.  To investigate effective methods to do this, a testing platform was developed along with several integrated communication and robotic play interfaces.  The interfaces underwent usability testing by five "expert" users with relevant backgrounds and then were modified iteratively after input from each user.  Subsequently a detailed heuristic evaluation was applied to the interfaces.  The resulting platform and interfaces are described here as well as results from the usability evaluations.  Next steps involve testing children with and without disabilities.


Play, Lego, Robotics, Augmentative Communication, Interface Design


Children without physical disabilities learn motor, cognitive, and linguistic skills through object manipulation during play. The ability to play and acquire associated skills may be hampered by a physical disability [1]. Children with disabilities might have opportunities to direct someone else to perform play activities using an augmentative communication method, but often they just watch other children play.

Research has shown that physically disabled children who use switches to control a robotic arm to engage in play activities have been able to reveal more cognitive skills than had been observed previously [2,3]. A pilot study revealed that low cost Lego Mindstorms™ robots could be used to perform similar play activities [4]. The low cost results in more access to play since parents and teachers can potentially purchase, build and program these robots themselves. The pilot has been expanded to a group of ten children aged 3 to 12 [5]. The project reported here is an extension of these studies in two ways. First, in order to independently control a robot with six degrees of freedom children need up to six switches (open & close for gripper, up & down for boom, and left & right for robot body). Some children may have physical limits to the number of switches they can control. There are numerous alternative access methods available for the computer which may be useful to them. Second, children vocalize more during robotic play, perhaps indicating an intent to communicate. In previous studies, they did not have access to an AAC so communication was limited. This is a common problem where children who use typical AAC devices must pause in order to engage in play activities [6].


Our overall goal is to facilitate integration of spontaneous play and communication for children who have severe disabilities. To investigate effective methods to do this, a platform was developed, which:

  1. provides interfaces that integrate communication and robotic play commands,
  2. provides full access to robotic functionality, and
  3. accommodates varying physical, cognitive and linguistic abilities.

Specific Objectives for this Project

  1. Develop a flexible platform to test various integrated interface configurations.
  2. Perform usability evaluations with five experts from relevant backgrounds and make iterative interface modifications based on their feedback



Participants in the usability evaluations were five experts from a variety of relevant fields: a speech-language pathologist specializing in AAC, a rehabilitation engineer specializing in alternative computer access methods, two psychologists, one specializing in human factors, the other in pediatric psychology, and an adult with complex communication needs who uses an AAC device via direct keyboard access.


Figure 1Figure 1. Robotic Arm and Mobile Roverbot. (Click image for larger view)

Testing platform: The testing platform consisted of prototype software called ATCreator™, developed by Madentec Limited, on a Sahara Slate PC™ tablet computer with resistive touch screen. ATCreator is an on-screen keyboard which has many flexible features. It allows import of digitized pictures and/or icons for symbols, and button labels in any text font, size, and/or color. It also allows clinicians to assign behaviors to the buttons such as infrared output, text to speech or sound file output. A very helpful feature is JavaScript™ programming to incorporate situational decision making (e.g., avoiding a repeated movement).

The symbol set for this project was digitized pictures of robots, users, and play items, as well as sample symbols from Imagine Symbols™. Infrared output was accomplished using a RedRat™, two-way infrared controller that stores, sends and receives infrared commands from a USB port. Two Lego Mindstorms™ robots were used, a robotic arm and a roverbot (See Figure 1).

This photograph shows a video close up of an expert user’s face with embedded video in the upper left hand corner of the robot she is looking at.Figure 2. Picture in picture video. (Click image for larger view)

Robots and play materials were positioned in a play area on a table. Two video cameras, a mixing board and output video recorder were used to capture a picture in picture (PiP) view of robot actions within the view of the users face (See Figure 2). PiP view should assist in determination if a child demonstrates an understanding of causality during play activities while we look for co-occurrences (i.e., activating a switch, then look at the robot for a reaction).


Each expert evaluated several interfaces. The expert user was encouraged to “think aloud” while using the interfaces to engage in robotic play, which involves the user continuously thinking aloud while using the system enabling the investigators to understand how that user views the system [7]. Investigators also observed the user to see when confusion or unexpected outcomes may have occurred and ask for clarification at those times. Users’ comments were noted and reviewed by both experimenters to reach consensus on required modifications. Interfaces were refined, edited, and expanded based on each expert's feedback, then presented to the next expert.

Following user testing, the interfaces were evaluated using the Xerox Heuristic Evaluation Tool [8]. This extensive heuristic evaluation checklist was completed by two investigators individually, one with a background in rehabilitation engineering and the other in educational psychology. Consensus resulted in identification of any subsequent modifications required. This method is similar to another AAC usability study [9].

Data Collection and Analysis:

Real time data collection was via the picture in picture recording, which facilitated verification of the written notes from the session and determination of what emotions the user was experiencing (e.g., surprise, confusion, or frustration).

This picture is software concurrently capturing video of an expert user’s face and video of the screen the expert is looking at.  A menu allows notation of screen and video events.Figure 3. Morae software. (Click image for larger view)

ATCreator also provides real time data collection capability through a log file that is a sequential record of user actions (e.g. infrared signals sent, text-to-speech output, or sound file played).

Morae™ usability testing software was used to integrate the data sources. Morae stores and synchronizes a video image (i.e. the user's face) and a recording of on-screen computer actions along a single timeline. User defined events are color coded, and events such as a surprise or excitement can be noted concurrently with interface events as mouse down activations (See Figure 3).


Expert evaluations:

Expert critique of the interfaces fell into categories of: modifications to the interface layout or behavior, ideas for future development, and suggestions for play and educational activities.

Some interface modifications were simple edits involving changing the type, position, size, or grouping of graphics presented. Other modifications required software programming changes. For example, the ability to send infrared signals continuously while a button was held down and addition of highlighted color borders when activated (to reinforce auditory and robot movement feedback).

Ideas for future development included improvements on the hardware, and ideas to expand the educational potential of robotic play. Changes include the types, complexity, precision, and reliability of robot movements and the addition of a robot-mounted camera, virtual reality landscapes, and situation sensitive robotic actions.

Suggestions for play activities included dressing, undressing and bathing dolls and the inclusion of familiar story lines with recognizable characters. Educational potential can be explored through interfaces with specific linguistic or cognitive objectives. Play scenarios can involve the child in sequencing, sorting, categorization, matching, choice making, and problem solving. Each expert’s input improved a different aspect of the system, all from the child’s perspective.

After evaluating the communication and play interfaces, the AAC user said, “This is my first actual time playing with stuff.” She explained that as a child, “I just watched my sister play [with] her toys”. Hence, this platform seems to be a step towards facilitating communication and play for children with disabilities.

Testing Platform and Interfaces:

This photo show an early basic interface followed by a later more colorful and elaborate interface with clearly distinguishable communication and robot action buttons.Figure 4. Early and Final interface. (Click image for larger view)

A few initial interfaces served as the basis for discussion and development of a variety of interfaces. Figure 4 shows how an interface evolved from a stand-alone communication page to an integrated communication and robotic play interface. The page on the left contains symbols for messages as well as making animal sounds, and it linked to another page for making robot movements to pick and drop animal toys. The integrated page on the right has communication symbols, which are accessible on every page, along side with robot control buttons to pick and drop animals while playing animal sound wave files.

This photo shows three progressively more elaborate roverbot interfaces building from single button to an elaborate grid with directional control and communication buttons.Figure 5. Three Roverbot Interfaces. (Click image for larger view)

Please refer to Figure 5 for three roverbot interfaces which accommodate the range from low to high physical, cognitive, and linguistic needs.

The resulting test platform can accommodate highly variable levels of physical skill. Because the interface is tablet computer-based, it supports alternative access methods, such as touch screen, scanning, and head-pointing. Touch screen access was tested here, and interfaces varied from having one to several buttons on them. The AAC device user revealed a technical “glitch” with the touch screen that had not been revealed by non-disabled testers. She demonstrated that when a button is activated with a finger, and the finger is dragged in continuous contact to another part of the screen, the button remains activated. The required modification was communicated to the programmers.

Varying levels of cognitive skill can be addressed by enabling the robot to perform greater amounts of the task (via pre-programmed functions) until the children build the skills required to independently control individual robot movements. For emerging assistive technology users the platform accommodates a visual scene display, which indicates "hot spots" that highlight when they are selected. It can also accommodate grid arrays when appropriate.

Linguistic skill can be accommodated by adding vocabulary to the interface as required. When the vocabulary storage capability of a visual scene is exhausted, the user can transition to a grid array. The platform can be used for very motivating play experiences which should encourage emerging communicators to speak more, and possibly build their sequencing skills. With more built in vocabulary, more advanced communicators can combine the independent control of robot movements with specific vocabulary surrounding the activity.

Data Collection and Analysis: ATCreator's logfiling and Morae's usability testing software will be helpful for subsequent analysis with children. Logfiling will provide data so measures such as number of utterances or number of button presses per session can be made. The Morae software will facilitate observing for co-occurrences as well as communicative intent which cannot be captured in the logfile, such as eye gestures. Figure 6 shows data from Morae on the left adjusted in position so the time is on the same scale as in the logfile to the right. Morae allows third party software to write directly to its search results, which will automate this synchronization process.

Table 1. Data Analysis
Morae Usability Analysis Search Results ATCreator Logfile Output
Time Event Window Title Clicks Marker Label ATCreator Logfile Output
02:30.4 Mouse Clicks OK L Button Down    
02:30.4 Marker     Start logfile #START :user4_20061215-103852.log
          #Logfile entries :
02:33.8 Mouse Clicks DVdriver 1.0.2(6) L Button Down    
02:34.6 Mouse Clicks   L Button Down    
02:39.4 Mouse Clicks Window: Window1 LButtonDown    
02:40.8 Mouse Clicks Window: Window1 LButtonDown    
02:48.6 Mouse Clicks Window: Window1 LButtonDown    
02:51.0 Mouse Clicks Window: Window1 LButtonDown    
02:52.4 Mouse Clicks Window: Window1 LButtonDown    
03:02.6 Marker     Start of VSD single button  
03:04.8 Marker     starts to press button  
03:05.8 Mouse Clicks Window: Window1 LButtonDown   10:39:27 A: Square123- MouseDown
          10:39:27 A: Square123- MouseUp
          10:39:27 O: screeching sound #Audio out
03:08.1 Marker     Looks for robot action  
03:09.4 Marker     Looks at assistant  


Our initial interfaces, while not perfect, gave the investigators a place to start and a means to generate discussion with the expert users. Though experts were never looking at a finished product, having hands-on experience with a working test platform probably enabled them to give more concrete advice rather than just offering opinions or suggesting specifications.

Within limits, this project has succeeded in providing a medium for spontaneous play and communication. It was encouraging that the adult AAC user enjoyed her time playing and saw great potential for this project. One limitation was with the robots themselves, where low battery power levels resulted in inaccurate target acquisition. However, this provided an opportunity for communication and problem solving.

Current system hardware and software is expandable, upgradeable, and can address a wide variety of physical, cognitive and linguistic needs. The next step involves usability evaluations with children with and without disabilities. Interface development will continue through subsequent phases when testing with children and will incorporate skill development activities identified in other robotic play projects. One interesting challenge for future development will be integrating the interface for users requiring scanning access methods. The challenge will be to scan robotic play options to control a robot that is moving.


  1. Musselwhite, C. (1986). Adaptive Play for Special Needs Children. College Hill Press.
  2. Cook, A.M.,Bentz,B., Harbottle, N., Lynch, C., and Miller, B. (2005).School-Based Use of a Robotic Arm System by Children With Disabilities. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 13(4), December 2005.
  3. Prazak, B., Kronreif, G. Hochgatterer, A. and Furst, M. "A toy robot for physically disabled children." Technology and Disability, 2004, 16, pp 131-136.
  4. Schulmeister, J., Wiberg, C., Adams, K., Harbottle, N., Cook, A. (2006). Robot Assisted Play for Children with Disabilities, RESNA Conference Proceedings, Atlanta, GA.
  5. Cook, A.M.,Adams, K., Harbottle, N. (Accepted for 2007). Lego Robot Use By Children With Severe Disabilities. CSUN Conference.
  6. Light, J., and Dragger, K., (2002). Improving the Design of Augmentative and Alternative Technologies for Young Children. Assistive Technology, 14: pp 17-32.
  7. Nielsen, J. (1994). Usability Engineering. Academic Press, San Fransisco, CA.
  8. Pierotti, D.(1995) Xerox Heuristic Evaluation – A System Checklist, Xerox Corporation.
  9. Adams, K, Portis, K, Bisantz, A, Buckley, M, Higginbotham, J, Schindler, K, & Sweeney, M, (2005). ”Experiences Using a Heuristic Evaluation Tool on AAC Software Interfaces”. RESNA Proceedings, Atlanta, GA.

ATCreator is a product of Madentec ltd.

Imagine Symbols are a copyright of Imagine Symbols

JavaScript is a registered trademark of Sun Microsystems, Inc.

LEGO Mindstorms is a trademark of LEGO

Morae is a trademark of TechSmith

RedRat is a trademark of RedRat ltd.

Sahara Slate PC is a product of Honeycom Solutions PTY ltd.


We would like to thank: Randy Marsden, Daniel Tse, and Jason Dutton of Madentec for the AT Creator platform and programming support; our expert users for their time; and the Glenrose Research Foundation, The University of Alberta Endowment Fund for the Future, and Alberta Economic Development Medical Device Development Program for financial support.


Kim Adams
Faculty of Rehabilitation Medicine
3-48 Corbett Hall
University of Alberta
Edmonton AB T6G 2G4


  • Source Ordered
  • No Tables
  • Very Compatible


Disney produced a television show in the mid 1990s called Gargoyles. It's a great show and I'm a big fan. A few years ago Disney started to release the show on DVD. The last release was of season 2, volume 1. That was two years ago. Volume 2 has not been released. Why? Poor sales. So if you should find yourself wanting to support my work, instead I ask you pick up a copy of season 2, volume 1. It's a great show and you might find yourself enjoying it.