EEG-Based Brain-Computer Interface Access To Tobii Dynavox Communicator 5

Kamilya A. Gosmanova1, Charles S. Carmack1, David Goldberg2, Kelly Fitzpatrick1, Bart Zoltan3, Debra M. Zeitlin3, Jonathan R. Wolpaw1,3, Ole Alexander Maehle2, Anders Borge2, Theresa M. Vaughan1, 3

1National Center for Adaptive Neurotechnologies, Wadsworth Center, NYSDOH, Albany NY; 2Tobii Dynavox;

3Helen Hayes Rehabilitation Hospital, West Haverstraw, NY

Abstract

A brain-computer interface (BCI) records brain signals, extracts specific measures (or features) from them, and converts (or translates) these features into commands that operate applications that replace, restore, enhance, supplement, or improve natural central nervous system (CNS) outputs. For people diagnosed with amyotrophic lateral sclerosis (ALS), a BCI using event-related potentials allows users to choose among items in a matrix and provides the means for communication. This study seeks to establish BCI as an access method for the Tobii Dynavox Communicator 5 software package. In this study, 14 subjects used Tobii Dynavox Communicator 5 software during BCI system calibration. Selection rates (calculated offline) averaged 5.8(+2.3) selections/minute with 97(+7.6)% accuracy. Nine of the 14 subjects then used BCI control to move freely between three Communicator 5 menus in real time. They used an average of seven selections to complete a task that required a minimum of six selections. These results indicate that BCI can be used as an alternative or complementary access method for the Communicator 5 software, thus extending the usefulness of both the off-the-shelf software package and the BCI.

Introduction

An estimated two million people with complex communication needs (CCN) worldwide depend on some sort of low or high tech augmentative and alternative communication (AAC) (Beukelman & Miranda 2005; National Joint Committee, 2017). Like the general population, most AAC users require multiple technologies, including eye tracking, to meet their needs across tasks and across the course of their injury or disease (Scherer, 2005).

People affected by severe motor disorders such as amyotrophic lateral sclerosis (ALS) and brainstem stroke may not be able to use even the most basic conventional assistive technologies, which all rely in one way or another on muscle control. A brain-computer interface (BCI) can give such individuals communication and control technology that does not depend on neuromuscular output (Wolpaw and Wolpaw, 2012).

Fig 1. A) A BCI-24/7 Home User sits waiting to begin a calibration task. The User is seated in his own wheelchair. He wears a cloth EEG cap that records and transmits brain activity through a cable to an amplifier and then to a computer.  B) Average ERP responses to target and nontarget stimuli plotted as amplitude (Y axis) over time (X axis) recorded at location Pz. This and other features found in the EEG during the  Calibration Step (S1) are used to classify the data  for S2 and S3. C) The Communicator 5 screen used during Calibration (S1) and Validation (S2) contains 26 blue English letters arranged alphabetically in 4 rows and 7 columns (white field, red borders). A black and white image of a neutral male face covers a random group of six letters (two each in columns 2, 4, and 6). D) A 13-item Communicator 5 Home Screen representing relevant (e.g.,
Fig 1. A) A BCI-24/7 Home User waiting to start a calibration task. B) An average ERP response to the target (red) and the non-target (green) recorded at location Pz. This and other features found in the EEG during the Calibration Step (S1) are used to classify the data for S2 and S3. C) The Communicator 5 screen used during Calibration (S1) and Validation (S2). D) the Communicator 5 Home Screen for Navigation (S3).
A BCI records brain signals, extracts specific measures (or features) from them, and converts (or translates) these features into commands that operate applications that replace, restore, enhance, supplement, or improve natural CNS outputs (Wolpaw and Wolpaw, 2012). For people diagnosed with ALS, a BCI using the P300 event-related potential allows users to choose among items in a matrix and provides the means for communication. (Sellers et al., 2006).

The P300 is a positive deflection in the electroencephalogram (EEG) that occurs 200 to 700 ms after stimulus onset and is typically recorded over central-parietal scalp locations. The response is evoked by attention to rare stimuli in a random series of stimulus events (i.e., the oddball paradigm). Farwell and Donchin (1988) first discussed using the oddball paradigm for communication almost three decades ago.

More recently, researchers at the National Center for Adaptive Neurotechnologies and Helen Hayes Hospital of the New York State Department of Health, have shown that people with limited eye control, can learn to control and use distinct features of scalp-recorded EEG activity including the P300 and other event-related potentials, to move a computer cursor in one or more dimensions, to select letters or icons, or even to move a robotic arm (reviewed in, (Vaughan et al., 2006)). Further, we have demonstrated that people with ALS can use EEG-based BCIs for communication and control autonomously in their homes (Sellers et al., 2010; Wolpaw et al., 2013).

Tobii Dynavox Communicator 5 (C5) is a highly configurable software package designed for use with an eye-gaze device and other conventional switches. It can be used for communication and computer access including e-mail, text messaging, telephone, and environmental control (Tobii Dynavox, 2016).

Methods

Fourteen individuals (seven males, two with ALS) (average age 35(+24); range, 16-73) took part in this study. It was reviewed and approved by the Institutional Review Board of New York State Department of Health; and all subjects gave informed consent.

All aspects of BCI use were controlled by BCI2000 (Schalk et al., 2004). The C5 dynamic screens were presented via a Perl script and the program was accessed using the Microsoft Active Accessibility (MSAA) software development kit (SDK). MSAA is an Application Programming Interface (API) originally designed to improve computer access for physically and cognitively impaired individuals. It reports on the location and purpose of elements on the current Microsoft Windows screen through a graphical user interface (GUI). (https://msdn.microsoft.com/en-us/library/ms697707.aspx.).

We recorded eight channels of EEG from frontal, central, and posterior scalp locations (Fz, Cz, P3, Pz, P4, Po7, Po8, Oz), referenced to the right and grounded to the left mastoid respectively. Signals were amplified using a a g.USBamp biosignal amplifier (g-tec Medical). Signals were sampled at a rate of 256 Hz, high and low-pass filtered at 0.1 Hz and 60 Hz, respectively and a 60Hz notch filter (58-62 Hz). A review of the method can be found in (McCane et al., 2014).

TABLE 1 Performance during BCI/C5 operation
Subj S1 % correct S1 Selections /Min S2 Correct Selections S3 Total selections
1 100 7.3 4 7
2 100 8 4 6
3 100 6.7 4 10
4 100 7.3 4 7
5 100 5.6 4 -
6 100 6 4 -
7 100 5.1 4 10
8 100 8 4 7
9 100 3.7 4 -
10 100 8 4 6
11 100 5.4 4 6
12 100 7.4 4 7
13* 79 1.2 3 -
14* 79 1.5 3 -
AVE 97+8 5.8+2.3 3.9 7.3+1.6
Performance data for all subjects. S1 represents the optimized performance, i.e., results calculated offline using a 5-fold cross validation. S2 represents the number of correct selections (max=4) classified correctly during Validation. S3 represents the number of selections needed to complete the sentence: “I want to watch TV.”

The subject sat in a comfortable chair or his/her own wheelchair at a comfortable viewing distance from a 50 cm screen (Fig 1A). The task had three steps. During the Calibration Step (S1) subjects attended to 21 cued targets as the letters in the words “THE,” “QUICK,” “BROWN,” “FOX,” “JUMPS” with several minutes break between each word. In the Validation Step (S2), subjects selected four cued targets displayed as the letters in the word “JULY” with feedback. During Navigation (S3) the static keyboard (Fig 1C) was replaced with the C5 dynamic screen (Fig 1D). The subjects were instructed to complete the following sentence “I want to watch TV.” To do this successfully, they needed to navigate among three different C5 screens and make a total of four selections from them. Results were displayed at the top of the Core Screen.

Each step required the subject to attend to a picture of a face flashed over the target item (Kaufmann et al., 2011) while all items on the screen flashed in groups of 4-6 items at a rate of four times per second (Townsend et al., 2010). Each step was carefully explained and illustrated. After each run, the subject was asked if she/he wished to continue. The experiment, including consent, step instructions, cap application and removal, and data collection took 45-60 min.

Personalized settings, i.e., correlation coefficients, derived using a stepwise linear discriminate analysis (SWLDA) from data collected during S1, were applied and validated in S2 and S3. Accuracies and optimized selections/min found in Table 1 represent results of a five-fold cross-correlation performed offline. Results reported from S2 and S3 are in real time where selection rates were held constant at 3 selections/min.

Results

All subjects were able to access the TOBII pages using the BCI, making a minimum of three correct selections during Validation (S2). (acceptable criteria for BCI use (Sellers et al., 2006)). Calibration (S1) accuracy, calculated offline using a five-fold cross validation, and average selections/min were 97(+8) and 5.8(+2.3) (range 1.2-8.0), respectively. The nine subjects who completed Navigation (S3) were able to move back and forth between the Communicator 5 screens making an average of seven selections where six selections represented a trial with no errors. (TABLE 1).

Discussion

These early results indicate that the Tobii Communicator 5 software can be accessed using a P300-based BCI. In future work, we will ask the BCI home users to trial the full functionality of the software. Future research will address streamlining the system for easier setup and use by caregivers; developing a hybrid system that allows the user to transfer effortlessly between access methods; and the development of a truly integrated system using both eye tracking and EEG control.

References

Beukelman, D.R., Mirenda, P. (2005) Augmentative and alternative communication: Supporting children and adults with complex communication needs (3rd ed.). Baltimore: Paul H. Brookes.

Farwell, L., Donchin, E. Talking off the top of your head. (1988) Electroencephalography and clinical Neurophysiology, 70:512-523.

Kaufmann, T., Schulz, S.M., Grünzinger, C., Kübler A. (2011) Flashing characters with famous faces improves ERP-based brain-computer interface performance. Journal of Neural Engineering 8(5):056016. doi: 10.1088/1741-2560/8/5/056016.

McCane, L.M., Sellers, E.W., McFarland, D.J., Mak, J.N., Carmack, C.S., Zeitlin, D., Wolpaw, J.R., & Vaughan, T.M. (2014). Brain-computer interface (BCI) evaluation in people with amyotrophic lateral sclerosis. Amyotrophic Lateral Sclerosis and Frontotemporal Degeneration, 15(3-4), 207-215.

National Joint Committee for the Communication Needs of Persons with Severe Disabilities (NJC). 2017. Augmentative and alternative communication. Retrieved from http://www.asha.org/njc/aac/

Schalk, G., McFarland, D.J., Hinterberger, T., Birbaumer, N., Wolpaw, J.R. (2004) BCI2000: A general-purpose brain-computer interface (BCI) system. IEEE Transactions Biomedical Engineering.  51(6):1034-43.

Scherer, M.J. (2005) Living in the state of stuck: How assistive technology impacts the lives of people with disabilities (4th ed.). Cambridge, MA: Brookline Books, pp 122-149.

Sellers, E.W., Donchin, E. (2006) A P300-based brain-computer interface: initial tests by ALS patients. Clinical Neurophysiology, 117(3):538-48.

Sellers, E.W., Krusienski, D.J., McFarland, D.J., Vaughan, T.M., Wolpaw, J.R. (2006) A P300 event-related potential brain-computer interface (BCI): the effects of matrix size and inter stimulus interval on performance. Biol Psychol. 2006 73(3):242-52.

Sellers EW1, Krusienski DJ, McFarland DJ, Vaughan TM, Wolpaw JR. A P300 event-related potential brain-computer interface (BCI): the effects of matrix size and inter stimulus interval on performance.

Sellers, E.W., Vaughan, T.M., Wolpaw, J.R. (2010) A brain-computer interface for long-term independent home use Amyotrophic Lateral Sclerosis 11: 449–55.

Tobii Dynavox. (2016). Communicator 5. Retrieved from http:// www. tobiidynavox.com/communicator5/

Townsend, G., LaPallo, B.K., Boulay, C.B., Krusienski, D.J., Frye, G.E., Hauser, C.K., Schwartz, N.E., Vaughan, T.M., Wolpaw, J.R., Sellers, E.W. (2010) A novel P300-based brain-computer interface stimulus presentation paradigm: moving beyond rows and columns. Clinical Neurophysiology. 121(7):1109-20.

Vaughan, T.M., McFarland, D.J., Schalk, G., Sarnacki, W.A., Krusienski, D.J., Sellers, E.W., Wolpaw, J.R. (2006) The Wadsworth BCI Research and Development Program: at home with BCI. IEEE Transactions Neural Systems Rehabilitation Engineering. 14(2):229-33.

Wolpaw, J.R. and Wolpaw, E.W. (2012) Something new under the sun in Wolpaw, J.R., & Wolpaw, E.W. (Eds.). Brain-computer Interfaces: Principles and Practice. New York, NY: Oxford University Press, pp 1-13.

Wolpaw, J.R., Bedlack, R.S., Ringer, R.J., Reda, D.J.,. Hill, K.J., Banks, P.G., Vaughan, T.M., Shi, H., Heckman, S.M., Carmack, S.C., Winden, S., McCane, L.M., Ruff, R.J. (2013) A clinical demonstration of EEG-based brain-computer interface (BCI): Long-term independent use of a P300-based BCI. (2013) Program number J45.55 2013 Neuroscience Meeting Planner. San Diego, CA.

Acknowledgements

National Institutes of Health (National Institute of Biomedical Imaging and Bioengineering (NIBIB) grant number EB018783.