An augmented-reality fNIRS-based brain-computer interface: a proof-of-concept study

DOI

Augmented reality (AR) enhances the user’s environment by projecting virtual objects into the real world in real-time. Brain-computer interfaces (BCIs) are systems that enable users to control external devices with their brain signals. BCIs can exploit AR technology to interact with the physical and virtual world and to explore new ways of displaying feedback. This is important for users to perceive and regulate their brain activity or shape their communication intentions while operating in the physical world. In this study, twelve healthy participants were introduced to and asked to choose between two motor-imagery tasks: mental drawing and interacting with a virtual cube. Participants first performed a functional localizer run, which was used to select a single fNIRS channel for decoding their intentions in eight subsequent choice-encoding runs. In each run participants were asked to select one choice of a six-item list. A rotating AR cube was displayed on a computer screen as the main stimulus, where each face of the cube was presented for 6 s and represented one choice of the six-item list. For five consecutive trials, participants were instructed to perform the motor-imagery task when the face of the cube that represented their choice was facing them (therewith temporally encoding the selected choice). In the end of each run, participants were provided with the decoded choice based on a joint analysis of all five trials. If the decoded choice was incorrect, an active error-correction procedure was applied by the participant. The choice list provided in each run was based on the decoded choice of the previous run. The experimental design allowed participants to navigate twice through a virtual menu that consisted of four levels if all choices were correctly decoded. Here we demonstrate for the first time that by using AR feedback and flexible choice encoding in form of search trees, we can increase the degrees of freedom of a BCI system. We also show that participants can successfully navigate through a nested menu and achieve a mean accuracy of 74% using a single motor-imagery task and a single fNIRS channel.

Identifier
DOI https://doi.org/10.34894/DF83FF
Metadata Access https://dataverse.nl/oai?verb=GetRecord&metadataPrefix=oai_datacite&identifier=doi:10.34894/DF83FF
Provenance
Creator Benitez-Andonegui, Amaia ORCID logo; Burden, R; Benning, R; Möckel, R.; Lührs, M.; Sorger, B. ORCID logo
Publisher DataverseNL
Contributor benitez, amaia; faculty data manager FPN
Publication Year 2020
Rights CC0-1.0; info:eu-repo/semantics/restrictedAccess; http://creativecommons.org/publicdomain/zero/1.0
OpenAccess false
Contact benitez, amaia (Maastricht University); faculty data manager FPN (Maastricht University)
Representation
Resource Type Dataset
Format application/vnd.openxmlformats-officedocument.spreadsheetml.sheet; application/zip; text/plain
Size 14320; 139538604; 1357
Version 2.0
Discipline Agriculture, Forestry, Horticulture, Aquaculture; Agriculture, Forestry, Horticulture, Aquaculture and Veterinary Medicine; Life Sciences; Social Sciences; Social and Behavioural Sciences; Soil Sciences