Summary: A newly designed dry sensor that can measure brain activity may one day enable mind control of robotic systems.
Source: American Chemical Society
It sounds like science fiction: don a special, electronic headband and control a robot using your mind. But now a recent study has revealed ACS Applied Nanomaterials It has taken a step towards making it a reality.
By designing a special, 3D-patterned structure that doesn’t rely on sticky conductive gel, the team created “dry” sensors that can measure the brain’s electrical activity, even within the bumps and curves of hair and scalp.
Doctors monitor the electrical signals from the brain with electroencephalography (EEG), where special electrodes are either placed on or placed on the surface of the head. EEG helps diagnose neurological disorders, but it can also be incorporated into a “brain-machine interface,” which uses brain waves to control an external device such as a prosthesis, robot, or even a video game.
Most non-invasive versions use “wet” sensors, which are stuck to the head with a gloopy gel that can irritate the scalp and sometimes cause an allergic reaction.
As an alternative, researchers are developing “dry” sensors that don’t require gel, but so far none have worked as well as the gold-standard wet variety.
Although nanomaterials such as graphene may be a suitable alternative, their flat and generally flaky nature makes them incompatible with the uneven curvature of the human head, especially for long periods of time. So, Francesca Iacoppi and colleagues wanted to develop a 3D, graphene-based sensor based on polycrystalline graphene that could accurately monitor brain activity without any adhesion.
The team created several 3D graphene-coated structures with different shapes and patterns, each 10 µm thick. Of the shapes tested, a hexagonal pattern worked best on the curved, hairy surface of the occipital region—the base of the head where the brain’s visual cortex is located.
The team incorporated eight of these sensors into an elastic headband, which held them on the back of the head. When paired with an augmented reality headset displaying visual signals, the electrodes can detect which cue is being viewed, then work with a computer to interpret the signals into commands that control the motion of a four-legged robot — completely hands-free.
Although the new electrodes do not yet function as wet sensors, the researchers say this work represents a first step toward developing robust, easy-to-implement dry sensors to help expand brain-machine interface applications.
Financing: The authors acknowledge funding from the Australian Government’s Defense Innovation Hub and support from the Australian National Fabrication Facility at the University of Technology Sydney and the Research and Prototype Foundry at the University of Sydney Nano Institute.
Robotics research news about this
Author: Katie Cottingham
Source: American Chemical Society
Contact: Katie Cottingham – American Chemical Society
Image: Adapted from ACS Applied Nano Materials, 2023, DOI: 10.1021/acsanm.2c05546
Original Research: Access to all.
“Noninvasive Sensor for Brain-Machine Interface Based on Micropatterned Epitaxial Graphene“By Sheikh Naeem Faisal et al. ACS Applied Nanomaterials
Noninvasive Sensor for Brain-Machine Interface Based on Micropatterned Epitaxial Graphene
The availability of accurate and reliable dry sensors for electroencephalography (EEG) is essential to enable large-scale deployment of brain-machine interfaces (BMIs). However, dry sensors always show poor performance compared to gold standard Ag/AgCl wet sensors.
Performance degradation with dry sensors is more pronounced when monitoring signals from hairy and curved areas of the scalp, which require the use of bulky and uncomfortable acicular sensors.
This work demonstrates a three-dimensional micropatterned sensor based on a subnanometer-thick epitaxial graphene to detect EEG signals from challenging occipital regions of the scalp.
The occipital region, related to the brain’s visual cortex, is key to implementing BMIs based on simple steady-state visually probabilistic paradigms.
Patterned epitaxial graphene sensors show efficient skin contact with low impedance and can achieve comparable signal-to-noise ratio against wet sensors.
Using these sensors, we demonstrated hands-free communication with a quadruped robot through brain activity.