As technology learns to incorporate tactile feedback, recreate bodily sensations and talk directly to our brains, HCI is positioning itself at the vanguard of a better, brighter and more inclusive world.
Two years ago, my mom lost most of her sight. It happened suddenly, her left retina fell while she was waiting for the bus, which somehow signaled her right eye cataracts to go into overgrowth. It was an instant that changed our lives forever, and as the creeping darkness began engulfing the things my mom loved the most, (reading, driving a car, caring for her garden) she told me that for the first time in her life, the world scared her. I found myself back home, doing something I hadn’t done in a while: Reading a whole lot of science fiction novels.
One of the books I had been recommended was Ernest Cline’s fabulous debut “Ready Player One.” In it, the main character — a high-school student called Wade, begins his adventure in the virtual world OASIS (or Ontologically Anthropocentric Sensory Immersive Simulation) owning a standard immersion ring: A pair of virtual reality goggles and a pair of ([haptic gloves}(https://en.wikipedia.org/wiki/Haptic_technology)). As he dives deeper into his journey, full body-immersion suits and form-fitting devices (even smell replicators) become his new norm.
The book marks my first serious encounter with haptic interfaces and defines, tinted perhaps by the situation, my subsequent interest in some of HCI’s latest developments and their applications for accessibility. Cline’s book was my pop-culture-fueled escape from the world. The same world to which my mom, in the next room, was desperately clinging. She smiled again, for the first time, when we installed an audio-book app on her tablet. I soon understood that technology, a threatening prospect before, was now her bridge.
Haptic and Brain-Computer Interfaces: What they are, How they work.
The Interactor force-feedback vest manufactured by Aura Systems, Inc. takes an audio signal from a game console or PC, process it and converts the lower frequencies into vibrations, jolts, and other similar physical sensations. Wikimedia.
Haptics is an area of Human-Computer Interaction that allows for users to receive feedback via tactile or bodily sensations. The technology uses purpose-built sensors that can send electrical signals based on movements or interactions. A computer interprets the signal, and in turn sends a signal back to the human organ or body. This is usually done through an input/output device such as a joystick or data or wired gloves.
Haptic communication — also called kinesthetic communication, recreates the sense of touching by applying vibrations, forces or motions on the user. When it comes to touch, most researchers distinguish between cutaneous, kinesthetic and haptic (haptic is usually associated with active touch (as opposed to ‘passive feeling’). One example of a haptic interface in use would be a person picking up a tennis ball with a data glove in a virtual reality environment. The virtual ball is moved in the display by the computer, which senses the movement, and the user can ‘feel’ the ball in their hand.
Brainwave (or Brain-Computer) interfaces, on the other hand, create a direct communication pathway between a wired brain and an external device.
Although their research began in the 70s (with UCLA’s on the vanguard, coining for the first time the expression “brain-computer interface”,) it wasn’t until the 90s that neuroprosthetic devices were implanted in humans. These devices, aided by the incredible cortical plasticity of the brain, did things like helping restore damaged hearing, sight and movement. (BCIs are implanted in the person’s grey matter during neurosurgery. In 1978, 68 electrodes were implanted onto a man’s cortex, successfully producing the sensation of “seeing light”!)
Visually impaired users were perhaps among those that could benefit the most from haptic and brainwave technologies. However, a workaround for all the optical information that the human brain processes sounded, at best, challenging.
That didn’t stop a team of researches from MIT Computer Science & Artificial Intelligence Lab from trying. Why, if we can count steps with a phone, can’t we do better than a walking stick? Determined to give the puzzle a go, they created a system that helps visually impaired users navigate their way through a physical environment.
They decided for haptic technologies and the assistance of artificial intelligence: A depth camera, a vibration belt, an embedded computer and a refreshable Braille display. This type of navigation for visually impaired people works through sensors; They can identify free walkable spaces, recognize everyday objects and communicate with the user through vibrating motors mounted on a belt. The braille display is used for transmitting finer grain information, actual encoded words the user can read.
You can see it in action here:
This is just one example exploring how blind or visually impaired people can benefit from intelligent user interfaces and experience the world in unprecedented ways.
These type of wearable devices go beyond the ability to find safe paths. What they offer is the possibility of better understanding (and interacting with) the world around us.
Brain-Computer Interaction provides a way to measure neuron activity directly, and translate it into information or action. For example, for a person who cannot move their body, they can imagine a ball rolling in a certain direction and see a cursor doing the same on a screen. Thanks in large part to electroencephalography sensors, which work through an array of electrodes fastened to the scalp and measure the strength of the electric field in each spot, we have seen in the last years a steady advance in brain-computer technologies.
Haptics changes lives, but BCI makes it possible to truly imagine a brighter, broader and better future. Ask BrainGate, a consortium of researchers trying to give humans the ability to sense, control, and communicate with the outside world through the power of thought.
BrainGate’s practical, groundbreaking medical devices restore communication, mobility and independence to people affected by neurological disease, paralysis, or limb loss. This is so revolutionary, even science fiction needs to catch up a bit:
The electrodes gear is safe, cheap, and -for now,- best suited to applications that ask only to distinguish between sharply contrasting thoughts, such as left vs. right, or up vs. down. Precision and speed are still an issue, but it’s just a matter of time until these systems make the jump towards restoring actual function (they are setting up, after all, a map for artificial workarounds of nervous szstem malfunctions caused by disease or accident.)
The field is, luckily, progressing very quickly. BrainGate has already allowed people with spinal cord injury, ALS and brainstem stroke to control a computer cursor simply by thinking about the movement of their own paralyzed hand and arm. You can check their clinical trials here.
As fascinating as these developments are, it’s not difficult to soon begin worrying about the darker possibilities of haptic and brainwave technologies. And to a certain, extend, we need to do so. As we learn how to directly control things from the cortex, we understand how sensation and action is coordinated and even develop exoskeletons to become fully functional cyborgs, we need to understand the risks as much as the benefits (if not more!)
One important consideration is patients’ as well as caregivers’ expectations about recovering motor functions with BCI. These expectations should be reasonable in order to avoid causing psychological harm when desires and intentions are not completely realized. Walter Glannon’s Ethical issues with brain-computer interfaces addresses this and other issues such as the differences in invasiveness and benefit-risk ratios for different types of electrodes.
The Center for Responsible Brainwave Technologies or CeReB suggests 3 core principals or reminders of our shared moral duty when creating these types of technology:
Respect for persons, concern for welfare, and justice.
The proliferation of new interfaces without a common set of safety and privacy standards (as well as agreements on personhood, autonomy, responsibility and justice) constitutes a concern that must be proactively addressed. This is also paramount when it comes to BCI devices for the purpose of entertainment and gaming.
We need to ensure the existence of a dialog between a broad diversity of perspectives, and make sure that the solutions to come are well-considered, financially achievable, meaningful, and effective. BCI, haptic technologies and similar developments should respect the individual’s integrity and autonomy and their right to be safe and informed under shared ethical framework and practices.
BCI research and its potential translation to therapeutic intervention is a leap forward in our common desire for a fairer, better world. We just need to make sure we have practical solutions to our ethical challenges.
Yisela Alvarez Trentini lives and works in Frankfurt am Main building useful things. You should follow her on Twitter