Between humanity and technology

What makes us human? The UW Center for Neurotechnology partners with the Philosophy Department to examine the ethics of tomorrow.

Joan, an Army specialist, is rumbling down a remote gravel road, the hulking mountains of central Afghanistan barely visible through the dust kicked up by her convoy.

The next moment, the Humvee is nearly torn in half by an improvised explosive device.

It鈥檚 not until she wakes up later in a military hospital that reality hits her: She has lost her arm above the elbow, and

her life is changed forever.

After months of rehab in the U.S., Joan participates in a clinical trial for a cutting-edge robotic arm controlled by a brain-computer interface (BCI). A chip implanted in her brain and a network of electrodes comb through the electric chatter, using artificial intelligence (AI) to decode Joan鈥檚 intentions 鈥

sending wireless signals to her robotic arm that enable her to perform basic functions.

Joan gets so used to the prosthetic that it鈥檚 no longer just a tool 鈥

she feels it has become an actual part of her.

Life begins to settle down again.

Joan is driving when her left arm jerks unexpectedly,

causing the car to swerve off the road and plow through her neighbor’s fence.

Was it a malfunction? A disagreement between her 鈥渢rue鈥 intentions and the device鈥檚 AI? Joan feels confused, guilty and alienated from her body 鈥 and her identity.

Who is at fault?
Joan, or the software and hardware in her body?

Should her neural data be analyzed by lawyers to determine blame? Does she have any right to personal privacy?

Where does humanity end and technology begin?

Raising the questions

Joan鈥檚 dilemma, by philosophers at the聽 at the UW, is hypothetical, but it could one day be reality. Neurotechnology 鈥 the intersection of neuroscience, technology and computing 鈥 has brought within reach treatments and technology for some of the human body鈥檚 most vexing problems (including spinal cord injury, stroke, Parkinson鈥檚 disease and more). But with those treatments come many areas of potential ethical conflict, and the center aims to raise important questions and awareness before design decisions are already entrenched.

Based at the UW with core partners at MIT and San Diego State University, the Center for Neurotechnology was established in 2011 as an Engineering Research Center funded by the National Science Foundation. And from the very beginning, ethics has been one of its cornerstones.

The center鈥檚 neuroethics research is led by Sara Goering, a UW philosophy professor with a background in disability studies and bioethics, and Eran Klein, a neurologist and UW affiliate professor. Center members from the Department of Philosophy work closely with neuroscientists, engineers, doctors and industry professionals to develop effective technology in the most ethical ways.

But Goering and Klein aren鈥檛 necessarily here to give answers. They help identify important questions 鈥 for researchers, industry and our society to grapple with before it鈥檚 too late.

Eran Klein

How were we thinking about privacy in the 鈥90s, as the internet was being developed? We could鈥檝e been thinking about the costs and benefits of giving up all our data. It would鈥檝e been nice to have that conversation up front.
Eran KleinAffiliate Assistant Professor of Philosophy

Klein gives an example of a time when technology raced past our attention to the ethical consequences: 鈥淗ow were we thinking about privacy in the 鈥90s, as the internet was being developed? We could鈥檝e been thinking about the costs and benefits of giving up all our data. It would鈥檝e been nice to have that conversation up front.鈥

Goering summarizes with a punch: 鈥淧eople may want 鈥榥ormal鈥 function. But not at any cost.鈥

The ethics team aims to ensure that disability perspectives are integrated into the design process at the earliest stages. They also work directly with research participants who are testing the technology of the future, learning more about what works, what doesn鈥檛,聽and what concerns users may have.

Tim Brown, who was a philosophy doctoral student of Goering鈥檚, was at the forefront of one such project at the UW, as an embedded ethicist in Electrical and Computer Engineering Professor Howard Chizeck鈥檚 . It was a deep collaboration that saw Brown working alongside engineering researchers every day, both in the lab and in pilot studies.

Sara Goering

People may want 鈥榥ormal鈥 function. But not at any cost.
Sara GoeringAssociate Professor of Philosophy
Tim Brown

How do we engage with technology? What is the boundary between humans and technology? And what makes us, us?
Tim BrownNIH Postdoctoral Scholar, UW Neuroethics Group

This technology is personal

As electrical engineers monitor the dips and spikes dancing on a computer monitor, Brown asks study participant Fred Foy to touch his own nose, then Brown鈥檚 finger, then his nose again, then Brown鈥檚 finger again. Known as the Fahn-Tolosa-Marin rating scale, it鈥檚 a way for researchers to gauge the severity of essential tremor 鈥 a condition that affects 7 million Americans.

Foy is in his 80s. He walks with a cane, likes to stay up reading magazines and is going on a date after his appointment. He also has an electrode implanted in his brain. Without it, his hands tremble constantly, making it difficult to do basic tasks like drinking water.

Normally, Foy鈥檚 implant uses open-loop deep-brain stimulation (DBS) to treat his condition. Tiny electrical pulses are delivered to Foy鈥檚 brain, reducing his tremors to a more manageable level. But 鈥渙pen loop鈥 means it鈥檚 always on 鈥 so potential side effects, including trouble with speaking and balance, can also be constant.

Center researchers at the BioRobotics Lab are working on a solution. Their next evolution of deep-brain stimulation, known as closed-loop DBS, uses machine learning to sense incoming tremors and toggle on and off as needed 鈥 prolonging the life of the battery (which requires surgical replacement) and reducing side effects.

In partnership with medical device manufacturer Medtronic, the creation of this first in-human closed-loop DBS system was led by Jeffrey Herron, 鈥16, a UW assistant professor of neurological surgery and center faculty member who earned his Ph.D. at the UW. And Foy is one of the first to test this technology.

Engineering researchers temporarily upload new algorithms to Foy鈥檚 device, turning it from open- to closed-loop. Then Brown helps run speech and mobility tests.

Foy draws a spiral, which is much smoother with closed-loop DBS than without. He capably lifts a bottle as if taking a drink of water 鈥 a task made challenging by his essential tremor, but made easier by closed-loop DBS. When he鈥檚 asked to name as many animals as he can think of in 30 seconds, the device automatically reduces stimulation, making it easier for Fred to speak.

Then Brown asks Foy deeper questions about his experience.

Have your moods, personality, thoughts or behaviors changed because of your device?

Have you ever felt that your actions were not your own because of the device?

Do you feel a stigma associated with this device?

The conversation soon turns to cyborgs, and Foy’s response is a frank reminder that this technology is personal:

I don鈥檛 like the term “cyborg.” I鈥檓 me, and this is a tool that helps me. That鈥檚 it.

More than engineering

For four years, Brown had a desk in the BioRobotics Lab. Chizeck had specifically asked the center for an embedded ethicist.

鈥淚f you鈥檙e going to modify someone鈥檚 brain function and capabilities, there鈥檚 more to think about than just the hard engineering,鈥 Chizeck says. 鈥淚 wanted an ethicist in the lab, someone trained in the literature of ethics. I didn鈥檛 think there was any other way to do it.鈥

By being there 鈥 surrounded by electrical and computer engineering grad students, remote-control surgical robots, brain-computer interface devices, hacker magazines and stacks of empty coffee cups 鈥斅燘rown could talk with engineers as they went about their day-to-day work, helping them address concerns and familiarizing them with complex ethical concepts.

鈥淚f you don鈥檛 have a desk in the laboratory, you miss out entirely,鈥 Brown says. 鈥淚t allowed people to drop by and ask me random questions that popped in their heads. Big, tough questions.鈥

Brown earned his Ph.D. in December, but he continues to work with the UW neuroethics group as an NIH postdoctoral scholar. His current project, run by Goering and Klein, is to help create a conceptual map of how different types of brain-computer interfaces (BCIs) affect agency 鈥 a person鈥檚 sense of control over their own actions.

Not every BCI device impinges on a user鈥檚 sense of agency. But, for instance, if a person is fully paralyzed, and artificial intelligence helps 鈥渞ead鈥 their brain activity and take action, they might feel they鈥檝e traded in a significant amount of agency for technology that helps them accomplish daily tasks.

Fundamental questions

Goering, Klein and Brown are proud of how the Center for Neurotechnology is baking a philosophical perspective into the technology of the future. The center develops tools and ethical guidelines for this technology 鈥 and shares them with an international audience.

There鈥檚 much to feel good about. Brown says he鈥檚 seeing more partnerships between ethicists and technological innovators at the UW and other universities. Philanthropic support may unlock even more important research and collaboration. And BCIs have tremendous potential to improve people鈥檚 lives. He notes, though, that there is still a lot to explore:

鈥淲e have an opportunity to answer some really fundamental questions about the way humans operate 鈥 questions that have deep implications for how we think we should be. How do we engage with technology? What is the boundary between humans and technology? And what makes us, us?鈥

Originally published May 2020

What you care about can change the world

The 天美影视传媒 is undertaking its most ambitious campaign ever: Be Boundless 鈥 For Washington, For the World. When you support the Center for Neurotechnology Neuroethics Fund, you can help us develop effective technology in the most ethical ways.