domingo, 10 de diciembre de 2017

BioEdge: Ethical standards urgently needed for neurotechnology, say researchers and ethicists

BioEdge: Ethical standards urgently needed for neurotechnology, say researchers and ethicists

Bioedge

Ethical standards urgently needed for neurotechnology, say researchers and ethicists
     
A group of researchers and ethicists delivered a warning in Nature in November about the dangers of neurotechnology and AI (sorry, guys, we missed this earlier). The Morningside group, headed by Columbia University neuroscientist Rafael Yuste, claims that existing ethical standards have been outpaced by galloping technology:

we are on a path to a world in which it will be possible to decode people's mental processes and directly manipulate the brain mechanisms underlying their intentions, emotions and decisions; where individuals could communicate with others simply by thinking; and where powerful computational systems linked directly to people's brains aid their interactions with the world such that their mental and physical abilities are greatly enhanced.
Most research in the area is currently in medical applications, such as helping people with brain and spinal cord damage. But it will soon have commercial and military applications which raise important ethical issues:

the technology could also exacerbate social inequalities and offer corporations, hackers, governments or anyone else new ways to exploit and manipulate people. And it could profoundly alter some core human characteristics: private mental life, individual agency and an understanding of individuals as entities bound by their bodies.
Hence the Morningside scholars propose four ethical principles to be incorporated into ethical codes and legislation.

Privacy and consent. Most people are already hooked up to the internet through their smartphones, which can collect huge amounts of revealing data. Google says that people touch their phones 1 million times a year. “We believe that citizens should have the ability — and right — to keep their neural data private,” assert the authors. Opting out of sharing this data should be the default choice on devices. The transfer of neural data should be regulated like organ donation to prevent a market from developing.

Agency and identity. “Neurotechnologies could clearly disrupt people's sense of identity and agency, and shake core assumptions about the nature of the self and personal responsibility — legal or moral.” Therefore “neurorights” should be protected by international treaties. Consent forms should warn patients about the risk of changes in mood, sense of self and personality.

Augmentation. Neurotechnology could allow people to radically increase their endurance or intelligence, creating discrimination and changes in social norms. The researchers urge that “guidelines [be] established at both international and national levels to set limits on the augmenting neurotechnologies that can be implemented, and to define the contexts in which they can be used — as is happening for gene editing in humans.”

Bias. Research has shown that bias can be incorporated into AI system, and can be devilishly hard to eliminate. “Probable user groups (especially those who are already marginalized) have input into the design of algorithms and devices as another way to ensure that biases are addressed from the first stages of technology development,” comments the Morningside Group.

Within academia these proposals may seem like Ethics 101. But history shows that once devices are commercialised ethics are easily forgotten:

History indicates that profit hunting will often trump social responsibility in the corporate world. And even if, at an individual level, most technologists set out to benefit humanity, they can come up against complex ethical dilemmas for which they aren't prepared. We think that mindsets could be altered and the producers of devices better equipped by embedding an ethical code of conduct into industry and academia.
Bioedge

There has been so much “inappropriate behaviour” in the press over the past two months, ever since movie mogul Harvey Weinstein was exposed as a serial molester of women, that it deserves its own acronym, IB. IB is a euphemism for a range of appalling actions, like sexual harassment, sexual assault, and rape.

And now, thanks to an Arizona Congressman we can add surrogacy to the IB list. Trent Franks resigned this week after revelations that he had pressured women on his staff to act as a surrogate mother. It’s not a pretty story, but it helps to expose the exploitative potential of surrogacy, which is so often depicted as generous and life-affirming.

Heads up: the next issue of BioEdge will be the last until January. Hey! Everyone deserves a holiday, even the staff of BioEdge.


Michael Cook
Editor
BioEdge
Comment on BioedgeFind Us on FacebookFollow us on Twitter
NEWS THIS WEEK
by Michael Cook | Dec 09, 2017
A bad hair day for pro-life politicians in Congress 

by Michael Cook | Dec 09, 2017
20% higher among women who are currently using or have recently used them 

by Michael Cook | Dec 09, 2017
After nearly 10 years, a tragic case may be drawing to a close

by Michael Cook | Dec 09, 2017
Otherwise we might end up as puppies for AI, as Elon Musk has warned

by Michael Cook | Dec 09, 2017
10 years or 55 years? 

by Michael Cook | Dec 09, 2017
The medical profession must remain ever vigilant 
BioEdge
Phone: +61 2 8005 8605
Mobile: 0422-691-615

No hay comentarios: