13/10/2016 - Events, News
Westworld. What philosophers and psychologists have to say about?
Roboethical experts are invited to comment on recente Westworld series script where human visitors have the change to kill, rape and abuse android robots. Roboethics inventor Gianmarco Veruggio said that this attitude of expressing violent behaviours is already present in the "kill everyone" videogames. Philosopher Patrick Lin about sex robots.
Patrick Lin, Ph.D., the director of the Ethics and Emerging Sciences Group at California Polytechnic University, commented on Inverse.com about the role of sex robots.
(From Inverse.com) I fear that sexbots might make us less human. “The fear is that people will become less in the habit of seeking consent and more into imposing their will and desires on others,” he told Inverse in an e-mail. A Westworld-like scenario presents a slippery slope similar to the one posed by or violent films or video games, perhaps even more troubling because sex robots are so “visceral” and “immersive” that they blur the line between virtual and physical reality.
“I don’t think these fears have played out, but robots and virtual reality are much more immersive and visceral than previous technologies, so we don’t really know,” he says.
Sex robots — at least the sophisticated, highly anthropomorphic kind — haven’t been around long enough for us to tell if they’ll affect our sexual interactions with each other. In her research, Darling is beginning to explore how human-on-robot violence might affect behavior outside the lab, and the preliminary results are less Hobbesian than we feared. In an experiment in 2014, Darling instructed participants to play with a set of Pleos — cute, wide-eyed robot dinosaurs the size of small cats — and subsequently tie them up, strike them, and kill them. Participants, especially strongly empathic ones, afterward self-described the experience as “disturbing,” says Darling, who took this as evidence that humans are somewhat confused by life-like robots. In follow-up experiments using human-like social bots called Hexbug Nanos, she observed similar, compassionately hesitant, reactions.
But in Westworld, there doesn’t seem to be any confusion. In one scene, a human couple beams at the death of a swarthily handsome robot they killed in a “shootout,” then poses with its corpse for a photo. Whether they’ll take those behaviors home with them remains to be seen, but Crichton’s assertion — that humans are inherently shitty — suggests they will.
It might seem pessimistic, but evidence seems to suggest that lack of consensual language between robots and humans will happen.
“Then the question is, if it does have an impact on people’s perceptions of consent and their behavior toward humans, what’s the best way to go about preventing that?” Darling asks. Should we, for example, program sex robots to say no? That depends on what we want from our machines: Sex robots are tools designed to deliver satisfaction. They could deliver an education as well, if that’s something we decide we need.
Some ethicists would argue that they should. “It is important for robots to say ‘no’ to us,” Lin says, noting that it’s the job of a “smart tool” to refuse our orders when it knows better. This already happens all the time: Autonomous cars turn right when left poses danger; nurse robots force critical medications on patients who refuse to take them. A sex robot that demands consent could be useful for stamping out our baser behaviors, in the same way meditation apps train us to stop drifting off, or exercise trackers buzz when we get too lazy.
But if we’re anything like the humans in Westworld, the last thing we’ll want from our sex robots is to be didactic. Not only would that defeat their purpose — aren’t they just Fleshlights and vibrators in elaborate casing? — but programming them with consent-seeking behavior would require a difficult admission on our part: That is, that we need to be stopped.
https://www.inverse.com/article/21654-westworld-sex-robot-consent-rape-culture-science