© 2017 BestMindFrame.com

When Machines Get Mean

November 21, 2019

 (Picture credit: Carnegie Mellon University)

 

 

Science has well established that a person’s ability to perform a task is affected by what other people say, but now a new study suggests we can extend this feedback circle to include robots.

 

Researchers at Carnegie Mellon University gathered 40 study participants and had them play against a robot in “Guards and Treasures,” a game used to study rationality. Each person played the game 35 times, either receiving encouragements from the robot, or almost hilariously mild trash talk. (One example: “I have to say you are a terrible player.” Another: “Over the course of the game your playing has become confused.”)

 

Everyone playing understood that their opponent wasn’t animate. “One participant said, ‘I don’t like what the robot is saying, but that’s the way it was programmed so I can’t blame it,'” said lead author Aaron M. Roth.

 

It’s likely that nobody ran away from the table crying at these robotic criticisms, but all the same, the effects were clear in the resulting gameplay. While all participants improved their rationality as the rounds went on, those on the receiving end of those snipes played noticeably worse.

 

It’s perhaps not a tremendous surprise to anyone who’s ever missed a turn and then felt hurt when their GPS guide’s “Recalculating” seemingly sounded a trifle judgmental, only to immediately miss a turn again. We are social creatures, and while the logical systems in our brain can tell us we’re dealing with a pre-recorded voice responding to data inputs, our older emotional core didn’t evolve with any reason to make these distinctions.

 

A simpler study might have simply concluded, “Wow, trash talk really works,” and then recommended to the public that we bring our best insults at the ready for our next round of golf or family game night. Of course, that’s not really what the Carnegie Mellon research was about.

 

As study co-author Fei Fang noted, our emotional malleability at the hands of even a relatively unsophisticated AI has some troubling implications in this age of heavy internet use. “We can expect home assistants to be cooperative,” said Fang, “but in situations such as online shopping, they may not have the same goals as we do.” We know to be wary of used car dealers, but now increasingly, we may need to develop an extra wariness around shopping sites designed to prey off our feelings and manipulate us into spending more.

Share on Facebook
Share on Twitter
Please reload

Featured Posts

I'm busy working on my blog posts. Watch this space!

Please reload

Recent Posts

November 21, 2019

September 26, 2019

August 8, 2019

Please reload

Archive