In a recent report, the Pew Research Center found that Americans are more worried than they are enthusiastic about automation technologies when it comes to tasks that rely on qualities thought to be unique to humans, such as empathy. They’re concerned that, in lacking certain sensibilities, robots are fundamentally limited in their ability to replace humans at those jobs; they don’t, according to the report, trust “technological decision-making.”
Human drivers don’t seem all that “human” when it comes to thoughtful decision-making. Federal fatal-crash data show that despite reductions in the number of deaths due to distracted or drowsy driving, those related to other reckless behaviors—including speeding, alcohol impairment, and not wearing seatbelts—have continued to increase. Roughly 37,000 of last year’s fatal crashes were attributed to poor decision-making.
Humans aren’t necessarily better than robots at caregiving, either. The American Psychological Association in 2012 estimated that 4 million older Americans—or about 10 percent of the country’s elderly population—are victims of physical, psychological, or other forms of abuse and neglect by their caregivers, and that figure excludes undetected cases.
Nor do they inherently excel at interpersonal skills. Humans incessantly use “strategic emotions”—emotions that don’t necessarily reflect how they actually feel—to achieve social goals, protect themselves from perceived threats, take advantage of people, and adhere to work-environment rules. Strategic emotions can help relationships but, if they’re detectable, they can harm them, too.
As an example, Jonathan Gratch, the director of emotion and virtual human research at the University of Southern California’s Institute for Creative Technologies, pointed to customer-service representatives, who tend to follow a script when speaking with people. Because they rarely express genuine emotions, they aren’t, according to Gratch, “really being human.” In fact, these rules surrounding professional conduct make it easier to program machines to do that sort of work, especially when Siri and Alexa are already collecting data on how people talk, such as their intonations and speech patterns. “There’s this digital trace you can treat as data,” he said, referring to the scripts on which customer-service reps rely, “and machines learn to mimic what people do in those tasks.”
Read more in The Atlantic.
Are Humans Actually More ‘Human’ than Robots?
Published: November 13, 2017
Category: News