The following article originally appeared in the New York Times on Dec. 13, 2019 and has been updated since.
In the article, Jackson worries that robot companions could make the lives of the elderly better, worse, or, oddly, both.
1. Do you agree with any of her worries? Explain why you agree or disagree. (5%)
2. Do you agree with her thinking about care and companion robots, that they might further isolate the elderly and allow the younger to engage less with the older, thinking they have companionship? Or do you think there are other and perhaps better or worse reasons for avoiding or improving such robots? Draw an Implication Table detailing as many expected, unexpected, positive and negative consequences of robot companions as you can think of. (10%)
3. Given the fatalities at elder care facilities during this pandemic, would robots be a way of ensuring the elderly continue to get good care, of minimizing the spread of diseases, and protecting staff at care homes by minimizing the risk to them (compared with the rest of us) during times of crisis—especially a future pandemic? (15%)
In answering these questions be sure to draw on as much relevant course material as possible
Would You Let a Robot Take Care of Your Mother? As the global population ages, robot companions are on the rise. By Maggie Jackson
An aging population is fueling the rise of the robot caregiver, as the devices moving into the homes and hearts of the aging and sick offer new forms of friendship and aid. With the global 65-and-over population projected to more than double by 2050 and the ranks of working age people shrinking in many developed countries, care robots are increasingly seen as an antidote to the burden of longer, lonelier human lives.
Winsome tabletop robots now remind elders to take their medications and a walk, while others in research prototype can fetch a snack or offer consoling words to a dying patient. Hundreds of thousands of “Joy for All” robotic cats and dogs designed as companions for older people have been sold in the U.S. since their 2016 debut, according to the company that makes them. Sales of robots to assist older adults and people with disabilities areexpected to rise 25 percent annually through 2022, according to the industry group International Federation of Robotics.
Yet we should be deeply concerned about the ethics of their use. At stake is the future of what it means to be human, and what it means to care.
Issues of freedom and dignity are most urgently raised by robots that are built to befriend, advise and monitor seniors. This is Artificial Intelligence with wide, blinking eyes and a level of sociability that is both the source of its power to help and its greatest moral hazard. When do a robot assistant’s prompts to a senior to call a friend become coercion of the cognitively frail? Will Grandma’s robot pet inspire more family conversation or allow her kin to turn away from the demanding work of supporting someone who is ill or in pain?
“Robots, if they are used the right way and work well, can help people preserve their dignity,” says Matthias Scheutz, a roboticist who directs Tufts University’s Human-Robot Interaction Lab. “What I find morally dubious is to push the social aspect of these machines when it’s just a facade, a puppet. It’s deception technology.”
For that is where the ethical dilemmas begin — with our remarkable willingness to banter with a soulless algorithm, to return a steel and plastic wink. It is a well-proven finding in the science of robotics: add a bit of movement, language, and “smart” responses to a bundle of software and wires and humans see an intentionality and sentience that simply isn’t there. Such “agency” is designed to prime people to engage in an eerie seeming reciprocity of care.
Social robots ideally inspire humans to empathize with them, writes Maartje de Graaf of the University of Utrecht in the Netherlands, who studies ethics in human-robot interactions. Even robots not designed to be social can elicit such reactions: some owners of the robot vacuum Roomba grieve when theirs gets “sick” (broken) or count them as family when listing members of their household.
Many in the field see the tensions and dilemmas in robot care, yet believe the benefits can outweigh the risks. The technology is “intended to help older adults carry out their daily lives,” says Richard Pak, a Clemson University scientist who studies the intersection of human psychology and technology design, including robots. “If the cost is sort of tricking people in a sense, I think, without knowing what the future holds, that might be a worthy trade-off.” Still he wonders, “Is this the right thing to do?”
We know little about robot care’s long-term impact or possible indirect effects. And that is why it is crucial at this early juncture to heed both the field’s success stories and the public’s apprehensions. Nearly 60 percent of Americans polled in 2017 said they would not want to use robot care for themselves or a family member, and 64 percent predict such care will increase the isolation of older adults. Sixty percent of people in European Union countries favor a ban on robot care for children, older people, and those with disabilities.
But research suggests that many seniors, including trial users, draw a line at investing too much in the charade of robot companionship, fearing manipulation, surveillance, and most of all, a loss of human care. Some worry robot care would carry a stigma: the potential of being seen as “not worth human company,” said one participant in a study of potential users with mild cognitive impairments.