Robotics has wide potential of improving people’s quality of life. Developing an ecosystem of service robotics pursues a solution especially to the growing need for social and health care of elderly people. However, utilizing robots in the welfare sector is a world of multiple contraries, differing attitudes and balancing between beneficence and nonmaleficence. Do robots cause jobs being terminated – or rather increase work opportunities? Is a heavy-lifting robot also a security hazard? Will social robots decrease loneliness or cut out all human contacts? Are social robots terminators or careminators?
Social robots are machines that service people in everyday lives and are capable of learning in the process. Some carebots remind seniors about eating, drinking and taking medicine while some are used for games and exercise, supporting mental and physical activity. Assistive robot can work as a monitoring system, alarming in case of a fall down, or even help people getting back up like the teddy bear faced robot Riba can. However, one major concern is can we monitor people in the name of safety and sacrifice their privacy on the way.
Some robots like Riba recognize faces and voices and respond to commands. More social robots (i.e. humanoid robot Pepper) have even some dialogue readiness. Pepper doesn’t have any other skills besides socializing, so it’s close to pet-like robots, which are described as either bobo-pets or therapeutic robots.
Tele-operated robots are used in interactions between patients and the health care personnel, for example when executing small health checks from a distance. These are not social robots per se, but nevertheless offer social contacts via technology – without geographical restrictions. I myself have been testing a telepresence robot Double for a few weeks. Navigating Double around the university draws a lot of attention from shy glances to enthusiastic conversations. One might ask, is this the way robots are social.
New technologies and social robots are meant to take some of the workload off nurses, and free them to tasks that are in real need of human touch and social interaction. If a hospital robot is gathering laundry, counting pills and calling stand-in employees, nurses are able to care for their patients. The outlook of increased nurse-patient interaction reminds me of the good old days!
Instead of terminating jobs and replacing employees with robots, the claim is that social robots will be tools for the nurses. Robots are meant to assist in strenuous tasks. The emphasis has been on the physical strain of healthcare and on how robotics can be a solution to that. However, the most frequent nursing care activities left undone includes comforting or talking to patients and educating patients or families. For obvious reasons, we can’t delegate the educating or the comforting of patients to robots. Still, are there parts we could?
The real question should be, do health care professionals find some of their work-related interactions overly laborious and if they do, could there be a technical solution to this. Depending on ward and field of work, violence and harassment are things many nurses have to be aware of. What about the strain of emotionally investing to a demented patient, who isn’t able to remember encounters or discussions the day after? Applying technology to these kind of problems raises a lot of ethical questions, which are and must be discussed.
In principle, delegating emotional (or any other) work to robots is justified if it increases peoples’ wellbeing, or on the other hand, if robots are proved to be somehow better than people. Motivational things guide the things people do, yet robots lack motivation all together. In this shortcoming might lie the competence of robots. Lack of motivation brings along things like not ever getting tired or frustrated, and even the purest absence of discrimination. Like Erving Goffman pointed out in his speech in 1982: Objectively based equal treatment of people can only occur, when the server is eliminated and a machine is employed instead.
This brings up yet another contrary. Earlier we wondered how to balance between the customer’s safety and the demand of privacy. Now ask how many of us would be willing to sacrifice true human interaction just to make sure we get treatment of uniform quality and guaranteed equality.
Ausserhofer et al. (2013). Prevalence, patterns and predictors of nursing care left undone in European hospitals: results from the multicountry cross-sectional RN4CAST study. The International Journal of Healthcare Improvement.
Goffman (1983). The interaction order: American Sociological Association, 1982 presidental address. American Sociological Review, 48,1.
Sharkey & Sharkey (2010). Granny and the robots: ethical issues in robot care for the elderly. Ethics and Information Technology, 14.