Introduction

Long before the coronavirus pandemic 2019 (COVID-19) made 'sheltering at home' and ‘physical distancing' well-worn phrases, the number of people living alone for sustained periods of time was already unparalleled in human history. In developed nations, one-person households are ubiquitous, representing over forty percent of households in Scandinavian nations; more than a third in France, Germany and England; and more than a quarter in the U.S., Russia, Canada, Spain and Japan (Klinenberg 2012). During the COVID-19 pandemic, social isolation has dramatically increased, with half the globe's population placed under strict physical distancing orders to prevent the spread of the SARS-Cov-2 virus (Minder et al. 2020). These orders typically require closing schools and "nonessential" businesses, while banning large group gatherings. Although unintended effects of pandemic-style physical distancing have not been systematically studied, there is genuine cause for concern. Outside of pandemic situations, social isolation and loneliness not only threaten well-being, but represent a major social determinant of health. A 2020 report of the National Academies of Science, Engineering, and Medicine, summarizing four decades of research, documents robust evidence linking both social isolation and loneliness to increased risk for premature death, with the strongest findings associated with social isolation (National Academies of Sciences, Engineering, and Medicine 2020). Social isolation predicts all-cause mortality (Leigh-Hunt et al. 2017; Steptoe et al. 2013), matching well-documented clinical risk factors, such as smoking (Pantell et al. 2013), while eclipsing others, such as obesity (Holt-Lunstad et al. 2015). Both social isolation and loneliness are strongly associated with a greater incidence of major psychological, cognitive, and physical morbidities (National Academies of Sciences, Engineering, and Medicine 2020). Socially isolated older people experience worse memory, physical well-being and mental health than those who are not socially isolated (Pantell et al. 2013).

During the COVID-19 pandemic, older populations, who are at the greatest risk of becoming seriously ill and dying from the disease, may face extended quarantine and prolonged physical distancing over and above what is recommended for the general population. In the U.S., eight out of ten reported deaths from COVID-19 have occurred among people age sixty-five and older (Centers for Disease Control and Prevention 2020). Worldwide, evidence shows a strong age gradient in COVID-19 morbidity and mortality (Verity et al. 2020). Frontline physicians involved in the care of older adults describe the "profound isolation" of residents in long-term care facilities who are "prisoners in their one-bedroom homes, isolated from each other and the outside world" (Eghtesadi 2020). During future infectious disease outbreaks involving other pathogens, experts predict that older age groups will face higher risk of morbidity and mortality than the general population due to age-related decline in immune responses that renders them less able to mount an effective defense (Wu et al. 2020) along with higher rates of underlying chronic disease (Huang et al. 2020).

This paper sets forth an innovative, albeit controversial, response. It proposes reducing the adverse health outcomes wrought by social isolation and loneliness by deploying robots to function as social companions and friends to socially isolated people. Since older people are among the hardest hit by the COVID-19 pandemic, the proposal focuses on older age groups. To date, most of the discussion of roles for robots during the COVID-19 pandemic have focused on other functions, such as decontamination and telemedicine; logistics, such as food delivery and handling of contaminated waste; and reconnaissance, such as monitoring compliance with quarantines (Yang et al. 2020). Diagnostic roles for robots have also garnered attention, including the piloting of a prototype robot to remotely collect nasopharyngeal swabs for testing (Wang et al. 2020); a field hospital in Wuhan, China staffed by robots to relieve healthcare workers (Hornyak 2020; Katz 2020); and research to develop an automated intensive care unit (ICU) with negative-pressure wards equipped with robotic capabilities (Guizzo 2020). Some see the pandemic as a tipping point, which will quicken the pace of modernization and deployment of technology, while at the same time cautioning that sustained coordination between government funders, robotics researchers and frontline clinicians is needed to prepare for expected future infectious disease outbreaks (Yang et al. 2020). Yet, in debates about preparing for future infectious disease outbreaks, little mention has been made of the valuable role sociable robots can play in reducing social isolation and loneliness. A technology strategy that includes sociable robots carries distinct advantages. First, sociable robots can be sanitized and offer a safe means of interacting with older people during a pandemic (Armitage and Nellums 2020). Second, sociable robots that leverage recent advances in artificial intelligence (AI) are up to the challenge. They display increasingly sophisticated emotional intelligence; interact in ways that seem lifelike, such as recognizing voices, faces and emotions; interpret speech and gestures; respond appropriately to complex verbal and nonverbal cues; make eye contact; speak conversationally; and adapt to people’s needs by learning from feedback, rewards, and criticisms. Increasingly sophisticated AI technologies make it possible for users to establish close rapport and meaningful connections with sociable robots, producing many of the same positive outcomes related to health and happiness that human social interaction affords (Abdi et al. 2018). Smartly designed sociable robots enable older individuals to "have a life" and to realize their continuing aspirations for social and emotional health (World Health Organization 2015). Finally, evidence suggests that older adults generally like and are prepared to form relationships with sociable robots (Pu et al. 2017). Research shows that when older adults form relationships with robots, they generally report better health and well-being (Broekens et al. 2009).

However, the proposal is bound to be met with controversy. Would deploying robots signal societal abandonment of older adults? Would it harm, rather than help, this population? Do robots offer counterfeit or subpar companionship? This paper addresses these questions and related concerns. “The proposal” section characterizes social isolation and loneliness as major threats to public health and sets forth a model public health strategy. The “Replies to objections” section defends this proposal against critics who raise concerns related to coercive design, replacement of humans with robots, privacy incursions, and counterfeit companionship. The “Conclusion” section holds that robots offer a promising avenue to improve the health of older age groups during pandemic outbreaks; more generally, they offer a promising means to improve the lives of socially isolated and lonely older adults in aging societies.

The proposal

Although social isolation and loneliness are closely entwined, they are conceptually distinct (National Institute of Aging 2019). Social isolation indicates an objective situation in which a person does not have a social network to turn to or does not interact frequently with others, while loneliness refers to a subjective feeling of being alone when one desires companionship. Although the two are often concurrent, they do not necessarily occur together. For example, a person can be lonesome while surrounded by others. Or someone can be isolated without desiring company.

Social isolation and loneliness spike during infectious disease outbreaks as the result of mandates to self-isolate for long stretches with no certain endpoint in order to prevent disease spread. While data on mental health repercussions of the COVID-19 pandemic are not yet available, evidence from prior similar events, such as the 2003 severe acute respiratory syndrome (SARS) outbreak, demonstrate depression, anxiety, panic, psychotic symptoms and suicide are common responses (Xiang et al. 2020). Research also shows that the psychological impacts of quarantine can be wide ranging and long-lasting (Brooks et al. 2020). The 2019 coronavirus pandemic exacerbated an already serious public health problem with especially devastating consequences for older people, whose sole social contacts may lie outside the home, in adult daycare facilities, places of worship, and community centers, and who rely on voluntary programs and services that come to them in their homes, which may cease to be available (Armitage and Nellums 2020). Many nursing homes and other long-term care facilities serving older adults went into "lockdown" mode during the COVID-19 pandemic, prohibiting all visitors, including family members. To address social isolation and loneliness during pandemic emergencies, Eghtesadi, a frontline physician involved in the care of older adults in long-term care facilities in Canada, recommends palliating social isolation by integrating technological advances (Eghtesadi 2020). Present technologies utilize devices such as smart phones and iPads as conduits to connect older individuals to others, e.g., via social media and telehealth; future advances might include, for example, virtual reality (VR) headsets that allow interacting with loved ones, attending a musical performance, or going outdoors in simulated spaces. These more immersive VR experiences have shown promising preliminary results (Kemperman et al. 2019; Shimada et al. 2010; Appel et al. 2019).

Even absent a pandemic emergency, social isolation and loneliness impact older people at higher rates than the general population. Although aging does not cause social isolation or loneliness, it is associated with major factors that put people at heightened risk (National Academies of Sciences, Engineering, and Medicine 2020). For example, older age is correlated with higher rates of chronic diseases, such as cardiovascular disease and stroke; geriatric syndromes, such as frailty and incontinence; sensory impairments, such as hearing and vision loss; and disruptive life events, such as retirement, housing changes, and the loss of a partner, all of which raise risk for social isolation and loneliness. For people sixty-five and over, living alone is strongly correlated with feeling subjectively lonely (University of Michigan, Institute for Healthcare Policy and Innovation 2019). Ironically, although older people living alone often report loneliness, they have larger social networks and more frequent participation in social activities (Chatters et al. 2018). Perhaps, the social connections available to older adults living alone do not generally reflect a preference to share company with someone, but instead the need for professional services. For example, connections instituted following the death of a spouse or deterioration of health might reflect a need for transportation or caregiving and might not involve the kinds of social or emotional ties that keep loneliness at bay.

While many societies express normative expectations that families will meet older people’s affiliative and social needs, in fact, studies of people age fifty-five and over reveal that older age is a reliable indicator of social isolation from family (Chatters et al. 2018). According to social convoy theory, people maintain a network of social relationships that escorts them through life, like a convoy or group of fellow travelers on the road of life (Wrzus et al. 2013). By old age, individuals face heightened risk of losing key members of their social convoy to death, disease and disability. Rather than seeking to grow social networks to compensate for bereavement and loss, older adults tend to regard time remaining as brief and do the opposite: relinquish social ties to all but their closest associates. Thus, even when a pandemic is not occurring, maintaining a social network during later life proves challenging and older adults might not be receptive.

During the COVID-19 pandemic, constraints due to quarantine and physical distancing make social relationships all the more challenging. Yet, as Cudjoe and Kotwal note, the pandemic simultaneously offers public health experts "a unique opportunity to envision, pilot or implement novel solutions that could have a lasting impact on the health and well-being of older adults" (Cudjoe and Kotwal 2020). One important response to the challenge of social isolation and loneliness is designing robots to afford social interactions that compensate for losses and safeguard health and well-being. A sociable robot is “an artificial agent (often embodied with anthropomorphic or zoomorphic features) that interacts with humans by following the social norms and behaviors attached to its role” (National Academies of Sciences, Engineering, and Medicine 2020, pp. 9–22). During a pandemic, sociable robots can be sanitized and afford a safe infection-free form of social relation, engaging with older people during periods when family and friends are physically distancing or prohibited from in-person visiting. With a global pandemic that is forecast to persist over an extended period of time, older adults will likely experience prolonged separation from family, which makes the need for practical tools to help them navigate their situation all the more urgent. Sociable robots could help.

A model public health strategy might draw on well-established evidence documenting placebo and nocebo responses. Evidence of these responses demonstrates that what a research subject believes in advance about an intervention shapes the intervention's subsequent outcomes. With placebos, positive expectations are assumed responsible for the beneficial effects of an intervention because these effects cannot be attributed to any properties of the intervention. By contrast, with nocebos, negative expectations are considered responsible for an intervention's negative effects, because these effects cannot be attributed to the intervention. Researchers have demonstrated placebo and nocebo responses under diverse conditions, including pain and other physical sensations (Bartels et al. 2014). In the case of social isolation and loneliness, the linkage to some adverse health outcomes, such as depression and anxiety, seems to be perceived isolation (Santini et al. 2020). By creating a perception that one is not isolated but in the company of others, social robots can potentially block this pathway.

A promising way of putting this suggestion into practice is leveraging robot design. In contrast to those who advocate designing robots to “remain iconic or cartoonish so that they are easily distinguished as synthetic even by unsophisticated users” (Sullins 2008, p. 156), a model public health policy takes the opposite tack. It designs life-like robots, because their resemblance to us helps to foster a sense of social rapport. An example of such an approach is the emerging field of soft robotics. Soft robotics designs robots to reflect the morphology and functionality of soft structures in nature, such as soft-bodied animals like inchworms and squid, and animals parts, like octopus arms and elephant trunks (Trivedi et al. 2008). In contrast to stiff robotic hands, which compute each finger’s movements, soft robotic hands deform around an object’s surface until it grabs hold (Shen 2016). This enables closer contact with users, especially with older adults, who in general exhibit more frailty, less agility, worse balance, less strength, more bone porosity and less muscle mass than their younger counterparts. Making sociable robots that can touch, rub, hug, pat and hold hands without causing injury to older end users not only enables safe companionship but reinforces affiliation through touch. Touch does not just feel good, "[h]umans have brain pathways that are specifically dedicated to detecting affectionate touch…touch is how our biological systems communicate to one another that we are safe, that we are loved, and that we are not alone" (Eichstaedt 2020).

Life-like design also benefits from robots capable of responding to touch. For example, the Zhenan Bao Research Group is developing an artificial nerve that, when used with a robotic “brain,” allows robots to react to external stimulus just like we do. According to Sprinkle, the concept is simple: “in our skin, we have sensors that can detect even the lightest touch, neurons that transmit that touch to other parts of the body, and synapses that take that information and translate it into the feelings that we recognize and respond to” (Sprinkle 2018). Mimetic robots with artificial sensors, neurons and synapses can perform similar functions. In fact, they are already being deployed in prosthetic limbs equipped with synthetic nerves that can sense Braille and perform delicate feats requiring constant careful sensing, such as moving a cockroach leg (Service 2018).

Finally, life-like robots should be humanoid in appearance. While robotic dogs and seals have been shown to mitigate social isolation and loneliness (McGlynn et al. 2016), they are designed to function as pets and their responses to users are pet-like. Humanoid robots promise more, mimicking sophisticated human social responses and relationships. Humanoid robots also encourage users to perceive robots as animate, which shapes expectations positively (Darling 2017). For example, we might be more apt to confide in, show consideration toward, and form close ties with robots perceived as animate. The alternative of presenting robots as nonliving tools might yield nocebo effects. For example, seeing robots as mere objects might foster the attitude that robots command no respect, reverence or love.

Replies to objections

Coercive design

Yet critics might worry that harnessing placebo and nocebo effects manipulates users, coaxing them to embark on relationships by appealing to emotions, rather than informed choice. In reply, evidence shows older adults prefer interacting with computers that present themselves as virtual humans, with human autobiographical memories (Reeves and Nass 1996) and with personalities and body shapes that match their own (van Vugt et al. 2006). Older adults maintain longer-term relationships with robots that mimic humans by displaying variability in speech and behavior and exhibiting socio-emotional behaviors, such as empathy and social chat (Bickmore et al. 2010). Collecting further information during early design phases about the preferences of the particular older population to which robots will be deployed can help ensure robot design reflects users' preferences.

However, it might be claimed that users can be manipulated even when their preferences are satisfied. One way this could occur is if users adapt preferences and go along with robotic companions under pressure. Elster explains the phenomenon of adaptive preferences by comparing it to sour grapes: “[d]esperately hungry, but unable to reach the grapes that hang above him, the fox declares them sour. They are not sour, so the fox is making a mistake” (Elster 1983, p. vii). The sour grapes phenomenon highlights the concern that preferences may shift to comply with what is expected or available, with the result that authentic preferences are displaced by inauthentic ones. Yet, in response, adaptation is not inherently problematic. Elster himself distinguishes internally- from externally-driven adaptations. Internally-drive preference change is a positive transformation that exhibits users' resilience and ability to cope when new conditions present; it occurs when a person's second order desires and preferences approve of changes to first-order desires and preferences. By contrast, externally-driven adaptations take place when other people are the primary drivers of preference change and individuals comply with outside demands to alter their preferences without approving the change at a higher order. In the context of a pandemic disease emergency, a decision to seek friendship with robots might reflect the authentic choices of users who are coping effectively and resiliently with their situation. The onus should be on those who suppose otherwise and doubt the genuineness of users’ preferences.

Yet, critics may find this analysis naive. Surely, many preference adaptations are externally-driven, especially in a setting where technology companies are bent on selling products. Gaining access to user information through internet-connected robots would enable companies to manipulate users better, extracting personal data and using it to increase sales by nudging users to want or think they need fancier robots with sleeker designs and upgrades. Over time, users may grow dependent on robots for social connection and acclimate to data sharing. An underlying concern is a failure on the part of sellers to regard older adults as ends-in-themselves and instead reduce them to data sets or means for increasing profits.

The reply to these concerns can only be to acknowledge them as serious. Yet even though it would be naive to think that users can avoid manipulation, it would be just as naive to think that users have an array of other options they prefer. During pandemics, opportunities for social relationships are narrowed. Rather than deprive older people of robot relationships, which further disenfranchises them as agents, a better tack is to enact effective measures to balance power and protect against coercion (discussed further below, in the section, “Privacy Incursions”).

Replacing humans

Still, critics underscore uncertainty and predict adverse outcomes. Sparrow regards it as "perverse" to respond to the fact that older persons are increasingly socially isolated with the invention of "fancy robots to entertain and comfort the elderly" (Sparrow 2010, p. 308). Turkle expresses the worry that robots do not just do things for us, they do things to us (Turkle 2006). According to Turkle, we attach to what we nurture, and if we cease nurturing one another and assign these tasks to machines, we grow apart and less connected (Turkle 2011). Too often, ease and comfort lead us to downgrade social connections, e.g., preferring texting to talking, avatars to live images, "friending" to friends. Coeckelbergh argues that whenever robots function as companions, would-be human companions lose the experience of serving in this capacity themselves (Coeckelbergh 2009). Others concur that sociable robots will eventually reduce older people's contact with human family and friends (Sparrow and Sparrow 2006). According to all of these critics, robotic companionship undermines human social and emotional life.

However, the reality for growing numbers of older adults is that there is not a viable human alternative. During a pandemic emergency in particular, the alternative to robot companionship for many older people is social isolation and loneliness. Without support, older adults are left to languish. Under these conditions, sociable robots do not rob older adults of human companionship but afford companionship where it is lacking. They improve health and well-being, upgrading, rather than downgrading, the social lives of older adults.

Still, critics might express concern that if older adults grow accustomed to sociable robots during the COVID-19 pandemic, after it is over, they will prefer robot over human companionship. Especially if robots' roles expand to other domains, such as assisting with activities of daily living, e.g., walking, dressing, and bathing, users might be reluctant to let robots go and robots might occupy jobs that would have gone to humans.

In reply, there is little evidence for thinking that we will see a glut of caregivers anytime soon. Instead, evidence demonstrates the opposite. A shortage of aides for care-dependent older adults is on the horizon as nations around the globe go grey. For example, from 2018 to 2028 U.S. care jobs, such as personal-care aide and home-health aide positions, are projected to grow 36%, which is far faster than overall job growth (Bureau of Labor Statistics 2020), suggesting ample opportunities exist for both human and robotic caregiving. Perhaps, some users will prefer robotic to human caregivers. Yet, that is not necessarily a bad choice for them to make; it is certainly a choice that competent adults are entitled to make.

Privacy incursions

A further set of concerns critics might raise to the proposal to introduce sociable robots during pandemics relates to privacy. While many electronic devices obtain personal information about users, the type of information shared with sociable robots arguably differs. First, it is more intimate. For example, while we would not disclose our deepest darkest secrets to a Roomba, we might to a robot friend. Second, unless users are willing to divulge some private information, close relationships with robots cannot develop. Third, although all social relationships create risks, robotic companions seem to pose greater risks than human ones. Robots record users’ voices and images and store vast troves of data about them, often in clouds that others could hack.

In response, it is helpful to frame privacy concerns in the context of digital information sharing and risk-taking already occurring. For example, smart phones store users' communications, photos, contacts, calendars and location. Electronic medical records, virtual voice assistants, social media and online banking also store and track personal data and expose users to risks. It is not clear that robotic friends and companions raise new risks or subject users to greater harms over and above what current technologies do. Second, existing data protections can be adapted to reduce many risks. For example, value sensitive design incorporates data protection proactively, at all phases of a product's life cycle, from initial design to operational use and disposal (Colesky et al. 2016); software tools for encryption and anonymizing data enable users to set limits to how information is shared beyond the robot-human encounter. While it is true that some direct information sharing with a robot is necessary for friendship, the degree of sharing is not fixed, but open to discretion (Lutz et al. 2019; Syrdal et al. 2007).

Ironically, giving users more control over information sharing might result in increased sharing and greater risk-taking. For example, social networking sites that invite users to share information can lend themselves to "oversharing" (van den Hoven et al. 2019). As we share more information digitally, the degree of privacy we consider optimal may change (Boenink et al. 2010). The meaning of "private" itself may be transformed. If sharing previously private information becomes daily fare, no longer reserved for special people in our lives, we may have more "friends" and the designation of "friendship" itself may take on new meaning. In the final analysis, the greater risk to privacy might not be that others will intrude, but rather, that we will indiscriminately invite others in. By doing so, we risk losing a sense of ourselves and our relationships separate from public scrutiny. Philosophers, such as Rachels and Schoeman, argue that an important reason why we cherish privacy is that it enriches our lives by creating the conditions necessary for diverse social relationships to flourish (Rachels 1984; Schoeman 1992). We feel uneasy when people share too much or too little, because doing so forecloses certain kinds of relationship we may value, which are partly constituted by particular privacy and publicity practices. When we think about robots transmitting our data to for-profit companies, or when we are forced to view advertisements before interacting with robots, the effect is to commercialize and depersonalize our relationship with robots. We become in that moment a tool for a third party's gain. One reason it matters to protect privacy in human–robot relationships is that it enables a different kind of relationship to take shape, one that is less crass because it is less defined by money-making.

Social relationships with robots are undercut not just by transmission of personal data to companies, but by the dispersion of control associated with hybrid ownership, in which the networked character of robots results in the robot never belonging solely to the user, but instead remaining under the influence of the technology company (Keymolen and van der Hof 2019). In some instances, companies wield more influence than users over "their" robots, e.g., retaining the ability to substantively modify robots based on a vague permission, such as "Users will be given additional notice in the case of material changes," without specifying what form "additional notice" will take (Forbrukerrådet 2016, p. 13). Addressing this requires interrogating the power structures that our relationships with robots embed and asking whose interests are served by having more or less privacy (Young 1990). In some contexts, more privacy makes people more vulnerable, e.g., when it shields those who perpetrate moral atrocities (Jecker 1993); while in others, less privacy leads to endangerment, e.g., when it makes people targets of identity theft. A good balance is struck when we weigh the value of privacy and publicity with an eye to making possible social practices and relationships we have reason to value (Boiling 1996; Gavison 1992).

When sociable robots are deployed for older adults who are socially isolated and lonely, regulatory and legal frameworks that set clear and enforceable parameters to govern for-profit technology companies are needed to strike the right balance. The right balance must be context sensitive. Niessenbaum delineates a theory of contextual integrity that is one possible way to specify context-sensitive regulatory frameworks.

The theory of contextual integrity presents contexts as social spheres, as constituents of a differentiated social space...Although contextual integrity relies on an intuitive notion of social sphere, covering such instances as education, healthcare, politics, commerce, religion, family and home life, recreation, marketplace, work and more...spheres generally comprise a number of constituencies, such as characteristic activities and practices, functions (or roles), aims, purposes, institutional structures, values and action-guiding norms (Niessenbaum 2018, p. 838).

Contextual analysis invites a nuanced regulatory framework tailored to diverse social domains. Privacy controls appropriate in the context of a pandemic will reasonably differ from those appropriate outside pandemic settings, because human-robot relationships will be more isolated from human social life, making them both more vulnerable to abuse and more integral to health.

In addition to helping guide privacy protections directed to for-profit technology companies, context integrity can help tailor protections suitable to particular types of personal relationships (Allen 2019). For instance, when adult offspring gain the ability to check on aging parents vis-a-vis technology without parents' approval or knowledge, privacy protection sensitive to social context will aim to walk a fine line between protecting the dignity of older adults with decisional capacity and protecting trust in parent–child relationships. Context analysis leads us in an altogether different direction in social domains where physical security and property are at stake. For example, when internet-connected robots become vectors for hackers to gain access to home devices such as cameras, security systems, or door locks, this not only breaches privacy but jeopardizes safety and calls for stronger safeguards. In all of these instances, context analysis helps with specification by shaping regulatory protections to suit different kinds of social situations.

Counterfeit companions

A final challenge to designing and deploying social robots for older adults during pandemic disease outbreaks is that even if we wage and win a war against social isolation and loneliness, we achieve only a Pyrrhic victory if befriending robots falls short of preserving the thing of value itself, namely, valued social connections. Matthias expresses this worry when he states that sociable robots feign human mental and emotional capabilities and thereby deceive users (Matthias 2015). Elder argues that even if robots seem to be genuine companions, they are counterfeit (Elder 2017). Others have taken swipes against sociable robots by arguing that they inevitably spoil, rather than replace, human social life (Turkle 2011).

To the extent that these objections claim that there is no redeeming value to human–robot relationships, they backfire. First, relationships with robots can be a lifeline during pandemic disease outbreaks, safeguarding the health and well-being of socially isolated older people, as well as acting as a buffer against loneliness. Second, outside pandemic settings, older people living in aging societies often find themselves alone and bereft of social connections. Third, even though they lack mental states, sociable robots can create a positive care environment, understood as an environment "formed by gestures, movements and articulations that express attentiveness and responsiveness to vulnerabilities within the relevant context" (Meacham and Studley 2017). They can also embody recognition of users when they are made to interact in ways that convey to users that they are being perceived, heard, understood and attended to (Brinck and Balkenius 2020). Clearly, the relationships we form with robots are not the same as those we have with humans. Yet they can protect our health and enrich our lives.

Conclusion

In conclusion, social isolation and loneliness pose serious threats to the health and well-being of older adults. These threats are made worse by unprecedented physical distancing requirements put in place during the COVID-19 pandemic. During infectious disease outbreaks, when human carers are in short supply and family and friends are allowed to visit only remotely, sociable robots fill a gaping hole in human social life. Experts forecast that older adults will continue to face heightened risk during future emerging infectious disease outbreaks, due to their aging immune systems and higher rates of chronic disease. Outside the context of a pandemic emergency, older people in aging societies also face serious risk of social isolation and loneliness, because so many older people live alone. Sociable robots offer a promising way to help older people and we should design and deploy them to do so.