August 24, 2016

Acceptance (WP6)


Overview

thumbsupman-2400pxA defining characteristic of social robots is the ability to exhibit a socially acceptable behaviour, and to follow social convention and norms. Norms are cultural products (including values, customs, and traditions) that represent individuals’ basic knowledge of others’ behaviour and expectations, and may be recognised at different levels in society. Within the nations of Europe, there are many cultural traditions, each of which have their norms of behaviour, and a robot’s usefulness and acceptability may depend on how well it learns, predicts and conforms to these and to people’s expectations of different types of robots in different circumstances. In WP6 we will continue our theme of real-world experimentation in three directions. ESR13 will investigate new methods for evaluation of social robots from multiple perspectives, using a combination of qualitative and quantitative data. A specific study will investigate if changes in Interaction Quality, caused by changes in level of automation, affect the perceived feeling of security and trust, which impact acceptance. ESR14 will explore the ethical consequences of the design decisions which may help users accept robots. Specifically, the effect of varying Interaction Quality controlled by emotional deception will be studied. ESR15 will develop a novel procedure for exploring assimilation of robots in older adults’ homes, in particular how changing Interaction Quality caused by sliding autonomy affects users’ attitudes and experiences. The work in WP6 is relevant for design and evaluation of all types of applications with robots that support ADL or act as social companions.


Tasks and Deliverables

Tasks

T6.1 Investigating the factors that determine users’ perceptions of robot sliding autonomy (ESR13)
T6.2 Exploration of the ethical consequences of the design decisions which may help users accept robots (ESR14)
T6.3 Studies on how to measure interaction experience by a combination of qualitative and quantitative methods (ESR15)

Deliverables

D6.1 Report on user study on how telepresence systems with adjustable autonomy affect perceived safety(ESR13)M23
D6.2 Report on user study on social acceptability and deception (ESR14) M23
D6.3 Report on a methodology for conducting longitudinal ethnographic studies in evaluations of robotics for older adults (ESR15) M23
D6.4 Quantitative measures to support existing qualitative evaluation tools (ESR13) M36
D6.5 Report on ethical concerns in close interaction systems (ESR14) M36
D6.6 Report on how Interaction Quality and levels of autonomy affect user experience (ESR15) M36
D6 7 Algorithmic description and report on adjusting levels of autonomy in a robotic telepresence systems (ESR13) M40
D6.8 Guidelines for ethical investigations and development (ESR14) M40
D6.9 A toolbox based on users’ perceptions and experiences, with guidelines on how to develop social robots (ESR15) M40


Involved ESRr

ESR13 (ORU) Measuring interaction effectiveness

Safety and the feeling of security are perhaps the most important factors and motivators for an older adult to adopt robot technologies in order to continue to live in their own accommodation. One such technology is sensor networks, that offers alarms, reminders, automatic switch off of electrical appliances, closing doors etc. User-centred evaluation is in this case often qualitative and done by users filling out various questionnaires (e.g. SF-125, QOLS6, and HADS7). However, the advent of physically embedded sensors, ranging from environmental sensors to wearables, makes it possible to collect quantitative data on people’s behaviour in the home.

Our initial studies[1], suggest that evaluation using a combination of qualitative and quantitative data can lead to a greater understanding of safety and perceived feeling of security. ESR13 will further test this hypothesis, and also extend the work to include autonomous robot systems operating at varying levels of autonomy, and with varying levels of social behaviour (see ESR7). For such systems the Interaction Quality depends on the level of autonomy. Three important questions to study are 1) Can such systems, that support independent living, also lead to an increased sense of security? 2) Can we determine new quantitative measures to support the qualitative evaluation tools? 3) In what way does the Interaction Quality, or level of autonomy, affect the perceived feeling of security? In order to answer these questions, a number of field studies will be conducted in collaboration with ANSAP who will manage installation and maintenance. A robot platform will be deployed in the homes of selected older adults, and a number of measures capturing the Interaction Quality with the robot will be collected over time. Such measures can range in complexity. For example, they may include usage statistics of the robot, proxemics based on sensors embedded on the robot and much more. The key is to collect and assess the measures longitudinally and in correlation with the collection of qualitative data in order to determine how well a select set of measures can reflect upon the perceived safety and security of the associated system. All evaluations are expected to last between 3-12 months, depending on the preferences of the end-users. During a secondment to ESR15@BGU, ESR13 will compare and combine evaluation procedures. During an industrial secondment to FHG, alternative approaches to evaluation of social robots will be investigated and compared, possibly leading to modified procedures (this is a tentative plan that may be adjusted to best fit actual research).


ESR14 (UWE) Ethical robot interaction

While researchers have explored the ethical issues regarding human-robot interactions, there is a need to develop a clearer understanding of what comprises ethically and socially acceptable behaviour in robots in different contexts. Prescott et al.[2] define a range of issues that need to be considered in developing companion robots for older adults, highlighting some key arguments in regards to customisation, personalisation of embodiments, personality and behaviour which could result in broader appeal and acceptance. From as early as 2002, Kiesler[3] , and Breazeal[4] showed that if a robot has a compelling personality, people would be more willing to establish a relationship and interact with it. However, instilling the robot with a personality and other human-like behaviours could lead to emotional bonding and attachment[5]. The matter has also been dealt with in our earlier work[6].

Actively encouraging and promoting emotional bonding and attachment, poses potential ethical issues[7],[8]. It may be that some level of emotional deception that implies the robot has a personality and can “understand” and empathetically respond to a person, is implicit in any human-robot interaction, and may sometimes be regarded as benign or beneficial[9]. The amount of deception is a design parameter that influences the Interaction Quality, and it is vital to ensure that the benefits of using the robot outweigh any potential costs due to the set level of deception. This is particularly important for applications in eldercare, where the human may be fragile and more sensitive to emotional bonding with the robot.

ESR14 will study how emotional deception is perceived and affects acceptance of robots, depending on age, culture, expectations and emotional state. The work will start by Wizard-of-Oz experiments with older adult users in care homes in the UK, and also within the constrained environment of the ARPAL Studio. The robots’ emotional realism will be controlled and experiences and acceptance will be analysed through longitudinal studies enhanced with surveys of users and care-givers. Where deception is used, we will make sure formalised debriefing takes place and will act according to the ethical guidelines of the British Psychological Society.

As the cognitive aspects of HRI become more fundamental to a robot’s role, it is likely that cultural norms and expectations of users will play more of a part in acceptance. It is therefore vital to assess how different cultures’ commonalities and differences impact on acceptance of a robot as a seemingly sentient being. There is also clearly a value in determining how general cultural norms might be distinguished from individual preferences. ESR14 will therefore make a secondment to ESR13@ORU to carry out comparative studies at Ängen testbed. ESR14 will also make a secondment to ESR8@CSIC to analyse ethical concerns regarding close interaction systems (this is a tentative plan that may be adjusted to best fit actual research). The end-goal for ESR14 is to produce a generalised ethical framework to guide future investigation and development. During a secondment to ADELE, the reached conclusions will be applied and related to their product, the virtual agent FIONA.


ESR15 (BGU) Older adults’ interaction with robots

Due to a prevalent focus on usability[10], many complex social and psychological concerns related to interactions between robots and older adults have been largely ignored in earlier research. As a result, key questions related to interaction effects on daily living and well-being in later life are, to a large extent, still open. Possible explanations for this gap in research may be social scientists’ lack of understanding of robotics, and the absence of adequate training for social scientists to explore humans-robot interactions.

SOCRATES provides an opportunity for training social scientists in exploring the effects of interaction with robots on daily living and well-being in later life. ESR15 will develop procedures for exploring processes of assimilation of new robotics technologies into elderly homes. As opposed to studies that consider the users at the design stage or final stage[11], we aim to create a method for assessing older adults’ behaviour and experiences over time and in real-life circumstances. Although a lot of research has focused on details of creating humanlike interactions for social robots[12], little attention has been paid to the development process itself, which is usually performed by programmers[13].

ESR15 will ensure the development process is a multidisciplinary process integrating technical knowledge of hardware and software, socio-psychological knowledge of interaction dynamics, and domain-specific knowledge of the target application[14]. In particular, the influence of varying Interaction Quality caused by aging effects will be studied. Inspired by our earlier work on the use of ICT in later life that simultaneously explored behavioural and psychological aspects of technology use[15], the procedure will relate to three topics: 1) (Behaviour) How do older adults interact with robots? Specifically regarding processes of gaining control, shaping or attributing meaning and making the robots an integral part of life. 2) (Benefits) What are the effects of varying Interaction Quality on older adults’ subjective well-being (e.g., on their satisfaction with life, social connectedness, self-efficacy, and attitudes towards aging)? 3) (Constraints) What social and psychological factors (e.g., technophobia, technostress, and attitudes) constrain older adults’ beneficial interaction with robots? The special effects of varying Interaction Quality through sliding autonomy will be studied during a secondment to ESR7@ORU in an ethnographic longitudinal study at Ängen at ORU. The study will aim at replicating real-life circumstances as much as possible, and will apply a mixed-methods approach with in-depth interviews of the residents and their main caregivers, as well as continuous observations. During an industrial secondment to FHG, connections between robot design (in particular the Care-O-bot) and the examined effect on users will be studied (this is a tentative plan that may be adjusted to best fit actual research).


References

[1] A. Kristoffersson, S. Coradeschi, A. Loutfi, A Review of Mobile Robotic Telepresence, Advances in Human-Computer Interaction, 2013, 2013.

[2] Prescott, et al. (2012). Robot companions for citizens: roadmapping the potential for future robots in empowering older people.

[3] Kiesler, S., & Goetz, J. 2002. Mental models of robotic assistants. InCHI’02 extended abstracts on Human Factors in Computing Systems. ACM.

[4] Breazeal, C. (2003). Toward sociable robots. Robotics and autonomous systems42(3), 167-175.

[5] Riek, L. D., Hartzog, W., Howard, D., Moon, A., & Calo, R. (2015, March). The Emerging Policy and Ethics of Human Robot Interaction. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts (pp. 247-248). ACM.

[6] Rocks, C., Jenkins, S., Studley, M. and McGoran, D. (2009) Heart robot: a public engagement project. Interaction Studies, 10 (3). pp. 427-452.

[7] Sharkey, A. (2014). Robots and human dignity: a consideration of the effects of robot care on the dignity of older people. Ethics and Inf. Tech,16(1), 63-75.

[8] Sharkey, A., & Sharkey, N. (2012). Granny and the robots: ethical issues in robot care for the elderly. Ethics and Information Technology14(1), 27-40.

[9] Eytan, A.,Tan, S,Teevan, J. Benevolent deception in human computer interaction. Proc of the SIGCHI Conf, on Human Factors in Comp Syst. ACM, 2013.

[10] Broadbent E., Stafford. R. & MacDonald, B. (2009). Acceptance of Healthcare Robots for the Older Population: Review and Future Directions. International Journal of Social Robotics, 1, 319–330.

[11] Jayawardena, C., Kuo, I. H., Unger, U., Igic, A., Wong, R., Watson, C. I. et al. (2010). Deployment of a Service Robot to Help Older People. The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems. October 18-22, 2010, Taipei, Taiwan.

[12] Leite, I., Martinho, C., & Paiva, A. (2013). Social Robots for Long-Term Interaction: A Survey. International Journal of Social Robotics, 5, 291-308.

[13] Glas, D.F., Satake, S., Kanda, T. and Hagita, N. (2012), “An Interaction Design Framework for Social Robots”, Robotics: Science and Systems, pp. 89–96.