Source: University of Canterbury
People have similar automatic biases towards darker-coloured robots as they do toward people with darker skin colour, new research from the University of Canterbury’s Human Interface Technology Lab (HIT Lab NZ) and Psychology department shows.
The new research paper, Robots and racism, is being presented today in Chicago at an international conference on human-robot interaction.
UC human-robot interaction expert Associate Professor Christoph Bartneck, HIT Lab NZ, is presenting the paper today at HRI 2018, the annual ACM/IEEE International Conference on Human Robot Interaction (5-8 Mar 2018).
An international collaboration between four universities, the paper’s research team consisted of members with different nationalities and ethnicities. One of the key authors, UC Psychology Senior Lecturer Dr Kumar Yogeeswaran, is a social psychologist with expertise in the areas of diversity, social identity, stereotyping and prejudice. Along with Assoc Prof Bartneck and Dr Yogeeswaran, UC Master of Human Interface Technology (MHIT) graduate Qi Min Ser, and Dr Graeme Woodward, Research Leader of UC Engineering’s Wireless Research Centre, Siheng Wang (Guizhou University of Engineering Science in China), Robert Sparrow (Monash University in Australia), and Friederike Eysell (University of Bielefeld in Germany) also collaborated on the paper.
Most robots currently being sold or developed are either stylised with white material or have a metallic appearance, according to the research paper.
“In this research, we examined if people automatically ascribe a race to robots such that we might say that some robots are ‘White’ while others are ‘Asian’ or ‘Black’,” the researchers wrote.
“To do so, we conducted an extended replication of the classic social psychological ‘shooter bias’ experiment which demonstrates that people from many backgrounds are quicker to shoot at armed Black people over armed White people, while also more quickly refraining from shooting unarmed White people over unarmed Black people.
“Using robot and human stimuli, we explored whether these effects would generalise to robots that were racialised as Black and White. Reaction-time measures revealed that participants demonstrated `shooter-bias’ toward both Black people and robots racialised as Black. Participants were also willing to attribute a race to the robots depending on their colour even when provided the option to select ‘does not apply’.”
“Our research shows that people show automatic biases towards darker coloured robots just as they do toward people with darker melanation,” Assoc Prof Bartneck says.
“This result should be troubling for people working in social robotics given the profound lack of diversity in the robots available and under development today.”
A Google image search result for “humanoid robots” shows predominantly robots with gleaming white surfaces or that have a metallic appearance. There are currently very few humanoid robots that might plausibly be identified as anything other than White and sometimes Asian, he says. Most of the main research platforms for social robotics, including Nao, Pepper, and PR2, are stylised with white materials and are presumed to be ‘White’.
There are some exceptions to this rule, Assoc Prof Bartneck says, including some of the robots produced by Hiroshi Ishiguro’s team, which are modelled on the faces of particular Japanese individuals and are thereby – if they have race at all – Asian. Another exception is the Bina 48 robot that is racialised as Black, although again, this robot was created to replicate the appearance and mannerisms of a particular person rather than to serve a more general role, he says.
“This lack of racial diversity amongst social robots may be anticipated to produce all of the problematic outcomes associated with a lack of racial diversity in other fields. An even larger concern is that this work suggests that people respond to robots according to societal stereotypes that are associated with people possessing the same skin colour,” Dr Yogeeswaran says.
The researchers believe their findings suggest that people carry over their negative stereotypes from humans to robots which can have negative implications for how people react to robots of different colours when they potentially operate as teachers, carers, police, or work alongside others in a factor.
“The development of an Arabic-looking robot as well as the significant tradition of designing Asian robots in Japan are encouraging steps in this direction. Especially since these robots were not intentionally designed to increase diversity, but the result of a natural design process,” Dr Bartneck says.
“We hope that our paper might inspire reflection on the social and historical forces that have brought what is now quite a racially diverse community of engineers to – seemingly without recognising it – design and manufacture robots that are easily identified by those outside this community as being almost entirely ‘White’.”
Bartneck C., Yogeeswaran K., Ser QM., Woodward G., Sparrow R., Wang S. and Eyssel F. (2018) Robots and racism. Chicago, IL, USA: 5-8 Mar 2018. In ACM/IEEE International Conference on Human-Robot Interactionhttp://dx.doi.org/10.1145/3171221.3171260.