FAQs

Frequently Asked Questions

Q1: Who do I contact for further information about the project?

Q2: Who funded this research project?

Q3: Who are the partners on this research project?

Q4: Who are the individual academics involved in the research?

Q5: What robots were used in the research?

Q6: Why is this research funded?  Is it a good use of public funds?

 

Q1: Who do I contact for further information on the project?

Please contact the Project Manager at the University of Exeter:

Freya Palmer

Email: hello@being-there.org.uk

Telephone: +44 (0)1392 724614

 

Q2: Who funded this research project?

Being There: Humans and Robots in Public Spaces, is a £2million three-year project (1st October 2013 – 30th September 2016), funded by the Engineering and Physical Sciences Research Council (EPSRC) under its IDEAS Factory Sandpits call on Digital Personhood, grant ref: EP/L00416X/1. 

http://www.epsrc.ac.uk/

 

Q3: Who are the partners on this research project?

The Being There project brought together researchers from the Universities of Exeter, Bath, Cambridge (formerly at Queen Mary University of London) and Oxford, the Bristol Robotics Laboratory (a collaborative partnership between the University of the West of England and the University of Bristol), and Watershed in Bristol - to look at the social and technological aspects of being able to appear in public in proxy forms, via a range of advanced robotics platforms.

 

Q4: Who are the individual academics involved in the research?

University of Exeter, Department of Psychology

Psychologists from the University of Exeter are looking at how cutting-edge robotics can enable people to participate in public spaces, as a place to meet and share ideas without being there in person.  Professor Mark Levine is Principal Investigator of the project and also leads the team at Exeter consisting of Dr. Miriam Koschate-Reis and Dr. Huseyin Cakal.

The Exeter team will be using accurate tracking and remote emotion sensing to explore contact amongst groups of humans, and also between humans and robots, and the effect this can have on social cohesion and intergroup relations.

Professor Levine said: “Being able to interact with others in public space plays an important role in the well-being of individuals and societies. Sadly, many people are unable to do this – because they are ill, housebound or unable to travel. However, if a robot proxy can act for them – and can transmit back the full experience of being with others - we can help to reduce social isolation and increase civic participation.

“We are very excited by the opportunities that new technologies offer to help us extend our research on helping behaviour and social interactions in public spaces. We hope our work on human-robot interactions will contribute to the public spaces of the future.”

Professor Mark Levine, Dr. Miriam Koschate-Reis and Dr. Huseyin Cakal will produce a series of experiments in laboratory and semi-public spaces which will deepen understanding of the relationship between social identities, social interactions and the spread of emotion in groups.

University of Exeter, Psychology

Professor Mark Levine

Dr. Miriam Koschate-Reis

Dr. Huseyin Cakal

 

University of Bath, Department of Psychology

Psychologists from the University of Bath are examining how people, both in front of and behind robot(s), understand, trust and engage with robotics in real world scenarios. Professor Danaë Stanton Fraser and Dr. Chris Bevan will be using their complementary expertise in social and cognitive psychology, Human-Computer Interaction (HCI), and robotics to analyse the ways in which people respond to robots.

Professor Stanton Fraser said: “We are excited about extending our research in trust and identity to the area of human robot interaction and drawing on our experience of carrying out studies in public spaces”.

The research team will create a ‘living laboratory’, using state-of-the-art technologies to measure how people respond to, and interact with other people who are acting through a robot representative.  The project will be using an advanced programmable humanoid robot ‘Nao’ that they will take into public spaces around Bristol and Bath to measure human interaction with robots.  Nao will be controlled remotely and its controllers will be able to see and speak through its eyes and mouth, while directing where it looks and walks.

Dr. Bevan said: “To some degree, the Internet has changed where, when and how strangers meet in public. This has numerous benefits, not least in removing many physical barriers to participation such as geographical distance and infirmity.  However, a consequence of digital communication is that much of the richness of real world interpersonal interaction – the source of much of its societal power and benefit – is simply lost in translation.  By exploring robots as a means of allowing people to ‘be there’ in a real-world public space – regardless of where they actually are – we can bridge the gap between these physical and digital worlds and take full advantage of their respective capabilities”.

University of Bath, Psychology

Professor Danaë Stanton Fraser

Dr. Chris Bevan

 

University of Oxford, Department of Computer Science

University of Oxford, Oxford Internet Institute

Computer Scientists from the University of Oxford will investigate the feasibility and accuracy of a positioning system based on low frequency magnetic fields.  This system will enable the tracking of humans and robots in any space, indoor or outdoor.  This technology has already been shown to yield accuracy in the context of underground animal tracking.

Dr. Niki Trigoni, Dr. Andrew Markham and Dr. Traian Abrudan are unable to use the same technology because the sensors used in animal tracking are too large and impractical to deploy in an indoor environment.  Instead they will design and use small magnetic beacons that generate low frequency fields.  These fields will then be sensed by low-power miniature receivers carried by individuals and robots, thus enabling them to position and orient themselves relative to beacons.

University of Oxford, Computer Science

Professor Niki Trigoni

Dr. Andrew Markham

Dr. Traian Abrudan

Researchers at the Oxford Internet Institute (OII) are exploring the privacy concerns around surrogate robots.  Professor Ian Brown Dr. Joss Wright and Piers O’Hanlon are working to embed privacy in the design of robots – while robots can record and transmit what they see and hear; their research is trying to find ways to prevent them from unnecessarily revealing the identities of the people they have captured.

Humanoid robotics is an emerging research field that will become increasingly important as robots start to assist people in their daily lives, for example becoming companions for older people in their homes. Professor Brown said: ‘When we begin to interact with friendly-looking humanoid robots, our expectations and assumptions shift. New questions arise about how much we trust these devices. Some people might develop an emotional attachment to them, particularly in situations where robots play the role of companions. It is important therefore that we design robots that have privacy embedded into their design, so their information gathering is restricted to what is needed to interact and carry out their tasks, and information about the identity of their human users is kept to a minimum. Otherwise, these robot “friends” could betray the trust of the people they come into contact with, passing on information to third parties.’

The team at the OII will consider the challenges of having robot proxies in public spaces.  They will conduct experiments exploring trust in shared social settings, and develop a framework for understanding the impact of privacy / anonymity in human-robot interactions. 

“Humanoid robots have the potential to gather, store and analyse data about our movements and activities,” said Dr. Wright. “While they provide opportunities to make our lives easier, the potential loss of control over this information should concern us.  At Oxford we have been exploring how individuals can maintain control over information about themselves, while still enjoying the potential benefits of robotic technology.”

University of Oxford, Oxford Internet Institute

Professor Ian Brown

Dr. Joss Wright

 

University of Cambridge, Computer Laboratory
(previously Queen Mary University of London, School of Engineering & Computer Science)

Researchers from the University of Cambridge (who were formerly at Queen Mary University of London) will analyse the human nonverbal behaviour and emotional expressions in human-robot interactions.  They will do this in a lab setting and in public via the ‘living laboratory’, using state-of-the-art technologies to measure how people respond to, and interact with other people who are acting remotely through a robot representative.

Assistant professor in digital media, Dr. Hatice Gunes said “We are excited about extending our research in individual emotions and nonverbal behaviour to public space settings where multiple people, multiple groups and even robots will be meeting and interacting. 

Interactions between individuals in a public space generate a rich set of explicit and implicit data, from gestures, visual cues, and body language to long-term patterns of interaction and group movement.  We are interested in obtaining fast and robust means for quantifying emotions and behaviour in collective settings”

Dr. Gunes and her team will use their expertise in affective computing, a field of study that focuses on developing systems that are sensitive to human emotions and behaviour, and multimodal information processing by focusing on human nonverbal behavioural cues from different sources.

University of Cambridge, Computer Laboratory

Dr. Hatice Gunes

Dr. Oya Celiktutan

 

Bristol Robotics Laboratory

Researchers from the Bristol Robotics Laboratory (BRL) are joining other leading research institutions in a new project looking at how remotely operated robots could enable people to take part in public spaces – without actually being there.  Dr. Paul Bremner and Peter Gibbons will look at how using remotely operated robots might enable people to participate in public spaces – a key aspect of developing successful citizenship and public cohesion – if accessibility or geography prevents them from being physically present in the space.

The BRL team will look at the social and technological aspects of being able to appear in public in proxy forms, via a range of advanced robotics platforms. The robots will be controlled remotely, a method called tele-operation, and a tele-operator will be able to see through the robot’s eyes and speak through its mouth, while directing where it looks and how it moves.

Dr. Bremner said, “Public spaces play a valuable role in providing shared understanding and common purpose, but if you are ill or disabled, or live too far away, this can be a barrier to participation.  The aim of our research is for the robot to be an avatar for a remote person so it will be taking part in the same activities as those actually present in the venue.

“To investigate this we will use several robots such as Engineered Arts’ Robothespian, Aldebaran’s NAO, and MobileRobots' PeopleBot.  The robots will be tele-operated to produce speech, gestures and other non-verbal social behaviour so that we can look at the way robot avatars transmit social presence, first using motion capture (using a Microsoft Kinect) and later using desktop control (a keyboard and mouse).  Over the course of the project some autonomy will be added to the robots to enable better social interaction and allow simple desktop control. We will also investigate how different robot appearances and behaviours affect the social interaction.

“An example of how this could eventually be used might be a NAO robot in a museum , acting as an avatar - looking round at the exhibits and interacting with other visitors – on behalf of someone who was in another part of the city, unable to visit the location because of disability or illness.  This research will enable us to develop the technology to enable this to happen, and to evaluate the human interaction with the robot.  Developing robots that will be effective in their interactions with humans in different social situations is crucial to the future robotic development that is useful to human society.”

The Bristol Robotics Laboratory (BRL) is collaborative partnership between the University of Bristol and the University of the West of England (UWE, Bristol).

Bristol Robotics Laboratory

Dr. Paul Bremner

Peter Gibbons

 

The Watershed, iShed

The academic research teams are working with a diverse range of creative practitioners to add nuance and a disruptive element to the project.  A group of artists, designers and game makers, all of whom work with technology, will be collaborating throughout the three year project.

Watershed is a cross-artform venue and producer, sharing, developing and showcasing exemplary cultural ideas and talent. They are based in Bristol, but place no boundaries on their desire to connect with artists and audiences in the wider world.  iShed was established by Watershed in 2007 to produce creative technology collaborations.

The Being There project seeks to create live research trials in public spaces, engaging non-academic audiences with the themes at the heart of the project and delivering usable study results for the teams. To ensure these trials are meaningful, creative and engaging, iShed has brought together a group of creative practitioners who will collaborate with the researchers.

From composers and choreographers, to game makers and experience designers, the creative cohort have a broad range of backgrounds and specialisms.  Working across the project, iShed will facilitate a process of labs, workshops, sprints and commissions which will help shape the direction of the research, catalyse new partnerships and collaborations and deliver disruptive, creative and engaging work.

iShed

 

Q5: What robots were used in the research?

The project is using a selection of robots in the research. 

Aldebaran's NAO

NAO is a 58cm tall humanoid robot who can move, see, touch and hear.  NAO has 25 degrees of freedom for movement; two cameras to see its surroundings; an inertial measurement unit which lets him know whether he is upright or sitting down; touch sensors to detect your touch; and four directional microphones so he can hear.

Engineered Arts' RoboThespian

RoboThespian is a life sized humanoid robot designed for human interaction in a public environment.  RoboThespian is fully interactive, multilingual, and user-friendly, making it a perfect device with which to communicate and entertain.

MobileRobots’ PeopleBot

PeopleBot is a differential-drive robot built on a robust base with a chest-level extension to facilitate interaction with people.  With infrared sensors PeopleBot can navigate autonomously and avoid obstacles with precision.  It has an onboard computer and pan/tilt/zoom camera which can be used for object and person recognition and tracking, mapping and localization.  With an audio and speech package, PeopleBot can also speak.

INRIA’s Poppy

Poppy is designed to conduct scientific experiments and to integrate several key abilities in an easy-to-use robotic platform.  Poppy is developed using Open-source humanoid platform based on robust, flexible, easy-to-use hardware and software.

  

Q6: Why is this research funded?  Is it a good use of public funds?

The EPSRC is the main UK government agency for funding research and training in engineering and the physical sciences, investing more than £800 million a year in a broad range of subjects - from mathematics to materials science, and from information technology to structural engineering.

The financial support for projects such as ours is important because the findings influence research domains including social inclusion, interaction design, robotics, and privacy protocols.  A couple of specific examples of the impacts of our research are listed below.

One of the first challenges for our research is to maintain the right to be a stranger in public. Interactions of individuals in a public space generate a rich set of explicit and implicit data, from gestures, visual cues, and body language to long-term patterns of interaction and group movement. Data gathering and analysis technologies, combined with auxiliary information sources easily accessible via the internet, pose an increasing challenge to maintaining control over personal information, and the inferences that can be drawn from it. We can be tracked, surveilled, profiled and socially sorted as we move through public space. We will therefore explore the possibilities of reducing the need to reveal information when interacting in public spaces, through data minimization techniques and privacy-enhancing technologies.  At a broader level we wish to support protocols and modes of interaction that allow the coordination of individuals and groups, both in location and time. To achieve this whilst maximizing privacy, we require tools that allow such data to be shared in a way that minimizes centralized control and observability. As such, we will develop approaches combining data minimization, decentralized data storage, and the design of new distributed protocols for privacy-preserving coordination and synchronisation.

Existing barriers to participation in a thriving public space include the tendency towards a more privatised or commercialised physical space in which the participation of certain groups (the young; the poor, the ethnically different for example) can be discouraged. However, other barriers to public space participation exist, for example those who for reasons of illness, infirmity, fear, stigma, conflicting obligation or geographical distance cannot be physically present in public spaces.  Our research will draw on the developing capability to appear in public in proxy form. Technological advances are such that different forms of inhabiting public space will become more ubiquitous, even for those who might be physically able to be present in public. As part of establishing a model public space, our research will explore the challenges and consequences of different forms of interaction in public – between co-present humans, but also between human and robot proxy.

As a result of our research work, one of the foreseen outcomes is that we will engage with City Councils in the UK to share our research findings and encourage them to become early adopters of the privacy protocols that we develop, and the public space adaptations we design in order to promote contact and social cohesion.

More information on the EPSRC and its funding areas can be found on the EPSRC website.