In this video demonstration you will see an early test of interaction using an Aldebaran NAO robot as an avatar for a remote person. As well as relaying tracked body motion, audio is streamed through the robot, and sound and vision are relayed to the operator. We are now beginning to test the effects of robot mediated interaction.
A demonstration of our immersive tele-presence system using a Kinect and Oculus Rift, to control control and head motion of a NAO robot. A stereo headset on NAO allows the controller to see from the robots point of view.
Politicians and the public may have grown tired of shaking hands at the end of a long election campaign, but there’s new evidence that it really does matter when it comes to striking the best deal in negotiations.
Researchers from QMUL’s School of Electronic Engineering and Computer Science (EECS) presented some their research to colleagues and visitors at the Mile End Campus during their Research Showcase on 22 April 2015.
Read how Professor Alan Winfield (UWE Bristol) found the experience of looking through the eyes of a robot.
Three project partners will be hosting members of the creative cohort for week long micro-residencies in April.
Dr Chris Bevan and Professor Danaë Stanton Fraser attended the HRI 2015 conference held in Portland USA at the beginning of March. They presented their paper “Shaking Hands and Cooperation in Tele-present Human-Robot Negotiation” and were delighted to be awarded the Best Enabling Experimental Studies Award.