Recommended Sponsor Painted-Moon.com - Buy Original Artwork Directly from the Artist

Source: MIL-OSI Submissions
Source: University of Auckland

As the pandemic obliged many of us to retreat from the real-world and into the remote one, researchers at the ABI upped the pace of their research on how to put more of the human dimension to Zoom and other teleconferencing platforms.

This includes numerous projects led by world-renowned augmented reality expert Professor Mark Billinghurst in the Empathic Computing Lab (ECL) at the ABI, which aims to develop software systems that put more empathy into technology.

Being locked out of their own building added a sense of urgency to the team’s research. In a paper published this month in the Journal of Multimodal User Interfaces the team presented their Mixed Reality (MR) technology, in which they showed how the sounds and sights of the real-world could be better replicated in a virtual 3D replica, allowing for more of the implicit cues that are a crucial part of the way we talk to and understand each other.
“Most people still prefer direct face-to-face communication over current audio-video conferencing solutions, partly because the latter usually fail to convey the non-verbal cues that we instinctually interpret in face-to-face encounters,” says Amit Barde, Research Fellow at the Lab.
He and his team’s research showed how they could guide someone wearing a commercially available Virtual Reality (VR) headset around a virtual representation of Level 7 in ABI House, using visual as well as auditory cues, such as the sound of an object being knocked on a bench.
The auditory clues are directional – that is, the user can tell whether, in virtual reality, the sound was coming from their left, their right, or behind them, as they would in the real-world.
They demonstrated that a local user (someone who really was on the 7th floor) could be guided by a ‘virtual’ guide with such a high level of precision that, using auditory and visual cues, they could find a real 2cm3 Lego brick in a real 90m2 space; all with help from a remote user inhabiting the same environment in virtual reality at the same time.
 “We found we could replicate natural auditory perception, as if the remote user was in the same space as the local user,” says Mr Barde. This is important, he says. VR platforms that allow a level of auditory and visual perception that we experience in the real world will be crucial to the future of telecommunications.
He and his fellow researchers have and are developing a number of ways to add more of the real-world into the Zoom-like experience, to make current video-conferencing systems less like talking to a two-dimensional tiles of other people’s faces on a flat screen, and more like a place in which you feel that you’re in the same room.
That includes the development of a platform that allows someone to share a real space with others and have a 360 degree view of that space in real-time, and to rotate around in that online virtual space – to move forward, step back, as we would in the real world. 
Moreover, their technology enables multiple participants to look in different directions independent of each other’s points of views – such as as staring out the window at a meeting, as we might in the real world.
His colleague, Dr Huidong Bai, explains: “So we’ve shown how you can pan the 360 video by yourself.  So I might be the tour guide, but each person streaming in can decide where they want to look, independent of the tour guide. They can head off in their own direction.”
He compares current video-conferencing technologies to watching a screen through the director’s window, seeing things from his or her perspective. “But with our platform you have choice about where you want to look.”
Their technologies have the potential to allow for experiences redolent of fictional Holodeck in Star Trek – to go where we have never been before, or might not be able to in real world.  “It’s about enabling people to feel that they really are somewhere where they physically aren’t,” says Mr Barde.
 “It allows for more of those implicit cues, that don’t come through a video conference, because you’re starting at a screen, you’re only seeing yourself and someone else in two dimensions.”
He predicts that Covid-19 is likely to accelerate research and the uptake of technology that addresses the shortcomings of current telecommunications platforms which many of us, at short notice, became so dependent upon.
 “With have Zoom we get face to face information, but a VR element can provide much richer 3D information.”
“But we’re not about foisting new technology onto people. What we’re trying to do is improve human-to-human interaction, mediated by these communication systems, that allow for more and easier human interaction.”

MIL OSI