How far is human beings from “Avatar”?
The question remains inconclusive to this day, but Cameron has given a direction in his own way – in the process of creating the film “Avatar”, “injecting human intelligence into the remote-controlled biological body” (the so-called “avatar”) ”) This concept becomes the core of the whole story.
It also points out another possibility for the coexistence of humans and robots in the future world.
Recently, the robot competition Avatar XPRIZE, sponsored by Japan’s All Nippon Airways (ANA), aims to find excellent teams of “Avatar robots” that can remotely control robots with human intelligence. Teams that can complete dozens of challenges with Avatars stand a chance to win a $10,000,000 grand prize.
The bonus may be just a reward, but the team’s attempt may become an important part of the human “Avatar” road.
A more “grounded” robot competition
Unlike DARPA’s Robotics Competition, although XPRIZE also promotes technological development through global competitions, its competitions revolve around climate change, food crisis, tropical rainforest protection and other fields.
This Avatar XPIZE for robotics is also different from DARPA’s robotics competition, with the aim of finding an avatar system (i.e. Avatar) that can deploy human senses, behaviors and presence to distant locations to create a more connected world .
Obviously, such a goal is more grounded than more sci-fi AI robots.
As its CEO Anousheh Ansari puts it: “When there is a crisis in one place, people in another country can help it without leaving their homes.” This avatar approach will solve real-world problems, precisely This peculiarity makes Avatar XPRIZE closer to the reality of the current industry combination.
As the name of the competition implies,The competition emphasizes the robot as an avatar rather than a tool. Operators will need to be able to identify what their robot is touching, hear what the robot hears, see what the robot sees, and need to sense temperature through temperature sensors – which is why you see teams An important reason why they are all adopting VR headsets like this.
Second, XPRIZE will use a simulated “high-quality” network connection where “reliability, bandwidth, latency and jitter will represent the best available public Internet service.” It’s hard to say how much more efficient DARPA Robotics teams would have been if they had access to the best public internet service available, but for the Avatar XPRIZE competition, which used VR equipment and sought the highest possible synchronization rate, the network quality was comparable to DARPA’s The competition will be better.
It is also worth noting that the teams must complete the set goals in 20 minutes or less, which actually increases the difficulty in disguise.
In this way, you may be able to understand that Avatar XPRIZE is not only testing the sensitivity and versatility of the robots designed by the participating teams, but also puts high demands on the team’s proficiency. These all imply the importance of the competition project once it can be realized, and when we observe the details of the works of some participating teams, we can further see the special features of this competition.
Finding the “way of the future” from the real world
As a human-machine collaboration event, the tasks set by Avatar XPRIZE seem relatively basic but diverse, not only for disaster scenarios, but also for many seemingly everyday tasks. In addition, other task requirements will also be given on the spot. Although these projects are not difficult on the whole, they are more complicated to complete within a specific time. The following are the target projects in the previous competition, from which we can also Take a peek at the game’s features:
Each of these actions looks very common, but together they reflect the complexity in a specific scene. Once the robot can be manipulated to complete it quickly, it means that the participating teams can make the robot play a practical role in some special scenes. .
For example, the NimbRo team that won the first place in the 2021 semi-finals, this robot from the Autonomous Intelligent Systems Laboratory of the University of Bonn, Germany, consists of an operator station and a mobile avatar robot.
NimbRo at the Laboratory for Autonomous Intelligent Systems at the University of Bonn in Germany is running a competition | IEEE
The robot has a human-like upper body, two arms and full five-fingered hands, and the head is equipped with a wide-angle camera, microphone array, and a display that shows the operator’s facial expressions.
During the remote control process, the robot can synchronize the operator’s facial expressions and voice to the scene in real time. This process is achieved by multiple cameras and eye-tracking technology in the VR headset.The head and hand movements are captured and transmitted to the robot, and the operator “avatars” the robot to complete various actions through the VR device controller and hand manipulation device.
The NimbRo team controls the robot | IEEE
The NimbRo team pointed out that the operator can use the force-torque sensor on the wrist to feel the force of the avatar robot’s palm grasping, and can also sense the stimulation through the electric current of the motor, and feel the subtle touch of the fingers.
Using this equipment, in the semi-final, the NimbRo robot completed the cooking process including cookingcoffee, playing chess, playing jigsaw puzzles, measuring high blood pressure, body temperature, blood oxygen saturation and other operations, and helping people in need to put on jackets, all of which make this robot more practical in the post-pandemic era.
Similarly, the Avatrina robot developed by the University of Illinois at Urbana-Champaign (UIUC) also integrates VR technology and robotics technology to realize real-time movements of the robot by manipulating the VR equipment of the human.
Avatrina is equipped with a visual and auditory system that allows the remote operator to communicate with the person in front of the robot, and the robot can also see everything in front of the robot through the tablet in its head (the robot’s face).
In the eyes of many teams, this combination of robots and VR technology brings humans one step closer to the future world. Compared with other implementations, the problems faced in this field are not so fatal, such as network delay, the resolution and viewing angle of the VR device are not high enough, and the avatar robot is not flexible enough, which will be gradually solved over time.
More importantly, this way of remotely controlling robots through humans has found another solution in the growing anxiety of “AI replacing humans”.
The Avatrina Robot from the University of Illinois at Urbana-Champaign team | IEEE
Imagine when the winning team was able to combine state-of-the-art technology to allow any untrained operator, after a very short period of training, toRemote control of avatar robots 100 kilometers away to complete tasks ranging from simple to complex in a specific environment will subvert some of the forms of work in the business field, such as shopping mall guidance, hotel services, scenic tourism, product sales, etc., can be completed without people leaving home, and save a lot of resources.
As Kris Hauser, professor of computer science at the University of Illinois at Urbana-Champaign, said: “Virtual experiences won’t be a perfect substitute for all face-to-face interactions, but even if we could replace 50 percent of travel, it’s still time, cost, and energy consumption. Huge victory.”
“Avatar” shines into reality
In robotics research, AI robots with self-decision-making capabilities are often the hottest topic at the moment, and are constantly discussed with the anxiety of “AI replacing humans”, but this may not be the case in reality.
Today we can see that whether it is a space capsule robotic arm in deep space or a robot assistant in the family, it has already become a reality for humans to complete some specific tasks by remotely controlling robots, and the goal of Avatar XPRIZE is to go one step further. , so that the combination of machines and people is more closely.
“During the epidemic, some remotely controllable robots have been used to provide telemedicine services for patients who are isolated at home, communicate with their relatives, and can also spray drugs or ultraviolet rays through remote control.disinfectand other tasks. And going a step further, they’re also showing up in surgical operations, explosive ordnance disposal, and even space exploration. “In the eyes of Professor Kris Hauser, such robots remain an important tool in the fight against ageing, as he says:
“One of the biggest upcoming opportunities for avatar robots is home healthcare.The U.S. will have a shortage of more than 1 million home health assistants by 2035 due to an aging population. A lot of the help seniors need is fairly routine, and if you could log into a robot avatar in the middle of the night and spend 10 minutes helping them without having to hire around-the-clock care, it would be a huge help for many families. “
A more practical application is that the avatar robot can be combined with the commercial AI robots we commonly see today to achieve the effect of 1+1>2.
HALODI robot in Norway｜HALODI
Norway-based robotics company HALODI produces commercial humanoid robots that can use AI to perform pre-programmed services such as door greetings and supermarket shopping guides.
When it needs to complete those services beyond the pre-programmed range, such as opening the door for people, picking up items, etc., the operator at the back end can take over the robot by connecting to the VR device, by obtaining the sight content of the robot, hearing the robot listen The received voice provides corresponding services and operations for the person in front of the robot.
Not only will this combination not make people unemployed, but it can also greatly reduce people’s work burden. Based on this approach, HALODI’s robots have been commercialized in Norway and have become “special waiters” in some medical institutions.
In a sense, the development of robotics is similar to the autonomous driving that is now on the rise, and the ultimate goal is a fully autonomous vehicle at the L4-L5 level. However, to really achieve this goal, there is still a long way to go. Until then, the human driver was still the protagonist.
Before scientists can develop general-purpose AI robots like the “Terminator”, using VR and other methods to use distant humans as the main body of operation and decision-making may be an important variable in the development of robots in the future, and one of the very feasible ways.
What the Avatar XPRIZE contest shows is the human effort towards the “Avatar” robot.