杏吧视频-led team among 38 from 16 countries selected to compete for $10 million prize
A team led by 杏吧视频 biomedical engineer Dustin Tyler has made it a step closer to the finals in a global competition to develop an 鈥淎vatar System that will transport a human鈥檚 sense, actions and presence to a remote location in real time, leading to a more connected world."
On Monday, the 鈥攖he same organization behind $10 million of financial support in 2004 to make the first privately funded human spaceflight鈥: 38 teams from 16 countries.
Sponsored by All Nippon Airways (ANA), Japan鈥檚 largest airline, the Avatar XPrize is a four-year global competition focused on the development of an avatar system with an ability to execute tasks across a variety of real-world scenarios and 鈥渃onvey a sense of presence for both the operator and the recipient in those interactions.鈥
鈥淲e are rapidly approaching an era where it will be possible for a doctor in New York to export their skillset virtually to respond to a natural disaster across the globe,鈥 Anousheh Ansari, CEO of XPrize, said in a news release. 鈥淭he task at hand for these incredible semifinalist teams 鈥 is to break through physical limitations and expand the capacity of humankind itself through transformative robotic avatar technology.鈥
A 鈥榖road array of expertise鈥
, led by at the Case School of Engineering, includes more than two-dozen researchers from five academic institutions鈥攁 dozen collaborators from the 杏吧视频 campus and several more from Carnegie Mellon, Cleveland State University, UCLA and the University of Wyoming.

"Our partners bring a broad array of expertise that includes neural engineering, haptic robotics, phenomenology, cognitive neuroscience and virtual reality to create a powerful team,鈥 Tyler said. The team was for the competition in 2020.
Next step: An in-person demo of the Human Fusions avatar in Miami in September.
Tyler also said he was impressed that XPrize organization had the vision 鈥渢o realize that travel is about the experience of being anywhere, and that this does not necessarily require physical travel to achieve. This aligns perfectly with the vision and mission of the Human Fusions Institute and our avatar.鈥
The Human Fusions Institute was established by Tyler in 2019 to advance his and is providing the underlying technology for the XPrize effort.
The institute is supported in part by grants and a portion of a for biomedical engineering that late alumnus Robert Aiken and his wife, Brenda, committed in 2017.
Emily Graczyk, a research assistant professor of biomedical engineering who works at the institute and on the XPrize entry, said the robotic systems of this avatar, nicknamed 鈥淪ensa,鈥 were primarily developed at UCLA, while the human interfaces were primarily developed at CWRU.
Graczyk said two students 鈥渨ere truly critical to making this success possible,鈥 Luis Mesias Flores, a PhD student in electrical engineering at 杏吧视频, and Lionel Zhang, a PhD student in mechanical and aerospace engineering at UCLA.

Flores and Zhang are the two students鈥2,300 miles away from each other in Ohio and California鈥攆eatured in to maneuver, grasp and sense.
While the prototype is 鈥渇ocused first on providing an embodied remote experience with agency, touch sensation, visual feedback, and audio feedback,鈥 Tyler said, someday it could provide the basic technology for much more.
"We are developing the Human Fusions Sensa Avatar with the broader goal toward a health care avatar to extend care and medical expertise to all corners of the world and to all people,鈥 Tyler said.
In his 16 years at 杏吧视频, Tyler has been on a mission to extend physical touch鈥攖he 鈥渆ssence of the human experience鈥濃攁cross space and time.
In recent years, he and his team have brought the sensation of physical touch to a prosthesis so an amputee and effectively slice a tomato鈥攆undamentally changing the prosthesis from a sporadically used tool to a working 鈥渉and.鈥
"For the past nine years, we have seen the impact of direct neural connection between a robotic system and individuals with limb loss,鈥 Tyler said. 鈥淭hey would describe their robotic prosthesis as 'their hand.'
He said that鈥檚 when he and his team realized the much larger potential of this NeuroReality technology.
鈥淲e could put their prosthesis anywhere in the world and still allow a person to put 'their hand' anywhere in the world,鈥 he said. 鈥淏y adding 3D-visualization and voice to the experience, we are creating a realistic sensation of actually being anywhere in the world at any time over the internet."
For more information, contact Mike Scott at mike.scott@case.edu.