Using cameras and AI to help exoskeletons adapt to their environment
Researchers at Canada’s University of Waterloo are showcasing work with prostheses and exoskeletons that utilizes cameras and AI to deliver more natural human movement. The ExoNet project leverages video captured by a wearable camera, run through deep learning AI, in order to mimic how humans adapt and adjust movements based on their environments.
The project is an attempt to create more natural locomotion on the fly than is currently offered through systems with connected smartphone apps or other external controllers.
“That can be inconvenient and cognitively demanding,” Waterloo PhD candidate Brokoslaw Laschowski said in a release tied to the research. “Every time you want to perform a new locomotor activity, you have to stop, take out your smartphone and select the desired mode.”