➀ Researchers from Duke University have developed a system called SonicSense, which allows robots to interact with their environment through a robotic hand with microphones on its fingertips. The system captures vibrations and analyzes them to determine the material and shape of objects. ➁ SonicSense uses AI to identify objects with complex shapes and multiple materials, and it can learn from interactions to improve its recognition capabilities. ➂ The system is designed to be affordable and easy to integrate into robots, potentially enhancing their ability to navigate unpredictable environments.
Related Articles
- Robots Learn By Watching Themselves Move3 months ago
 - Robots Learn To Pack Smarter And Faster5 months ago
 - A System For Real-Time Control Of Humanoid Robots6 months ago
 - 3D-Printed Robots That Walk Without Electronics7 months ago
 - Training Robots For Athletic Movements9 months ago
 - Apple Wants To Build AI Robots With Personalities To Solve First World Problemsabout 1 year ago
 - Compact AI Health Monitoring And Anomaly Detection Device3 days ago
 - AI Vision Assistant Camera That Can See, Read, and Translate3 days ago
 - “The Inspiration Behind BonV Aero Has Always Been To Bringsmart Aerial Mobility To India”3 days ago
 - The Emerging Wonderland Of ‘LIVING’ Computer Systems3 days ago