➀ Researchers from Duke University have developed a system called SonicSense, which allows robots to interact with their environment through a robotic hand with microphones on its fingertips. The system captures vibrations and analyzes them to determine the material and shape of objects. ➁ SonicSense uses AI to identify objects with complex shapes and multiple materials, and it can learn from interactions to improve its recognition capabilities. ➂ The system is designed to be affordable and easy to integrate into robots, potentially enhancing their ability to navigate unpredictable environments.
Related Articles
- 3D-Printed Robots That Walk Without Electronics4 months ago
- Training Robots For Athletic Movements5 months ago
- Apple Wants To Build AI Robots With Personalities To Solve First World Problems11 months ago
- How Hims & Hers Health Is Quietly Changing Healthcare1 day ago
- CEO Interview with Shelly Henry of MooresLabAI2 days ago
- Input latency is the all-too-frequently missing piece of framegen-enhanced gaming performance analysis2 days ago
- ADI eases AI-at-the-edge for Cortex-M43 days ago
- What caught your eye? (Synopsys, AI datacentres, Sodium-ion battery, Deep Space)3 days ago
- Double SoC prototyping performance with S2C’s VP1902-based S8-1006 days ago
- Defence demand drives IR image sensor market to 5% CAGR 2024-304 months ago