California startup Atom Bodies is revolutionizing prosthetic technology with its 'Atom Touch' mind-controlled robotic arm. Utilizing AI, machine learning, and EMG sensors, Atom Touch allows precise individual finger, wrist, and elbow control by thought. Designed for comfort and extended wear, it aims to make advanced prosthetics accessible to thousands of amputees by offering it at a significantly lower price point ($25,000) compared to current state-of-the-art options ($200,000).
Mind-controlled prosthetic arms are now becoming a reality
InnovationTechsTechnologyRobotsTechniquesFitness
AI Summary
TL;DR: Key points with love ❤️California startup Atom Bodies is revolutionizing prosthetic technology with its 'Atom Touch' mind-controlled robotic arm. Utilizing AI, machine learning, and EMG sensors, Atom Touch allows precise individual finger, wrist, and elbow control by thought. Designed for comfort and extended wear, it aims to make advanced prosthetics accessible to thousands of amputees by offering it at a significantly lower price point ($25,000) compared to current state-of-the-art options ($200,000).
Trending- 1 2025-06-11: Article published.
- 2 Within the next year: Atom Bodies plans to begin clinical trials for Atom Touch, pending FDA approval.
- Advanced bionic limbs could become accessible to a much larger population
- Improving quality of life for amputees by offering greater dexterity, comfort, and affordability
What: California startup Atom Bodies is developing 'Atom Touch,' a mind-controlled robotic arm that uses AI, machine learning, and EMG sensors to provide precise, intuitive control of individual fingers, wrist, and elbow. It aims to be affordable and comfortable for extended wear.
When: Now (becoming a reality), within the next year (clinical trials planned), 2025-06-11 (article published).
Where: California, United States.
Why: To address the challenges faced by over 2 million Americans with limb loss, including the high cost, discomfort, and limited capabilities of current prosthetic options.
How: By combining EMG sensors to detect muscle activity, machine-learning algorithms to interpret signals, and an AI neural interface (Atom A1) for intuitive control. The design includes a load-balanced harness and haptic feedback.