ReachARM: Motion-Controlled Robotic Arm

ReachARM was developed during the WearHacks Montreal hackathon in October 2015. The project aimed to help people with disabilities regain mobility through advanced robotics, using muscle motion detection to control a robotic arm that simulates natural movement patterns. You can find more details about the project on our DevPost submission and check out our code on GitHub.

ReachARM Robotic Arm

The Sprint Begins

One of our team members had developed a robotic arm as part of his Master's thesis. Having a pre-existing robotic arm gave us a head start, but integrating it with wearable technology presented its own set of challenges. Our first few hours were spent understanding the Myo armband's capabilities and limitations. While the robotic arm was already capable of precise movements, we needed to create an interface that could interpret human motion and translate it into mechanical control signals.

Technical Deep Dive

We broke down the project into three main components:

Challenges

As with any hackathon project, we faced several significant challenges that forced us to adapt our plans on the fly. Our original design called for an independent camera system that would move separately from the robotic arm, adding an extra dimension of control and functionality. However, the lack of necessary hardware components meant we had to pare back this feature, focusing instead on perfecting the core arm control system.

The Myo armband presented its own set of challenges. Getting reliable data from the device proved more difficult than anticipated. We spent considerable time fine-tuning the calibration process to properly interpret the user's movements. One tricky aspect was matching the system's response to users' movement speed.

Key Takeaways

Our success in this hackathon largely came down to efficient team organization. We got our development environment up and running quickly, setting up our IDE, GitHub, and Slack workflows from the start. This early infrastructure allowed us to begin collecting device data almost immediately, which proved crucial for validating our approach.

Perhaps most importantly, we learned valuable lessons about working with new hardware under time constraints. The experience taught us how to quickly understand external APIs, analyze device capabilities, and identify the specific data points needed for our application.

Beyond the Hackathon

While ReachARM began as a hackathon project, it opened our eyes to the possibilities in assistive robotics. We identified several directions for future development:

See It In Action

This demo video captures our first successful integration test, showing the real-time response and precision we achieved in just 48 hours: