Wearable Sensors Article
🎮 Overview
At the Shirley Ryan AbilityLab’s Center for Bionic Medicine, I had the chance to help solve a critical problem in spinal cord injury rehab: therapy was tedious, repetitive, and hard to quantify. Patients often struggled to stay motivated, and clinicians lacked the data they needed to personalize recovery.
I joined a multi-disciplinary team to design a new solution—one that transformed muscle rehabilitation into an interactive gaming experience, while collecting accurate, real-time EMG data from wearable sensors.
🎯 Objectives
We set out to build a system that would:
- Engage patients using mobile games controlled by their own muscle signals,
- Translate EMG input from wearable sensors into real-time game mechanics,
- Provide clinicians with clear data visualizations and analysis tools, and
- Enable long-term tracking of patient progress—both in clinic and at home.
I led major portions of the Android-side engineering and signal processing work, and helped connect clinical feedback to gameplay improvements.
⚙️ What I Built
- Android BLE Integration:
- Extended the BLE pipeline to support multiple streaming EMG-IMU sensors.
- Implemented real-time signal preprocessing (filtering, normalization, gain adjustment).
- Game Development in Unity:
- Designed and programmed two new EMG-controlled games:
- Space Flight: a 2D side-scrolling dodger with dynamic EMG-based thrust.
- Muscle Racer: a 3D racing game where difficulty scaled with muscle engagement.
- Integrated procedural level generation and difficulty curves driven by real-time EMG input.
- Extended existing Power Hammer game.
- Clinical Visualization Tools:
- Built real-time signal visualization to help clinicians choose viable muscles for training.
- Added baseline calibration flows and visual feedback for signal quality.
- Data Logging & Analysis:
- Logged sensor data, game states, and patient events to Firebase + GCP backend.
- Wrote Python scripts to analyze engagement trends, dropout zones, and noisy signal artifacts—guiding both gameplay and UX improvements.
- Research & Patient Studies:
- Supported in-lab testing with patients, gathering feedback and operating the system in clinical sessions.
- Delivered over a dozen feature updates based directly on study outcomes and patient interviews.
- Multi-Modal Sync Support:
- Helped resolve Android camera timestamp issues to enable synchronized gait video + sensor logging, expanding the platform’s utility into spasticity and gait research.
✅ Outcome
We transformed rehab from a passive chore into an active experience. The system improved patient engagement and gave clinicians access to rich, continuous data.
It’s now being used in clinical research—and was developed far enough to support in-home trials, with modular features for longitudinal tracking and remote monitoring.
For me, it was a crash course in building mobile software that had to work across hardware, game engines, clinicians, and real patients. I learned how to balance clinical precision with human-centered design—and how to turn raw biosignals into something joyful and motivating.