Assisted development of embedded Linux systems and CAN bus middleware for intelligent prosthetic limbs.
At Shirley Ryan AbilityLab, the Robotics Engineering team was developing next-generation prosthetic limbs—devices that used pattern recognition algorithms to infer a user’s intended movement and actuate motors accordingly. These weren’t passive limbs—they were intelligent systems, designed to interpret human intent in real time.
While I wasn’t the primary engineer on the project, I joined the Electrical Engineering team during a critical phase to support embedded Linux development and help keep the project on track. My contributions helped maintain build consistency across prototype boards and streamline the development process during high-stakes testing and integration.
The overall goal of the project was to build prosthetic limbs—both an arm and a leg—that could:
My job was to support the embedded side of the stack and help the core team accelerate their integration efforts.
The project culminated in a working prosthetic leg system capable of responding to user input in real time. My contributions, while scoped to infrastructure and integration, helped keep the engineering workflow stable and efficient during crucial phases of development.
Through this project, I gained hands-on experience with embedded Linux, real-time communication protocols, and prosthetic systems that blend AI, hardware, and human physiology. It also helped solidify my transition from mechanical to embedded to full software engineering, and deepened my appreciation for the complexity of building biomedical robotics systems that people rely on every day.