Contract: Part-time
We Are Avelo Running
We’re on a mission to help every athlete run faster, stronger, and more resiliently by building the world’s smartest running shoe. Our team includes ex-Nike execs, former Garmin engineers, experienced software / firmware developers, and leading PhDs in biomechanics and AI.
The Role
As our Algorithm Engineer, you'll own the systems that ensure Avelo's running insights are accurate, reliable, and continuously improving. You'll contribute to our embedded algorithms, regression testing frameworks, and data infrastructure that underpin our algorithm development.
By adding new features and maintaining our innovative technology you will help Avelo define (and lead) the smart running shoe category, enhancing running health and performance.
This is a cross-functional role where you'll collaborate closely with data science, firmware, and software engineering teams. Your work will touch everything from defining data collection protocols to writing production code.
Key Responsibilities
- Design and implement automated regression testing frameworks for new and existing algorithms.
- Define data collection protocols and coordinate collection efforts to build high-quality reference datasets.
- Develop data integrity checks to ensure incoming data is clean, consistent, and trustworthy.
- Collaborate with technical leadership to shape the internal research database, defining schemas, access patterns, and supporting tooling.
- Contribute to production firmware (C) to improve on-device algorithms for running metrics.
- Build and expand analysis tools to identify areas for algorithm improvement.
- Work cross-functionally with data science, firmware, and software teams to drive measurable improvements.
Who You Are
Experience:
- 3+ years of experience in software engineering, with exposure to embedded systems, data infrastructure, or algorithm development.
- Proficiency in C and at least one scripting/analysis language (Python preferred).
- Experience building test automation frameworks or CI/CD pipelines.
- Familiarity with data pipelines, data quality practices, or data lake architectures.