Imagine a robot that listens to a piece of music, identifies the notes, and then plays them on a piano and trumpet—just like a human musician! That was the challenge in eYRC’18 (e-Yantra Robotics Competition) under the Mocking Bot theme, where we built an intelligent music-playing robot that combined audio processing, machine learning, and embedded systems to bring music to life.
This project was a fusion of multiple domains, including Python for signal processing, machine learning for classification, embedded systems for hardware control, and mechanical design for instrument construction.
Some of the major challenges we tackled included:
✅ Accurately detecting notes despite noise – fine-tuning FFT thresholds and filtering techniques.
✅ Real-time playback synchronization – optimizing actuation timing for solenoids.
✅ Material selection – switching from MDF to acrylic for precision and portability.
The Mocking Bot project was an incredible journey of multidisciplinary engineering. From converting raw audio into playable notes to building a robot that could perform music, it showcased how machine learning, robotics, and creativity can come together to build something truly fascinating.
This project not only deepened our understanding of audio processing, embedded systems, and mechanics but also demonstrated the power of AI-driven automation in the world of music.
The competition required teams to:
Analyze an audio file to extract musical notes and their onsets using Python.
Identify the instrument (piano, trumpet, flute, or violin) using machine learning.
Design a robotic system that could play the identified notes in real time.
This involved a deep dive into signal processing, classification algorithms, hardware actuation, and mechanical design.
To decode the audio file, we employed Fast Fourier Transform (FFT) to extract frequency components and detect musical notes. Since different instruments have unique sound patterns, we used Supervised Machine Learning algorithms, including:
SVM (Support Vector Machines)
k-NN (k-Nearest Neighbors)
MLP (Multi-Layer Perceptron Neural Networks)
By training on different instrument samples, our system could classify whether a note belonged to a piano, trumpet, flute, or violin with high accuracy.
Once the notes were identified, the next step was to play them on a custom-built piano and trumpet.
The piano keys and trumpet rods were struck using an ATmega 2560-based actuation mechanism.
We used push-pull solenoids to simulate the key presses, ensuring precise note execution.
The mechanical structure was initially built using Medium Density Fibreboard (MDF) for prototyping. However, for the final competition at IIT Bombay, we upgraded to laser-cut white acrylic to improve portability and dimensional accuracy.
Our Robot playing the musical instruments in Finals held at IIT Bombay
Interview about my experience regarding the competition