0% found this document useful (0 votes)
40 views

Musical Note Processing

Uploaded by

Srujan Ravindra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views

Musical Note Processing

Uploaded by

Srujan Ravindra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Srujan R et al.

International Journal of Engineering, Basic sciences, Management & Social


studies
Volume 1, Issue 1, May 2017

Musical Note Processing


Srujan R
Charuhasa S R
Pavan M S Arjun K R
Karthik C Assistant Professor
Nisarga R Dept. of Electronics and Communication
Dept. of Electronics and Communication Vidyavardhaka College of Engineering
Vidyavardhaka College of Engineering Mysuru, India
Mysuru, India
[email protected]

ABSTRACT
The project is based on producing musical notes by detecting and recognizing the tones in the audio file given
by the user. Detection and recognition of audio tones involves audio processing and rendering them requires
communication between the bots and motion planning. Two robots are used to render tones detected from the
audio and they communicate using XBEE technology. Depending on the number of tones detected, multiple
bots can be used, communicating with each other to play tones with the help of different sized. Multiple robots
collaborated together does the task fast and efficiently.

Keywords—Firebird V,AVR Microcontroller AT Mega 2560, Line sensors, Sharp sensors,XBEE


communication module.
I. INTRODUCTION

Music touches the soul and helps us to relax and feel good. Music has the power of uniting people from different
backgrounds and cultural heritages. In fact, it can be best described as a wonderful force that is capable of bonding people
together. In Indian Music, the following seven notes form the basis for creating music: 
Sa Re Ga Ma Pa Dha Ni.
How about a robot singing or playing music for you? In this technological era, robots can not only be used as a
machine to reduce human work but may also be used for entertainment purpose to play music for you. Inspired by this idea,
we thought of developing an experiment with robots playing music. In this project, two robots coordinate with each other to
play the notes in a given sequence. 
Challenges in this theme include Audio Processing, Motion Planning and Communication. A music file is Audio
Processed so that they have to extract the notes and communicate them to the robots. The arena consists of metallic pipes of
different sizes that are placed randomly to give different frequencies of sound. Both robots traverse the arena to play the
notes in a given order by coordinating with each other. 

II. METHODOLOGY

A. Input
The input given to our program will consist of a series of pure notes, each of some finite duration,
separated by silence. The width of silence between any two notes may be different. The duration of each note
may be different.

B. Process Flow
In order to identify the notes that are present in the file, we will implement the following steps in the
same sequence:
1. Reading the sound file
2. Detecting silence in the file
3. Detecting the location of notes using data obtained from (2)
4. Calculating the frequency of each detected note by using DFT
5. Matching the calculated frequency to the standard frequencies of notes to identify the note that is being
played.

©2016 IJEBMS www.ijejournal.org Page No.


Srujan R et al.
International Journal of Engineering, Basic sciences, Management & Social
studies
Volume 1, Issue 1, May 2017

The audio file provided is processed to generate the musical notes from the audio file. The audio file is
broken down into windows for the process of detection of the notes. Moreover, the silences in the audio file
must be detected and they have to be neglected.

Fig1: Part of an audio file containing the notes and silences.

Fig2: Example of a complete audio file.

C. Detecting Silence
For the detection of silence, one of the approaches that can be implemented involves using a window of
some fixed size. Let us assume that our window is 0.05 seconds long. In terms of the number of samples, this
window will have the length equal to 0.05 ∗ i i. Considering,i i = 44,100, this window will of length 2205.
We can slide this window over the input signal and for each position of window, record the sum of
squared elements of input falling within the window. This will roughly be equal to the mean square of amplitude
of the signal under the window. If this value falls below a particular threshold, then we can classify that input
within that window as silence.
Implementing this using a for loop will work, but it may be inefficient for large audio files. Interested
reader might also try to implement this using convolve function from numpy.

D. Detecting location of notes


Once the silence has been detected, it is easy to infer the location of notes by keeping track of start and end
index of the input falling within the window. Basically, everything that is not silence can be considered as a
note.
Ideally a Musical Note is a pure sine wave with a predefined frequency. If we find the DFT for a pure
sine wave, we get two peaks (maximum value) in the DFT output. The first of these peaks is at the index
corresponding to the frequency of the sine wave. Hence given the DFT of a pure sine wave with unknown
frequency we look for peaks and find the index of the first peak (maximum value). Let the index of this peak be
i max , then the corresponding frequency in Hertz is given by the following formula:
i max . f s
f= -----------------------------------------------------------------(1)
l

©2016 IJEBMS www.ijejournal.org Page No.


Srujan R et al.
International Journal of Engineering, Basic sciences, Management & Social
studies
Volume 1, Issue 1, May 2017

This is the equation (1) for Calculating Frequency from DFT Output. Here l refers to the length (i.e. the number
of samples) of the vector corresponding to the audio signal of the sine wave.

III. CALCULATING DFT

Once we have the location of each note, we can use the Equation above to calculate the frequency of
the note. To do so, we find index by calculating the DFT of the portion of signal at the identified location of the
note. The DFT of a signal can be calculated in Python using fft function from numpy fft module.
numpy.fft.fft(signal) - This function computes the one dimensional Discrete Time Fourier Transform of the
specified signal and returns a complex numpy ndarray.
You might want to use numpy.argsort() function to find index from the calculated DFT.

With proper spacing and windowing we get certain set of notes and silences. From these notes the
frequency of the note is found out and the frequency is then compared with the standard set of frequencies to
figure out the tune of the note.

Fig3: Output on Python IDLE Console


The music notes obtained from the python console output have to be invoked in the Firebird V robot using serial
communication. The arena consists of sets of musical node points where the aluminium pipes are placed. The
arena is as shown in the figure below.

©2016 IJEBMS www.ijejournal.org Page No.


Srujan R et al.
International Journal of Engineering, Basic sciences, Management & Social
studies
Volume 1, Issue 1, May 2017

Fig4: Arena with musical notes


It also has constraints of obstacles placed in between the traversal path. The main objective of the robots is to
traverse the shortest path and play music by striking the pipes . Motion planning has to be done for the traversal
of the robots using Firebird V robots. This is done with the help of AVR studio used for programming the
firebird V robot. The traversal of the robot is done using the line follower methodology using line sensors of the
robots and the obstacles have to be avoided using the sharp sensors. If obstacles are detected the robots have to
retrace its path and make sure it goes to its corresponding MNP (Musical node point). Two robots are
coordinated to play the music to accomplish the task quickly and efficiently. This is done using the XBEE
communication module which is mounted on the robot. This module ensures that the musical notes are played in
synchronization with one another i.e if robot 1 plays the 1 st note the robot 2 has to wait for certain amount of
time for the robot 1 to accomplish its task and then robot 2 has to play its corresponding note. The music is
generated by mounting an aluminium rod with a servo motor (0-180’) which helps in the striking mechanism.
When the robots reach the corresponding MNP’s the servo motors ensure the proper alignment of the aluminium
rod mounted on it to strike the hollow aluminium rod on the corresponding MNP to generate the Music.

IV. APPLICATIONS

The main applications of this project are


A. Used to perform live concerts.
B. Used to teach music.
C. Has applications in audio processing.
D. Can be used to find faults in the notes in music competitions.

V. CONCLUSION
In this paper, we proposed the musical note processing using robots. The main idea behind this project was
that robots were generally used as a machine, this paper was made to prove that robots can not only be used as a
machine, but they can also be used for entertainment purpose for playing music. This was developed to show the
future scope of robotics in the field of music. The implementation of the project is provided in the link below.
https://www.youtube.com/watch?v=w8RsqEkZuZM&feature=youtu.be

©2016 IJEBMS www.ijejournal.org Page No.


Srujan R et al.
International Journal of Engineering, Basic sciences, Management & Social
studies
Volume 1, Issue 1, May 2017

REFERENCES

[1] https://www.youtube.com/playlist?list=PL86D5A3CA4C8BF2E8
[2] http://www.phy.mtu.edu/~suits/notefreqs.html
[3] http://www.audacityteam.org/

©2016 IJEBMS www.ijejournal.org Page No.

You might also like