top of page

STAGE I: Single Note

Recording Musical Time-samples

For Stage I of our project, we recorded a single note in isolation, in this case D#.  This is the time-domain graph of the signal and can be seen to be made up of combinations of sinusoids of differnt frequencies.

Transforming to the Frequency Domain and Note Identification

We then used the Fast Fourier Transform (FFT) to tranform our data into the frequency domain.  This graph shows the first 700 Hz of frequency of the note D# played on a guitar.  We identified the dominant frequency of each note in MATLAB and the closest in-tune note. A true D# has dominent frequency of 155.6 Hz; as can be seen in this graph, the note we intended to be D# has a domnenet frequency of 151.3; it is out of tune.

Shifting the Frequency Response

Using MATLAB, we calculated the ratio between the frequency of the intended note and the recorded note.  We then resampled the original note by that ratio to obtain a version of our note shifted in frequency. The blue graph on the right is the frequency spectrum of the original D# note; the red graph if the autotuned version, shifted in frequency.

This graph is a magnified view of the graph above, showing the change in the peak that represents the dominent frequency of the note.  As can be seen, the dominent peak in the blue graph, which represents the untuned D#, is at 151.3 Hz; after the autotuning, it can be seen in the red graph that this peak is shifted to its correct position for a D# note, at 155.6 Hz.

 

 

 

 

 

 

Listen to our results!

 

 

 

 

 

 

Note D - Untuned
00:00 / 00:00
Note D - Tuned
00:00 / 00:00
bottom of page