Skip To Content
Cambridge University Science Magazine
The new system identifies a note according to its 'spectral pattern'. This pattern is determined by the distribution of sound energy across the fundamental frequency (the frequency that determines the pitch of the note) and its harmonies (frequencies that are multiples of the fundamental frequency). The spectral pattern of a specific note may vary according to the instrument played, the musician and the recording environment.

Recorded notes are compared to a 'harmonic dictionary', a database of the typical spectral patterns of all musical notes and identified as the closest match. The notes in a 'wav' file, a common format for audio recording, can be analysed to produce a 'midi' file, which can not only be listened to, but can also generate the sheet music.

Whilst the system can currently only analyse music played by a solo instrument, the research continues into its use for more complex compositions and Julio Jose Carabias-Orti, a co-author of the paper, explains that there are important practical applications: "Automatic music transcription is of enormous assistance in recovering musical content, separating audio sources and codifying or converting audio files."

Written by Ian Fyfe