Graph-based feature extraction: A new proposal to study the classification of music signals outside the time-frequency domain.

Most feature extraction algorithms for music audio signals use Fourier transforms to obtain coefficients that describe specific aspects of music information within the sound spectrum, such as the timbral texture, tonal texture and rhythmic activity. In this paper, we introduce a new method for extra...

Full description

Bibliographic Details
Main Authors: Dirceu de Freitas Piedade Melo, Inacio de Sousa Fadigas, Hernane Borges de Barros Pereira
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2020-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0240915
id doaj-0cb2bb6b8a9545e79e9db9fcf6708d2c
record_format Article
spelling doaj-0cb2bb6b8a9545e79e9db9fcf6708d2c2021-03-04T12:25:01ZengPublic Library of Science (PLoS)PLoS ONE1932-62032020-01-011511e024091510.1371/journal.pone.0240915Graph-based feature extraction: A new proposal to study the classification of music signals outside the time-frequency domain.Dirceu de Freitas Piedade MeloInacio de Sousa FadigasHernane Borges de Barros PereiraMost feature extraction algorithms for music audio signals use Fourier transforms to obtain coefficients that describe specific aspects of music information within the sound spectrum, such as the timbral texture, tonal texture and rhythmic activity. In this paper, we introduce a new method for extracting features related to the rhythmic activity of music signals using the topological properties of a graph constructed from an audio signal. We map the local standard deviation of a music signal to a visibility graph and calculate the modularity (Q), the number of communities (Nc), the average degree (〈k〉), and the density (Δ) of this graph. By applying this procedure to each signal in a database of various musical genres, we detected the existence of a hierarchy of rhythmic self-similarities between musical styles given by these four network properties. Using Q, Nc, 〈k〉 and Δ as input attributes in a classification experiment based on supervised artificial neural networks, we obtained an accuracy higher than or equal to the beat histogram in 70% of the musical genre pairs, using only four features from the networks. Finally, when performing the attribute selection test with Q, Nc, 〈k〉 and Δ, along with the main signal processing field descriptors, we found that the four network properties were among the top-ranking positions given by this test.https://doi.org/10.1371/journal.pone.0240915
collection DOAJ
language English
format Article
sources DOAJ
author Dirceu de Freitas Piedade Melo
Inacio de Sousa Fadigas
Hernane Borges de Barros Pereira
spellingShingle Dirceu de Freitas Piedade Melo
Inacio de Sousa Fadigas
Hernane Borges de Barros Pereira
Graph-based feature extraction: A new proposal to study the classification of music signals outside the time-frequency domain.
PLoS ONE
author_facet Dirceu de Freitas Piedade Melo
Inacio de Sousa Fadigas
Hernane Borges de Barros Pereira
author_sort Dirceu de Freitas Piedade Melo
title Graph-based feature extraction: A new proposal to study the classification of music signals outside the time-frequency domain.
title_short Graph-based feature extraction: A new proposal to study the classification of music signals outside the time-frequency domain.
title_full Graph-based feature extraction: A new proposal to study the classification of music signals outside the time-frequency domain.
title_fullStr Graph-based feature extraction: A new proposal to study the classification of music signals outside the time-frequency domain.
title_full_unstemmed Graph-based feature extraction: A new proposal to study the classification of music signals outside the time-frequency domain.
title_sort graph-based feature extraction: a new proposal to study the classification of music signals outside the time-frequency domain.
publisher Public Library of Science (PLoS)
series PLoS ONE
issn 1932-6203
publishDate 2020-01-01
description Most feature extraction algorithms for music audio signals use Fourier transforms to obtain coefficients that describe specific aspects of music information within the sound spectrum, such as the timbral texture, tonal texture and rhythmic activity. In this paper, we introduce a new method for extracting features related to the rhythmic activity of music signals using the topological properties of a graph constructed from an audio signal. We map the local standard deviation of a music signal to a visibility graph and calculate the modularity (Q), the number of communities (Nc), the average degree (〈k〉), and the density (Δ) of this graph. By applying this procedure to each signal in a database of various musical genres, we detected the existence of a hierarchy of rhythmic self-similarities between musical styles given by these four network properties. Using Q, Nc, 〈k〉 and Δ as input attributes in a classification experiment based on supervised artificial neural networks, we obtained an accuracy higher than or equal to the beat histogram in 70% of the musical genre pairs, using only four features from the networks. Finally, when performing the attribute selection test with Q, Nc, 〈k〉 and Δ, along with the main signal processing field descriptors, we found that the four network properties were among the top-ranking positions given by this test.
url https://doi.org/10.1371/journal.pone.0240915
work_keys_str_mv AT dirceudefreitaspiedademelo graphbasedfeatureextractionanewproposaltostudytheclassificationofmusicsignalsoutsidethetimefrequencydomain
AT inaciodesousafadigas graphbasedfeatureextractionanewproposaltostudytheclassificationofmusicsignalsoutsidethetimefrequencydomain
AT hernaneborgesdebarrospereira graphbasedfeatureextractionanewproposaltostudytheclassificationofmusicsignalsoutsidethetimefrequencydomain
_version_ 1714802814537957376