Please use this identifier to cite or link to this item:
https://hdl.handle.net/10316/95810
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Pinto, António | - |
dc.contributor.author | Böck, Sebastian | - |
dc.contributor.author | Cardoso, Jaime | - |
dc.contributor.author | Davies, Matthew | - |
dc.date.accessioned | 2021-09-24T16:13:53Z | - |
dc.date.available | 2021-09-24T16:13:53Z | - |
dc.date.issued | 2021 | - |
dc.identifier.issn | 2079-9292 | pt |
dc.identifier.uri | https://hdl.handle.net/10316/95810 | - |
dc.description.abstract | The extraction of the beat from musical audio signals represents a foundational task in the field of music information retrieval. While great advances in performance have been achieved due the use of deep neural networks, significant shortcomings still remain. In particular, performance is generally much lower on musical content that differs from that which is contained in existing annotated datasets used for neural network training, as well as in the presence of challenging musical conditions such as rubato. In this paper, we positioned our approach to beat tracking from a real-world perspective where an end-user targets very high accuracy on specific music pieces and for which the current state of the art is not effective. To this end, we explored the use of targeted fine-tuning of a state-of-the-art deep neural network based on a very limited temporal region of annotated beat locations. We demonstrated the success of our approach via improved performance across existing annotated datasets and a new annotation-correction approach for evaluation. Furthermore, we highlighted the ability of content-specific fine-tuning to learn both what is and what is not the beat in challenging musical conditions. | pt |
dc.language.iso | eng | pt |
dc.publisher | MDPI | pt |
dc.relation | IF/01566/2015 | pt |
dc.relation | SFRH/BD/120383/2016 | pt |
dc.relation | CISUC/UID/CEC/00326/2020 | pt |
dc.rights | openAccess | pt |
dc.rights.uri | http://creativecommons.org/licenses/by/4.0/ | pt |
dc.subject | Beat tracking | pt |
dc.subject | Transfer learning | pt |
dc.subject | User adaptation | pt |
dc.title | User-Driven Fine-Tuning for Beat Tracking | pt |
dc.type | article | - |
degois.publication.firstPage | 1518 | pt |
degois.publication.issue | 13 | pt |
degois.publication.title | Electronics (Switzerland) | pt |
dc.peerreviewed | yes | pt |
dc.identifier.doi | 10.3390/electronics10131518 | pt |
degois.publication.volume | 10 | pt |
dc.date.embargo | 2021-01-01 | * |
uc.date.periodoEmbargo | 0 | pt |
item.grantfulltext | open | - |
item.openairecristype | http://purl.org/coar/resource_type/c_18cf | - |
item.fulltext | Com Texto completo | - |
item.openairetype | article | - |
item.cerifentitytype | Publications | - |
item.languageiso639-1 | en | - |
crisitem.author.researchunit | CISUC - Centre for Informatics and Systems of the University of Coimbra | - |
crisitem.author.parentresearchunit | Faculty of Sciences and Technology | - |
crisitem.author.orcid | 0000-0002-1315-3992 | - |
Appears in Collections: | I&D CISUC - Artigos em Revistas Internacionais |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
electronics-10-01518 (1).pdf | 1.5 MB | Adobe PDF | View/Open |
This item is licensed under a Creative Commons License