A Robust Music Auto-Tagging Technique Using Audio Fingerprinting and Deep Convolutional Neural Networks
碩士 === 國立中興大學 === 資訊科學與工程學系 === 106 === Music tags are a set of descriptive keywords that convey high-level information about a music clip, such as emotions(sadness, happiness), genres(jazz, classical), and instruments(guitar, vocal). Since tags provide high-level information from the listener’s per...
Main Authors: | Jia-Hong Yang, 楊佳虹 |
---|---|
Other Authors: | 吳俊霖 |
Format: | Others |
Language: | zh-TW |
Published: |
2018
|
Online Access: | http://ndltd.ncl.edu.tw/handle/vagbse |
Similar Items
-
A Method of Music Auto-tagging Based on Audio and Lyric
by: Sheng-WeiSyu, et al.
Published: (2019) -
Deep Convolutional Neural Network for Passive RFID Tag Localization Via Joint RSSI and PDOA Fingerprint Features
by: Chao Peng, et al.
Published: (2021-01-01) -
Improving Audio Fingerprinting for Music Retrieval
by: Liao, Pei-Yu, et al.
Published: (2013) -
Exploring convolutional, recurrent, and hybrid deep neural networks for speech and music detection in a large audio dataset
by: Diego de Benito-Gorron, et al.
Published: (2019-06-01) -
Robust Authentication of Consumables With Extrinsic Tags and Chemical Fingerprinting
by: Naren Vikram Raj Masna, et al.
Published: (2019-01-01)