Aiming at the problems of difficult data feature selection and low classification accuracy in music emotion classification, this study proposes a music emotion classification algorithm based on deep belief network (DBN). (e traditional DBN network is improved by adding fine-tuning nodes to enhance the adjustability of the model. (en, two music data features, pitch frequency and band energy distribution, are fused as the input of the model. Finally, the support vector machine (SVM) classification algorithm is used as a classifier to realize music emotion classification. (e fusion algorithm is tested on real datasets. (e results show that the fused feature data of pitch frequency and band energy distribution can effectively represent music emotion. (e accuracy of the improved DBN network fused with the SVM classification algorithm for music emotion classification can reach 88.31%, which shows good classification accuracy compared with the existing classification methods.
Keywords
Music Emotion ClassificationDeep Belief NetworkFeature Fusion
Institute(s)
College of Music Huizhou University
Year
2022
Abstract
Author(s)
Guiying Tong