Citation: | YU Hanyu, HUANG Jin, ZHU Jia. Fea2Lab: A Feature-to-Label Generation Model Based on Multi-Label Learning[J]. Journal of South China Normal University (Natural Science Edition), 2020, 52(3): 111-119. DOI: 10.6054/j.jscnun.2020052 |
[1] |
MITCHELL T M. Machine learning[J].New York:McGraw-Hill, 1997. http://d.old.wanfangdata.com.cn/NSTLQK/NSTL_QKJJ0216575822/
|
[2] |
TSOUMAKAS G, KATAKIS I. Multi-label classification:an overview[J]. International Journal of Data Warehousing and Mining, 2007, 3(3):1-13. http://d.old.wanfangdata.com.cn/NSTLQK/NSTL_QKJJ0227335633/
|
[3] |
ZHANG M, ZHOU Z. A review on multi-label learning algorithms[J]. IEEE Transactions on Knowledge and Data Engineering, 2014, 26(8):1819-1837. doi: 10.1109/TKDE.2013.39
|
[4] |
BOUTELL M R, LUO J, SHEN X, et al. Learning multi-label scene classification[J]. Pattern Recognition, 2004, 37(9):1757-1771. doi: 10.1016/j.patcog.2004.03.009
|
[5] |
FVRNKRANZ J, HVLLERMEIER E, MENCÍA E L, et al. Multilabel classification via calibrated label ranking[J]. Machine Learning, 2008, 73(2):133-153. http://d.old.wanfangdata.com.cn/NSTLQK/NSTL_QKJJ027595258/
|
[6] |
TSOUMAKAS G, VLAHAVAS I. Random, k-Labelsets: an ensemble method for multilabel classification[C]//Proceedings of the 18th European Conference on Machine Learning. Berlin: Springer, 2007: 406-417.
|
[7] |
ELISSEEFF A, WESTON J. A kernel method for multi-labelled classification[C]//Proceedings of the 14th International Conference on Neural Information Processing Systems: Natural and Synthetic. Vancouver: MIT Press, 2001: 681-687.
|
[8] |
ZHANG M L, ZHOU Z H. ML-KNN:a lazy learning approach to multi-label learning[J]. Pattern Recognition, 2007, 40(7):2038-2048. doi: 10.1016/j.patcog.2006.12.019
|
[9] |
ZHANG M L, ZHANG K. Multi-label learning by exploiting label dependency[C]//Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Disco-very and Data Mining. New York: ACM, 2010: 999-1008.
|
[10] |
BAHDANAU D, CHO K, BENGIO Y. Neural machine translation by jointly learning to align and translate[C/OL]//Proceedings of the 3rd International Conference on Learning Representations, (2014-09-01)[2019-10-10]. https://arxiv.org/abs/1409.0473.
|
[11] |
SUN X, WEI B, REN X, et al. Label embedding network: learning label representation for soft training of deep networks[J/OL]. Computing Research Repository, (2017-10-28)[2019-10-10]. https://arxiv.org/abs/1710.10393.
|
[12] |
YANG P, SUN X, LI W, et al. SGM: Sequence generation model for multi-label classification[C]//Proceedings of the 27th International Conference on Computational Linguistics. New-Mexico: ACL, 2018: 3915-3926.
|
[13] |
NAM J, MENCÍA E L, KIM H J, et al. Maximizing subset accuracy with recurrent neural networks in multi-label classification[C]//Advances in Neural Information Processing Systems. California: Curran Associates, 2017: 5413-5423.
|
[14] |
HUANG Y, WANG W, WANG L, et al. Multi-task deep neural network for multi-label learning[C]//Proceedings of 2013 IEEE International Conference on Image Processing. Melbourne: IEEE, 2013: 2897-2900.
|
[15] |
ZHANG M L, ZHOU Z H. Multilabel neural networks with applications to functional genomics and text categorization[J]. IEEE Transactions on Knowledge and Data Enginee-ring, 2006, 18(10):1338-1351. doi: 10.1109/TKDE.2006.162
|
[16] |
CHO K, VAN MERRIËNBOER B, GULCEHRE C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Qatar: ACL, 2014: 1724-1734.
|
[17] |
VAN DER HEIJDEN F, DUIN R P, DE RIDDER D, et al. Classification, parameter estimation and state estimation: an engineering approach using MATLAB[M]. New Jersey:John Wiley & Sons, 2005.
|
[18] |
刘宇, 熊有伦.基于有界k-d树的最近点搜索算法[J].华中科技大学学报(自然科学版), 2008, 36(7):73-76. doi: 10.3321/j.issn:1671-4512.2008.07.020
LIU Y, XIONG Y L. Algorithm for searching nearest-neighbor based on the bounded k-d tree[J]. Journal of Huazhong University of Science and Technology(Nature Science Edition), 2008, 36(7):73-76. doi: 10.3321/j.issn:1671-4512.2008.07.020
|
[19] |
HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural Computation, 1997, 9(8):1735-1780. doi: 10.1162/neco.1997.9.8.1735
|
[20] |
LUONG M T, PHAM H, MANNING C D. Effective approaches to attention-based neural machine translation[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing.Qatar: ACL, 2015: 1412-1421.
|
[21] |
YU L, ZHANG W, WANG J, et al. SeqGAN: sequence generative adversarial nets with policy gradient[C]//Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence. California: AAAI Press, 2017: 2852-2858.
|
[22] |
FREITAG M, AL-ONAIZAN Y. Beam search strategies for neural machine translation[C]//Proceedings of the First Workshop on Neural Machine Translation. Vancouver: Association for Computational Linguistics, 2017: 56-60.
|
[23] |
TROHIDIS K. Multi-labelclassification of music into emotions[C]//Proceedings of the 9th International Confe-rence on Music Information Retrieval. Philadelphia: [s.n], 2008: 325-330.
|
[24] |
READ J, PFAHRINGER B, HOLMES G, et al. Classifier chains for multi-label classification[J]. Machine Lear-ning, 2011, 85(3):333-359. http://d.old.wanfangdata.com.cn/OAPaper/oai_arXiv.org_1304.5862
|
[25] |
ZHANG M L, WU L. Lift:multi-label learning with label-specific features[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 37(1):107-120. http://d.old.wanfangdata.com.cn/Periodical/zgxdyxzz201802017
|
[26] |
KINGMA D P, BA J. Adam: a method for stochastic optimization[C/OL]//Proceedings of the 3rd International Conference on Learning Representations, (2014-12-22)[2019-10-10]. https://arxiv.org/abs/1412.6980.
|
[27] |
SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout:a simple way to prevent neural networks from overfitting[J]. The Journal of Machine Learning Research, 2014, 15(1):1929-1958. http://d.old.wanfangdata.com.cn/Periodical/kzyjc200606005
|