[1] |
HENDRICKX I, KIM S N, KOZAREVA Z, et al. Semeval-2010 task 8: multi-way classification of semantic relations between pairs of nominals[C]//Proceedings of the NAACL HLT Workshop on Semantic Evaluations: Recent Achievements and Future Directions. ACL, 2009: 94-99.
|
[2] |
SOCHER R, HUVAL B, MANNING C D, et al. Semantic compositionality through recursive matrix-vector spaces[C]//Proceedings of the 2012 joint conference on empirical methods in natural language processing and computational natural language learning. ACL, 2012: 1201-1211.
|
[3] |
ZENG D, LIU K, LAI S, et al. Relation classifica tion via convolutional deep neural network[C]//Proceedings of the 25th International Conference on Computational Linguistics. Pennsylvania: ACL, 2014:2335-2344.
|
[4] |
YU M, GORMLEY M, Dredze M. Factor-based compositional embedding models[C]//Conference and Workshop on Neural Information Processing Systems Workshop on Learning Semantics. Montreal: NIPS, 2014: 95-101.
|
[5] |
SANTOS C N, XIANG B, ZHOU B. Classifying relations by ranking with convolutional neural networks[J]. Computer Science, 2015, 86(86):132-137.
|
[6] |
SHEN Y, HUANG X J. Attention-based convolutional neural network for semantic relation extraction[C]// Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics:Technical Papers. Osaka: ICCL, 2016: 2526-2536.
|
[7] |
LEE J, SEO S, CHOI Y S. Semantic relation classification via bidirectional LSTM networks with entity-aware attention using latent entity typing[J]. Symmetry, 2019, 11(6): 785-796.
doi: 10.3390/sym11060785
URL
|
[8] |
LI Q, JI H. Incremental Joint Extraction of Entity Mentions and Relations[C]//Proceedings of Meeting of the Association for Computational Linguistics. ACL, 2014: 402-412.
|
[9] |
BAI T, GUAN H, WANG S, et al. Traditional Chinese medicine entity relation extraction based on CNN with segment attention[J]. Neural Computing and Applications, 2022: 1-10.
|
[10] |
MIWA M, BANSAL M. End-to-end relation extraction using LSTMs onsequences and tree structures[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin:ACL, 2016: 1105-1116.
|
[11] |
ZHANG Y, ZHONG V, CHEN D, et al. Position-aware attention and supervised data improve slot filling[C]//Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen: ACL, 2017:35-45.
|
[12] |
DAI A M, LE Q V. Semi-supervised sequence learning[C]//International Conference on Neural nformation Processing Svstems. MiT Press, 2015: 3079-3087.
|
[13] |
PETERS M E, AMMAR W, BHAGAVATULA C, et al. Semi-supervised sequence tagging with bidirectional language models[C]// Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. ACL, 2017:1756-1765.
|
[14] |
HOWARD J, RUDER S. Universal language model fine-tuning for text classification[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. ACL, 2018:328-339.
|
[15] |
ZHANG Y, QI P, MANNING C D. Graph convolution over pruned dependency trees improves relation extraction[C]// Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels: ACL, 2018: 2205-2215.
|
[16] |
DEVLIN J, CHANG M, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]// Procedings of the 2019 Conference of the North American Chapter of the Association for Computational Linquistics:Human Lanquage Tecinologies. Stroudsburg: ACL, 2019: 4171-4186.
|
[17] |
WU S, HE Y. Enriching pre-trained language model with entity information for relation classification[C]// Proceedings of the 28th ACM international conference on information and knowledge management. ACM, 2019: 2361-2364.
|
[18] |
GAO S, DU J, ZHANG X. Research on relation extraction method of Chinese electronic medical records based on BERT[C]//Proceedings of the 2020 6th international conference on computing and artificial intelligence. ACM, 2020: 487-490.
|