Joint Extraction of Entities and Relations Based on Multi-feature Fusion


  • Shoubin Li University of Chinese Academy of Sciences, Beijing, China and The Institute of Software, Chinese Academy of Sciences, Beijing, China Author
  • Zhiyuan Chang The Institute of Software, Chinese Academy of Sciences, Beijing, China Author
  • Yangyang Liu University of Auckland, Auckland, New Zealand Author



Joint Entity and Relation Extraction, Span-based Method, Entity Redundancy, Pre-trained Model, Part-of-speech


Joint extraction of entities and relations is essential for understanding massive text corpora. In recent years, the span-based joint models have achieved excellent results in the entity and relation extraction task. However, the previous literature and experimental results suggest that the usage of span-based method in entity and relation extraction may produce more redundant entities, although it can solve the overlapping problem of entities. In order to solve the problem of entity redundancy, this paper proposes a joint extraction model based on multi-feature fusion. The overall network follows the framework as SpERT, which is the state-of-the-art model for joint entity and relation extraction. In addition to the word embedding features in SpERT, the proposed model also considers the part-of-speech features. We believe that the part-of-speech features in entities are helpful for entity recognition and can effectively alleviate the entity redundancy problem. The proposed model is evaluated on two public data sets, CoNLL04 and ADE. The experimental results show that the proposed joint extraction model based on multi-feature fusion significantly outperforms current state-of-the-art methods.


Download data is not yet available.


S. Zheng, Y. Hao, D. Lu, H. Bao, J. Xu, H. Hao, and B. Xu,“Joint entity and relation extraction based on a hybrid neural network,”Neurocomputing, vol. 257, pp. 59–66, 2017.

S. Zheng, F. Wang, H. Bao, Y. Hao, P. Zhou, and B. Xu, “Joint extraction of entities and relations based on a novel tagging scheme,”arXiv preprint arXiv:1706.05075, 2017.

D. Nadeau and S. Sekine, “A survey of named entity recognition and classification,”Lingvisticae Investigationes, vol. 30, no. 1, pp. 3–26,2007.

B. Rink and S. Harabagiu, “Utd: Classifying semantic relations by combining lexical and semantic resources,” inProceedings of the 5th international workshop on semantic evaluation, 2010, pp. 256–259.

L. Ratinov and D. Roth, “Design challenges and misconceptions in named entity recognition,” inProceedings of the Thirteenth Conference on Computational Natural Language Learning (CoNLL-2009), 2009, pp.147–155.

G. Bekoulis, J. Deleu, T. Demeester, and C. Develder, “Joint en-tity recognition and relation extraction as a multi-head selection problem,”Expert Systems with Applications, vol. 114, pp. 34–45, 2018.

M. Eberts and A. Ulges, “Span-based joint entity and relation extraction with transformer pre-training,”arXiv preprint arXiv:1909.07755, 2019.

K. Dixit and Y. Al-Onaizan, “Span-level model for relation extraction,”inProceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 2019, pp. 5308–5314.

J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, “Bert: Pre-training of deep bidirectional transformers for language understanding,”arXiv preprint arXiv:1810.04805, 2018.

A. Radford, K. Narasimhan, T. Salimans, and I. Sutskever, “Improving language understanding by generative pre-training,” 2018.

A. Radford, J. Wu, R. Child, D. Luan, D. Amodei, I. Sutskeveret al.,“Language models are unsupervised multitask learners,”OpenAI blog,vol. 1, no. 8, p. 9, 2019.

T. B. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal,A. Neelakantan, P. Shyam, G. Sastry, A. Askellet al., “Language models are few-shot learners,”arXiv preprint arXiv:2005.14165, 2020.

Z. Yang, Z. Dai, Y. Yang, J. Carbonell, R. R. Salakhutdinov, and Q. V. Le, “Xlnet: Generalized autoregressive pretraining for language understanding,”Advances in neural information processing systems,vol. 32, 2019.

Z. Zhang, X. Han, Z. Liu, X. Jiang, M. Sun, and Q. Liu, “Ernie: En-hanced language representation with informative entities,”arXiv preprint arXiv:1905.07129, 2019.

M. Miwa and M. Bansal, “End-to-end relation extraction using lstms on sequences and tree structures,”arXiv preprint arXiv:1601.00770, 2016.

B. Yu, Z. Zhang, X. Shu, Y. Wang, T. Liu, B. Wang, and S. Li, “Joint extraction of entities and relations based on a novel decomposition strategy,”arXiv preprint arXiv:1909.04273, 2019.

X. Zeng, D. Zeng, S. He, K. Liu, and J. Zhao, “Extracting relational facts by an end-to-end neural model with copy mechanism,” inProceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2018, pp. 506–514.

W. Huang, X. Cheng, T. Wang, and W. Chu, “Bert-based multi-head selection for joint entity-relation extraction,” inCCF International Conference on Natural Language Processing and Chinese Computing.Springer, 2019, pp. 713–723.

Y. Luan, L. He, M. Ostendorf, and H. Hajishirzi, “Multi-task identifica-tion of entities, relations, and coreference for scientific knowledge graph construction,”arXiv preprint arXiv:1808.09602, 2018.

D. Wadden, U. Wennberg, Y. Luan, and H. Hajishirzi, “Entity, relation,and event extraction with contextualized span representations,”arXiv preprint arXiv:1909.03546, 2019.

Y. Luan, D. Wadden, L. He, A. Shah, M. Ostendorf, and H. Hajishirzi,“A general framework for information extraction using dynamic span graphs,”arXiv preprint arXiv:1904.03296, 2019.

F. Hasan, A. Roy, and S. Pan, “Integrating text embedding with tradi-tional nlp features for clinical relation extraction,” in2020 IEEE 32nd International Conference on Tools with Artificial Intelligence (ICTAI). IEEE, 2020, pp. 418–425.

M. Neumann, D. King, I. Beltagy, and W. Ammar, “Scispacy: Fast and robust models for biomedical natural language processing,”arXiv preprint arXiv:1902.07669, 2019.

K. Lee, L. He, M. Lewis, and L. Zettlemoyer, “End-to-end neural coreference resolution,”arXiv preprint arXiv:1707.07045, 2017.

A. Rogers, O. Kovaleva, and A. Rumshisky, “A primer in bertology: What we know about how bert works,”Transactions of the Association for Computational Linguistics, vol. 8, pp. 842–866, 2020.

E. Loper and S. Bird, “Nltk: The natural language toolkit,”arXiv preprint cs/0205028, 2002.

G. Bekoulis, J. Deleu, T. Demeester, and C. Develder, “Adversarial training for multi-context joint entity and relation extraction,”arXiv preprint arXiv:1808.06876, 2018.

T. Tran and R. Kavuluru, “Neural metric learning for fast end-to-end relation extraction,”arXiv preprint arXiv:1905.07458, 2019.

D. Q. Nguyen and K. Verspoor, “End-to-end neural relation extraction using deep biaffine attention,” inEuropean Conference on Information Retrieval. Springer, 2019, pp. 729–738.

D. Roth and W.-t. Yih, “A linear programming formulation for global inference in natural language tasks,” ILLINOIS UNIV AT URBANA-CHAMPAIGN DEPT OF COMPUTER SCIENCE, Tech. Rep., 2004.

P. Gupta, H. Sch¨utze, and B. Andrassy, “Table filling multi-task recurrent neural network for joint entity and relation extraction,” inProceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, 2016, pp. 2537–2547.

H. Gurulingappa, A. M. Rajput, A. Roberts, J. Fluck, M. Hofmann-Apitius, and L. Toldo, “Development of a benchmark corpus to support the automatic extraction of drug-related adverse effects from medical case reports,”Journal of biomedical informatics, vol. 45, no. 5, pp. 885–892, 2012.





How to Cite

S. Li, Z. . Chang, and Y. Liu, “Joint Extraction of Entities and Relations Based on Multi-feature Fusion”, ijetaa, vol. 1, no. 1, Feb. 2024, doi: 10.62677/IJETAA.2401101.

Similar Articles

1-10 of 17

You may also start an advanced similarity search for this article.