Journal Home > Volume 37 , Issue 6

Relation extraction has been widely used to find semantic relations between entities from plain text. Dependency trees provide deeper semantic information for relation extraction. However, existing dependency tree based models adopt pruning strategies that are too aggressive or conservative, leading to insufficient semantic information or excessive noise in relation extraction models. To overcome this issue, we propose the Neural Attentional Relation Extraction Model with Dual Dependency Trees (called DDT-REM), which takes advantage of both the syntactic dependency tree and the semantic dependency tree to well capture syntactic features and semantic features, respectively. Specifically, we first propose novel representation learning to capture the dependency relations from both syntax and semantics. Second, for the syntactic dependency tree, we propose a local-global attention mechanism to solve semantic deficits. We design an extension of graph convolutional networks (GCNs) to perform relation extraction, which effectively improves the extraction accuracy. We conduct experimental studies based on three real-world datasets. Compared with the traditional methods, our method improves the F1 scores by 0.3, 0.1 and 1.6 on three real-world datasets, respectively.

File
jcst-37-6-1369-Highlights.pdf (101.8 KB)
Publication history
Copyright

Publication history

Received: 15 April 2022
Accepted: 16 November 2022
Published: 30 November 2022
Issue date: November 2022

Copyright

©Institute of Computing Technology, Chinese Academy of Sciences 2022
Return