Journal Home > Volume 38 , Issue 3

Due to the small size of the annotated corpora and the sparsity of the event trigger words, the event coreference resolver cannot capture enough event semantics, especially the trigger semantics, to identify coreferential event mentions. To address the above issues, this paper proposes a trigger semantics augmentation mechanism to boost event coreference resolution. First, this mechanism performs a trigger-oriented masking strategy to pre-train a BERT (Bidirectional Encoder Representations from Transformers)-based encoder (Trigger-BERT), which is fine-tuned on a large-scale unlabeled dataset Gigaword. Second, it combines the event semantic relations from the Trigger-BERT encoder with the event interactions from the soft-attention mechanism to resolve event coreference. Experimental results on both the KBP2016 and KBP2017 datasets show that our proposed model outperforms several state-of-the-art baselines.

File
JCST-2011-11143-Highlights.pdf (260.6 KB)
Publication history
Copyright

Publication history

Received: 09 November 2020
Accepted: 11 April 2022
Published: 30 May 2023
Issue date: May 2023

Copyright

© Institute of Computing Technology, Chinese Academy of Sciences 2023
Return