AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (1.1 MB)
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Open Access

Rotary Position Embedding-Based Transformer Hawkes Process for Event-Type Big Data

Shan Dai1,( )Anningzhe Gao2,Zhuo Li3Yuhao Du3
Shenzhen Research Institute of Big Data, Shenzhen 518172, China, and also with School of Data Science, The Chinese University of Hong Kong (Shenzhen), Shenzhen 518172, China
Shenzhen Research Institute of Big Data, Shenzhen 518172, China
School of Science and Engineering, The Chinese University of Hong Kong (Shenzhen), Shenzhen 518172, China

Shan Dai and Anningzhe Gao contribute equally to this paper.

Show Author Information

Abstract

Temporal Point Processes (TPPs), especially the Hawkes process, are commonly used for modeling asynchronous event-type big data, such as financial transactions and user behaviors in social networks. Due to the strong fitting ability of neural networks, various neural point processes are proposed, among which the neural Hawkes processes based on self-attention, such as the Transformer Hawkes Process (THP), achieve distinct performance improvement. Although the THP has gained popular applications, it still suffers from low accuracy and unstable performance in sequence prediction tasks when training on history sequences and inferencing about the future, which is a prevalent paradigm in realistic sequence analysis. Conventional THP and its variants generally adopt sinusoid embedding in transformers, which also shows severe performance sensitivity to temporal change or noise by empirical study. To deal with the above problems, we propose a new Rotary position embedding-based THP (RoTHP), which for the first time encodes the temporal information in the Hawkes process with rotary embedding, and then constructs the intensity function adaptively. Notably, we show the translation invariance property of the RoTHP induced by the relative time encoding when coupled with the Hawkes process theoretically, and illustrate its sequence prediction flexibility. Extensive experiments are conducted, which demonstrate that the proposed RoTHP can be better generalized when dealing with sequence data with timestamp translations or noise, and show its superior performance in sequence prediction tasks.

References

【1】
【1】
 
 
Big Data Mining and Analytics
Pages 23-38

{{item.num}}

Comments on this article

Go to comment

< Back to all reports

Review Status: {{reviewData.commendedNum}} Commended , {{reviewData.revisionRequiredNum}} Revision Required , {{reviewData.notCommendedNum}} Not Commended Under Peer Review

Review Comment

Close
Close
Cite this article:
Dai S, Gao A, Li Z, et al. Rotary Position Embedding-Based Transformer Hawkes Process for Event-Type Big Data. Big Data Mining and Analytics, 2026, 9(1): 23-38. https://doi.org/10.26599/BDMA.2025.9020029

865

Views

43

Downloads

1

Crossref

0

Web of Science

1

Scopus

0

CSCD

Received: 06 November 2024
Revised: 27 February 2025
Accepted: 10 March 2025
Published: 10 December 2025
© The author(s) 2026.

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).