Discover the SciOpen Platform and Achieve Your Research Goals with Ease.
Search articles, authors, keywords, DOl and etc.
Temporal Point Processes (TPPs), especially the Hawkes process, are commonly used for modeling asynchronous event-type big data, such as financial transactions and user behaviors in social networks. Due to the strong fitting ability of neural networks, various neural point processes are proposed, among which the neural Hawkes processes based on self-attention, such as the Transformer Hawkes Process (THP), achieve distinct performance improvement. Although the THP has gained popular applications, it still suffers from low accuracy and unstable performance in sequence prediction tasks when training on history sequences and inferencing about the future, which is a prevalent paradigm in realistic sequence analysis. Conventional THP and its variants generally adopt sinusoid embedding in transformers, which also shows severe performance sensitivity to temporal change or noise by empirical study. To deal with the above problems, we propose a new Rotary position embedding-based THP (RoTHP), which for the first time encodes the temporal information in the Hawkes process with rotary embedding, and then constructs the intensity function adaptively. Notably, we show the translation invariance property of the RoTHP induced by the relative time encoding when coupled with the Hawkes process theoretically, and illustrate its sequence prediction flexibility. Extensive experiments are conducted, which demonstrate that the proposed RoTHP can be better generalized when dealing with sequence data with timestamp translations or noise, and show its superior performance in sequence prediction tasks.
The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).
Comments on this article