TY - GEN
T1 - Text-enhanced Multi-Granularity Temporal Graph Learning for Event Prediction
AU - Han, Xiaoxue
AU - Ning, Yue
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - When working with forecasting the future, it is all about learning from the past. However, it is non-trivial to model the past due to the scale and complexity of available data. Recently, Graph Neural Networks (GNNs) have shown flexibility to process different forms of data and learn interactions among entities, giving them advantages in real-life applications. More and more researchers have started to apply GNNs and temporal models for event forecasting because events are formalized in knowledge graphs. However, most of these models are based on the Markov assumption that the probability of a event is only influenced by the state of its last time step (or recent history). We claim that the occurrence of an event not only has short-term but also long-term dependencies. In this work, we propose a temporal knowledge graph (KG)-based model that considers different granularties of histories when forecasting an event; this method also integrates news texts as auxiliary features during the graph learning process. Extensive experiments on multiple datasets are conducted to examine the effectiveness of the proposed method. Code is available at: https://github.com/yuening-lab/MTG.
AB - When working with forecasting the future, it is all about learning from the past. However, it is non-trivial to model the past due to the scale and complexity of available data. Recently, Graph Neural Networks (GNNs) have shown flexibility to process different forms of data and learn interactions among entities, giving them advantages in real-life applications. More and more researchers have started to apply GNNs and temporal models for event forecasting because events are formalized in knowledge graphs. However, most of these models are based on the Markov assumption that the probability of a event is only influenced by the state of its last time step (or recent history). We claim that the occurrence of an event not only has short-term but also long-term dependencies. In this work, we propose a temporal knowledge graph (KG)-based model that considers different granularties of histories when forecasting an event; this method also integrates news texts as auxiliary features during the graph learning process. Extensive experiments on multiple datasets are conducted to examine the effectiveness of the proposed method. Code is available at: https://github.com/yuening-lab/MTG.
KW - Dynamic Graph Neural Networks
KW - Multiple Temporal Granularities
KW - Text-enriched Knowledge Graphs
UR - http://www.scopus.com/inward/record.url?scp=85147732926&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85147732926&partnerID=8YFLogxK
U2 - 10.1109/ICDM54844.2022.00027
DO - 10.1109/ICDM54844.2022.00027
M3 - Conference contribution
AN - SCOPUS:85147732926
T3 - Proceedings - IEEE International Conference on Data Mining, ICDM
SP - 171
EP - 180
BT - Proceedings - 22nd IEEE International Conference on Data Mining, ICDM 2022
A2 - Zhu, Xingquan
A2 - Ranka, Sanjay
A2 - Thai, My T.
A2 - Washio, Takashi
A2 - Wu, Xindong
T2 - 22nd IEEE International Conference on Data Mining, ICDM 2022
Y2 - 28 November 2022 through 1 December 2022
ER -