Lookup NU author(s): John Brennan,
Dr Stephen McGough
This is the authors' accepted manuscript of a conference proceedings (inc. abstract) that has been published in its final definitive form by IEEE, 2018.
For re-use rights please refer to the publisher's terms and conditions.
Graphs are a commonly used construct for repre- senting relationships between elements in complex high dimen- sional datasets. Many real-world phenomenon are dynamic in nature, meaning that any graph used to represent them is inher- ently temporal. However, many of the machine learning models designed to capture knowledge about the structure of these graphs ignore this rich temporal information when creating representations of the graph. This results in models which do not perform well when used to make predictions about a later point in a graph’s time series when the delta between time steps is not small. In this work, we explore a novel training procedure and an associated unsupervised model which creates graph representations optimised to predict the future state of the graph. We make use of graph convolutional neural networks to encode the graph into a latent representation, which we then use to train our temporal offset reconstruction method, inspired by auto-encoders, to predict a later time point. Using our method, we demonstrate superior performance for the task of future link prediction compared with none-temporal state- of-the-art baselines. We show our approach to be capable of outperforming non-temporal baselines by 38% on a real world dataset.
Author(s): Bonner S, Brennan J, Kureshi I, Theodoropoulos G, McGough AS, Obara B
Publication type: Conference Proceedings (inc. Abstract)
Publication status: Published
Conference Name: IEEE International Conference on Big Data
Year of Conference: 2018
Online publication date: 10/12/2018
Acceptance date: 10/11/2018
Date deposited: 21/11/2018