Advancements in Graph
Neural Networks:
PGNNs, Pretraining, and OGB
Jure Leskovec
Includes joint work with W. Hu, J. You, M. Fey, Y. Dong, B. Liu,
M. Catasta, K. Xu, S. Jegelka, M. Zitnik, P. Liang, V. Pande
Modern ML Toolbox
Images
Text/Speech
Modern deep learning toolbox is
designed for simple sequences & grids
Jure Leskovec, Stanford University
2
But not everything
can be represented as
a sequence or a grid
How can we develop neural
networks that are much more
broadly applicable?
New frontiers beyond classic neural
networks that learn on images and
sequences
Jure Leskovec, Stanford University
3
Representation Learning in Graphs
z
…
Input: Network
Predictions: Node labels,
New links, Generated
graphs and subgraphs
Jure Leskovec, Stanford University
4
Networks of Interactions
Social networks
Knowledge graphs
Biological networks
Complex Systems
Molecules
Jure Leskovec, Stanford University
Code
5
Why is it Hard?
Networks are complex!
§ Arbitrary size and complex topological
structure (i.e., no spatial locality like grids)
vs.
Text
Networks
§ No fixed node ordering or reference point
§ Often dynamic and have multimodal features
Images
Jure Leskovec, Stanford University
6
Graph Neural Networks
TARGET NODE
B
A
D
E
INPUT GRAPH
C
F
A
B
C
D
A
C
A
B
E
F
A
Each node defines a computation graph
§ Each edge in this graph is a
transformation/aggregation function
Scarselli et al. 2005. The Graph Neural Network Model. IEEE Transactions on Neural Networks.
Jure Leskovec, Stanford University
7
Graph Neural Networks
TARGET NODE
B
A
D
E
INPUT GRAPH
C
F
A
B
C
D
A
C
A
B
E
F
A
Neural networks
Intuition: Nodes aggregate information from
their neighbors using neural networks
Inductive Representation Learning on Large Graphs. W. Hamilton, R. Ying, J. Leskovec. NIPS, 2017.
Jure Leskovec, Stanford University
8