Title: Graph Neural Networks and Embedding Geometry
Abstract: Recent works in graph neural networks (GNNs) have achieved state-of-the-art in many challenging tasks, and are applied to a variety of challenging use cases in scientific domains, large-scale networks and NP-hard problems. Here I will present the use of embedding geometry to further improve the expressive power of GNNs, using two examples: Hyperbolic GCN and Neural Subgraph Matching.
Traditional Graph Convolutional Network (GCN) embeds nodes in a graph into Euclidean space, which has been shown to incur a large distortion when embedding real-world graphs with scale-free or hierarchical structure. Hyperbolic geometry offers an exciting alternative, as it enables embeddings with much smaller distortion. We propose Hyperbolic GCN (HGCN), the first inductive hyperbolic GCN that leverages both the expressiveness of GCNs and hyperbolic geometry to learn inductive node representations for hierarchical and scale-free graphs. We derive GCN operations in the hyperboloid model of hyperbolic space and map Euclidean input features to embeddings in hyperbolic spaces with different trainable curvature at each layer.
Subgraph matching is the problem of determining the presence and location(s) of a given query graph in a large target graph. We propose NeuroMatch, an accurate, efficient, and robust neural approach to subgraph matching. NeuroMatch decomposes query and target graphs into small subgraphs and embeds them using graph neural networks. Trained to capture geometric constraints corresponding to subgraph relations, NeuroMatch then efficiently performs subgraph matching directly in the order embedding space. Experiments demonstrate NeuroMatch is 100x faster than existing combinatorial approaches and 18% more accurate than existing approximate subgraph matching methods.
Time: June 24, 2020, 3:30pm – 4:00pm