**Title:** A framework for studying the power of graph neural networks

**Abstract:** Graph Neural Networks (GNNs) have recently received attention for their success on many tasks related to graph-structured data, and people have been interested in understanding its representation power theoretically. One line of work proposes to evaluate the power of GNNs by applying them to distinguish non-isomorphic graphs. Another line of work studies the abilities of GNNs to approximate functions on graph data that are invariant or equivariant to permutations. We will first introduce the background on both of these approaches, and then show the equivalence between these two perspectives. Moreover, we will propose a framework for studying the representation power of GNNs based on the concept of sigma-algebra, which unifies both of these perspectives. Next, under this framework, we focus on a particular type of GNNs – the Graph G-invariants Network with tensor-order 2 – and prove that it is not able to distinguish between non-isomorphic regular graphs with the same degree. As a consequence, it cannot approximate permutation-invariant functions on graph data universally. In light of this, we propose a new GNN architecture, which we call Ring-GNN, as an augmentation of the Graph G-invariants Network with tensor-order 2 based on the polynomial ring of matrices. We demonstrate that it succeeds in distinguishing the aforementioned non-isomorphic regular graphs as well as for task on real-world datasets. Based on joint work with Soledad Villar, Lei Chen and Joan Bruna.

**Time:** June 24, 2020, 2:30pm – 3:00pm