Skip to main content

Expressiveness and Decidability of Graph Neural Networks

Chia-Hsuan Lu ( Oxford )

In this talk, I will present our recent work concerning the expressiveness and decidability of graph neural networks (GNNs) by using logics involving Presburger quantifiers, which are quantifiers that can express linear constraints over the number of elements satisfying certain properties. The first part of the talk will focus on GNNs  that use activation functions that are eventually constant as we approach infinity or negative infinity. The truncate ReLU is a standard example of such an activated function.  I will show that the expressiveness of these GNNs is the same as two-variable modal logic with Presburger quantifiers (MP2).  Using prior results about MP2, this gives the decidability of static analysis problems for these GNNs, as long as only "local aggregation" -- aggregation over the neighbors of a give node -- is allowed. We show that these analysis problems become undecidable if "global readout"  -- aggregation over all nodes in the graph -- is permitted.  Then, I will present a complexity result for one special case -- GNNs with truncated ReLU activations. In this case, our main analysis problem is PSPACE-complete.

The last part of the talk concerns GNNs with non-truncated activations, such as the standard ReLU function. We show that GNNs with ReLU activations are strictly more expressive than GNNs with eventually constant activations. Finally, I will present a partial decidability result, which deals with a variation over directed graphs where we only aggregate over outgoing edges.

 

 

 

Share this: