THEORY

CORE MACHINE LEARNING

Learning on Random Balls is Sufficient for Estimating (Some) Graph Parameters

December 06, 2021

Abstract

Theoretical analyses for graph learning methods often assume a complete observation of the input graph. Such an assumption might not be useful for handling any-size graphs due to the scalability issues in practice. In this work, we develop a theoretical framework for graph classification problems in the partial observation setting (i.e., subgraph samplings). Equipped with insights from graph limit theory, we propose a new graph classification model that works on a randomly sampled subgraph and a novel topology to characterize the representability of the model. Our theoretical framework contributes a theoretical validation of mini-batch learning on graphs and leads to new learning-theoretic results on generalization bounds as well as size-generalizability without assumptions on the input.

Download the Paper

AUTHORS

Written by

Takanori Maehara

Hoang NT

Publisher

NeurIPS

Research Topics

Theory

Core Machine Learning

Related Publications

December 06, 2021

COMPUTER VISION

CORE MACHINE LEARNING

Debugging the Internals of Convolutional Networks

Bilal Alsallakh, Narine Kokhlikyan, Vivek Miglani, Shubham Muttepawar, Edward Wang (AI Infra), Sara Zhang, David Adkins, Orion Reblitz-Richardson

December 06, 2021

December 06, 2021

CORE MACHINE LEARNING

Revisiting Graph Neural Networks for Link Prediction

Yinglong Xia

December 06, 2021

December 06, 2021

INTEGRITY

CORE MACHINE LEARNING

BulletTrain: Accelerating Robust Neural Network Training via Boundary Example Mining

Weizhe Hua, Yichi Zhang, Chuan Guo, Zhiru Zhang, Edward Suh

December 06, 2021

November 12, 2021

THEORY

REINFORCEMENT LEARNING

Bandits with Knapsacks beyond the Worst-Case Analysis

Karthik Abinav Sankararaman, Aleksandrs Slivkins

November 12, 2021