CORE MACHINE LEARNING

CombOptNet: Fit the Right NP-Hard Problem by Learning Integer Programming Constraints

July 18, 2021

Abstract

Bridging logical and algorithmic reasoning with modern machine learning techniques is a fundamental challenge with potentially transformative impact. On the algorithmic side, many NP-hard problems can be expressed as integer programs, in which the constraints play the role of their "combinatorial specification". In this work, we aim to integrate integer programming solvers into neural network architectures as layers capable of learning both the cost terms and the constraints. The resulting end-to-end trainable architectures jointly extract features from raw data and solve a suitable (learned) combinatorial problem with state-of-the-art integer programming solvers. We demonstrate the potential of such layers with an extensive performance analysis on synthetic data and with a demonstration on a competitive computer vision keypoint matching benchmark.

Download the Paper

AUTHORS

Written by

Anselm Paulus

Michal Rolínek

Vít Musil

Brandon Amos

Georg Martius

Publisher

ICML 2021

Research Topics

Core Machine Learning

Related Publications

November 03, 2020

CORE MACHINE LEARNING

Robust Embedded Deep K-means Clustering

Deep neural network clustering is superior to the conventional clustering methods due to deep feature extraction and nonlinear dimensionality reduction.…

Rui Zhang, Hanghang Tong Yinglong Xia, Yada Zhu

November 03, 2020

December 07, 2020

CORE MACHINE LEARNING

Adversarial Example Games

The existence of adversarial examples capable of fooling trained neural network classifiers calls for a much better understanding of possible attacks to guide the development…

Avishek Joey Bose, Gauthier Gidel, Andre Cianflone, Pascal Vincent, Simon Lacoste-Julien, William L. Hamilton

December 07, 2020

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.