RESEARCH

COMPUTER VISION

PyTorch: An Imperative Style, High-Performance Deep Learning Library

December 02, 2019

Abstract

Deep learning frameworks have often focused on either usability or speed, but not both. PyTorch is a machine learning library that shows that these two goals are in fact compatible: it provides an imperative and Pythonic programming style that supports code as a model, makes debugging easy and is consistent with other popular scientific computing libraries, while remaining efficient and supporting hardware accelerators such as GPUs. In this paper, we detail the principles that drove the implementation of PyTorch and how they are reflected in its architecture. We emphasize that every aspect of PyTorch is a regular Python program under the full control of its user. We also explain how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance. We demonstrate the efficiency of individual subsystems, as well as the overall speed of PyTorch on several common benchmarks.

Download the Paper

AUTHORS

Written by

Soumith Chintala

Adam Lerer

Benoit Steiner

Edward Yang

Francisco Massa

Gregory Chanan

Junjie Bai

Lu Fang

Sam Gross

Zachary DeVito

Zeming Lin

Adam Paszke

Alban Desmaison

Alykhan Tejani

Andreas Köpf

James Bradbury

Luca Antiga

Martin Raison

Natalia Gimelshein

Sasank Chilamkurthy

Trevor Killeen

Publisher

NeurIPS

Research Topics

Computer Vision

Related Publications

December 13, 2022

NLP

COMPUTER VISION

Efficient Self-supervised Learning with Contextualized Target Representations for Vision, Speech and Language

Michael Auli, Alexei Baevski, Arun Babu, Wei-Ning Hsu

December 13, 2022

November 28, 2022

RESEARCH

CORE MACHINE LEARNING

Neural Attentive Circuits

Nicolas Ballas, Bernhard Schölkopf, Chris Pal, Francesco Locatello, Li Erran, Martin Weiss, Nasim Rahaman, Yoshua Bengio

November 28, 2022

November 27, 2022

RESEARCH

Near Instance-Optimal PAC Reinforcement Learning for Deterministic MDPs

Andrea Tirinzoni, Aymen Al Marjani, Emilie Kaufmann

November 27, 2022

November 16, 2022

RESEARCH

NLP

Memorization Without Overfitting: Analyzing the Training Dynamics of Large Language Models

Kushal Tirumala, Aram H. Markosyan, Armen Aghajanyan, Luke Zettlemoyer

November 16, 2022

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.