NLP

COMPUTER VISION

The Framework Tax: Disparities Between Inference Efficiency in NLP Research and Deployment

November 07, 2023

Abstract

Increased focus on the computational efficiency of NLP systems has motivated the design of efficient model architectures and improvements to underlying hardware accelerators. However, the resulting increases in computational throughput and reductions in floating point operations have not directly translated to improvements in wall-clock inference latency. We demonstrate that these discrepancies can be largely attributed to bottlenecks introduced by deep learning frameworks. We denote this phenomenon as the framework tax, and observe that the disparity is growing as hardware speed increases over time. In this work, we examine this phenomenon through a series of case studies analyzing the effects of model design decisions, framework paradigms, and hardware platforms on total model latency.

Download the Paper

AUTHORS

Written by

Jared Fernandez

Jacob Kahn

Clara Na

Yonatan Bisk

Emma Strubell

Publisher

EMNLP

Related Publications

May 06, 2024

REINFORCEMENT LEARNING

COMPUTER VISION

Solving General Noisy Inverse Problem via Posterior Sampling: A Policy Gradient Viewpoint

Haoyue Tang, Tian Xie

May 06, 2024

April 23, 2024

COMPUTER VISION

Very high resolution canopy height maps from RGB imagery using self-supervised vision transformer and convolutional decoder trained on Aerial Lidar

Jamie Tolan, Eric Yang, Ben Nosarzewski, Guillaume Couairon, Huy Vo, John Brandt, Justine Spore, Sayantan Majumdar, Daniel Haziza, Janaki Vamaraju, Theo Moutakanni, Piotr Bojanowski, Tracy Johns, Brian White, Tobias Tiecke, Camille Couprie, Edward Saenz

April 23, 2024

April 23, 2024

CONVERSATIONAL AI

GRAPHICS

Generating Illustrated Instructions

Sachit Menon, Ishan Misra, Rohit Girdhar

April 23, 2024

April 22, 2024

NLP

Text Quality-Based Pruning for Efficient Training of Language Models

Vasu Sharma *, Karthik Padthe *, Newsha Ardalani, Kushal Tirumala, Russ Howes, Hu Xu, Bernie Huang, Daniel Li (FAIR), Armen Aghajanyan, Gargi Ghosh, Luke Zettlemoyer

April 22, 2024

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.