We are pleased to congratulate FAIR’sLéon Bottou and Google AI’s Olivier Bousquet on receiving the NeurIPS 2018 Test of Time award. Their winning paper is “The Tradeoffs of Large Scale Learning,” which they presented at NIPS 2007, while Bottou was a researcher at NEC Labs. For more about the event, visit the NeurIPS Facebook page.
Bottou joined FAIR in 2015 and is best known for his work on deep neural networks in the 1990s, large-scale learning in the 2000s, and his more recent work on causal inference in learning systems. He is also known for the DjVu document compression technology.
Facebook’s Research blog recently caught up with Bottou to learn more about the award-winning paper, which he says “explains why a deceptively simple optimization algorithm, Stochastic Gradient Descent (SGD), proposed by Robbins in 1951, gives superior performance for large-scale machine learning problems, decisively beating apparently more sophisticated optimization algorithms.”
You can read the full interview at research.fb.com.