Federated learning (FL) is an important privacy-preserving method for training AI models. We believe this is the first asynchronous FL system running at scale, training a model on 100 million Android devices. Our results show that asynchronous FL is five times faster and nearly eight times more communication-efficient than existing synchronous FL. This will enable FL-trained models to adapt quickly, improving the product experience while preserving privacy.