Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model
Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model This article co-authored by Alexandra Sasha Luccioni, Sylvain Viguier, and Anne-Laure Ligozat estimates that BLOOM, a Large Language Model, with 176 billion parameters, emits 24.7 tonnes of CO2 during a single training. Abstract: Progress in machine learning (ML) comes with a cost to the […]