Training large language models consumes a significant amount of energy (Strubell, 2019). In the spirit of carbon neutrality and minimizing the carbon footprint of model training, all models are trained on Google Cloud, which has net zero emissions and the goal of being 100% renewable by 2030. Models are trained on Google Cloud TPUs, which are significantly more energy efficient than traditional GPUs.
While training models with net zero emissions is a good first step, we acknowledge the controversy of carbon neutrality. In the long run, we aim to reduce our carbon footprint by using more efficient training processes, designing smaller models, and training on energy-efficient devices.
For transparency and accountability, carbon emissions are reported below per model training run using the ML CO2 Impact Tool, which follows the methodology described in (Lecoste, 2019). Estimates are approximate upper bounds and are applicable to both Generation and Representation models. Estimates for custom model training are currently excluded.
|Model||Kg CO2 eq.||Transatlantic flight eq.|
Transatlantic flight eq. calculated based on a total CO2 emission of 41284.1 kg (Toronto to London)
Updated 5 months ago