Dark mode
🌱 Carbontracker
Seamlessly measure the carbon footprint of your machine learning models.
Install now using pip
pip install carbontracker
Carbontracker tracks hardware power consumption and local energy carbon intensity during training to provide accurate measurements and predictions of the operational carbon footprint.
Multi-platform
Supports Intel CPUs, NVIDIA GPUs and Apple silicon.
Localized
Fetches carbon intensity estimates based on geographic position.
Plug-and-play
Available as both CLI and Python bindings for easy integration into existing solutions.
Provides intelligent predictions
Estimates total emissions after first epoch.
Live updates
Power consumption and carbon intensity are not static - neither should the estimates be.
Minimal overhead
Runs in separate threads and only adds a minor computation cost.
Easily integrated
Contains extensive tooling for parsing log files for easy integration with other tools.
(Upcoming) HPC Ready
Deploy on SLURM to track carbon emissions across the entire cluster.
Statistics
as of August 2024
Example usage
Use the CLI:
$ carbontracker python script.py
...or embed directly into your Python program
from carbontracker.tracker import CarbonTracker
tracker = CarbonTracker(epochs=max_epochs)
# Training loop.
for epoch in range(max_epochs):
tracker.epoch_start()
# Your model training.
tracker.epoch_end()
# Optional: Add a stop in case of early termination before all monitor_epochs has
# been monitored to ensure that actual consumption is reported.
tracker.stop()
Example output:
CarbonTracker:
Actual consumption for 1 epoch(s):
Time: 0:00:10
Energy: 0.000038 kWh
CO2eq: 0.003130 g
This is equivalent to:
0.000026 km travelled by car
CarbonTracker:
Predicted consumption for 1000 epoch(s):
Time: 2:52:22
Energy: 0.038168 kWh
CO2eq: 4.096665 g
This is equivalent to:
0.034025 km travelled by car
CarbonTracker: Finished monitoring.
Carbontracker: Tracking and Predicting the Carbon Footprint of Training Deep Learning Models
Abstract from Anthony et al. (2020)
The popularity of solving problems using deep learning (DL) has rapidly increased and with it the need for ever more powerful models. These models achieve impressive results across a wide variety of tasks such as gameplay, where AlphaStar reached the highest rank in the strategy game Starcraft II (Vinyals et al., 2019) and Agent57 surpassed human performance in all 57 Atari 2600 games (Badia et al., 2020). This comes at the cost of training the model for thousands of hours on specialized hardware accelerators such as graphics processing units (GPUs). From 2012 to 2018 the compute needed for DL grew 300000-fold (Amodei & Hernandez, 2018). This immense growth in required compute has a high energy demand, which in turn increases the demand for energy production. In 2010 energy production was responsible for approximately 35% of total anthropogenic greenhouse gas (GHG) emissions (Bruckner et al., 2014). Should this exponential trend in DL compute continue then machine learning (ML) may become a significant contributor to climate change.
Built and maintained by
Originally developed by
Current maintainers