Reducing Carbon Footprint of Machine Learning Through Model Compression and Pruning

May, Stow, and Ashley Ajumoke, Stewart, (2025) Reducing Carbon Footprint of Machine Learning Through Model Compression and Pruning. International Journal of Innovative Science and Research Technology, 10 (8): 25aug970. pp. 1479-1503. ISSN 2456-2165

Abstract

The exponential growth in machine learning model complexity has led to substantial increases in computational requirements and associated carbon emissions, raising concerns about the environmental sustainability of artificial intelligence systems. While previous research has primarily focused on neural network compression for GPU accelerated environments, the environmental impact of classical machine learning algorithms deployed on CPU infrastructure remains underexplored. This study investigates the application of pruning and aggressive pruning techniques to Random Forest and Gradient Boosting classifiers, evaluating their effectiveness in reducing carbon emissions while maintaining acceptable predictive performance. The research employs structural compression methods including tree pruning and estimator reduction across three UCI benchmark datasets (Adult Income, Wine Quality, Heart Disease) with varying size and class distribution characteristics. Comprehensive evaluation encompasses performance metrics, computational efficiency, and lifecycle carbon footprint analysis. Results demonstrate that combined pruning achieves 97.6% reduction in carbon emissions while maintaining 94.5% of baseline accuracy. Notably, compressed Random Forest models exhibited improved F1 scores on imbalanced datasets, with up to 137% improvement on Wine Quality data, suggesting compression serves as implicit regularization. Model size reductions reached 54% with inference time improvements of 38%. These findings establish that aggressive compression of tree based ensembles can simultaneously address environmental concerns and computational constraints without prohibitive performance degradation, making sustainable machine learning accessible for resource constrained deployments

Documents
2498:15062
[thumbnail of IJISRT25AUG970.pdf]
Preview
IJISRT25AUG970.pdf - Published Version

Download (2MB) | Preview
Information
Library
Metrics

Altmetric Metrics

Dimensions Matrics

Statistics

Downloads

Downloads per month over past year

View Item