Abstract: With the exponential growth of computational power and the availability of large-scale datasets in recent years, remarkable advancements have been made in the field of artificial intelligence (AI), leading to complex models and innovative applications. However, these models consume a significant unprecedented amount of energy, contributing to greenhouse gas emissions and a growing carbon footprint in the AI industry. In response, the concept of green AI has emerged, prioritizing energy efficiency and sustainability alongside accuracy and related measures. To this end, data-centric approaches are very promising to reduce the energy consumption of AI algorithms. This article presents a comprehensive overview of data-centric technologies and their impact on the energy efficiency of AI algorithms. Specifically, it focuses on methods that utilize training data in an efficient manner to improve the energy efficiency of AI algorithms. We have identified multiple data-centric approaches, such as active learning, knowledge transfer/sharing, dataset distillation, data augmentation, and curriculum learning that can contribute to the development of environmentally-friendly implementations of machine learning algorithms. Finally, the practical applications of these approaches are highlighted, and the challenges and future directions in the field are discussed.
External IDs:dblp:journals/tai/SalehiS24
Loading