Researchers from the University of Michigan have created an open-source optimization framework called Zeus that addresses the energy consumption issue in deep learning models. As the trend of using larger models with more parameters grows, the demand for energy to train these models is also increasing. Zeus seeks to solve this issue by identifying the optimal balance between the consumption of energy and training speed during the training process without requiring any hardware changes or new infrastructure.
Zeus accomplishes this by using two software knobs: the GPU power limit and the batch size parameter of the deep learning model. The GPU power limit controls the amount of power consumed by the GPU, and the batch size parameter controls how many samples are processed before updating the model’s representation of the data’s relationships. By adjusting these parameters in real-time, Zeus seeks to minimize energy usage while having as little impact on training time as possible.
Zeus is designed to work with a variety of machine learning tasks and GPUs and can be used without changes to the hardware or infrastructure. Furthermore, the research team has also developed complementary software called Chase, which can reduce the carbon footprint of DNN training by prioritizing speed when low-carbon energy is available and efficiency during peak times.
The research team aims to develop solutions that are realistic and reduce the carbon footprint of DNN training without conflicting with constraints, such as large dataset sizes or data regulations. While deferring training jobs to greener time frames may not always be an option due to the need to use the most up-to-date data, Zeus and Chase can still provide significant energy savings without sacrificing accuracy.
The development of Zeus and complementary software like Chase is a crucial step in addressing the energy consumption issue of deep learning models. By reducing the energy demand of deep learning models, the researchers can help mitigate the impact of artificial intelligence on the environment and promote sustainable practices in the field. The optimization of deep learning models through Zeus does not come at the cost of accuracy, as the research team has demonstrated significant energy savings without impacting training time.
In summary, Zeus is an open-source optimization framework that aims to reduce the energy consumption of deep learning models by identifying the optimal balance between energy consumption and training speed. By adjusting the GPU power limit and batch size parameter, Zeus minimizes energy usage without impacting accuracy. Zeus can be used with a variety of machine learning tasks and GPUs, and the complementary software Chase can reduce the carbon footprint of DNN training. The development of Zeus and Chase promotes sustainable practices in the field of artificial intelligence and mitigates its impact on the environment.
Check out the Study 1, Study 2, and Github. Don’t forget to join our 19k+ ML SubReddit, Discord Channel, and Email Newsletter, where we share the latest AI research news, cool AI projects, and more. If you have any questions regarding the above article or if we missed anything, feel free to email us at Asif@marktechpost.com
Check Out 100’s AI Tools in AI Tools Club
The post Cutting Carbon Footprint in AI Training by Optimization appeared first on MarkTechPost.
Researchers from the University of Michigan have created an open-source optimization framework called Zeus that addresses the energy consumption issue in deep learning models. As the trend of using larger models with more parameters grows, the demand for energy to train these models is also increasing. Zeus seeks to solve this issue by identifying the
The post Cutting Carbon Footprint in AI Training by Optimization appeared first on MarkTechPost. Read More AI Shorts, Applications, Artificial Intelligence, Editors Pick, Machine Learning, Staff, Tech News, Technology