Zeus, a new open-source optimization framework developed at the University of Michigan, can reduce the energy demands of deep learning models by up to 75% without any new hardware and with only minor impacts on the time it takes to train a model. This framework studies deep learning models during training, pinpointing the best tradeoff between energy consumption and the speed of the training. Mainstream uses for deep learning models have exploded over the past three years, ranging from image-generation models and expressive chatbots to the recommender systems powering TikTok and Amazon.
