L-Mul: Energy-Efficient AI Training Approach
Finally, trimming AI Energy Consumption and carbon footprint can soon be a reality.
This is originally published on Databracket’s Substack page. If you don’t have a medium subscription, please check there.
Large language models have shown us that, given enough data and sound ML algorithms, human-like text can be generated.
However, the cost of training and inference is deadly from energy consumption and data collection standpoints.
Looking at how the big players in the LLM field have accumulated the data to train their proprietary solutions, it is clear that, user privacy is not considered.
That doesn’t concern or scare me, but the energy consumption and carbon emissions send a chill down my spine.
Did you know that in early 2023, the electricity required to power ChatGPT was around 564 MWh? A small town with nearly 18000 families can survive with that energy.
Other big players like Google Gemini and Mistral contribute at the same level.
It is now increasingly important than ever to optimize the model training for efficiency.