The Generative AI Race Has a Dirty Secret
*The race to build high-performance, AI-powered search engines is likely to require a dramatic rise in computing power, and with it a massive increase in the amount of energy that tech companies require and the amount of carbon they emit.
While neither OpenAI nor Google, have said what the computing cost of their products is, third-party analysis by researchers estimates that the training of GPT-3, which ChatGPT is partly based on, consumed 1,287 MWh, and led to emissions of more than 550 tons of carbon dioxide equivalent—the same amount as a single person taking 550 roundtrips between New York and San Francisco.
“We have to work on how to reduce the inference time required for such big models,” says Nafise Sadat Moosavi, a lecturer in natural language processing at the University of Sheffield, who works on sustainability in natural language processing. “Now is a good time to focus on the efficiency aspect.”
^ Or, you know. Just avoid using them.
1. Elsewhere
1.1. In my garden
Notes that link to this note (AKA backlinks).