Now, generative Artificial intelligence is impossible to ignore online. Every time you do a Google search, an AI-generated summary appears randomly at the top of the results. Or Meta’s AI tools While browsing Facebook Always Shining Emoji He keeps appearing in my dreams.

This push to add AI to as many online interactions as possible dates back to OpenAI’s boundary-pushing release of ChatGPT in late 2022. Silicon Valley quickly fell in love with generative AI, and nearly two years later, AI tools powered by large-scale language models are permeating online user experiences.

One unfortunate side effect of this proliferation is that the computing processes required to run generative AI systems have become far more resource intensive, ushering in an era of internet hyper-consumption, defined by the proliferation of new kinds of computing that require huge amounts of electricity and water to build and operate.

“On the backend, these algorithms required to run generative AI models are fundamentally very different from a traditional Google search or email,” he said. Sajjad Moazeni“Basic services were very lightweight in terms of the amount of data that needed to be passed between processors,” said Moazeni, the computer engineering researcher at the University of Washington. By comparison, generative AI applications are around 100 to 1,000 times more computationally intensive, he estimates.

The energy demands of training and deploying the technology are no longer generative AI’s dirty secret, as experts predicted last year that data centers would see huge increases in energy demand as companies develop more and more AI applications. As if on cue, Google recently stopped considering itself a leader in AI. Carbon Neutraland Microsoft, Sustainability Goals At the foot of the ongoing race to build the biggest and best AI tools.

“These data centers are essentially powered by the amount of computation they do, so their carbon footprint and energy consumption are proportional to the amount of computation they do,” he said. Jiang JunchengA network systems researcher at the University of Chicago, Dr. Schneider says that the larger an AI model gets, the more computational power it requires, and these cutting-edge models are becoming extremely large.

Google’s total energy consumption is set to double between 2019 and 2023, but company spokesperson Corinna Standiford said: Google’s Energy Consumption They surged during the AI ​​race. “Emissions from our suppliers are extremely hard to reduce, and they make up 75% of our footprint,” she said in an email. The suppliers Google blames include makers of servers, networking equipment and other technical infrastructure for data centers, the energy-intensive process needed to create the physical parts of cutting-edge AI models.



Source

Share.

TOPPIKR is a global news website that covers everything from current events, politics, entertainment, culture, tech, science, and healthcare.

Leave A Reply

Exit mobile version