AI Doesn’t Float! Data Centers are anchored in physical infrastructure with real-world trade-offs. In our collective pursuit of AI acceleration, one question often goes unasked: Where will all the power come from and who will bear the cost of delivering it? While headlines warn that data center emissions may hit 3–4% of global CO₂ by 2030, the IEA ( https://lnkd.in/eKe89HjZ) estimates this number closer to 1%. So why the wide gap? Because location, timing, and grid readiness matter more than averages. In places like Dublin, data centers already consume nearly 20% of available power. In the U.S., data centers may surpass the electricity demand of all domestic heavy industry combined by 2030. This is a planning crisis already underway. Through my recent advisory work, I’ve seen how forward-thinking infrastructure planning, decarbonization strategy, and locational modeling can either unlock or bottleneck entire regions. Data center clusters need grid flexibility, clean firm power, and local engagement. However, very often we see a race to build without a strategy to share or sustain. The IEA’s new report (“Energy and AI” https://lnkd.in/eKe89HjZ) lays out three imperatives: 1. Diversify clean power supply (yes, including geothermal, SMRs, and batteries). 2. Accelerate grid build-out, not just generation. 3. Foster better collaboration between tech and energy planners, upstream, not after permits are filed. A sustainable AI future is about cross stakeholder coordination not solely carbon emissions. Cities, utilities, and communities deserve a seat at the table before another megawatt is claimed. #EnergyTransition #DataCenters #GridPlanning #SustainableAI #IEA #Infrastructure #ClimateTech #Resilience
AI Energy Consumption Insights
Explore top LinkedIn content from expert professionals.
-
-
Can you believe it has already been a year since #ChatGPT launched? Since its emergence, #artificialintelligence has captured global dialogue, from its potential #workforce impact to implications for education and art. But we’re missing a critical angle: AI’s #carbonfootprint. Examining ChatGPT’s usage can help us gain insight into its environmental impact. As of February 2024, the platform’s 100 million+ weekly active users are each posing an average of 10 queries… That’s ONE BILLION queries per week, each generating 4.32g of CO2. By plugging these estimations into an emissions calculator, I found that EVERY WEEK the platform is producing emissions roughly equivalent to 10,800 roundtrip flights between San Franciso and New York City (enough to melt 523,000 square feet of Arctic sea ice). Scientists have already warned the Arctic could be free of sea ice in summer as soon as the 2030s. And something tells me they weren’t factoring ChatGPT and other energy-demanding AI models into those projections. Further, this is based on estimated *current* ChatGPT use, which will only grow as society gets accustomed to the tool and as AI becomes more a part of everyday life. Some analyses indicate that by 2027, ChatGPT’s electricity consumption could rival that of entire nations like Sweden, Argentina, or the Netherlands. The platform is taking precautions, however, such as using Microsoft’s carbon-neutral #Azure cloud system and working to develop more #energyefficient chips—so it could certainly be worse. But, it could also be better. So let’s hold OpenAI accountable to mitigate their damage before it gets out of control. Join me in letting them know the public is watching their environmental impact and that they must responsibly manage the platform’s rapidly growing carbon footprint. (Pictured: Microsoft GPU server network to power OpenAI's supercomputer language model. Image courtesy of Microsoft/Bloomberg).
-
𝐀𝐬 𝐍𝐕𝐈𝐃𝐈𝐀 𝐀𝐬𝐜𝐞𝐧𝐝𝐬 𝐰𝐢𝐭𝐡 𝐌𝐢𝐠𝐡𝐭𝐢𝐞𝐫 𝐆𝐏𝐔𝐬, 𝐖𝐡𝐨 𝐇𝐨𝐥𝐝𝐬 𝐭𝐡𝐞 𝐑𝐞𝐢𝐧𝐬 𝐨𝐧 𝐃𝐚𝐭𝐚 𝐂𝐞𝐧𝐭𝐞𝐫 𝐄𝐟𝐟𝐢𝐜𝐢𝐞𝐧𝐜𝐲? As NVIDIA continues to push the boundaries with more powerful GPUs, the demand for extensive data center infrastructure skyrockets. But 𝐚𝐦𝐢𝐝𝐬𝐭 𝐭𝐡𝐢𝐬 𝐬𝐮𝐫𝐠𝐞 𝐢𝐧 𝐜𝐨𝐦𝐩𝐮𝐭𝐚𝐭𝐢𝐨𝐧𝐚𝐥 𝐩𝐨𝐰𝐞𝐫, 𝐭𝐡𝐞 𝐜𝐫𝐢𝐭𝐢𝐜𝐚𝐥 𝐝𝐢𝐚𝐥𝐨𝐠𝐮𝐞 𝐨𝐧 𝐝𝐚𝐭𝐚 𝐜𝐞𝐧𝐭𝐞𝐫 𝐞𝐟𝐟𝐢𝐜𝐢𝐞𝐧𝐜𝐲 𝐬𝐞𝐞𝐦𝐬 𝐨𝐯𝐞𝐫𝐬𝐡𝐚𝐝𝐨𝐰𝐞𝐝. In the era of digital transformation, managing energy efficiency in data centers has become a critical challenge. The use of state-of-the-art machine learning models, particularly neural networks, is revolutionizing how we optimize these complex systems. By integrating AI to analyze a variety of key operational metrics, data centers can achieve unprecedented levels of energy efficiency and operational excellence. Consider the power of AI in predicting 𝐏𝐨𝐰𝐞𝐫 𝐔𝐬𝐚𝐠𝐞 𝐄𝐟𝐟𝐞𝐜𝐭𝐢𝐯𝐞𝐧𝐞𝐬𝐬 (𝐏𝐔𝐄), a vital measure of a data center's energy efficiency. Neural networks utilize real-time data from multiple sources, including: 📌 𝐓𝐨𝐭𝐚𝐥 𝐬𝐞𝐫𝐯𝐞𝐫 𝐈𝐓 𝐥𝐨𝐚𝐝 𝐚𝐧𝐝 𝐓𝐨𝐭𝐚𝐥 𝐂𝐚𝐦𝐩𝐮𝐬 𝐂𝐨𝐫𝐞 𝐍𝐞𝐭𝐰𝐨𝐫𝐤 𝐑𝐨𝐨𝐦 (𝐂𝐂𝐍𝐑) 𝐈𝐓 𝐥𝐨𝐚𝐝, which reflect the direct energy consumption of critical data processing equipment. 📌 𝐎𝐩𝐞𝐫𝐚𝐭𝐢𝐨𝐧𝐚𝐥 𝐦𝐞𝐭𝐫𝐢𝐜𝐬 𝐨𝐟 𝐜𝐨𝐨𝐥𝐢𝐧𝐠 𝐢𝐧𝐟𝐫𝐚𝐬𝐭𝐫𝐮𝐜𝐭𝐮𝐫𝐞, such as the 𝐭𝐨𝐭𝐚𝐥 𝐧𝐮𝐦𝐛𝐞𝐫 𝐨𝐟 𝐩𝐫𝐨𝐜𝐞𝐬𝐬 𝐰𝐚𝐭𝐞𝐫 𝐩𝐮𝐦𝐩𝐬 (𝐏𝐖𝐏) 𝐫𝐮𝐧𝐧𝐢𝐧𝐠, their 𝐯𝐚𝐫𝐢𝐚𝐛𝐥𝐞 𝐟𝐫𝐞𝐪𝐮𝐞𝐧𝐜𝐲 𝐝𝐫𝐢𝐯𝐞 (𝐕𝐅𝐃) 𝐬𝐩𝐞𝐞𝐝𝐬, 𝐜𝐨𝐧𝐝𝐞𝐧𝐬𝐞𝐫 𝐰𝐚𝐭𝐞𝐫 𝐩𝐮𝐦𝐩𝐬 (𝐂𝐖𝐏), and the cooling towers in operation. Each of these components plays a vital role in the cooling efficiency of the center. 📌 𝐓𝐞𝐦𝐩𝐞𝐫𝐚𝐭𝐮𝐫𝐞 𝐬𝐞𝐭𝐩𝐨𝐢𝐧𝐭𝐬, like the 𝐦𝐞𝐚𝐧 𝐜𝐨𝐨𝐥𝐢𝐧𝐠 𝐭𝐨𝐰𝐞𝐫 𝐥𝐞𝐚𝐯𝐢𝐧𝐠 𝐰𝐚𝐭𝐞𝐫 𝐭𝐞𝐦𝐩𝐞𝐫𝐚𝐭𝐮𝐫𝐞 (𝐋𝐖𝐓) and 𝐦𝐞𝐚𝐧 𝐜𝐡𝐢𝐥𝐥𝐞𝐝 𝐰𝐚𝐭𝐞𝐫 𝐢𝐧𝐣𝐞𝐜𝐭𝐢𝐨𝐧 𝐩𝐮𝐦𝐩 𝐬𝐞𝐭𝐩𝐨𝐢𝐧𝐭 𝐭𝐞𝐦𝐩𝐞𝐫𝐚𝐭𝐮𝐫𝐞, which directly influence the cooling system's response to internal heat loads. By analyzing the interactions and efficiencies of these components, Plutoshift AI's models provide actionable insights that lead to 𝐬𝐦𝐚𝐫𝐭𝐞𝐫 𝐨𝐩𝐞𝐫𝐚𝐭𝐢𝐨𝐧𝐚𝐥 𝐝𝐞𝐜𝐢𝐬𝐢𝐨𝐧𝐬, 𝐫𝐞𝐝𝐮𝐜𝐞 𝐞𝐧𝐞𝐫𝐠𝐲 𝐜𝐨𝐧𝐬𝐮𝐦𝐩𝐭𝐢𝐨𝐧, 𝐚𝐧𝐝 𝐥𝐨𝐰𝐞𝐫 𝐨𝐩𝐞𝐫𝐚𝐭𝐢𝐨𝐧𝐚𝐥 𝐜𝐨𝐬𝐭𝐬. This approach not only helps in achieving sustainability goals but also enhances the reliability and performance of data centers. As we move forward, the integration of advanced #AI into data center operations is not just an option but a necessity. Let's embrace these technological advancements to foster innovation and sustainability in our industries! #AI #DataCenters #Sustainability #MachineLearning #Innovation #EnergyEfficiency #DataCenterEvolution #NextGenDataCenters #EfficiencyFirst #ResponsibleAI Plutoshift AI Iron Mountain NVIDIA Top Corner Capital
-
Energy is the key. Over the weekend I got a lot of messages about articles and stories talking about the links between the energy hungry AI models and the path to net zero. On one hand, the computational power required to train and run AI models is soaring, placing increasing demands on our energy grids. On the other, AI itself holds remarkable potential to drive innovations in climate technology, potentially aiding our quest for net zero emissions. This dichotomy presents both a formidable challenge and a beacon of hope in our journey toward a sustainable future. Advanced AI models, particularly those involved in machine learning and deep learning, require substantial computational resources. Training a single AI model can consume as much electricity as several hundred homes use in a month. As AI becomes more integrated into our daily lives, from autonomous vehicles to personalized medicine, the demand on energy grids will inevitably rise. This surge complicates our path to achieving net zero emissions, as increased energy demand generally translates to higher carbon footprints unless met entirely by renewable sources. However, the same technology that poses such a challenge also harbors solutions to some of the most pressing environmental issues. AI can optimize energy consumption in industries and homes, create more efficient renewable energy systems, and improve waste management practices. For example, AI algorithms can predict energy demand more accurately, enabling smarter grid management and reducing reliance on fossil fuel-powered peaker plants. In renewable energy, AI can enhance the efficiency of solar panels and wind turbines by optimizing their placement and operation based on weather predictions. Moreover, AI-driven innovations in materials science are paving the way for more efficient batteries and renewable energy storage solutions, addressing one of the significant hurdles in the transition to green energy. AI also plays a crucial role in monitoring and combating climate change. Through the analysis of satellite imagery and environmental data, AI can track deforestation, ocean health, and the melting of polar ice caps with unprecedented precision and speed. This capability not only informs better policy and conservation efforts but also helps in quantifying the impact of climate action, making it a potent tool in the global effort to mitigate climate change. The dual role of AI as both a contributor to and a solver of the energy and climate crises underscores the need for a balanced approach in its development and deployment. By prioritizing energy-efficient AI models and leveraging AI to accelerate the transition to renewable energy, we can harness the power of AI to move closer to our net zero goals, turning a formidable challenge into a formidable ally in the fight against climate change. At Nadia Partners we are building companies on both sides to help achieve both goals. Anybody who can help us please tag or share!
-
AI’s future depends on energy systems we haven’t built yet. I spent the first half of my career in the #energyefficiency and #renewableenergy field, working with electric utilities and in massive facilities including manufacturing plants, government sites, retail, hospitals, and resorts. In that work, my teams and I designed and installed energy systems for data centers, solar on rooftops and utility-scale farms, worked in geothermal power plants, built the world’s first utility-owned on-site co-generation system, and delivered the first fuel-cell energy solutions for the Department of Defense. The second half of my career has focused on IT and digital transformation. Today at AI Squared, I am seeing AI’s energy appetite firsthand. It is real, and it is growing faster than our infrastructure can handle. By 2030, data center electricity demand is projected to hit 945 terawatt-hours annually, up from 415 TWh today. That's almost as much electricity as the entire country of Japan uses in a year. AI workloads are the primary driver, with demand set to increase by 160 percent in just a few years. With quantum computing on the horizon, the ceiling could rise even higher. Yet only about 30 percent of global electricity today comes from clean sources. Most of the power fueling AI still comes from natural gas plants, not solar or wind. I had the opportunity to speak with Nicole Willing at Techopedia about this growing tension between AI innovation and energy reality. You can check out the full article here: 🔗 https://lnkd.in/eS7ERHH3 #AI #EnergyTransition #NetZero #SustainableAI #DataCenters #AISquared #InfrastructureMatters
-
Nobody talks about this, but AI is actually destroying the environment. (A single AI model can emit more carbon than 5 cars do over their lifetimes.) The amount of energy and CO2 released to train AI models is concerning. The GPUs used for AI training are significantly more power-hungry than regular CPUs. For instance, training the GPT-3 model consumed as much energy as powering 126 single-family homes for a year. The more I get into AI, the more I realize the environmental costs involved. Another example? The University of Massachusetts pointed out that it takes more than 626,000 pounds of CO2 to train a generic neural network. Good news is: There are solutions. Huge companies like Microsoft, IBM, and Google are taking steps to mitigate AI's environmental impact. → Microsoft Aims to power all data centers with 100% renewable energy and is working on ways to return energy to the grid during high demand. → IBM Is focusing on "recycling" AI models and make them more efficient over time rather than training new ones from scratch. → Google Cloud Is optimizing data center operations by using liquid cooling and ensuring high utilization rates to minimize energy waste. I love AI, but we can’t pretend these issues don’t exist. I’m glad to see that big companies are taking a step towards mitigating the risks, but there’s still a long way to go. A sustainable future isn’t possible without sustainable AI. P.S. I have a whole article on the environmental impacts of AI published in Forbes, link in the comments.
-
As we enter into a global AI arms race, the data economy is ‘returning’ from distributed processing to an industrial-like era of large ‘factories’ with massive local resource footprints. We’ve all heard the statistic about ChatGPT using a bottle of #water per conversation, but these filings are bringing new insights to light about the very localized interplay between AI and water/energy. It turns out that the environmental impact of developing AI is highly contingent on the placement of relatively few massive compute facilities, as the training of these products generally needs to be localized due to the massive flux of data. This highlights the need to better understand #datacenter cooling practices and alternatives, as well as seasonality of cooling demand in the face of a changing climate. If the statistics below are correct for Iowa, we can only imagine the summertime cooling demand for ‘AI Factories’ in desert latitudes (see below quote on Las Vegas). Fortunately, there ARE alternatives. We can integrate datacenters with local water and wastewater infrastructure to leverage water for its value as a heat sink— but without evaporative losses. Since me and Ufuk Erdal Ph.D., P.E. starting presenting on this issue roughly two years ago, awareness has grown tremendously about the problem, but relatively few of us are discussing novel cooling solutions. To learn more, check out our upcoming presentation at #Weftec2023… “Google reported a 20% growth in water use in the same period, which Ren also largely attributes to its AI work. Google’s spike wasn’t uniform -- it was steady in Oregon where its water use has attracted public attention, while doubling outside Las Vegas. It was also thirsty in Iowa, drawing more potable water to its Council Bluffs data centers than anywhere else… In July 2022, the month before OpenAI says it completed its training of GPT-4, Microsoft pumped in about 11.5 million gallons of water to its cluster of Iowa data centers, according to the West Des Moines Water Works. That amounted to about 6% of all the water used in the district, which also supplies drinking water to the city’s residents. In 2022, a document from the West Des Moines Water Works said it and the city government “will only consider future data center projects” from Microsoft if those projects can “demonstrate and implement technology to significantly reduce peak water usage from the current levels” to preserve the water supply for residential and other commercial needs.”
-
Artificial intelligence (AI) is revolutionizing the tech industry by enabling more sustainable system designs for complex applications. Through generative design, AI can explore a multitude of design alternatives to find the most efficient and environmentally friendly options. This not only enhances the performance of systems but also ensures they are built with sustainability in mind from the outset. AI’s impact on sustainability extends to reducing energy consumption, particularly in data centers. By optimizing operations such as cooling systems, AI has demonstrated the potential to significantly lower energy usage and carbon emissions. This optimization is crucial as the tech industry seeks to mitigate its environmental footprint. However, the development of AI models themselves can be resource-intensive. To address this, the industry is moving towards more targeted, domain-specific AI models that require less data and energy to train. This shift is essential for creating AI solutions that are not only powerful but also sustainable, paving the way for a future where technology advances hand in hand with environmental responsibility.
-
At a recent conference, someone asked me about AI’s environmental impact, a question we all need to sit with more seriously. Training large AI models consumes enormous resources. GPT-3 alone used over 1,200 MWh of electricity and approximately 700,000 liters of water. As demand for compute grows, so does AI’s energy footprint, expected to reach 20 to 50 percent of global data center usage by 2025. This is not sustainable unless we act deliberately. Fortunately, researchers and industry leaders are innovating: optimizing architectures, capping compute, and powering data centers with renewables. Small changes in model design and infrastructure can reduce energy usage by up to 90 percent without sacrificing performance. “AI’s true power lies in advancing both innovation and sustainability. Progress that drains the planet isn’t progress at all” Asha Saxena #SustainableAI #ResponsibleTech #LeadershipInAI #AIandClimate #TheAIFactor
-
Research continues to show the high environmental cost of GenAI tool development and deployment. We’ve created this classroom guide to help educators get a better understanding and engage their students in thoughtful discussions on the potential impacts of GenAI on the planet. Researchers estimate that creating ChatGPT used 1,287 megawatt hours of electricity and produced the carbon emissions equivalent of 123 gas-powered vehicles driven for one year. It's development created substantial heat that required a significant amount of water to cool down those data centers – and for every 5-50 prompts it requires about 16oz of water. Generating an image can be especially energy-intensive, similar to fully charging your smartphone. Creating 1,000 images with Stable Diffusion is responsible for as much CO2 as driving 4.1 miles in a gas-powered car. Some researchers estimate the carbon footprint of an AI prompt to be 4-5 times that of a normal search query. And the impact of escalating use predicted by 2027 could mean AI servers will use as much electricity as a small country. Check out the carousel for more including discussion questions and further reading. Or download a PDF version for your classroom here: https://lnkd.in/eaCtnN3n AI for Education #aiforeducation #aieducation #AI #GenAI #ChatGPT #environment #sustainability
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development