– March 2026
Artificial intelligence has given us some of the most transformative technologies in human history. It diagnoses disease, writes code, translates languages, and powers systems we interact with dozens of times each day. But all of that intelligence needs somewhere to live and somewhere to run. And right now, those places are consuming energy at a rate that should concern anyone paying attention.
Global data centre electricity consumption reached approximately 415 terawatt-hours in 2024. That represents roughly 1.5 percent of all electricity generated worldwide. By 2030, the figure is expected to more than double to 945 terawatt-hours, comparable to the entire electricity consumption of Japan. In the United States, the picture is even more striking. AI-specific servers consumed an estimated 53 to 76 terawatt-hours in 2024 alone. Deloitte projects that by 2035, power demand from US AI data centres could grow more than thirtyfold, from 4 gigawatts to 123 gigawatts.
The question is not whether data centres will continue growing. They will. The real question is whether that growth will run on clean energy or further entrench dependence on fossil fuels. The answer carries enormous consequences, not just for climate commitments, but for the businesses operating at the frontier of AI.
AI inferencing alone is projected to consume up to 20 percent of global energy by 2030. With 60 percent of data centre power still sourced from fossil fuels, the implications are difficult to ignore.
The Scale of the Problem
Training a large language model such as GPT-3 required roughly 1,287 megawatt-hours of electricity and produced approximately 552 tons of carbon dioxide. That figure represents a single training run for a single model. Today’s frontier models are significantly larger and require many more training cycles. Add to that the ongoing energy cost of inference, required each time a user submits a query, and the numbers escalate quickly.
A generative AI training cluster consumes seven to eight times more energy than a standard computing workload. The hardware operates at higher temperatures, requires more intensive cooling, and demands stable, uninterrupted power. Conventional data centre infrastructure was not originally designed for this level of intensity.
Even companies with strong environmental track records are feeling the pressure. Google has matched 100 percent of its electricity use with renewable energy purchases since 2017. Yet in 2024, its data centre electricity consumption increased 27 percent year over year as AI workloads expanded. Microsoft, Meta, and Amazon face similar tensions between ambitious climate commitments and rapid infrastructure growth.
Financial markets have taken notice. The global green data centre market was valued at approximately 48 to 49 billion dollars in 2025. Analysts project it will reach between 155 and 235 billion dollars by 2030, growing at a compound annual rate of 19 to 28 percent. This is no longer a niche sustainability segment. It is fast becoming one of the largest infrastructure investment categories in the world.
What Makes a Data Centre Green
The term is often used loosely, but a genuinely green data centre addresses energy efficiency, carbon emissions, water consumption, and waste heat in a systematic way. Established metrics exist for measuring performance across each of these areas, and regulators are increasingly requiring public disclosure.
The Core Metrics :
Power Usage Effectiveness, or PUE, remains the most widely recognised metric. It measures the ratio of total facility power to the power used by IT equipment. A PUE of 1.0 would indicate perfect efficiency. Traditional data centres typically operate between 1.8 and 2.2, meaning nearly as much energy is used for cooling and overhead as for computing itself. Best-in-class hyperscale facilities are achieving figures close to 1.1. Water Usage Effectiveness, or WUE, measures water consumed per kilowatt-hour of IT energy. Traditional cooling systems can place significant strain on local water supplies, particularly in water-stressed regions. Singapore’s updated certification standards now require a WUE of 2.0 litres per kilowatt-hour or better. Carbon Usage Effectiveness, or CUE, measures total carbon dioxide emissions relative to IT energy consumption. A facility powered entirely by renewable energy achieves a CUE of zero. This metric has become central to environmental reporting and benchmarking.
Cooling: The Heart of the Challenge
Cooling has always been a defining engineering challenge in data centre design, and AI has intensified it. Modern GPU-dense racks can generate 40 to 100 kilowatts or more of heat, compared to 5 to 10 kilowatts for traditional server racks. Air cooling, long the industry standard, struggles at these densities.
Direct-to-chip liquid cooling circulates coolant directly over processors through cold plates. It delivers precise thermal management, reduces power consumption by roughly 10 percent compared to air-cooled systems, and can be retrofitted into existing facilities. It has quickly become the preferred approach for advanced AI infrastructure.
Immersion cooling goes further, submerging servers in non-conductive dielectric fluid. Heat dissipation is highly efficient, mechanical fans are eliminated, and rack density can increase significantly. Importantly, the resulting waste heat reaches temperatures high enough to be reused effectively. The technology is moving from specialised use cases toward broader adoption.
Rear door heat exchangers provide a practical retrofit option. Liquid-cooled doors capture exhaust heat before it enters the room, easing the burden on conventional cooling systems. While not as efficient as full liquid cooling, they extend the life of existing infrastructure.
Liquid cooling is projected to account for 36 percent of data centre thermal management revenue by 2028.
Renewable Energy: Beyond Certificates
Renewable Energy Certificates, or RECs, were once considered sufficient proof of sustainability. Today, they are widely viewed as incomplete. Purchasing a certificate does not necessarily mean the facility itself runs on clean energy at the time of consumption.
Power Purchase Agreements provide a more credible approach. Under long-term contracts, often spanning 20 to 25 years, operators buy electricity directly from renewable energy producers. This model secures financing for generators while providing predictable costs for buyers. Hyperscale operators have become the world’s largest corporate PPA investors.
On-site generation, through solar installations or wind systems, improves resilience and reduces transmission losses. When paired with battery storage, on-site renewables can offset a significant portion of baseload demand. Solar plus battery installations can reduce annual electricity costs by up to 30 percent, with US federal tax credits lowering project costs by as much as 40 percent.
Nuclear energy is also re-entering the discussion. Small Modular Reactors, or SMRs, provide 5 to 300 megawatts of continuous, carbon-free baseload power within compact footprints. With 22 gigawatts of SMR projects currently in development, the first SMR-powered data centres are expected by 2030. Major technology companies are already securing agreements to access this capacity.
Green data centers now outperform traditional ones in five-year return on investment by more than 18 percent. Cooling costs drop from around 40 percent of operating expenses to roughly 20 percent.
The Regulatory Landscape
For much of the past decade, green data center practices were voluntary. That era is ending. Regulators in Europe, Asia, and increasingly elsewhere are moving from guidelines to requirements.
The EU Energy Efficiency Directive, which entered force in 2023, requires any data center with an IT power demand of 500 kilowatts or more to annually report its PUE, WUE, Energy Reuse Factor, and Renewable Energy Factor to a European database. The European Commission is preparing a new Data Centre Energy Efficiency Package for the first quarter of 2026, with a stated goal of making data centers carbon-neutral by 2030.
Germany’s national implementation is particularly demanding. Data centers beginning operations from July 2026 onward must use at least 10 percent reused energy, including waste heat, rising to 20 percent by 2028. The waste heat provision is notable because it moves green data center obligations beyond the facility boundary, requiring operators to think about how their thermal output integrates with surrounding communities and infrastructure.
Singapore has updated its Green Mark certification to require PUE at or below 1.39 for Platinum ratings and has introduced Water Usage Effectiveness requirements for the first time. These standards shape access to limited land and grid capacity in one of Asia’s most important data center markets.
The EU Taxonomy regulation also matters here. Data center activities are subject to climate change mitigation criteria, which means that investors applying the Taxonomy framework to their portfolios must evaluate whether a facility meets technical screening thresholds. As sustainable finance regulations mature across Europe and other markets, this will increasingly influence the cost and availability of capital for data center development.
Waste Heat: An Overlooked Asset
Every data center produces heat. For most of the industry’s history, that heat has been regarded as a problem to be solved. Cooling it away requires energy, and managing it safely requires careful engineering. But for AI-dense facilities, the waste heat produced is not low-grade warmth. It is high-temperature output that can be converted into something genuinely useful.
High-temperature heat pumps can capture data center waste heat and boost it to 120 degrees Celsius, suitable for district heating networks that serve homes, offices, and industrial processes. Research from Oak Ridge National Laboratory suggests this approach can reduce carbon dioxide emissions by up to 85 percent compared to natural gas boilers serving the same demand.
This is already happening at scale. Microsoft is supporting district heating projects in Denmark. Equinix is providing heat for thousands of homes in Paris. Google has a major heat recovery programme at its facility in Hamina, Finland. Amazon has announced plans to use waste heat from its Irish data centers to supply a district heating network.
The EU mandate for data centers to quantify and report their Energy Reuse Factor is designed to accelerate this. As the requirement for reused energy rises, particularly in Germany’s market, waste heat integration will shift from a voluntary premium to a baseline expectation.
The Business Case: Green is Profitable
The Business Case: Green is Profitable
The perception that sustainability comes at a cost to financial performance does not survive contact with the actual numbers. The Uptime Institute has found that green data centers outperform traditional ones in five-year total cost of ownership by more than 18 percent. Cooling costs, which typically represent around 40 percent of operating expenses in a traditional facility, drop to around 20 percent in a well-designed green one. Energy cost savings can reach 45 percent.
Capital recovery improves too. CapEx recovery in traditional data centers typically takes 8 to 10 years. In green facilities, the timeframe contracts to 4 to 6 years. Carbon offset revenue can add up to a million dollars per year for well-positioned facilities in active carbon markets.
| Metric | Traditional DC | Green DC |
| PUE (Average) | 1.8 to 2.2 | 1.1 to 1.3 |
| Cooling Cost (% OpEx) | ~40% | ~20% |
| Energy Cost Savings | Baseline | Up to 45% |
| Carbon Offset Revenue | None | Up to $1M/year |
| CapEx Recovery Period | 8 to 10 years | 4 to 6 years |
Goldman Sachs has estimated that the premium for powering data centers with renewables plus storage amounts to roughly 4 percent of 2026 forecasted EBITDA for major hyperscalers. In exchange, operators eliminate 87 to 100 percent of emissions compared to natural gas baselines. For most serious operators, that is a straightforward trade.
How to Build One: A Practical Roadmap
The path from conventional to green data center is well understood, even if executing it takes sustained effort. The sequence matters. Starting with efficiency optimisation before investing in renewables delivers faster return and reduces the overall scale of renewable capacity required.
Site selection is foundational. Cooler climates enable free-air cooling for parts of the year, reducing mechanical cooling needs by up to 40 percent. Proximity to renewable energy sources reduces transmission losses. Grid carbon intensity, water availability, and natural disaster risk all feed into the long-term operational profile of a facility. Low-carbon building materials, including concrete produced with reduced clinker and recycled steel, reduce embodied carbon in the construction itself.
Baseline measurement comes before any investment decisions. Without accurate data on current PUE, WUE, CUE, and carbon emissions from a DCIM system, it is impossible to prioritise interventions or track progress. The discipline of measuring before acting also surfaces quick wins that might otherwise be missed.
Efficiency optimisation follows. Virtualisation, hot and cold aisle containment, cooling system upgrades, and power supply efficiency improvements all reduce waste before a single renewable energy contract is signed. Titanium-rated power supplies operate at 96 percent efficiency compared to 92 percent for Gold-rated ones. These differences compound at scale.
For new AI-focused builds, liquid cooling should be designed in from the outset rather than retrofitted. The architecture of a facility optimised for 40 to 100 kilowatt racks is fundamentally different from one built for 10 kilowatt racks, and trying to adapt legacy infrastructure is expensive and limiting. For existing facilities, direct-to-chip solutions or rear door heat exchangers are the practical path.
Renewable energy strategy needs to move beyond RECs. Long-term PPAs provide cost certainty and credibility. On-site generation plus battery storage adds resilience. SMR partnerships are worth evaluating for any facility with a 10 to 15 year planning horizon.
Waste heat recovery should be built into site planning from the start, particularly in European markets. The infrastructure required to connect to a district heating network is most economically installed during construction, not retrofitted afterward.
Certification rounds out the process. LEED, BREEAM, ISO 50001, ISO 14001, ENERGY STAR, and the Uptime Institute Sustainability Certification validate performance, unlock incentives, and demonstrate credibility to customers, investors, and regulators. In markets like Singapore, Green Mark certification is directly tied to planning approval and grid access.
Where This is Heading
The major technology companies have moved past the point of debating whether to invest in green data center infrastructure. They are racing to secure it. Meta has announced a 600 billion dollar AI data center investment plan that prioritises renewable energy, targeting 15 gigawatts of new clean capacity. Google, Microsoft, and Amazon are competing for SMR contracts, signing large PPAs, and building the supply chains needed to manufacture liquid cooling equipment at scale.
For the rest of the market, the competitive and regulatory environment is tightening. EU reporting requirements are already in force. Stricter mandates are coming. The economics of green infrastructure continue to improve as cooling technology matures and renewable energy costs decline. The gap between the cost of green and conventional data center infrastructure is narrowing, and in many markets it has already closed.
The organisations that move now will have lower energy costs, more reliable access to grid capacity, stronger positions with customers and investors who scrutinise supply chain emissions, and less exposure to regulatory risk as mandates tighten. Those that wait will face a compressed timeline to comply while managing infrastructure that was built for a different era.
The AI revolution is real. Its energy demands are real. The good news is that the solutions exist, the economics make sense, and the roadmap is clear. What remains is the decision to act.
This article was prepared by Sustainnovate.ai, an AI-native ESG compliance and sustainability intelligence platform serving SMEs across the UAE and GCC markets.







