Skip to content

Green Software Engineering: Writing Code That Doesn't Cost the Earth

Green Software Engineering: Writing Code That Doesn't Cost the Earth

I used to think software was inherently clean. No smokestacks, no waste barrels, no tailpipe emissions. Code runs on electricity, electricity comes from the wall, and that's where my mental model stopped. Turns out that wall socket connects to a grid, and that grid is powered by a mix of renewables, gas, coal, and nuclear -- and the mix changes depending on where you are, what time it is, and how windy it's been lately.

Data centers now consume somewhere between 1% and 2% of global electricity. That number is climbing. The IEA estimated data center electricity use could double by 2028, driven largely by AI workloads. For context, that's roughly the electricity consumption of Japan. Every API call, every database query, every CI pipeline run -- they all translate to electrons flowing through servers that generate heat that requires cooling that draws more electricity.

This isn't a guilt trip. It's just physics.

What green software engineering actually is

The Green Software Foundation defines it as "an emerging discipline at the intersection of climate science, software design, electricity markets, hardware, and data center design." Which is a mouthful. In practice, it means writing software that uses less energy, is aware of the carbon intensity of its electricity source, and makes better use of the hardware it runs on.

Three ideas show up repeatedly:

Energy efficiency -- do the same work with fewer compute cycles. This is the one most developers already practice without calling it green. Optimizing a slow query, caching a response, reducing bundle size -- all of these lower the energy your software demands.

Carbon awareness -- not all electricity is equal. A server running at 2 AM in Norway (mostly hydro) has a different carbon footprint than the same server running at 5 PM in Poland (still heavy on coal). Carbon-aware software can shift work to times or regions where the grid is cleaner.

Hardware efficiency -- use the silicon you've already manufactured. Extending the life of servers, maximizing utilization instead of letting machines idle at 10% capacity, writing software that runs well on older hardware. The embodied carbon in manufacturing a server is substantial -- somewhere between 20% and 50% of its lifetime emissions, depending on the study.

The stuff you can actually do

I'll be honest: some green software practices require organizational decisions that individual developers don't control. You can't unilaterally move your company's workloads to a different cloud region. But there's a surprising amount that lives at the code level.

Reduce compute. This is the big one. Every cycle your code doesn't execute is energy it doesn't consume. Caching is the obvious win -- if you computed something once and it hasn't changed, don't compute it again. Algorithm choice matters too. An O(n^2) sort on a dataset that's growing 10x per year isn't just a performance problem, it's an energy problem. Profile your hot paths. The bottleneck you find is also the place where your software wastes the most electricity.

Reduce data transfer. Network hops cost energy -- at the origin, at every router along the path, at the destination. Compress payloads. Use CDNs to move data closer to users. Send only what the client needs instead of over-fetching. GraphQL gets a lot of things wrong, but "ask for exactly the fields you need" is genuinely useful here.

Reduce storage. Every byte you persist sits on a disk that draws power 24/7. Implement data lifecycle policies. Archive what's cold. Delete what's dead. I've worked at companies sitting on terabytes of logs that nobody has looked at in years. That's not "we might need it someday" -- that's electricity burning to maintain data you've forgotten about.

Pick the right rendering strategy. This one is web-specific, but it matters. A statically generated page (SSG) serves from a CDN with near-zero compute per request. A server-rendered page (SSR) spins up a function, hits a database, renders HTML, and sends it back -- every single time. If the content doesn't change per-user, static wins on every axis: performance, cost, and carbon. Edge rendering sits somewhere in the middle -- closer to the user, but still executing code per request.

Lazy load everything you can. Images below the fold, components the user hasn't interacted with, third-party scripts that aren't needed on initial paint. Every byte you defer is energy saved for users who never scroll that far.

Carbon-aware computing

This is the most interesting part to me, because it's counterintuitive. The same computation has different carbon costs depending on when and where it runs.

The idea is straightforward: if you have work that doesn't need to happen right now, wait until the grid is cleaner. Batch jobs, ML training runs, CI pipelines for non-urgent branches, report generation -- all of these can be shifted by hours without anyone noticing. Tools like the Carbon Aware SDK from the Green Software Foundation can query real-time grid carbon intensity and help you schedule accordingly.

Some cloud providers are starting to bake this in. Google has been doing carbon-aware load balancing internally since 2021, shifting work between data centers based on which ones have the cleanest power at any given moment. AWS published a methodology for accounting for carbon in their regions. It's early, but the infrastructure is forming.

The tricky part: latency-sensitive workloads can't wait. Your API response to a user in Mumbai can't be deferred until the Swedish grid has spare wind capacity. Carbon awareness works best for batch and background work.

AI's carbon elephant

We should talk about this because it's the fastest-growing segment of compute, and it's not close.

Training a large language model consumes a staggering amount of energy. The estimates vary, but training GPT-4-class models is often cited in the range of gigawatt-hours -- comparable to the annual electricity consumption of small towns. And that's a one-time cost. Inference -- the cost of actually running the model for each query -- is much cheaper per request, but multiplied across billions of queries it adds up fast.

The thing I find worth thinking about: AI can also reduce carbon. An AI model that optimizes building HVAC systems can save more energy than it took to train. Route optimization for delivery fleets, materials science for better solar cells, predictive maintenance that prevents wasteful emergency replacements -- the applications are real.

It's not a simple "AI bad" or "AI good" equation. It's a question of whether the downstream savings outweigh the upstream costs, and that calculation is different for every use case. What's less defensible is training massive models for trivial applications, or re-training when fine-tuning would do, or running inference on problems that a lookup table could solve.

What requires organizational change

I want to be realistic about what's in your hands and what isn't.

Individual developers can write efficient code, choose lighter frameworks, implement caching, set up proper data retention policies, and pick appropriate rendering strategies. That's real and it adds up.

But the decisions with the biggest carbon impact tend to be organizational. Which cloud provider and region to use. Whether to invest in carbon-aware scheduling infrastructure. Whether to factor energy efficiency into architectural reviews. How aggressively to pursue hardware utilization. Whether to measure carbon at all.

The Green Software Foundation publishes a Software Carbon Intensity (SCI) specification -- a way to express the carbon cost of software as a rate (carbon per unit of work). It's useful because it gives teams a number to optimize against. But adopting it requires buy-in beyond any single engineer.

If you're in a position to advocate for this stuff, the strongest argument isn't environmental -- it's financial. Energy efficiency and cost efficiency are nearly the same thing. Less compute means lower cloud bills. Better utilization means fewer servers. Shorter data retention means less storage spend. The ROI case is easy to make; the carbon reduction comes along for free.

Where to start

If you haven't thought about this before, here's what I'd suggest:

Measure something. Even rough estimates help. The Cloud Carbon Footprint tool can pull emissions data from AWS, GCP, and Azure billing data. Once you see the numbers, the opportunities become obvious.

Pick one thing. Don't try to green your entire stack at once. Find the most wasteful pattern in your codebase -- the N+1 query, the uncompressed images, the nightly cron job that processes the full dataset instead of deltas -- and fix that.

Think in terms of "waste." Most green software practices are also just good engineering. If you're computing something you don't need to compute, transferring data nobody asked for, or storing files no one reads, you've got waste. Eliminating it is good practice regardless of your feelings about climate change.

The framing I keep coming back to: every efficiency gain you make helps the planet and helps your wallet. That kind of alignment is rare. Might as well take advantage of it.