Featured Topics

Featured series.

A series of random questions answered by Harvard experts.

Explore the Gazette

Read the latest.

case study on green computing

Professor tailored AI tutor to physics course. Engagement doubled.

Harvard Postdoctoral Research Fellow Colleen Lanier-Christensen.

Did lawmakers know role of fossil fuels in climate change during Clean Air Act era?

1. Co-lead authors Maxwell Block and Bingtian Ye.

Spin squeezing for all

Microchips.

Craig Dennis/Pexels

Smaller, faster, greener

Leah Burrows

SEAS Communications

Examining the environmental impact of computation and the future of green computing

When you think about your carbon footprint, what comes to mind? Driving and flying, probably. Perhaps home energy consumption or those daily Amazon deliveries. But what about watching Netflix or having Zoom meetings? Ever thought about the carbon footprint of the silicon chips inside your phone, smartwatch or the countless other devices inside your home?

Every aspect of modern computing, from the smallest chip to the largest data center comes with a carbon price tag. For the better part of a century, the tech industry and the field of computation as a whole have focused on building smaller, faster, more powerful devices — but few have considered their overall environmental impact.

Researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) are trying to change that.

“Over the next decade, the demand, number and types of devices is only going to grow,” said Udit Gupta, a Ph.D. candidate in Computer Science at SEAS. “We want to know what impact that will have on the environment and how we, as a field, should be thinking about how we adopt more sustainable practices.”

Gupta, along with Gu-Yeon Wei, the Robert and Suzanne Case Professor of Electrical Engineering and Computer Science, and David Brooks, the Haley Family Professor of Computer Science, will present a paper on the environmental footprint of computing at the IEEE International Symposium on High-Performance Computer Architecture on March 3, 2021.

The SEAS research is part of a collaboration with Facebook, where Gupta is an intern, and Arizona State University.

The team not only explored every aspect of computing, from chip architecture to data center design, but also mapped the entire lifetime of a device, from manufacturing to recycling, to identify the stages where the most emissions occur.

They found that most emissions related to modern mobile and data-center equipment come from hardware manufacturing and infrastructure.

“A lot of the focus has been on how we reduce the amount of energy used by computers, but we found that it’s also really important to think about the emissions from just building these processors,” said Brooks.  “If manufacturing is really important to emissions, can we design better processors? Can we reduce the complexity of our devices so that manufacturing emissions are lower?”

Take chip design, for example.

Today’s chips are optimized for size, performance and battery life. The typical chip is about 100 square millimeters of silicon and houses billions of transistors. But at any given time, only a portion of that silicon is being used. In fact, if all the transistors were fired up at the same time, the device would exhaust its battery life and overheat. This so-called dark silicon improves a device’s performance and battery life but it’s wildly inefficient if you consider the carbon footprint that goes into manufacturing the chip.

“You have to ask yourself, what is the carbon impact of that added performance,” said Wei. “Dark silicon offers a boost in energy efficiency but what’s the cost in terms of manufacturing? Is there a way to design a smaller and smarter chip that uses all of the silicon available? That is a really intricate, interesting, and exciting problem.”

The same issues face data centers. Today, data centers, some of which span many millions of square feet, account for 1 percent of global energy consumption, a number that is expected to grow.

As cloud computing continues to grow, decisions about where to run applications — on a device or in a data center — are being made based on performance and battery life, not carbon footprint.

More like this

case study on green computing

Injectable device delivers nano-view of the brain

case study on green computing

Wyss Institute’s organs-on-chips develops into new company

Para-aramid nanofibers.

Nanofiber protects against extreme temperatures and projectiles

We need to be asking what’s greener, running applications on the device or in a data center,” said Gupta. “These decisions must optimize for global carbon emissions by taking into account application characteristics, efficiency of each hardware device, and varying power grids over the day.”

The researchers are also challenging industry to look at the chemicals used in manufacturing.  

Adding environmental impact to the parameters of computational design requires a massive cultural shift in every level of the field, from undergraduate CS students to CEOs.

To that end, Brooks has partnered with Embedded EthiCS , a Harvard program that embeds philosophers directly into computer science courses to teach students how to think through the ethical and social implications of their work. Brooks is including an Embedded EthiCS module on computational sustainability in “COMPSCI 146: Computer Architecture” this spring.

The researchers also hope to partner with faculty from Environmental Science and Engineering at SEAS and the Harvard University Center for the Environment to explore how to enact change at the policy level.

“The goal of this paper is to raise awareness of the carbon footprint associated with computing and to challenge the field to add carbon footprint to the list of metrics we consider when designing new processes, new computing systems, new hardware, and new ways to use devices. We need this to be a primary objective in the development of computing overall,” said Wei.

The paper was co-authored by Sylvia Lee, Jordan Tse, Hsien-Hsin S. Lee and Carole-Jean Wu from Facebook and Young Geun Kim from Arizona State University.

Share this article

You might like.

case study on green computing

Preliminary findings inspire other large Harvard classes to test approach this fall

Harvard Postdoctoral Research Fellow Colleen Lanier-Christensen.

New study suggests they did, offering insight into key issue in landmark 2022 Supreme Court ruling on EPA

1. Co-lead authors Maxwell Block and Bingtian Ye.

Physicists ease path to entanglement for quantum sensing

You want to be boss. You probably won’t be good at it.

Study pinpoints two measures that predict effective managers

Your kid can’t name three branches of government? He’s not alone. 

Efforts launched to turn around plummeting student scores in U.S. history, civics, amid declining citizen engagement across nation

Good genes are nice, but joy is better

Harvard study, almost 80 years old, has proved that embracing community helps us live longer, and be happier

MIT Technology Review

  • Newsletters

The power of green computing

Sustainable computing practices have the power to both infuse operational efficiencies and greatly reduce energy consumption, says Jen Huffstetler, chief product sustainability officer at Intel.

  • MIT Technology Review Insights archive page

case study on green computing

In partnership with Intel

When performing radiation therapy treatment, accuracy is key. Typically, the process of targeting cancer-affected areas for treatment is painstakingly done by hand. However, integrating a sustainably optimized AI tool into this process can improve accuracy in targeting cancerous regions, save health care workers time, and consume 20% less power to achieve these improved results. This is just one application of sustainable-forward computing that can offer immense improvements to operations across industries while also lowering carbon footprints.

Investments now in green computing can offer innovative outcomes for the future, says chief product sustainability officer and vice president and general manager for Future Platform Strategy and Sustainability at Intel, Jen Huffstetler. But transitioning to sustainable practices can be a formidable challenge for many enterprises. The key, Huffstetler says, is to start small and conduct an audit to understand your energy consumption and identify which areas require the greatest attention. Achieving sustainable computing requires company-wide focus from CIOs to product and manufacturing departments to IT teams.

"It really is going to take every single part of an enterprise to achieve sustainable computing for the future," says Huffstetler.

Emerging AI tools are on the cutting edge of innovation but often require significant computing power and energy. "As AI technology matures, we're seeing a clear view of some of its limitations," says Huffstetler. "These gains have near limitless potential to solve large-scale problems, but they come at a very high price."

Mitigating this energy consumption while still enabling the potential of AI means carefully optimizing the models, software, and hardware of these AI tools. This optimization comes down to focusing on data quality over quantity when training models, using evolved programming languages, and turning to carbon-aware software.

As AI applications arise in unpredictable real-world environments with energy, cost, and time constraints, new approaches to computing are necessary.

This episode of Business Lab is produced in partnership with Intel.

Full Transcript

Laurel Ruma: From MIT Technology Review, I'm Laurel Ruma and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace. Our topic today is building an AI strategy that's sustainable, from supercomputers, to supply chain, to silicon chips. The choices made now for green computing and innovation to make a difference for today and the future. Two words for you: sustainable computing. My guest is Jen Huffstetler. Jen is the chief product sustainability officer and vice president and general manager for Future Platform Strategy and Sustainability at Intel. This podcast is produced in partnership with Intel.

Welcome, Jen.

Jen Huffstetler : Thanks for having me, Laurel.

Laurel : Well, Jen, a little bit of a welcome back. You studied chemical engineering at MIT and continue to be involved in the community. So, as an engineer, what led you to Intel and how has that experience helped you see the world as it is now?

Jen: Well, as I was studying chemical engineering, we had lab class requirements, and it so happened that my third lab class was microelectronics processing. That really interested me, both the intricacy and the integration of engineering challenges in building computer chips. It led to an internship at Intel. And I've been here ever since.

And what I really love about it is we are always working on the future of compute. This has shaped how I see the world, because it really brings to life how engineers, the technology that is invented can help to advance society, bringing access to education globally, improving healthcare outcomes, as well as helping to shape work overall. As we were able to move to this pandemic world, that was all technology infrastructure that helped to enable the world to continue moving forward while we were facing this pandemic.

Laurel: That's really great context, Jen. So, energy consumption from data infrastructure is outpacing the overall global energy demand. As a result, IT infrastructure needs to become more energy efficient. So, what are the major challenges that large-scale enterprises are facing when developing sustainability strategies?

Jen: Yeah, when we survey IT leaders [1]  , we find that 76% believe that there is a challenge in meeting their energy efficiency goals while increasing performance to meet the needs of the business. In fact, 70% state that sustainability and compute performance are in direct conflict. So, we don't believe they have to be in conflict if you're really truly utilizing the right software, the right hardware, and the right infrastructure design. Making operations more sustainable, it can seem daunting, but what we advise enterprises as they're embarking on this journey is to really do an audit to survey where the biggest area of impact could be and start there. Not trying to solve everything at once, but really looking at the measurement of energy consumption, for an example in a data center today, and then identifying what's contributing the most to that so that you can build projects and work to reduce in one area at a time.

And what we like to say is that sustainability, it's not the domain of any one group at a company. It really is going to take every single part of an enterprise to achieve sustainable computing for the future. That includes of course, the CIOs with these projects to focus on reducing the footprint of their computing profile, but also in design for product and manufacturing companies, making sure that they're designing and architecting for sustainability, and throughout the overall operations to ensure that everyone is reducing consumption of materials, whether it's in the factory, the number of flights that a marketing or sales team is taking, and beyond.

Laurel: That's definitely helpful context. So technologies like AI require significant computing power and energy. So, there's a couple questions around that. What strategies can be deployed to mitigate AI's energy consumption while also enabling its potential? And then how can smart investment in hardware help with this?

Jen: This is a great question. Technologies like you mentioned, like AI, they can consume so much energy. It's estimated that the ChatGPT-3 model consumes 1.28 gigawatt hours of electricity, and that's the same as the consumption for 120 US homes for a year. So, this is mind-boggling.

But one of the things that we think about for AI is there's the training component and the inference component. You think about a self-driving car, and you train the model once and then it's running on up to a hundred million cars, and that's the inference. And so what we actually are seeing is that 70 to 80% of the energy consumption, or two to three x the amount of power is going to be used running inference as it can be to train the model. So, when we think about what strategies can be employed for reducing the energy consumption, we think about model optimization, software optimization, and hardware optimization, and you can even extend it to data center design.

They're all important, but starting with model optimization, the first thing that we encourage folks to think about is the data quality versus data quantity. And using smaller data sets to train the model will use significantly less energy. In fact, some studies show that many parameters within a trained neural network can be pruned by as much as 99% to yield a much smaller, a sparser network, and that will lower your energy consumption.

Another thing to consider is tuning the models for a lower accuracy of intake. And an example of this is something we call quantization, and this is a technique to reduce your computational and your memory costs of running inference, and that's by representing the weights and the activations with lower precision data types, like an 8-bit integer instead of a 32-bit floating point. So, those are some of the ways that you can improve the model, but you can also improve them and lower their energy costs by looking at domain-specific models. Instead of reinventing the wheel and running these large language models again and again, if you, for example, have already trained a large model to understand language semantics, you can build a smaller one that taps into that larger model's knowledge base and it will result in similar outputs with much greater energy efficiency. We think about this as orchestrating an ensemble of models. Those are just a couple of the examples. We can get more into the software and hardware optimization as well.

Laurel: Yeah, actually maybe we should stay on that a bit, especially considering how energy intensive AI is. Is there also a significant opportunity for digital optimization with software, as you mentioned? And then you work specifically with product sustainability, so then how can that AI be optimized across product lines for efficiency for software and hardware? Because you're going to have to think about the entire ecosystem, correct?

Jen: Yeah, that's right. This is really an area where I think in the beginning of computing technology, you think about the very limited resources that were available and how tightly integrated the coding had to be to the hardware. You think about the older programming languages, assembly languages, they really focused on using the limited resources available in both memory and compute.

Today we've evolved to these programming languages that are much more abstracted and less tightly coupled, and so what leaves is a lot of opportunity to improve the software optimization to get better use out of the hardware that you already have that you're deploying today. This can provide tremendous energy savings, and sometimes it can be just through a single line of code. One example is Modin, an open source library which accelerates Pandas applications, which is a tool that data scientists and engineers utilize in their work. This can accelerate the application by up to 90x and has near infinite scaling from a PC to a cloud. And all of that is just through a single line of code change.

There's many more optimizations within open source code for Python, Pandas, PyTorch, TensorFlow, and Scikit. This is really important that the data scientists and engineers are ensuring that they're utilizing the most tightly coupled solution. Another example for machine learning on Scikit is through a patch, or through an Anaconda distribution, you can achieve up to an 8x acceleration in the compute time while consuming eight and a half times less energy and 7x less energy for the memory portions. So, all of this really works together in one system. Computing is a system of hardware and software.

There's other use cases where when running inference on a CPU, there are accelerators inside that help to accelerate AI workloads directly. We estimate that 65% to 70% of inference is run today on CPUs, so it's critical to make sure that they're matching that hardware workload, or the hardware to the workload that you want to run, and make sure that you're making the most energy-efficient choice in the processor. The last area around software that we think about is carbon-aware computing or carbon-aware software, and this is a notion that you can run your workload where the grid is the least carbon-intensive. To help enable that, we've been partnering with the Green Software Foundation to build something called the Carbon Aware SDK, and this helps you to use the greenest energy solutions and run your workload at the greenest time, or in the greenest locations, or both. So, that's for example, it's choosing to run when the wind is blowing or when the sun is shining, and having tools so that you are providing the insights to these software innovators to make greener software decisions. All of these examples are ways to help reduce the carbon emissions of computing when running AI.

Laurel : That's certainly helpful considering AI has emerged across industries and supply chains as this extremely powerful tool for large-scale business operations. So, you can see why you would need to consider all aspects of this. Could you explain though how AI is being used to improve those kind of business and manufacturing productivity investments for a large-scale enterprise like Intel?

Jen: Yeah. I think Intel is probably not alone in utilizing AI across the entirety of our enterprise. We're almost two companies. We have a very large global manufacturing operations that is both for the Intel products, which is sort of that second business, but also a foundry for the world's semiconductor designers to build on our solutions. When we think of chip design, our teams use AI to do things like IP block placement. So, they are looking at grouping the logic, the different types of IP. And when you place those cells closer together, you're not only lowering cost and the area of silicon manufacturing that lowers your embodied carbon for a chip, but it also enables a 50% to 30% decrease in the timing or the latency between the communication of those logic blocks, and that accelerates processing. That'll lower your energy costs as well.

We also utilize AI in our chip testing. We've built AI models to help us to optimize what used to be thousands of tests and reducing them by up to 70%. It saves time, cost, and compute resources, which as we've talked about, that will also save energy.

In our manufacturing world we use AI and image processing to help us test a 100% of the wafer, detect up to 90% of the failures or more. And we're doing this in a way that scales across our global network and it helps you to detect patterns that might become future issues. All of this work was previously done with manual methods and it was slow and less precise. So, we're able to improve our factory output by employing AI and image processing techniques, decreasing defects, lowering the waste, and improving overall factory output.

We as well as many partners that we work with are also employing AI in sales techniques where you can train models to significantly scale your sales activity. We're able to collect and interpret customer and ecosystem data and translate that into meaningful and actionable insights. One example is autonomous sales motions where we're able to offer a customer or partner the access to information, and serving that up as they're considering their next decisions through digital techniques, no human interventions needed. And this can have significant business savings and deliver business value to both Intel and our customers. So, we expect even more use at Intel, touching almost every aspect of our business through the deployment of AI technologies.

Laurel : As you mentioned, there's lots of opportunities here for efficiencies. So, with AI and emerging technologies, we can see these efficiencies from large data centers to the edge, to where people are using this data for real-time decision making. So, how are you seeing these efficiencies actually in play?

Jen: Yeah, when I look at the many use cases from the edge, to an on-prem enterprise data center, as well as to the hyperscale cloud, you're going to employ different techniques, right? You've got different constraints at the edge, both with latency, often power, and space constraints. Within an enterprise you might be limited by rack power. And the hyperscale, they're managing a lot of workloads all at once.

So, starting first with the AI workload itself, we talked about some of those solutions to really make sure that you're optimizing the model for the use case. There's a lot of talk about these very large language models, over a hundred billion parameters. Every enterprise use case isn't going to need models of that size. In fact, we expect a large number of enterprise models to be 7 billion parameters, but using those techniques we talked about to focus it on answering the questions that your enterprise needs. When you bring those domain specific models in play, they can run on even a single CPU versus this very large-scale dedicated accelerator clusters. So, that's something to think about when you're looking at, what's the size of the problem I'm trying to solve, where do I need to train it, how do I need to run the inference, and what's the exact use case? So, that's the first thing I would take into account. The second thing is, as energy becomes ever more a constraint across all of those domains, we are looking at new techniques and tools in order to get the most out of the energy that's available to that data center, to that edge location. Something that we are seeing an increasing growth and expecting it to grow ever more over time is something called liquid cooling. Liquid cooling is useful at edge use cases because it is able to provide a contained solution, where sometimes you've got more dust, debris, particles, you think about telco or base stations that are out in very remote locations. So, how can you protect the compute and make it more efficient with the energy that's available there?

We see the scaling both through enterprise data centers all the way up to large hyperscale deployments because you can reduce the energy consumption by up to 30%, and that's important when today up to 40% of the energy in a data center is used to keep it cool. So, it's kind of mind boggling the amount of energy or inefficiency that's going into driving the compute. And what we'd love to see is a greater ratio of energy to compute, actually delivering compute output versus cooling it. And that's where liquid cooling comes in.

There's a couple of techniques there, and they have different applications, as I mentioned. Immersion's actually one that would be really useful in those environments where it's very dusty or there's a lot of pollution at the edge where you've got a contained system. We're also seeing cold plate or direct to chip. It's already been in use for well over a decade in high performance computing applications, but we're seeing that scale more significantly in these AI cluster buildouts because many data centers are running into a challenge with the amount of energy they're able to get from their local utilities. So, to be able to utilize what they have and more efficiently, everyone is considering how am I going to deploy liquid cooling?

Laurel: That's really interesting. It certainly shows the type of innovation that people are thinking about constantly. So, one of those other parts of innovation is how do you think about this from a leadership perspective? So, what are some of those best practices that can help an enterprise accelerate sustainability with AI?

Jen : Yeah, I think just to summarize what we've covered, it's emphasizing that data quality over quantity, right? The smaller dataset will require less energy. Considering the level of accuracy that you really need for your use case. And again, where can you utilize that INT8 versus those compute intensive FP32 calculations. Leveraging domain-specific models so that you're really right sizing the model for the task. Balancing your hardware and software from edge to cloud, and within a more heterogeneous AI infrastructure. Making sure that you're using the computing chip set that's necessary to meet your specific application needs. And utilizing hardware accelerators where you can to save energy both in the CPU as well. Utilizing open source solutions where there's these libraries that we've talked about, and toolkits, and frameworks that have optimizations to ensure you're getting the greatest performance from your hardware. And integrating those concepts of carbon-aware software.

Laurel: So, when we think about how to actually do this, Intel is actually a really great example, right? So, Intel's committed to reaching net zero emissions in its global operations by 2040. And the company's cumulative emissions over the last decade are nearly 75% lower than what they would've been without interim sustainability investments. So, then how can Intel's tools and products help other enterprises then meet their own sustainability goals? I'm sure you have some use case examples.

Jen: Yeah, this is really the mission I'm on, is how can we help our customers lower their footprint? One of the first things I'll just touch upon is, because you mentioned our 2040 goals, is that our data center processors are built with 93% renewable electricity. That immediately helps a customer lower their Scope 3 emissions. And that's part of our journey to get to sustainable compute.

There's also embedded accelerators within the Xeon processors that can deliver up to 14x better energy efficiency. That's going to lower your energy consumption in data center no matter where you've deployed that compute. And of course, we have newer AI accelerators like Intel Gaudi, and they really are built to maximize the training and inference throughput and efficiency up to 2x over competing solutions. Our oneAPI software helps customers to take advantage of those built-in accelerators with solutions like an analytics toolkit and deep learning neural network software with optimized code.

We take all those assets, and just to give you a couple of customer examples, the first would be SK Telecom. This is the largest mobile operator in South Korea, 27 million subscribers. They were looking to analyze the massive amount of data that they have and really to optimize their end-to-end network AI pipeline. So, we partnered with them, utilizing the hardware and software solutions that we've talked about. And by utilizing these techniques, they were able to optimize their legacy GPU based implementation by up to four times, and six times for the deep learning training and inference. And they moved it to just a processor-based cluster. So, this really, it's just an example where when you start to employ the hardware and the software techniques, and you utilize everything that's inside the solution in the entire pipeline, how you can tightly couple the solution. And it doesn't need to be this scaled out dedicated accelerator cluster. So, anyway, that's one example. We have case studies.

Another one that I really love is with Siemens Healthineers. So, this is a healthcare use case. And you can envision for radiation therapy, you need to really be targeted where you're going to put the radiation in the body, that it's just hitting the organs that are being affected by the cancer. This contouring of the organs to target the solution was previously done by hand. And when you bring AI into the workflow, you're not only saving healthcare workers' time, of which we know that's at a premium since there's labor shortages throughout this industry, that they were able to improve the accuracy, improve the image generation 35 times faster, utilizing 20% less power, and enabling those healthcare workers to attend to the patients.

The last example is an intercom global telecommunication system provider with KDDI, which is Japan's number one telecom provider. They did a proof of concept on their 5G network using AI to predict the network traffic. By looking at their solutions, they were able to scale back the frequency of the CPUs that were used and even idling them when not needed. And they were able to achieve significant power savings by employing those solutions. These are just ways where you can look at your own use cases, making sure that you're meeting your customer SLAs or service level agreements, as is very critical in any mobile network, as all of us being consumers of that mobile network agree. We don't like it when that network's down. And these customers of ours were able to deploy AI, lower their energy consumption of their compute, while meeting their end use case needs.

Laurel: So Jen, this has been a great conversation, but looking forward, what are some product and technology innovations you're excited to see emerge in the next three to five years?

Jen: Yeah, outside of the greater adoption of liquid cooling, which we think is foundational for the future of compute. In the field of AI, I'm thinking about new architectures that are being pioneered. There's some at MIT, as I was talking to some of the professors there, but we also have some in our own labs and pathfinding organizations. One example is around neuromorphic computing. As AI technology matures, we're seeing a clear view of some of its limitations. These gains have near limitless potential to solve large-scale problems, but they come at a very high price, as we talked about with the computational power, the amount of data that gets pre-collected, pre-processed, et cetera. So, some of these emerging AI applications arise in that unpredictable real world environment, and as you talked about some of those edge use cases. There could be power latency or data constraints, and that requires fundamentally new approaches. Neuromorphic computing is one of those, and it represents a fundamental rethinking of computer architecture down to the transistor level. And this is inspired by the form and the function of our human biological neural networks in our brains. It departs from those familiar algorithms and programming abstractions of conventional computing to unlock orders of magnitude gains in efficiency and performance. It can be up to 1,000x. I've even seen use cases of 2,500x energy efficiency over traditional compute architectures. We have the Loihi research processor that incorporates these self-learning capabilities, novel neuro models, and asynchronous spike-based communication. And there is a software community that is working to evolve the use cases together on this processor. It consumes less than a watt of power for a variety of applications. So, it's that type of innovation that really gets me excited for the future.

Laure l: That's fantastic, Jen. Thank you so much for joining us on the Business Lab.

Jen: Thank you for having me. It was an honor to be here and share a little bit about what we're seeing in the world of AI and sustainability.

Laurel: Thank you.

That was Jen Huffstetler, the chief product sustainability officer and vice president and general manager for Future Platform Strategy and Sustainability at Intel, whom I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review. That's it for this episode of Business Lab. I'm your host, Laurel Ruma. I'm the Global Director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology. And you can find us in print on the web and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com. This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you'll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Climate change and energy

""

Google, Amazon and the problem with Big Tech’s climate claims

How companies reach their emissions goals is more important than how fast.

  • James Temple archive page

Mike Schroepfer speaking with James Temple at ClimateTech 2023

From Meta CTO to climate tech investor: Mike Schroepfer on his big pivot

‘Schrep’ talks geoengineering, glaciers, and where he draws the line between climate philanthropy and investment.

closeup of a pile of assorted plastic items in a heap

Here’s the problem with new plastic recycling methods

Technology is giving us more options for plastic waste, but new methods are still far from perfect.

  • Casey Crownhart archive page

High voltage substation using SF6 circuit breakers

The race to replace the powerful greenhouse gas that underpins the power grid

Sulfur hexafluoride is crucial for high-voltage equipment, but it can trap heat in the atmosphere for 1,000 years or more. And emissions are on the rise.

Stay connected

Get the latest updates from mit technology review.

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at [email protected] with a list of newsletters you’d like to receive.

Twenty years ago, coding had boundaries. Bandwidth restrictions and limited processing power forced developers to always be mindful of the length and complexity of their code. But as technology enabled greater innovation, programmers were no longer constrained by size.

For example, greater computing power allowed faster processing of large files and applications. Open-source libraries and frameworks allowed software engineers to reuse pieces of code in their projects, creating greater possibilities. This also led to programs with more lines of code—and more processing power required to parse it. The unintended consequence was greater energy usage and a higher global electricity demand.

As companies look to transform business and implement more sustainable practices, they’re digging deep into established processes to find new efficiencies. This includes evaluating the building blocks of their business operations, from storing data more efficiently to examining how code is written.

In this post, we’ll explore how green coding helps organizations find innovative ways to prioritize sustainability and reach their energy reduction goals.

Green coding is an environmentally sustainable computing practice that seeks to minimize the energy involved in processing lines of code and, in turn, help organizations reduce overall energy consumption. Many organizations have set greenhouse emission reduction goals to respond to the climate change crisis and global regulations; green coding is one way to support these sustainability goals.

Green coding is a segment of green computing, a practice that seeks to limit technology’s environmental impact, including reducing the carbon footprint in high-intensity operations, such as in manufacturing lines,  data centers  and even the day-to-day operations of business teams. This larger green computing umbrella also includes green software—applications that have been built using green coding practices.

Advances in technology—from big data to  data mining —have contributed to a massive increase in energy consumption in the information and communications technology sector. According to the  Association for Computing Machinery  (link resides outside ibm.com), annual energy consumption at data centers has doubled over the past decade. Today, computing and IT are responsible for between 1.8% and 3.9% of global greenhouse gas emissions.

To fully understand how green coding can reduce energy consumption and greenhouse gas emissions, it helps to dive into the energy consumption of software:

  • Infrastructure:  The physical hardware, networks and other elements of an  IT infrastructure  all require energy to run. Within any organization, there are likely areas where the computing infrastructure is overly complicated or overprovisioned, which results in inefficient energy use.
  • Processing:  Software consumes energy as it runs. The more complicated the software or the larger the file, the more processing time it takes and the more energy it consumes.
  • DevOps:  In the typical coding process, developers write lines of code, which are parsed and processed through a device. The device requires energy, which unless powered by 100% renewable energy, creates carbon emissions. The more code to process, the more energy the device consumes and the higher the level of emissions.

Recent research into the speed and energy use of different programming languages found that  C was the most efficient in speed  (link resides outside ibm.com), reducing energy and memory usage and providing another potential opportunity for energy savings. However, there is still some debate in terms of how this is realized and which metrics should be used to evaluate energy savings.

Green coding begins with the same principles that are used in traditional coding. To reduce the amount of energy needed to process code, developers can adopt less energy-intensive coding principles into their  DevOps  lifecycle.

The “lean coding” approach focuses on using the minimal amount of processing needed to deliver a final application. For example, website developers can prioritize reducing file size (e.g., switching high-quality media with smaller files). This not only accelerates website load times, but also improves the user experience.

Lean coding also aims to reduce code bloat, a term used to refer to unnecessarily long or slow code that is wasteful of resources. Open-source code can be a contributing factor to this software bloat. Because open-source code is designed to serve a wide range of applications, it contains a significant amount of code that goes unutilized for the specific software. For example, a developer may pull an entire library into an image, yet only need a fraction of the functionality. This redundant code uses additional processing power and leads to excess carbon emissions.

By adopting lean coding practices, developers are more likely to design code that uses the minimal amount of processing, while still delivering desired results.

The principles of green coding are typically designed to complement existing IT sustainability standards and practices used throughout the organization. Much like implementing sustainability initiatives in other areas of the organization, green coding requires both structural and cultural changes.

Structural changes

  • Improving energy use at the core:  Multi-core processor-based applications can be coded to increase energy efficiency. For example, code can directly instruct processors to shut down and restart within microseconds instead of using default energy saving settings that might not be as efficient.
  • Efficiency in IT : Sometimes referred to as green IT or green computing, this methodology aims for resource optimization and workload consolidation to reduce energy use. By optimizing IT infrastructure through use of modern tools like  virtual machines  (VMs) and  containers , organizations can reduce the number of physical servers needed for operations, which in turn, reduces energy consumption and carbon intensity.
  • Microservices:   Microservices  are an increasingly popular approach to building applications that break down complicated software into smaller elements, called services. These smaller services are called upon only when needed, instead of running a large monolithic program as a whole. The result is that applications run more efficiently.
  • Cloud-based DevOps : Applications running on  distributed cloud  infrastructure cut the amount of data transported over the network and the network’s overall energy use.

Cultural changes

  • Empower management and employees:  Change is only effective when employees and management are on board. Encouraging adoption with consistent messaging to the entire  DevOps  team helps support the sustainability agenda and makes people feel like they are part of the solution.
  • Encourage innovation:  DevOps teams are often driven by the desire to innovate and create solutions to big problems. Encourage teams to look for new ways to use data insights, collaborate with partners and take advantage of other energy-saving opportunities.
  • Stay focused on outcomes:  Problems will arise when implementing new initiatives like green coding. By anticipating challenges, companies can deal with problems that arise more easily.

Beyond the energy-saving benefits, companies may also find there are additional advantages to green coding practices, including the following:

  • Reduced energy costs:  It’s the simple principle of use less, spend less. With the increasingly volatile price of energy, organizations want to reduce the amount they spend on power not just for environmental sustainability, but also to maintain the sustainability of the business.
  • Accelerated progress toward sustainability goals:  Most organizations today have net zero emission goals or strategic initiatives to reduce emissions to increase sustainability. Green coding moves organizations closer to reaching this goal.
  • Higher earnings:  CEOs that implement sustainability and digital transformation initiatives, such as green coding, report a higher average operating margin than their peers, according to the  IBM 2022 CEO Study .
  • Better development discipline:  Using green coding empowers programmers to simplify elaborate infrastructures and can ultimately save time, reducing the amount of code software engineers write.

To find out more about IBM and green coding, start with the white paper from the Institute for Business Value:  IT sustainability beyond the data center .

This white paper investigates how software developers can play a pivotal role in promoting responsible computing and green IT, discusses four major sources of emissions from IT infrastructure, and looks at how to fulfill the promise of green IT with hybrid cloud.

Infrastructure optimization is an important way to reduce your carbon footprint through better resource utilization. One of the fastest ways to make an impact on energy efficiency is to configure resources automatically to reduce energy waste and carbon emissions.  IBM Turbonomic Application Resource Management  is an IBM software platform that can automate critical actions that proactively deliver the most efficient use of compute, storage and network resources to your apps at every layer of the stack continuously—in real-time—without risking application performance.

When applications consume only what they need to perform, you can increase utilization, reduce energy costs and carbon emissions, and achieve continuously efficient operations. Customers today are seeing up to 70% reduction in growth spend avoidance by leveraging IBM Turbonomic to better understand application demand. Read the latest  Forrester TEI study  and learn how IT can impact your organization’s commitment to a sustainable IT operation while assuring application performance in the data center and in the cloud.

A final critical way to promote green computing is to choose energy efficient IT infrastructure for on-prem and cloud data centers. For example,  IBM LinuxONE Emperor 4  servers can reduce energy consumption by 75% and space by 50% for the same workloads on x86 servers/. Containerization, interpreter/compiler optimization and hardware accelerators can then reduce energy needs further through green coding.

Learn more about LinuxONE and sustainability.

Google’s Green Computing, Efficiency at Scale

August 2011

What Is Green Computing?

Cutting the carbon emissions and energy consumption of data centers and computing technologies is key to reducing tech’s environmental footprint.

Alexandria Jacobson

Green computing, also known as greentech or green IT, is the environmentally conscious design, manufacturing, use and disposal of computers and related technologies in a manner that minimizes environmental impact.

Green Computing Definition

Green computing refers to the conscious design, manufacturing, use and disposal of computers and related technologies in a way that reduces carbon emissions and energy consumption.

Thanks to the passage of the Inflation Reduction Act and the CHIPS and Science Act in 2022, and the Artificial Intelligence Environmental Impacts Act in 2024, green computing stands to become one of the more pressing topics for companies. The United States looks to invest several billion dollars into zero-carbon technologies, clean energy research and reducing greenhouse gas emissions.

Even so, “there’s no silver bullet,” said Soudip Roy Chowdhury, founder and CEO of Eugenie.ai , an AI-powered platform supporting industrial companies in achieving sustainable operations. “Multiple initiatives together will help the world to achieve net zero.”

Related 22 Environmental Companies Building a Better World

Green computing is a set of practices that involve using computers and their components in an environmentally friendly way.

It includes initiatives like a company reducing their climate impacts and minimizing their carbon footprint, the specific actions of which can range considerably.

“It’s not just what temperature we keep the data center or if there’s enough diesel to backup the generators if the power goes out,” said George Burns, senior consultant of cloud operations at SPR , a technology modernization firm. “It’s so many other things, all the way down to the code that we feed in.”

According to Sreejit Roy, green IT offering leader and senior partner at IBM , green computing is all about the reduction of power consumption for compute, network and storage. 

“If we are able to reduce the power consumption of these three components that comprise the whole — from the data center to the software we write — [that] is what we call green computing,” Roy said. 

Why Is Green Computing Important? 

Green computing is important for the environment — and the companies that practice it — for a few reasons:

Reduces Technology’s Ecological Footprint

Green computing stands to reduce technology’s impact on climate change and can aid in net-zero efforts. It emphasizes lowering technologies’ emissions and their reliance on fossil fuels, as well as recycling electronics and e-waste properly.

Reduces Energy Costs

Implementing green-computing solutions and energy-efficient technologies can lower electricity usage over time. In turn, this will also reduce energy costs for businesses and users in the long run.

Boosts Company Reputation 

Companies that embrace green IT stand to have better reputational success. Advocating for environmental sustainability can help brand a company as socially responsible and empathetic toward global issues. This can also lead to increased interest from job candidates and retention of existing employees.

Prepares Companies for Regulations and Compliance

Companies that already implement green computing technologies in their work may have an easier time complying with regulations around technology emissions in the future. Plus, it could save companies money on having to set up new technologies and processes if additional regulations arise.

Related Greenwashing: What It Is and How to Avoid It

Green Computing Examples 

A green computing strategy may involve all or some of the following actions:

Updating Tech and Architecture

Much of green computing looks like updating technologies to be more efficient and consume less energy.

“I can change from the old servers to the new servers, or I can change the architecture of the servers, or I can change what goes into the server from what we’re calling as monolithic to serverless, which are different kinds of architectures, which will significantly reduce the carbon footprint,” Roy said. “If nothing else, companies can simply take their workloads and move it to cloud.”

Using Cloud Storage

Roy estimates that transitioning operations to the cloud can help some tech companies improve their carbon footprint by as much as 60 to 80 percent. Yet, definitively measuring the environmental impact of cloud servers, especially private ones, is challenging. 

Storing Data Locally

For companies with employees across the globe, using networks that store data locally can save energy. For example, Roy, who is based in Bangalore, India, expects that when accessing data in the United States, it takes two or three hops across network devices to get the data. 

“The moment I start designing my architecture, designing my IT systems in a way that I do not go to those multiple hops, I’m consuming less network, and hence again, consuming less power, and hence again, consuming less energy, and hence less CO2,” Roy said. 

Creating Digital Twins to Identify Inefficiencies

Companies can create a digital twin , which is a real-time virtual representation of a system used for simulation and decision making. The digital twin can help identify areas where businesses can reduce waste and improve production.

“We simulated a scenario like what if you change this parameter by X with different numbers,” Chowdhury said. “Think of it like you are seeing the future. Whatever you change any of these parameters … that helps them to really understand at real time what the right setup for set points [should be].”

Using Virtual Machines

The creation of multiple virtual machines on a single computer can reduce energy consumption — the more densely an infrastructure is virtualized, the less carbon emissions released. The IBM research team offers a proprietary tool that can provide such insight to companies. 

“The moment I’m able to get at the virtual machine level, then I know by mapping it, which application is consuming more energy, and I can then focus on those applications, or I can focus on those unused VMs to reduce my power consumption,” Roy said. “That is an actionable insight. And unfortunately, not too many of the companies are able to provide that actionable insight. 

Analyzing Application Performance

Burns encourages companies to analyze their applications’ performance and not stick with a software just because the company has invested in or even built it. Maintaining outdated or inefficient applications is wasteful and a prime example of sunk cost fallacy .

“Once you start breaking the code down, you break down the compute. You break down the storage. That’s a big part of it. Application modernization will have a huge effect on resource consumption for computing,” Burns said. “It’s time to decouple them. If you are looking for what is the low hanging fruit that I can grab onto easiest to make a green investment and green return for my organization? It starts with application modernization.”

Applying Edge Computing

Edge computing , an IT architecture that brings computing and data storage closer to the data source, is a potential green computing practice, which can contribute to a reduction in energy consumption.

“Locality of what we do becomes a big part of how we consume computing resources,” Burns said. “We consume the network differently. Those consumers are where a lot of our resource consumption occurs still, so we move things to the edge so that they consume less power.”

Related Looking to Go Green? Start With Modernizing Your Company’s Systems.

Challenges of Green Computing

While green computing can sustain the environment and save long-term business costs, it can also present challenges, including:

Costs of Replacing Equipment

Replacing existing technologies with green computing solutions can be costly. Companies may have to make a substantial investment upfront to make the switch to green equipment, and likely won’t see a return on investment right away. Overhauling existing systems can also disrupt company processes and require training for new technologies that are implemented.

Lack of Green Computing Awareness and Expertise

Computing and the IT industry’s impact on the environment aren’t often considered in comparison to industries like energy or transportation. As such, tech companies may not be aware of the role their products and operations play in affecting the planet, and can lack the expertise needed to prioritize and implement green computing practices.

Environmental Impact Is Hard to Measure

Much of the challenge around green computing is in measuring and reporting a company’s environmental impact, Chowdhury said. The industry needs a “consolidated data approach,” where not just one, but all companies within a sector, know where the industry benchmark is. It’s important so that companies can measure and work toward a goal in a sustainable way.

Public cloud services offer clearer measurements of how much power is being consumed, but Roy estimates that very few companies operate entirely on a public cloud. 

“The challenge is identifying, first of all, how much data, how much carbon is being pulled, what the carbon footprint of that nonpublic cloud portion is,” Roy said. “The measurement of that is not matured enough at all.”

Companies Can Be Resistant to Change

Ultimately, green computing requires a cultural shift to become a part of tech companies’ business practices.

“That cultural change is what we think is extremely important going forward for all those organizations,” Roy said. “They have to put sustainability as what we’re calling NFR, or a nonfunctional requirement, into everything they do, from architecture, to design, to develop, to test and deploy. If they do not do that, going forward, they are going to be in big trouble going forward.” 

History of Green Computing

Green computing entered the mainstream in 1992 when the U.S. Environmental Protection Agency (EPA) launched the Energy Star program. Directed in partnership with the U.S. Department of Energy (DOE), the Energy Star program uses set standards to determine if consumer electronics and commercial equipment are energy-efficient , and recognize such products by labeling them as Energy Star-certified. Following the enactment of the Energy Policy Act in 2005 by the U.S. Congress, the EPA and DOE were directed to promote Energy Star and implement a program that identified energy-efficient products and buildings.

In 2006, the Global Electronics Council debuted the Electronic Product Environmental Assessment Tool (EPEAT), which became an influential tool in green computing regulation. EPEAT acts as an online registry for identifying electronic products that meet specific environmental assessment criteria, including those addressing materials used, product longevity design and supply chain greenhouse gas emissions. 

Today, green computing standards and benchmarking metrics have been proposed by various organizations to help guide green computing initiatives. For example, the International Organization for Standardization (ISO) established ISO 14000 and ISO 50001 , standards for environmental management systems and energy management systems.

Frequently Asked Questions

What is green computing.

Green computing is the practice of using computers and technologies in an environmentally conscious and energy-efficient manner.

Green computing aims to reduce carbon emissions, save energy and minimize the use of hazardous material in manufacturing to protect the planet.

What are some examples of green computing?

Some examples of green computing include:

  • Updating technologies and architecture to consume less energy 
  • Using cloud or local data storage 
  • Utilizing virtualization and virtual machine technology
  • Analyzing and identifying gaps in application performance

Recent Greentech Articles

42 Environmental Companies Building a Sustainable World

case study on green computing

  • IT efficiency and sustainability

Case study: Coca-Cola embarks on green IT strategy

Coca-cola is aiming to reduce the carbon footprint of its it. kevin sirjuesingh, director of it strategic initiatives, discusses the plan.

Cliff Saran

  • Cliff Saran, Managing Editor

Coca-Cola Enterprises is aiming to reduce the carbon footprint of its IT. Kevin Sirjuesingh, the company's director of IT strategic initiatives, speaks to Computer Weekly about the company's energy reduction strategy.

In September, Coca Cola Enterprises released a report, co-authored by the Economist Intelligence Unit, discussing the benefits of sustainability.

At the time, John Brock, chairman and CEO of Coca-Cola Enterprises, said: "We are proud of the progress we have made on our own sustainability journey, but the next era of sustainable business will be led by more meaningful collaboration."

Kevin Sirjuesingh, director of IT strategic initiatives at Coca-Cola Enterprises (CCE), is looking at IT’s role in the company’s sustainability strategy. 

He says CCE already works across its supply chain to focus on energy efficiency and sustainability, but believes IT needs to step up and focus on how it can contribute to the overall sustainability agenda.

Targeting technology's energy consumption

With 15% of the UK's total energy bill related to office equipment such as PCs, photocopiers, multifunction devices, servers and racks, IT equipment is a major contributor to CCE's carbon footprint.

"We need to monitor and understand our current energy consumption and figure out the top initiatives to reduce that," says Sirjuesingh.

More on green IT

  • Cloud providers must step up green IT efforts, says Greenpeace
  • Computer Weekly Buyer's Guide to green computing
  • Green cloud computing tips: Rethink energy consumption and save money
  • Configure datacentres with green tech to save costs

From an IT perspective, he says energy efficiency involves everything from the front office to the back office.

And the small things matter, he says: "People are now turning off their computer monitors when they take the laptop out of the docking station."

Given the strong focus to make datacentres more efficient, in Sirjuesingh’s experience finding out actual power consumption data can be extremely difficult in a hosted environment. 

" Datacentres are somewhat streamlined in terms of their consumption ," he says. "But when I look at the figures for datacentre hosting with our supplier, CCE's energy footprint is based on square footage in the datacentre, which is a nonsense figure."

Sirjuesingh says some of CCE’s suppliers have a floor space charge and do not measure the energy consumption by device. "They look at the equipment in the square footage and build that into the hosting cost, which does not seem accurate in terms of the power usage."

CCE plans to work out the energy consumption of its in-house systems, and will also work with its hosting providers to drill down to get energy data, which is important for the company to meet its Carbon Reduction Commitment.

Energy metering and management

Once CCE's IT department understands the energy required to provide IT services, Sirjuesingh says the company will focus on targeting energy reduction. 

"We will also look at further staff training and awareness, server consolidation/virtualisation and retirement of legacy applications," he says.

But CCE plans to go further, according to Sirjuesingh. 

We need to monitor and understand our current energy consumption and figure out initiatives to reduce it Kevin Sirjuesingh, Coca Cola Enterprises

"For our workstation strategy we will look at the people in our salesforce and ask whether they need a laptop or a low-powered device like an iPad," he says. Trials are already underway.

Some of the basics to manage power on the desktop include configuring power management to turn off laptops and desktops after a period of inactivity and using "wake on LAN" to switch on the PC if software needs updating.

Reducing technology's carbon footprint

Coca-Cola Enterprises is taking a two-pronged approach to improving IT's carbon footprint. 

"We will put [energy] metering in our IT environment to start pulling energy consumption data from devices," says Sirjuesingh. "The software works a bit like a network sniffer. It will be able to look for and identify devices connected to the company's IP network by interrogating the device's firmware. He says the tool will work both on endpoint devices and within CCE's datacentres where required.

At the back end, he says CCE will use a hardware asset database that will contain data such as manufacturer specifications for the device, which will include the manufacturer’s stated energy data. "We could identify that a particular iPad 3, which is connected to the company's network, has been powered on for X number of hours and it will have a [known] energy consumption rate."

The Energy Manager Association (EMA) will also provide data to CCE to help it determine the energy consumption of devices that are not on the network. 

More case studies from Computer Weekly

  • How Capgemini keeps its Merlin datacentre green and energy-efficient
  • Lancaster University revamps unified communications
  • London Symphony Orchestra fine tunes Wi-Fi
  • Macmillan turns to social media to ease restructuring

Raising energy efficiency awareness among staff

CCE is starting a programme this quarter to train its IT staff in energy efficiency. It is the first IT department to sign up to the Low Energy Company initiative, which was launched at the start of October.

"Broadly speaking we want to achieve an awareness of energy consumption," says Sirjuesingh. "In the first quarter of 2014 we will expand training further. We at least want to be in a position where we can train the trainer and get it right throughout the IT organisation."

Read more on IT efficiency and sustainability

case study on green computing

Ex-Coca-Cola VPs debut ChatGPT-like AI tool to help enterprises solve their sustainability woes

CarolineDonnelly

HR leaders bridge gap between business, employee needs

MakenzieHolland

Russia-supporting cyber crime gang claims Coca-Cola as victim

AlexScroxton

Top 5 personalized marketing examples and their takeaways

SandraMathis

While California advances AI legislation targeting safety testing, the U.S. Senate will also have on its plate several AI bills ...

The next U.S. president will set the tone on tech issues such as AI regulation, data privacy and climate tech. This guide breaks ...

CIOs and IT leaders can play an important role in boosting tech talent retention. Learn how these strategies can motivate ...

Ransomware remained a highly disruptive threat last month, as notable attacks claimed victims in healthcare, technology, ...

If you are ready to take a more proactive approach to cybersecurity, threat hunting might be a tactic to consider. Here's what ...

The Office of the National Cyber Director has published a roadmap for internet routing security that outlines recommendations for...

Test scripts are the heart of any job in pyATS. Best practices for test scripts include proper structure, API integration and the...

Cloud and on-premises subnets use IP ranges, subnet masks or prefixes, and security policies. But cloud subnets are simpler to ...

Satellite connectivity lets Broadcom offer the VeloCloud SD-WAN as an option for linking IoT devices to the global network from ...

Rocky Linux and AlmaLinux are new distributions created after Red Hat announced the discontinuation of CentOS. These ...

The Broadcom CEO says public cloud migration trauma can be cured by private cloud services like those from VMware, but VMware ...

New capabilities for VMware VCF can import and manage existing VMware services through a single console interface for a private ...

Microsoft Copilot raises security concerns around unauthorized or unintentional data access. Prevent leaks with vigilant ...

Don't wait until you have a metadata management problem to address the issue. Put a metadata management framework in place to ...

The time series database specialist's update addresses performance to better handle complex real-time workloads and includes a ...

The green IT revolution: A blueprint for CIOs to combat climate change

Companies and governments looking to combat climate change are turning to tech for help. AI, new technologies, and some promising tech-driven business models have raised hopes for dramatic progress.

About the authors

This article is a collaborative effort by Gerrit Becker, Luca Bennici, Anamika Bhargava, Andrea Del Miglio , Jeffrey Lewis , and Pankaj Sachdeva, representing views from McKinsey Technology.

While many organizations’ climate goals are lofty, enterprise technology leaders—CIOs, chief digital innovation officers (CDIOs), and chief technology officers (CTOs), among others—have not always succeeded at turning climate ambitions into reality. One of the biggest reasons is that hard facts and clear paths of action are scarce. Misconceptions and misinformation have clouded the picture of what CIOs and tech leaders should do.

We have done extensive analysis of where technology can have the biggest impact on reducing emissions. To start, we divided technology’s role into two primary types of activities:

  • offense—the use of technology and analytics to cut emissions by reducing (improving operational efficiency), replacing (shifting emission-generating activities to cleaner alternatives), and reusing (recycling material)
  • defense—the actions IT can take to reduce emissions from the enterprise’s technology estate

Scope of the McKinsey analysis

McKinsey’s emissions analysis for this report focuses on enterprise technology emissions, which are the business IT emissions from the hardware, software, IT services, enterprise communications equipment, mobile devices, fixed and mobile network services, and internal technology teams that a company uses for its own operations and that a CIO has control over. These include the emissions related to the full life cycles of the products and services that an enterprise IT function uses, including their development, delivery, usage, and end of life (exhibit). Our internal services emissions' analysis assumes around 40 percent of IT workers are working from home.

The analysis does not include the emissions from the technology products and services that a company is selling (such as data center capacity sold by hyperscalers), operational technology devices (such as sensors and point-of-sale systems), and cryptocurrency mining.

The defense activities are where the CIO, as the head of IT, can act independently and quickly. This article focuses on defense, specifically the IT elements over which a CIO has direct control. We examined emissions from use of electricity for owned enterprise IT operations, such as the running of on-premises data centers and devices (classified as scope 2 by the Greenhouse Gas Protocol 1 Greenhouse Gas Protocol: Technical Guidance for Calculating Scope 3 Emissions: Supplement to the Corporate Value Chain (Scope 3) Accounting & Reporting Standard , World Resources Institute & World Business Council for Sustainable Development, 2013. Scope 1 emissions are direct emissions from the activities of an organization or under their control, including fuel combustion on site such as gas boilers, fleet vehicles, and air-conditioning leaks; scope 2 emissions are from electricity purchased and used by the organization; and scope 3 emissions are all indirect emissions not included in scope 2 that occur in the value chain of the reporting company, including both upstream and downstream emissions. ), and indirect emissions from technology devices that the CIO buys and disposes of (scope 3). 2 These calculations do not include emissions from technology-driven services sold, such as cloud capacity. (See sidebar, “Scope of the McKinsey analysis.”)

What the facts say

Our analysis has uncovered several facts that contravene some commonly held views about enterprise technology emissions. These facts involve the significant amount of tech-related emissions, the share of emissions from end-user devices, the variety of mitigation options available, and the favorable impact of shifting to cloud computing.

Enterprise technology generates significant emissions

Enterprise technology is responsible for emitting about 350 to 400 megatons of carbon dioxide equivalent gases (CO 2 e), accounting for about 1 percent of total global greenhouse gas (GHG) emissions. At first blush, this might not seem like a lot, but it equals about half of the emissions from aviation or shipping and is the equivalent of the total carbon emitted by the United Kingdom.

The industry sector that contributes the largest share of technology-related scope 2 and scope 3 GHG emissions is communications, media, and services (Exhibit 1). Enterprise technology’s contribution to total emissions is especially high for insurance (45 percent of total scope 2 emissions) and for banking and investment services (36 percent).

This amount of carbon dioxide and equivalent gases is a significant prize for companies under increasing pressure to cut emissions. Progress on climate change requires action on many fronts, and enterprise technology offers an important option that CIOs and companies can act on quickly.

Taking a photo of bamboo forest - stock photo

You’re invited

To a McKinsey Technology webinar on the critical role of technology in building a sustainable enterprise on October 25, 9:30–10:30am ET.

The biggest carbon culprit is end-user devices, not on-premises data centers

End-user devices—laptops, tablets, smartphones, and printers—generate 1.5 to 2.0 times more carbon globally than data centers (Exhibit 2). 3 On-premises and co-located data centers used by enterprises, not including data center capacity sold by hyperscalers. One reason is that companies have significantly more end-user devices than servers in on-premises data centers. In addition, the devices typically are replaced much more often: smartphones have an average refresh cycle of two years, laptops four years, and printers five years. On average, servers are replaced every five years, though 19 percent of organizations wait longer. 4 Rhona Ascierto and Andy Lawrence, Uptime Institute global data center survey 2020 , Uptime Institute, July 2020.

More worrisome, emissions from end-user devices are on track to increase at a CAGR of 12.8 percent per year. 5 End-user computing market: Growth, trends, COVID-19 impact, and forecasts (2022–2027) , Mordor Intelligence, January 2022. Efforts to address this could target the major causes of emissions from these devices. About three-fourths of the emissions comes from manufacturing, upstream transportation, and disposal. A significant source of these emissions is the semiconductors that power the devices.

Plenty of low-cost/high-impact options exist, starting with improved sourcing

We have found that when it comes to going green, many CIOs think in terms of investments needed to replace items or upgrade facilities. Our analysis, however, finds that CIOs can capture significant carbon benefits without making a significant investment—and in some cases can even save money (Exhibit 3).

Overall, for example, 50 to 60 percent of emissions related to end-user devices can be addressed through sourcing changes, primarily by procuring fewer devices per person and extending the life cycle of each device through recycling. These options will not require any investment and will lower costs, though companies may want to evaluate the impact on employee experience.

In addition, companies can more aggressively recycle their devices; 89 percent of organizations recycle less than 10 percent of their hardware overall. 6 Sustainable IT: Why it’s time for a green revolution for your organization’s IT , Capgemini Research Institute, 2021. CIOs can put pressure on suppliers to use greener devices, especially as companies in the semiconductor sector are already increasing their commitments to emission reduction. Further low-cost, high-impact actions include optimizing business travel and data center computing needs, as well as increasing the use of cloud to manage workloads.

Moving to cloud has more impact than optimizing data centers

Optimizing an on-premises data center’s power usage effectiveness (PUE) 7 PUE describes how efficiently a computer data center uses energy, expressed as the ratio of total facility energy to IT equipment energy. is expensive and results in limited carbon abatement. If a company were to double what it spends on infrastructure and cloud to reduce PUE, it would cut carbon emissions by only 15 to 20 percent. Structural improvements in data centers and optimized layout can help, but the impact is limited, and many companies have already implemented them. More aggressive measures, such as moving data centers to cooler locations or investing in new cooling tech, are prohibitively expensive.

A more effective approach is to migrate workloads to the cloud. Hyperscalers (also known as cloud service providers) and co-locators are investing significantly to become greener through measures such as buying green energy themselves and investing in ultra-efficient data centers with a PUE equal to or less than 1.10, compared with the average PUE of 1.57 for an on-premises data center. 8 “Uptime Institute 11th annual Global Data Center Survey shows sustainability, outage, and efficiency challenges amid capacity growth,” Uptime Institute, September 14, 2021. (We estimate that companies could achieve just a 1.3 PUE score for their data center if they invested nearly 250 percent more, on average, over what they currently spend for their data centers and cloud presence.)

With thoughtful migration to and optimized usage of the cloud, companies could reduce the carbon emissions from their data centers by more than 55 percent—about 40 megatons of CO 2 e worldwide, the equivalent of the total carbon emissions from Switzerland.

Three steps to take now

With companies and governments under intensifying pressure to cut carbon emissions and with technology playing a key role in delivering on those goals, CIOs will find themselves on the front lines. The challenge will be to reduce IT’s carbon footprint while delivering high-quality, low-cost technology services to customers and employees.

On average, completion of the defensive steps might take three to four years. However, CIOs who act decisively and precisely can achieve 15 to 20 percent of carbon reduction potential in the first year with minimal investment.

CIOs can choose from among a wide array responses, particularly in conjunction with the CEO and the board. However, three measures they can take right now will prepare the organization for longer-term efforts. These measures involve sourcing strategies, key metrics, and a performance management system.

Map of the world designed in flowers

The net-zero transition: What it would cost, what it could bring

Move now on sourcing strategies.

Far and away the fastest and most effective defensive measure for reducing IT carbon emissions is to revise policies for technology sourcing. Optimizing the number of devices in line with standards followed by companies in the top quartile 9 Top quartile in terms of the ratio of devices to people is derived from the number of devices per person. Our analysis uses McKinsey Digital’s Ignite solutions and 2020 data. would reduce about 30 percent of end-user-device emissions, the amount of carbon emitted by Hong Kong. For example, top-quartile companies have one printer for every 16 people in the workplace; the overall average is one printer per eight people.

This sourcing shift does not necessarily lead to a degradation in user experience, because the rollout of 5G and increasingly advanced processing and compute power allow the main processing function to happen at the server. Therefore, devices can be less powerful and consume much less energy. Essentially, this is a software-as-a-service (SaaS) model where high-end and user-friendly experiences happen on the server, not the device. The effectiveness of this approach will depend on having stable networks, less resource-intensive coding at the device level, edge computing capabilities, and shifts of offerings to more efficient platforms (for example, cloud).

As part of this effort, the CIO and the business’s head of procurement will need to collaborate on reviewing and adjusting device refresh timelines and device-to-person ratios, as well as adjusting the basis for purchasing decisions. Procurement generally relies on cost/benefit calculations, and rightly so. That approach will need to expand to account for carbon dioxide emissions. The spirit of collaboration should extend to suppliers as well, with the parties working together to formulate plans that provide the greatest benefits for all.

A more thoughtful sourcing strategy extends beyond end-user devices. CIOs, for example, should look for green sources of the electricity IT uses. When these sources are unavailable, CIOs can direct procurement to power purchase agreements to offset carbon use. CIOs can also set green standards for their vendors and suppliers, requiring GHG emissions disclosures and incorporating them into their criteria for purchase decisions.

Establish a green ROI metric for technology costs

Any real progress on green technology can happen only when companies measure their “green returns.” But today, most green metrics omit cost and savings, which ultimately makes them impractical. A better metric focuses on cost per ton of carbon saved (accounting for costs saved as well). Sophisticated models calculate emissions throughout the full life cycle, including production, transportation, and disposal.

CIOs can further assess suppliers, manufacturers, and service providers based on how advanced they are in recycling and refurbishing electronics; designing circular components; extending product life cycles with better design, higher-quality manufacturing, and more robust materials; offering repair services; and reselling to consumers.

Decisions about IT spending need to consider a range of factors, including technical debt abatement and business strategy. Along with these factors, companies should institutionalize a green ROI metric that is transparent to everybody in the business as an element in IT decision making, including in requests for proposals (RFPs). Doing so will enable companies to better understand the true impact their technology is having on carbon emissions.

Put in place green measurement systems

Establishing a green ROI metric is only a start. CIOs need to establish a baseline of performance, measure progress against the baseline, and track impact in near real time, much as companies track real-time computer and network usage for applications in the cloud. This kind of measuring system ensures that CIOs know what’s working and what isn’t, so they can adjust quickly.

In practice, implementing green measurement can be challenging. Some companies have spent a year measuring their carbon footprint, ending up with an outdated analysis. This tends to happen when companies are determined to measure every bit of carbon emitted, a praiseworthy but time-consuming effort. CIOs can make substantial progress by instead prioritizing measurement where the impact is highest, such as tracking the number of end-user devices purchased and in use, the current duration of use for each device, and the ratio of devices per user. Another way CIOs can make quick progress is to embed emissions- and power-monitoring capabilities into large technology assets and work with external providers, such as electricity companies, to track usage in real time.

Effectively combating climate change won’t happen through one or two big wins; those don’t exist yet. To have real impact, companies and governments will need to act in many areas. Technology has a huge role to play in many of these areas, but CIOs and tech leaders need to act quickly and decisively.

This article is the first in a series about how CIOs can reduce emissions. The next article will explore how CIOs can drive the business’s sustainability agenda by playing offense and implementing reduce, replace, and reuse levers to decarbonize.

Gerrit Becker is an associate partner in McKinsey’s Frankfurt office, Luca Bennici is an associate partner in the Dubai office, Anamika Bhargava is a consultant in the Toronto office, Andrea Del Miglio is a senior partner in the Milan office, Jeffrey Lewis is a senior partner in the New Jersey office, and Pankaj Sachdeva is a partner in the Philadelphia office.

The authors wish to thank Bernardo Betley, Arjita Bhan, Raghuvar Choppakatla, Sebastian Hoffmann, Abdelrahman Mahfouz, Tom Pütz, Jürgen Sailer, Tim Vroman, Alice Yu, and Gisella Zapata for their contributions to this article.

Explore a career with us

Related articles.

Worker walking in the factory

It’s not easy buying green: How to win at sustainable sourcing

" "

Sustainability in semiconductor operations: Toward net-zero production

Futuristic organic sphere

Delivering the climate technologies needed for net zero

This website uses cookies to ensure you get the best experience. Learn more about DOAJ’s privacy policy.

Hide this message

You are using an outdated browser. Please upgrade your browser to improve your experience and security.

The Directory of Open Access Journals

Quick search.

E3S Web of Conferences (Jan 2021)

Green Computing case study: calls for proposing solutions for the Arabian Gulf Oil Company

  • Benamer Wisam H.,
  • Elberkawi Ebitisam K.,
  • Neihum Nebras A.,
  • Anwiji Anwiji S.,
  • Youns Moatz A.

Affiliations

Read online

Nowadays technology-based means are elements in every-day’s life, they ease running household and institution needs. In Libya the number of newly established institutions is on the rise in both the private and public sectors, with such growth there must be a concomitant increase in the use of electronic devices/gadgets. However, after a while these electronic devices will be out of date, and here a suitable strategy of Green Computing must be considered. This study aims to shed light on ways to save electricity consumption as a main factor for green computing, and to help publicize the understanding of green computing in institutions. Furthermore, awareness of green computing importance will be raised, through introducing the green computing concept to companies using a field study. The current study is a trial and was conducted in one of the biggest companies in Libya, namely the Arabian Gulf Oil Company (AGOCO). Consequently, we had come up with a number of recommendations that were classified into general and specific. General recommendations: These deal with the concept of green computing and what must be done to implement the steps for its correct application. Specific recommendations: These reflect the company’s plans to enhance power saving through green computing that are suitable for running their daily business.

  • green computing
  • environment
  • sustainable energy
  • energy-aware
  • eco friendly computing.

WeChat QR code

case study on green computing

IEEE Account

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

The impact of green transformation in data centers on corporate carbon emission reduction: empirical evidence from China

  • Published: 02 September 2024

Cite this article

case study on green computing

  • Zhixiang Yin 1 &
  • Haisen Wang   ORCID: orcid.org/0000-0002-9912-8439 2 , 3  

Carbon emission reduction is a critical objective for enhancing ecological and environmental quality. The shift toward green and sustainable practices is becoming increasingly central to the future development of data centers. Despite its importance, few studies have examined the impact of green data centers on carbon emissions. Based on the event of green data center pilots at district-county level in China, this paper explores the impact of green transformation of data centers on corporate carbon emission reduction and its mechanism of action by using a high-dimensional fixed-effects model with the help of a panel data of A-share listed companies in Shanghai and Shenzhen, China, from 2008 to 2021. The findings reveal: (1) The green transformation of data centers significantly promotes corporate carbon emission reduction. This result is robust, persisting even after adjusting for the influence of other policies and benchmark variables. (2) The study identifies that green transformation substantially enhances the level of breakthrough innovation within enterprises, which in turn significantly reduces their carbon emissions. Additionally, the level of green concern within a company positively moderates the relationship between green transformation and carbon emission reduction. (3) Heterogeneity analysis indicates that the effects of data center green transformation on carbon emissions vary significantly between central and western regions and non-interprovincial border areas. This research provides empirical evidence and policy recommendations to assist developing countries in balancing economic development with carbon emission reduction objectives.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

Similar content being viewed by others

case study on green computing

Environmental protection tax and corporate carbon emissions in China: a perspective of green innovation

case study on green computing

Can carbon emission trading improve corporate sustainability? An analysis of green path and value transformation effect of pilot policy

case study on green computing

What drives the green transformation of enterprises? A case of carbon emissions trading pilot policy in China

Data availability.

Some or all data, models, and/or code generated or used during the study are available from the corresponding author by request.

Abbreviations

The carbon emissions of enterprises

The degree of green transformation in data centers

The scale of enterprises

The asset liability ratio of enterprises

The growth rate of a of enterprises’ operating revenue

The profitability of enterprises

The liquidity of enterprises

The cash level of enterprises

The age of enterprises

The investment proportion of enterprises

The nature of enterprises

The consistency between the registered location and office location of enterprises

The capital intensity of enterprises

The concentration of equity in enterprises

The breakthrough innovation level of enterprises

The green concern of enterprises

A set of policy dummy binary variables

A set of urban benchmark virtual binary variables

District-county fixed effects

Industry fixed effects

Year fixed effects

Random error terms

Abban, O. J., Xing, Y. H., Nuţă, A. C., et al. (2023). Policies for carbon-zero targets: Examining the spillover effects of renewable energy and patent applications on environmental quality in Europe[J]. Energy Economics , 126 , 106954.

Article   Google Scholar  

Abbas, J., & Khan, S. M. (2023). Green knowledge management and organizational green culture: An interaction for organizational green innovation and green performance[J]. Journal of Knowledge Management , 27 (7), 1852–1870.

Abernathy, W. J., & Utterback, J. M. (1978). Patterns of Industrial Innovation[J] Technology Review , 80(7): 40–47.

Google Scholar  

Bhattacharya, T., Rahgouy, M., Peng, X. P., et al. (2023). Capping carbon emission from green data centers[J]. International Journal of Energy and Environmental Engineering , 14 , 627–641.

Article   CAS   Google Scholar  

Boucekkine, R., Fabbri, G., Federico, S., et al. (2021). From firm to global-level pollution control: The case of transboundary pollution[J]. European Journal of Operational Research , 290 (1), 331–345.

Byun, S. K., Oh, J. M., & Xia, H. (2021). Incremental vs. breakthrough innovation: The role of technology spillovers[J]. Management Science , 67 (3), 1779–1802.

Chen, W. X., Gu, T. C., Fang, C. L., et al. (2023). Global urban low-carbon transitions: Multiscale relationship between urban land and carbon emissions[J]. Environmental Impact Assessment Review , 100 , 107076.

Cui, R. X., Wang, J. R., & Zhou, C. (2023). Exploring the linkages of green transformational leadership, organizational green learning, and radical green innovation[J]. Business Strategy and the Environment , 32 (1), 185–199.

Dong, F., Zhu, J., & Li, Y. F. (2022). How green technology innovation affects carbon emission efficiency: Evidence from developed countries proposing carbon neutrality targets [J]. Environmental Science and Pollution Research , 29 , 35780–35799.

Du, K. R., Liu, X. Y., & Zhao, C. (2023). Environmental regulation mitigates energy rebound effect[J]. Energy Economics , 125 , 106851.

Feng, Y., Liu, J. X., & Nie, C. F. (2024). How does digital economy affect synergy of carbon mitigation and pollution reduction? Evidence from next-generation internet demonstration city construction in China[J] . Environment, Development and Sustainability. https://doi.org/10.1007/s10668-024-05178-0

Han, J. W., Miao, J. J., Du, G., et al. (2021). Can market-oriented reform inhibit carbon dioxide emissions in China? A new perspective from factor market distortion[J]. Sustainable Production and Consumption , 27 , 1498–1513.

Han, F., Mao, X., Yu, X. Y., et al. (2024). Government environmental protection subsidies and corporate green innovation: Evidence from Chinese microenterprises[J]. Journal of Innovation & Knowledge , 9 (1), 100458.

Hittinger, E., & Jaramillo, P. (2019). Internet of things: Energy boon or bane?[J]. Science , 364 (6438), 326–328.

Huang, P., Copertaro, B., Zhang, X., et al. (2020). A review of data centers as prosumers in district energy systems: Renewable energy integration and waste heat reuse for district heating[J]. Applied Energy , 258 , 114109.

Huang, Y. P., Deng, Z. L., Chen, Y. P., et al. (2023). Performance investigation of a biomimetic latent heat thermal energy storage device for waste heat recovery in data centers[J]. Applied Energy , 335 , 120745.

Isazadeh, A., Ziviani, D., & Claridge, D. E. (2023). Global trends, performance metrics, and energy reduction measures in datacom facilities[J]. Renewable and Sustainable Energy Reviews , 174 , 113149.

Jones, N. (2018). How to stop data centres from gobbling up the world’s electricity[J]. Nature , 561 (7722), 163–166.

Liu, J., Chen, Y., & Liang, F. H. (2023a). The effects of digital economy on breakthrough innovations: Evidence from Chinese listed companies[J]. Technological Forecasting and Social Change , 196 , 122866.

Liu, X. Q., Cifuentes-Faura, J., Zhao, S. K., et al. (2023b). Government environmental attention and carbon emissions governance: Firm-level evidence from China[J]. Economic Analysis and Policy , 80 , 121–142.

Liu, Y. H., & Yang, Z. H. (2023c). Can data center green reform facilitate urban green technology innovation? Evidence from China[J]. Environmental Science and Pollution Research , 30 (22), 62951–62966.

Magotra, B., Malhotra, D., & Dogra, A. K. (2023). Adaptive computational solutions to energy efficiency in cloud computing environment using VM consolidation[J]. Archives of Computational Methods in Engineering , 30 (3), 1789–1818.

Nerlich, B., & Koteyko, N. (2009). Carbon Reduction activism in the UK: Lexical Creativity and Lexical Framing in the Context of Climate Change[J]. Environmental Communication , 3 (2), 206–223.

Pan, M. J., Zhao, X., Lv, K. J., et al. (2023). Internet development and carbon emission-reduction in the era of digitalization: Where will resource-based cities go?[J]. Resources Policy , 81 , 103345.

Park, J., Han, K., & Lee, B. (2023). Green cloud? An empirical analysis of cloud computing and energy efficiency[J]. Management Science , 69 (3), 1639–1664.

Pata, U. K., Kartal, M. T., & Mukhtarov, S. (2024). Technological changes and carbon neutrality targets in European countries: A sustainability approach with Fourier approximations[J]. Technological Forecasting and Social Change , 198 , 122994.

Porter, M. E., & Linde, C. (1995). Toward a new conception of the environment-competitiveness relationship[J]. Journal of Economic Perspectives , 9 (4), 97–118.

Razzaq, A., Wang, Y. F., Chupradit, S., et al. (2021). Asymmetric inter-linkages between green technology innovation and consumption-based carbon emissions in BRICS countries using quantile-on-quantile framework[J]. Technology in Society , 66 , 101656.

Ren, F., Zhang, J. Y., Yu, X., et al. (2024). Measurement and influencing factors of TFEE, and potentials of energy conservation and emission reduction in China based on GW-SBM and bootstrap model[J]. Energy Reports , 11 , 745–753.

Shiri, N., & Jafari-Sadeghi, V. (2023). Corporate social responsibility and green behaviour: Towards sustainable food-business development[J]. Corporate Social Responsibility and Environmental Management , 30 (2), 605–620.

Si, H. Y., Li, N., Duan, X., et al. (2023). Understanding the public’s willingness to participate in the Carbon Generalized System of preferences (CGSP): An innovative mechanism to drive low-carbon behavior in China[J]. Sustainable Production and Consumption , 38 , 1–12.

Wang, F., Yu, L., Jiang, L., et al. (2020). Tracing China’s inter-regional cost transfer of air pollution through domestic supply chains[J]. Journal of Cleaner Production , 268 , 121488.

Wang, F., Nian, V., Campana, P. E., et al. (2022a). Do ‘green’data centres really have zero CO2 emissions?[J]. Sustainable Energy Technologies and Assessments , 53 , 102769.

Wang, S. W., Abbas, J., Sial, M. S., et al. (2022b). Achieving green innovation and sustainable development goals through green knowledge management: Moderating role of organizational green culture[J]. Journal of Innovation & Knowledge , 7 (4), 100272.

Wang, M., Gao, M. M., Cao, H. M., et al. (2024). Policy impact of national comprehensive pilot initiative for new-type urbanization on carbon emissions from rural energy consumption in China[J] . Environment, Development and Sustainability. https://doi.org/10.1007/s10668-024-05206-z

Xin, D. L., Ahmad, M., Lei, H., et al. (2021). Do innovation in environmental-related technologies asymmetrically affect carbon dioxide emissions in the United States?[J]. Technology in Society , 67 , 101761.

Xing, Z. C. (2023). Collaborative governance of trade-driven transboundary air pollution in China: A responsibility-based fair compensation mechanism[J]. Journal of Environmental Management , 348 , 119327.

Xu, L., Fan, M. T., & Yang, L. L. (2021). Heterogeneous green innovations and carbon emission performance: Evidence at China’s city level[J]. Energy Economics , 99 , 105269.

Xu, T. T., Kang, C. Y., & Zhang, H. (2022). China’s efforts towards carbon neutrality: Does energy-saving and emission-reduction policy mitigate carbon emissions?[J]. Journal of Environmental Management , 316 , 115286.

Xuan, D., Ma, X. W., & Shang, Y. P. (2020). Can China’s policy of carbon emission trading promote carbon emission reduction?[J]. Journal of Cleaner Production , 270 , 122383.

Yi, M., Liu, Y. F., Sheng, S. M., Y, et al. (2022). Effects of digital economy on carbon emission reduction: New evidence from China[J]. Energy Policy , 171 , 113271.

Zhang, W. K., Fan, H. X., & Zhao, Q. W. (2023). Seeing green: How does digital infrastructure affect carbon emission intensity?[J]. Energy Economics , 127 , 107085.

Zhao, Y. Y., Mao, J. Z., & Li, Y. S. (2022). Local governments’ environmental emphasis and corporate green innovation: Evidence from China[J]. Economic Change and Restructuring , 55 , 2577–2603.

Zhu, H. Y., Zhang, D. D., Goh, H. H., et al. (2023). Future data center energy-conservation and emission-reduction technologies in the context of smart and low-carbon city construction[J]. Sustainable Cities and Society , 89 , 104322.

Zhu, Y. P., Hu, Y., & Zhu, Y. (2024). Can China’s energy policies achieve the dual carbon goal? A multi-dimensional analysis based on policy text tools[J] . Environment, Development and Sustainability. https://doi.org/10.1007/s10668-024-05190-4

Download references

Acknowledgements

We would like to thank the editor and the anonymous referees for their valuable comments on this paper.

This work was supported by the Major Program of National Fund of Philosophy and Social Science of China (CN) (grant number 23&ZD222).

Author information

Authors and affiliations.

School of Law, Humanities and Sociology, Wuhan University of Technology, Wuhan, 430070, China

Zhixiang Yin

School of Information Management, Wuhan University, Wuhan, 430072, China

Haisen Wang

Centre for Studies of Information Resources, Wuhan University, Wuhan, 430072, China

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Haisen Wang .

Ethics declarations

Competing interest.

The authors declare that they have no known competing financial interests or personal relationships that could appear to have influenced the work reported in this paper.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Yin, Z., Wang, H. The impact of green transformation in data centers on corporate carbon emission reduction: empirical evidence from China. Environ Dev Sustain (2024). https://doi.org/10.1007/s10668-024-05372-0

Download citation

Received : 30 April 2024

Accepted : 27 August 2024

Published : 02 September 2024

DOI : https://doi.org/10.1007/s10668-024-05372-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Green data centers
  • Corporate carbon reduction
  • Breakthrough innovations
  • Green concern
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. Green Computing: A Contribution to Save the Environment

    case study on green computing

  2. (PDF) Green Computing in Cloud Technology

    case study on green computing

  3. Green-computing- Ijertconv 6IS07046

    case study on green computing

  4. Green Computing and Its Applications

    case study on green computing

  5. (PDF) Green Computing: A Study on Future Computing and Energy Saving

    case study on green computing

  6. 😂 Computing goes green case study. Green computing. 2019-03-03

    case study on green computing

VIDEO

  1. Green computing: Introduction

  2. Green Computing

  3. Green computing 2

  4. Green computing 3

  5. Green Computing PSA

  6. CASE STUDY : Green adsorbents for waste water

COMMENTS

  1. PDF Google's Green Computing: Efficiency at Scale

    Powering an Email System. reen Computing: Eficiency at ScaleIntroductionIt's common to hear about new data centers being built, and it may seem as if the ener. y used by "the cloud" is a ...

  2. A Comprehensive Review of Green Computing: Past, Present, and Future

    Abstract: Green computing, also called sustainable computing, is the process of developing and optimizing computer chips, systems, networks, and software in such a manner that can maximize efficiency by utilizing energy more efficiently and minimizing the negative environmental influence on the surrounding. The term "green computing" refers to practices that lessen the negative effects of ...

  3. PDF Green Computing case study: calls for proposing solutions for the

    Furthermore, awareness of green computing importance will be raised, through introducing the green computing concept to companies using a field study. The current study is a trial and was conducted in one of the biggest companies in Libya, namely the Arabian Gulf Oil Company (AGOCO). Consequently, we had come up with a number of recommendations ...

  4. Green Computing: A Sustainable and Eco-friendly Approach for ...

    Green computing is a revolutionary way to designing, building and managing energy-efficient computer systems. The objectives are to improve product energy efficiency over its lifetime, reduce the use of dangerous chemicals and improve the recycling process or good biocompatibility of old items and processing facility waste.

  5. (PDF) Advancing green computing: Practices, strategies, and impact in

    Case studies provide real-world examples of successful imple mentation of green computing strategies in software development organizatio ns. These case studies highlight best practices, l essons ...

  6. What Is Green Computing?

    19 April 2022. 6 min read. Learn how green computing reduces energy consumption and lowers carbon emissions from the design, use and disposal of technology products. Green computing (also known as green IT or sustainable IT) is the design, manufacture, use and disposal of computers, chips, other technology components and peripherals in a way ...

  7. What will green computing look like in the future?

    Researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) are trying to change that. "Over the next decade, the demand, number and types of devices is only going to grow," said Udit Gupta, a Ph.D. candidate in Computer Science at SEAS. "We want to know what impact that will have on the environment and ...

  8. The power of green computing

    Sustainable computing practices have the power to both infuse operational efficiencies and greatly reduce energy consumption, says Jen Huffstetler, chief product sustainability officer at Intel ...

  9. What is Green Coding and Why Does it Matter?

    Green coding is a segment of green computing, a practice that seeks to limit technology's environmental impact, including reducing the carbon footprint in high-intensity operations, such as in manufacturing lines, data centers and even the day-to-day operations of business teams.

  10. Google's Green Computing, Efficiency at Scale

    Google's Green Computing, Efficiency at Scale. This case study compares the energy savings and carbon footprint of using Gmail via Google Apps—Google's cloud-based messaging and collaboration suite, versus housing local servers to manage the same email. Download the Google's Green Computing, Efficiency at Scale.

  11. PDF Green Computing

    Green computing can be defined as the study of using computers so as to maximise energy efficiency, reducing the use of hazardous materials and employing the 3 R principle-REDUCE, REFURBISH and RECYCLE. This paper is a case study that takes stock of the current green efficiency of the monitors and CPUs of the CSE Department in RVCE and gives ...

  12. (PDF) A Comprehensive Review of Green Computing: Past ...

    ABSTRACT Green computing, also called sustainable computing, is the process of developing and. optimizing computer chips, systems, networks, and software in such a manner that can maximize ...

  13. Green Computing: A Future Perspective and the Operational Analysis of a

    Green computing refers to sustainable, environment-friendly computing that harnesses information and technology. Green computing can be thought of as applying the principles of manufacturing and design to the use and disposal of electronic products (computers, printers, servers, mobile phones, and storage devices), so the environment is not impacted. The goals of green computing include the ...

  14. What Is Green Computing?

    Green Computing Definition. Green computing refers to the conscious design, manufacturing, use and disposal of computers and related technologies in a way that reduces carbon emissions and energy consumption. Thanks to the passage of the Inflation Reduction Act and the CHIPS and Science Act in 2022, and the Artificial Intelligence Environmental ...

  15. Case study: Coca-Cola embarks on green IT strategy

    Computer Weekly Buyer's Guide to green computing; Green cloud computing tips: Rethink energy consumption and save money ... More case studies from Computer Weekly. How Capgemini keeps its Merlin ...

  16. Green IT: Case Studies

    Tel.: +965-66610985. E-mail address: [email protected]. 2012 International Conference on Future Energy, Environment and Materials Green IT: Case Studies Chibli Joumaa, Seifedine Kadry* American University of the Middle East Abstract In 2008, Murugesan [1] has defined Green computing or Green IT as the study and practice of designing ...

  17. (PDF) Green IT: Case studies

    In 2008, Murugesan [1] has defined Green computing or Green IT as the study and practice of designing, manufacturing, using, and disposing of computers, servers, and associated subsystems—such ...

  18. The green IT revolution: A blueprint for CIOs

    Establish a green ROI metric for technology costs. Any real progress on green technology can happen only when companies measure their "green returns." But today, most green metrics omit cost and savings, which ultimately makes them impractical. A better metric focuses on cost per ton of carbon saved (accounting for costs saved as well).

  19. Green Computing case study: calls for proposing solutions for the

    Furthermore, awareness of green computing importance will be raised, through introducing the green computing concept to companies using a field study. The current study is a trial and was conducted in one of the biggest companies in Libya, namely the Arabian Gulf Oil Company (AGOCO). Consequently, we had come up with a number of recommendations ...

  20. PDF Green Computing: A Case Study of Rajarshi Shahu Mahavidyalaya, Latur, INDIA

    Green Computing: A Case Study of Rajarshi Shahu Mahavidyalaya, Latur, INDIA. During their operations computers not only consume energy but also leave carbon foot prints. IT-related equivalent carbon dioxide (CO2e) emissions alone have been estimated at two per cent of the world's total [3]. What prompted this research paper is the fact that ...

  21. Green computing implementation factors: UAE case study

    Green computing has been receiving lot of attention because of the growing environmental concerns and swelling energy consumption costs. Developments of computing performance and reduction in the consumption of energy and carbon footprints are among the most important objectives of green computing. Aim of this study is to explore and advance the green computing implementation practices among ...

  22. Green Computing case study: calls for proposing solutions for the

    introducing the green co mputing concept to companies using a field study. The current study is a trial and. was conducted in one of the biggest companies in Libya, namely the Arabian Gulf Oil ...

  23. The impact of green transformation in data centers on corporate carbon

    Carbon emission reduction is a critical objective for enhancing ecological and environmental quality. The shift toward green and sustainable practices is becoming increasingly central to the future development of data centers. Despite its importance, few studies have examined the impact of green data centers on carbon emissions. Based on the event of green data center pilots at district-county ...