Research shows optimized workloads on AWS can be up to four times more efficient than workloads done on-site.
Artificial intelligence (AI) is rapidly transforming how we use technology to address some of the world's biggest challenges, from health care to climate change. As we scale our use of AI, it's important to also minimize its environmental footprint. A new study commissioned by Amazon Web Services (AWS) and conducted by Accenture shows that an effective way to do that is by moving IT workloads from on-premises infrastructure to AWS data centres around the globe.
The report, "Moving onto The AWS Cloud Reduces Carbon Emissions," estimates AWS's infrastructure is up to 4.1 times more efficient than on-premises, and when workloads are optimized on AWS, the associated carbon footprint can be reduced by up to 99%. On-premises refers to organizations running hardware and software within their own physical space, and 85% of global IT spend by organizations remains on-premises.
"AWS's holistic approach to efficiency helps to minimize both energy and water consumption in our data centre operations, contributing to our ability to better serve our customers," said Chris Walker, director of sustainability at AWS. "We are constantly working on ways to increase the energy efficiency of our facilities - optimizing our data centre design, investing in purpose-built chips, and innovating with new cooling technologies.
As AWS takes steps toward reaching Amazon's net-zero carbon by 2040, as part of The Climate Pledge, we will continuously innovate and implement ways to increase the energy efficiency across our facilities in an effort to build a brighter future for our planet."
AWS customers have experienced the efficiency benefits of moving to, and developing solutions on, AWS for several years. For example, global genomics and human-health company Illumina saw an 89% reduction of carbon emissions by moving to AWS. This type of efficiency gained by leveraging AWS versus organizations maintaining their own on-site IT infrastructure is expected to become more prominent as the world increasingly adopts the use of AI.
That is because as AI workloads become more complex and data-intensive, they will require new levels of performance from systems that complete millions of calculations every second, along with memory, storage, and networking infrastructure. This requires energy and has a corresponding carbon footprint.
While on-premises data centres struggle to keep pace due to their inherent limitations in driving scalability and energy efficiency, AWS is continuously innovating to make the cloud the most efficient way to run our customers' infrastructure and businesses.
"This research shows that AWS's focus on hardware and cooling efficiency, carbon-free energy, purpose-built silicon, and optimized storage can help organizations reduce the carbon footprint of AI and machine learning workloads," said Sanjay Podder, global lead for Technology Sustainability Innovation at Accenture. "As the demand for AI continues to grow, sustainability through technology can play a crucial role in helping businesses meet environmental goals while driving innovation."
Industry-leading standard used to quantify efficiency and estimated carbon reduction
The research quantified the energy efficiency and carbon reduction opportunity of moving customer workloads from on-premises to AWS by simulating and analyzing the differences between them. A workload is a collection of resources and code to accomplish tasks such as running a retail website or managing inventory databases.
Accenture used the International Organization Standardization (ISO) Software Carbon Intensity (SCI) standard to analyze the carbon footprint of representative storage-heavy and compute-heavy workloads and went beyond that by considering the effect of carbon-free energy for both on-premises and AWS. This is one of the first times a hyperscale cloud provider has used the SCI specification to perf...