IT Sustainability Think Tank: How CIOs can navigate the green IT landscape – ComputerWeekly.com

Regulations in place or pending require that all companies report their Scope 1-to-3 greenhouse gas (GHG) emissions, and to achieve that organisations can adopt an approach based on the following broad avenues: data collection, measurement standards and data analytics.
How data is collected will depend on what hosting options are employed. Typically, enterprises have applications hosted in both on-premise datacentres and in hyperscale cloud facilities – and these environments have very different reporting demands. 
In simple terms, reporting electricity consumption from datacentres requires only a sub-meter to distinguish server room consumption from that occurring in attached offices. 
However, the installation of smart power distribution units will provide an understanding of emissions hotspots, which is vital in targeting emissions reduction activities.
Once this data is collected, it needs only a knowledge of the local power generation mix and any company-owned power purchase agreements to convert electricity consumption into carbon emissions. This can be facilitated by some datacentre infrastructure management (DCIM) systems that have these capabilities built in.
A measure of datacentre efficiency that is often used is Power Usage Effectiveness (PUE), which is the ratio of power consumed by computing to total datacentre power consumption and is intended to emphasise the overheads of the air conditioning and other supporting systems.
It should be acknowledged and accepted that reducing consumption from computing alone will have a negative impact on the PUE, but that is not necessarily a bad thing.
Of course, dedicated datacentres require purchasing dedicated hardware. So the emissions from manufacturing that hardware need to be accounted for and – for that – you need to check the supplier’s specifications.
The hyperscale clouds, including Microsoft Azure, Amazon Web Services (AWS) and Google, all have Carbon Footprint Calculators that can paint a picture of the emissions generated by an enterprise’s cloud footprint.
Indeed, Google go a step further by providing users with the capability to compare the energy mix on a region-by-region basis at all their locations. Using this can help organisations reduce their emissions simply by hosting in a greener grid. 
In parallel with identifying what tools to use, enterprises also need to be clear on what measurement standards to report against. The GHG Protocol is the de-facto standard for reporting emissions and is what regulatory reporting requires.
It uses values for the Global Warming Potential (GWP) of different gases (methane, refrigerants from air conditioning, etc.) to calculate carbon dioxide equivalence (CO2e). In this way it allows comparisons between, for example, an on-premise, hardware-based offering with Software-as-a-Service (SaaS).
While regulatory reporting has little requirement for imaginative formatting, it is vital to bring the information to life for team discussions.
This needs the same tools and skills used to visualise data in other areas of the business. Dashboards of graphs, diagrams and tables can help everyone see both what emissions are and the context from which they originate. In this way, mitigation strategies become easier to formulate. Relative sizes can be compared and emissions from less mission-critical services might be addressed more ruthlessly.
While quantifying emissions is necessary for regulatory reporting, as described above, it can also be used to give material benefits by driving emissions reduction activities leading to improved customer perception and cost avoidance.
Sen. Chuck Schumer’s AI policy roadmap wants $32 billion to boost AI innovation. It also prioritizes a federal data privacy …
IT leaders at the MIT Sloan CIO Symposium cited the challenge of managing the ‘explosion’ of GenAI companies while also working …
CIOs will need to pay attention to cybersecurity regulations that often include multiple requirements for businesses to maintain …
Equifax. Colonial Pipeline. Sony. Target. All are high-profile data breaches, and all offer key lessons to learn that prevent …
Artificial intelligence was center stage at RSA Conference 2024, but the show also focused on secure-by-design principles, the …
Tried and true cloud security threats are on the rise. But according to a new report from Palo Alto Networks, the specter of …
The transition to IPv6 is a learning curve. Here’s what to understand about IPv6, including benefits, troubleshooting techniques …
At the 2024 ‘Strategies for a Resilient Network’ summit, five thought leaders shared best practices to help you achieve optimal …
Network licensing is shifting from Capex models to Opex, largely due to SDN and cloud networking evolution. Here are best …
Broadcom discontinues AWS as a reseller of VMware Cloud on AWS, further evidence that it wants hands-on control of how the VM …
There are several levels of testing to address when developing a UPS maintenance checklist. Be sure to conduct visual, thermal …
Cybersecurity is critical to protect data and systems. Admins of hybrid data centers must understand the risks of a hybrid model …
The data transformation specialist’s latest update includes new governance and metadata management capabilities as well as an …
The data catalog specialist’s Workflow Automation provides customers with bots and prebuilt workflows that aid data stewardship …
Data mining and data profiling have different roles. Using one without the other is not an option for data management operations …
All Rights Reserved, Copyright 2000 – 2024, TechTarget

Privacy Policy
Cookie Preferences
Do Not Sell or Share My Personal Information

source