How much do data centers charge per kWh?
Table of Contents
14.47
State Rankings based on Industrial Electricity Rates
Rank | State | Average Industrial Electricity Rate (Cents per kWh) |
---|---|---|
47 | Connecticut | 13.76 |
48 | California | 14.47 |
49 | Rhode Island | 14.85 |
50 | Alaska | 16.94 |
How many kWh does a data center use?
In order to keep data centers running continuously and without interruption, managers must use a lot of electricity. According to one report, the entire data center industry uses over 90 billion kilowatt-hours of electricity annually. This is the equivalent output of roughly 34 coal-powered power plants.
How much does a rack in a data center cost?
Generally, rack space works out to be about $1000 to $1500 per month per rack. Now, it can be cheaper or more expensive based on power density.
How much electricity does a data center use per day?
Globally, data centers were estimated to use between 196 terawatt hours (TWh) (Masanet et al, 2020) and 400 TWh (Hintemann, 2020) in 2020. This would mean data centers consume between 1-2% of global electricity demand.
How much does it cost to build a data center per square foot?
The average-powered base building (defined here as foundation, four walls and roof along with a transformer and common areas for security, loading dock, restrooms, corridors, etc…) of a data center facility typically ranges from $125 per square foot to upwards of $200 per square foot.
How much power does a 42U rack consume?
As many as 60 blade servers can be placed in a standard height 42U rack. However this condensed computing comes with a power price. The typical power demand (power and cooling) for this configuration is more than 4,000 watts compared to a full rack of 1U servers at 2,500 watts.
How much does it cost to run a data center?
The U.S. average rate is $.0733/kwh and this report only tracks state-by-state instead of by utility. For a large data center user in the continental U.S., using the EIA’s report that significant delta between high and low for the “same” power translates to a difference of ~$878,000 annually as follows:
How much electricity does it take to power a data center?
According to Computerworld, “It takes 34 power plants, each capable of generating 500 megawatts of electricity, to power all of the data centers in operation today.” Knowing how to calculate and forecast power utilization and costs are important.
What are the different pricing models for data center power?
There are two primary pricing models for data center power – metered and unmetered. The most common pricing model is referred to as metered power. In this model, you are charged for the power you use similar to how a utility company charges a residential customer.
What is a megawatt data center?
In the data center industry, megawatts are reserved for wholesale colocation customers that require enough power for thousands of servers and related IT hardware. For large wholesale colocation opportunities, costs are based on wholesale electricity rates plus a markup for the recapture of buildout costs, infrastructure, and cooling costs.