Why do data centers use so much energy?



  1. 0 Votes

    To start, computers use energy, even when they are idle. Circuits need to be powered to even respond. The more computer actions you take, the more electricity is used.

    But data centers have additional power requirements that home users typically do not have.

    1) Data centers need to respond quickly to high loads. At home, if your computer is taking too long, you can just take a sip of coffee — or make a fresh pot. But data centers don’t have that luxury, their users expect relatively constant response regardless of what other loads other users have. That means data centers need a substantial amount of computing power that much of the time isn’t being used.

    2) Data centers need to have high reliability. Generally that means that there is “hot swappable” equipment that can be brought online in an instant — with all the same data as the equipment that failed. All that standby equipment takes power.

    3) Air conditioning. Data centers need to run major air conditioning. And the ideal temperature to run computers is lower than most people find comfortable.

    4) Power conditioning. Most power has all kinds of minor problems, constantly. Brown-outs, surges, spikes, phase problems, etc. A major data center cannot afford to be affected by these, so typically they use power hungry line conditioners, uninterruptable power sources, etc. All these use energy. (Energy that the air conditioners need to get rid of, too!)

Please signup or login to answer this question.

Sorry,At this time user registration is disabled. We will open registration soon!