15 measures and tips that can help you improve Server Room & Data Centre energy efficiency & Consumption.
Energy and power consumption in the data centre or server room can be improved by being easily remediated by examining the following 15 factors.
Energy Efficiency and Consumption Metric
The most commonly used metric to determine the energy efficiency of a data centre is power usage effectiveness or PUE. This simple ratio is the total power entering the data centre divided by the power used by the IT equipment.
PUE = Total Facility Power/ IT Equipment Power
Green data centres
Data centres use a lot of power, consumed by two main usages: the power required to run the actual equipment and then the power required to cool the equipment. The first category is addressed by designing computers and storage systems that are increasingly power-efficient.
To bring down cooling costs data centre designers try to use natural ways to cool the equipment. Many data centres are located near good fibre connectivity, power grid connections and also people-concentrations to manage the equipment, but there are also circumstances where the data centre can be miles away from the users and don’t need a lot of local management. Examples of this are the ‘mass’ data centres like Google or Facebook: these DC’s are built around many standardised servers and storage arrays and the actual users of the systems are located all around the world. After the initial build of a data centre staff numbers required to keep it running are often relatively low: especially data centres that provide mass-storage or computing power which don’t need to be near population centres.Data centres in arctic locations where outside air provides all cooling are getting more popular as cooling and electricity are the two main variable cost components.
How to Improve Energy Efficiency of Server Rooms and Data Centre?
Is there a ghost in your IT closet? If your building has one or more IT rooms or closets containing between 5 and 50 servers, chances are that they account for a significant share of the building’s energy use (in some cases, over half!). Servers, data storage arrays, networking equipment, and the cooling and power conditioning that support them tend to draw large amounts of energy 24/7, in many cases using more energy annually than traditional building loads such as HVAC and lighting. e good news is that there are many cost-effective actions, ranging from simple to advanced, that can dramatically reduce that energy use, helping you to save money and reduce pollution.
1- Unused Servers in Data Centre
Determine computational functions/Turn off any unused servers. An Uptime Institute survey suggests that close to 30% of servers in data centres are consuming power but not doing any useful work. To better manage server usage and utilization, create and regularly update a server hardware and application inventory that will help you track the number of applications running on each server. Mapping applications to the physical servers on which they are running help identify unused servers and opportunities for consolidation. Just make sure to migrate any remaining data or workloads before shutting down.
2- Improve Temperature
Increase temperature setpoints to the high end of ASHRAE’s recommended limit. ASHRAE temperature guidelines allow much broader operating ranges than those commonly used, allowing the air temperature at the IT equipment inlet to be raised—up to 80ºF or higher—which considerably reduces cooling energy usage.
3- Power Backup Requirements
Examine power backup requirements (do you really need UPS equipment, and if so, how much is enough?) Many IT applications are not so critical that they cannot be shut down if there is a power disturbance and restarted without adverse effects. Analyzing your power backup requirements can help you eliminate capital costs for unnecessary or oversized redundant power supplies or Uninterruptible Power Supply (UPS) equipment. It can also help you save energy lost in power conversion in those devices as well as energy to cool them. Anything that needs high reliability should be a candidate for moving to a true data centre or cloud solution.
4- Cleaning of Servers and Data Centre
Monitor the dust and contamination level in your server room and provide adequate cleaning when required. A clean server run more efficiently and has a longer lifespan. Server room cleaning is an important factor in care and maintained. Hire the specialist server room cleaning company to perform a deep cleaning to avoid overheating of the hardware and potential failure.
5- Airflow Management
Install blanking panels and block holes between servers in racks Airflow management is conceptually simple and surprisingly easy to implement. Your challenge: ensuring that the cool air from your cooling equipment gets to the inlet of your IT gear, without getting mixed with the hot air coming from the back; and ensuring that hot air going back to the cooling equipment does not mix with the cold air. is can be done by clearing clutter from the airflow path, blanking within and between the racks and the openings in the floor if the gear sits on a raised floor. Containment of cold or hot aisles is a more effective approach. When good airflow management is in place, further savings can be realized through additional measures, such as raising temperature setpoints.
6 – Hardware Equipment Refresh
Refresh the oldest equipment with high-efficiency models Establish server refresh policies that account for increases in generation-on-generation computational ability, energy-efficiency, and power manageability improvements. esavings in energy and software costs will often justify a faster refresh than expected. Consider Energy Star, Climate Savers Computing Initiative Server Catalog, high-temperature tolerant servers, and high-efficiency power supplies (80 PLUS). When purchasing new equipment, servers with solid-state drives (SSD), rather than hard disk drives, may be considered, as they feature faster speeds, are generally considered to be more reliable and consume less power.
7 – Move To More Energy-efficient Data Centre
Move to a more energy-efficient internal or external data centre space, or to cloud solutions. Distributed server rooms are typically not very energy efficient. If a central data centre is available, you may be able to save energy and reduce your utility bill, by moving your servers to that location. When a data centre is not available, many organisations are moving their equipment to co-location or cloud facilities (public or private cloud facilities both typically provide much better efficiencies than on-premise server rooms). Data centres, colocation and cloud facilities typically offer better security, redundancy, and efficiency than is usually available in server rooms.
8 – Improve Awareness Training
Energy-efficiency awareness training for IT custodial and facility staff Have your IT and facilities staff attend server room energy-efficiency awareness classes offered by utility companies, ASHRAE, and other efficiency advocates, to take full advantage of best practices in that area.
9- Server Power Management
Implement server power management. Check for power management options that come with your server models and enable power management if possible. Power management saves energy, especially for applications that do not run continuously or are accessed infrequently. Power cycling can also be implemented to put servers that are unused for long periods of time in a light sleep mode. Lastly, consider built-in or add-in cards that enable servers to be powered on or off remotely when they are not in use.
10 – Consolidate and Virtualize
Consolidate and virtualize applications. Typical servers in server rooms and closets run at very low utilization levels (5-15% on average), while drawing 60-90% of their peak power. Consolidating multiple applications on a smaller number of servers accomplishes the same amount of computational work, with the same level of performance, with much lower energy consumption. Virtualization is a proven method for consolidating applications, allowing multiple applications to run in their own environments on shared servers. By increasing server utilization, this reduces both the number of servers required to run a given number of applications and overall server energy use.
11- Improve Power Monitoring
Implement rack/infrastructure power monitoring. Power monitoring identifies the energy use and efficiencies of the various components in an electrical distribution system. Power meters can be installed at the panels serving the cooling units, or directly on IT and HVAC equipment. Another alternative is to read IT power from UPS display, and to estimate cooling power from the nameplate, taking into account unit efficiency and operating hours. Often power distribution products will have built-in monitoring capability. A key metric is the Power Usage Effectiveness (PUE), which is the ratio of total power to IT input power (with the “overhead” being electrical distribution losses plus cooling power usage). Monitor and strive to lower your PUE: over 2 shows significant room for improvement; 1.5 is good; 1.1 is excellent.
12 – Improve Cooling Units for Better energy Consumption
Install variable frequency drives on cooling units. If your server room is cooled with a Computer-Room Air Handler (CRAH) or Computer-Room Air Conditioner (CRAC) unit, then it is highly likely that the unit has a single-speed fan, and that it provides more airflow than your IT equipment needs. Units with variable frequency drives (VFDs) have the capability of providing only the amount of air that is required by the IT equipment. To maximize potential energy savings, coordinate the implementation of airflow management measures and airflow isolation systems with the installation of a VFD on the cooling unit fan. See item 4 for air management suggestions. Ideally the fan speed should be dynamically controlled to maintain IT inlet temperature within the recommended range.
13 – Rack and Row Level
Install rack- and row-level cooling. If you are installing a new server room or buying new racks, consider local cooling; in-rack and in-row cooling refer to a cooling system located in that rack or row. Another highly efficient option is a Rear Door Heat Exchanger (RDHX), in which a coil is installed directly on the rear (exhaust) section of the server rack. Condenser (Tower) water, chilled water, or refrigerant is run through the coils to passively absorb the exhaust heat and provide the needed cooling. Air circulation through the cooling coil is provided by the internal server fans.
14- Use air-side economizers
An economizer simply draws in outside air for cooling when conditions are suitable. For a server closet with exterior walls or roof, there is a good possibility that an air-side economizer could be installed. is could be in the form of an exhaust fan removing heat in one portion of the room and an opening in another location allowing cool, outside air to enter; or it could be in the form of a fan coil or CRAC/H with air-side economizer capability. Depending on the climate zone in which the server closet is located, this strategy can save a significant amount of energy by reducing compressor cooling energy use.
15 – Dedicated Cooling System
Install dedicated cooling for the room, rather than depending on building cooling. Install cooling equipment solely for the use of the room, so that the building system does not have to operate around the clock. If a retrofit is in order, installing dedicated cooling equipment (like a packaged air conditioning unit) for your server room(s) can result in significant energy savings. Specify a high-efficiency unit with a high SEER rating.