temperature@lert blog

Go Back
  • Dawn of Solar Data Centers?


    Major player projects can point to readiness, costs and benefits of solar power for data centers.


    Water, water everywhere,

    And all the boards did shrink.

    Water, water everywhere,

    Nor any drop to drink.                The Rime of the Ancient Mariner - Samuel Taylor Coleridge


    Data center managers must feel a lot like Coleridge’s Ancient Mariner when they look out the window (assuming their offices have any windows). Like the sailors on Coleridge’s journey, data center professionals are surrounded by free power from the wind, sun, water,the earth’s heat and biofuel, but none of it is usable as it exists to power the insatiable demands of the equipment inside the vessel. Despite this challenge, there have been several interesting projects regarding green energy sources. This piece in the data center energy series will explore solar photovoltaic to help determine if the technology is suitable to provide cost effective, reliable power to data centers.


    Temperature@lert Blog: Dawn of Solar Data Centers?
    Left: Engraving by Gustave Doré for an 1876 edition of the poem. "The Albatross," depicts 17 sailors on the deck of a wooden ship facing an albatross. Right: A statue of the Ancient Mariner, with the albatross around his neck, at Watchet, Somerset in south west England where the poem was written. (Link to Source - Wikipedia)


    Solar powered data centers have been in the news recently primarily due to projects by Apple and Google. In an effort to build green data center, Apple’s Maiden, North Carolina 500,000 sq.ft. site is powered in part by a nearby 20-acre, 20-megawatt (MW) solar array, The site also has a 10-MW fuel cell array that uses “directed biogas” credits as the energy source. (Link to Apple Source) The remainder of the power needed for the site is purchased from the local utility with Apple buying renewable energy credits to offset the largely coal and nuclear generated Duke Energy electricity. Apple sells the power from the fuel cells to the local utility in the form of Renewable Energy Credits used to pay electric utility bills. Apple expects that the combination of solar photovoltaic panels and biogas fuel cells will allow the Maiden data center to use 100% renewable energy or energy credits by the end of the year. Several lesser known companies have also implemented solar initiatives but the news is not so widespread.


    Temperature@lert Blog: Dawn of Solar Data Centers?
    Left: Apple Maiden, NC data center site shows solar array in green (Link to Source - Apple); Right: Aerial photo of site with solar array in foreground (Link to Source - Apple Insider)


    It will be instructive to follow reports from Apple to determine the cost-effectiveness of the company’s green approach. That being said, many if not most companies do not have the luxury of being able to build a 20-acre solar farm next to the data center. And most have neither the cash to invest in such projects nor the corporate caché of Apple to get such projects approved, so initiatives such as Maiden may be few and far between. Still, there’s a lot of desert land ripe for solar farms in the US Southwest. Telecommunication infrastructure may be one limitation, but California buys a lot of its electrical power from neighboring states so anything is possible.

    What about solar power for sites where the data center is built in more developed areas, is there any hope? Colocation provider Lifeline Data Centers announced their existing 60,000 sq. ft. Indianapolis, Indiana site will be “largely powered by solar energy”. (Link to Source - Data Center Dynamics) Author Mark Monroe’s piece titled Solar Data Center NOT “Largely Solar Powered” thought about his solar panel installation and took a at the numbers behind this claim. Lifeline is planning to install a 4-MW utility-grade solar array on the roof and in campus parking lot by mid-2014. Author Monroe takes a swag at determining how much of the data center’s power needs will be filled by the solar array.

    Assuming the site’s PUE is equal to the Uptime Institute’s average of 1.64 and taking into account the photovoltaic array’s operating characteristics (tilt angle, non-tracking), site factors (sun angle, cloud cover), etc., Monroe calculates that 4.7% of the site’s total energy and 12% of the overhead energy will be available from the solar installation. At an industry leading PUE of 1.1, the installation will provide 7% of the total energy and 77% of the overhead energy. Monroe notes that while these numbers are a step in the right direction, Lighthouse’s claim of a data center “largely powered by solar energy” is largely not based on the facts. His piece notes that even Apple’s Maiden site with 20 acres of panels only generates about 60% of the total energy needed by the site overhead and IT gear. Lifeline would need to add and extra 6-MW of solar capacity and operate at a PUE of 1.2 to operate at Net Zero Overhead.

    I am curious to see hard data from these and other solar photovoltaic projects for data centers that will show hard cost, performance data and financial incentives (tax considerations, power contracts, etc.) that the industry can review to determine if solar is the right approach for their electrical power needs. Although such disclosure is unlikely due to competitive considerations, it would greatly assist the industry to help promote such green initiatives to help take the spotlight off of headlines criticizing the “power hungry monster”.

    All efforts to improve industry efficiency and reduce energy consumption are steps in the right direction. Companies like Lighthouse Data Centers that don’t have the deep pockets of Apple or Google are taking steps toward the goal of Net Zero Overhead. The challenge for data center operators that initiate green energy or efficiency based projects will be to boast about these efforts to make headline grabbing claims that may not be well supported by the data. As Launcelot Gobbo tells Old Gobbo in Shakespeare’s The Merchant of Venice, “but at any length truth will out.” Green powered and energy independent are claims that need to be examined carefully to maintain industry credibility and good will or “truth will out.”

    Temperature@lert FREE IT Monitoring Guide

    Full story

    Comments (0)

  • Tips for Energy Efficiency and Optimization: Data Centers

    In the quest to become more energy efficient, or "green" as the buzzword goes, Data Center operators have many considerations and variables to keep on their radar. The implementation of these processes can be a multi-faceted project and typically requires both engineering and management insight to accomplish a variety of goals. With that said, according to the 2011 ASHRAE Thermal Guidelines for Data Processing environments, there are 5-6 separate classes of data center environments. Each class is separated by two factors, the equipment present in the environment, as well as the environmental control that is needed relative to the equipment.

    ______________________________________________________________________________

    A1 DC Enterprise Servers and Storage Tightly controlled environment
    A2 DC Volume Servers, Storage, PCs and Workstations      Some control
    A3 DC Volume Servers, Storage, PCs and Workstations  Some control
    A4 DC Volume Servers, Storage, PCs and Workstations  Some control 
    B Office PCs, Workstations, Laptops, Printers  Minimal control
    C  POS equipment, Computers, PDAs  No Control

    ______________________________________________________________________________

     

    One important note, that while classes A1-A4 may utilize the same types of equipment, the differentiation is important for the segmented environmental specifications, including Dry-Bulb Temperature, Humidity Ranges, Dew Points and Elevation. All of these specifications vary within classes A1-A4, as well as for B and C (Table 4, ASHRAE Guidelines).

    After environmental conditions and hardware have been specified, operators must pay strict attention to a long list of considerations for energy optimization. The cliché "attention to detail" is relevant in this case, given the variety of options for optimization. Operators need to consider the big picture first and foremost, but cannot ignore the incremental choices that can also provide value. The following is an abridged version of the list, but provides some of the basic planning considerations for an energy efficient data center. 

    Architecture: 
    • Layout and Arrangement of the Data Center
    • Economizer Airflow Path
    • Synchronization of newer buildings with older sections
    • Economizer Choice: Water-side, Air-Side, None
    • Cooling Routines (in racks or alongside equipment, sensor location)
    Type of Data Center
    • High Performance Computing (HPC)
    • Internet/Web Applications
    • Enterprise storage/servers
    • Financial
    Temperature and Humidity Ratings:
    • Power Distribution Equipment
    • Switches
    • Network Gear and Hardware
    • Cooling units
    • Personnel health

    As a general rule of thumb, ensuring the ratings for all equipment is necessary in the larger picture, as the capabilities of equipment may vary significantly under different environmental conditions. If possible, establish a baseline rating for the equipment (and future purchases) to simplify management and planning.

    As an aside, the concept of "Waste Heat" is a particularly interesting idea. The concept is based on applying hot server air to other processes that require a certain amount of heat, thereby making use of the supposed "waste heat". This is the same idea as recycling. Ventilation air (for the building), heating water, and a number of other internal processes require some amount of heat. By using "Waste Heat", the server air becomes positive reinforcement for other processes, rather than a drawback. For further information on Reusable energy, or the "Energy Reuse Efficectiveness" (ERE), visitTheGreenGrid.org.

     For additional information on 'Green Data Centers', energy efficiency, and ASHRAE guidelines, refer to the compiled Thermal Guidelines or the ASHRAE homepage.



    Full story

    Comments (0)

  • Essential Tech Check List: Building & Retrofitting Your Server Room

    Whether you're building a server room, adding on, or moving equipment there are many considerations to mull over. From the basics to alarm systems, it is important to ensure your server room is efficient and to protect your mission critical equipment. Previously in our blog, we have addressed the issues surrounding the microclimate present in your server room; however, it is critical to have an understanding of how a server room should be laid-out and managed. Use our check list as a guide for promoting security, efficiency, and productivity:

    Our Essential Tech Check List

    (1) Your Basics of Space

    • -Examine the layout of the space and how many units of space you have to work with.

    • -The walls (including ceiling) and doors should isolate the sounds that your equipment is creating.

    • -Check to see which way the door opens. There should also be no windows or other entry points other than the doors in the room.

    • -Consider the floor and whether your equipment will need raised flooring. Aim for anti-static floor finishing to prevent an unwanted static charge.

    • -Make sure there is enough clearance for racks and that they are stable enough to hold your equipment.

    • -Check for aisle clearance too, make sure your have enough room for exhaust to escape and not over-heat nearby equipment.

    • -Think about whether you need ladder racks, cabinets, shelves, patch panels, or rack mounts.

    • -Take into weight and size of each piece of equipment into consideration when designing the layout.


    (2) Keeping Your Cool

    • -Check and see what type if centralized cooling is available, whether an under the floor air distribution or an air duct system.

    • -If there is no centralized system available, get an air conditioner or cooling unit that is able to keep your equipment working productively while minimizing energy consumption and costs.

    • -If at all possible, fresh air vents are great and save on energy costs and consumption!

    • -Remove any and all radiators or other heating equipment currently present in the room. You don't need to add heat at all!

    • -Monitor your cooling system(s) to make sure it is working properly, especially when no one is there.

    • -Make sure your cooling units are not too close in proximity to your electrical equipment, think condensation and flooding. Do not place air conditioning units over your servers.

    • -Monitor the humidity to prevent static charge and electrical shorts.

    • -See if a chilled water system is in the budget or find something within the budget constraints to ensure that the hot air has somewhere to go.

     

    (3) Using Your Power

    • -Check to make sure that you have enough outlets to support power to all your equipment and not to overload them.

    • -Get backup power, preferably UPS to prevent data loss from power blinking or outages.

    • -Don't surpass the maximum electrical intensity per unit of space.

    • -Consider shut down capabilities of equipment (SNMP traps for example).

    • -Make sure your equipment is grounded.

    • -Monitor for power outages if you are not using back-up power systems.

    • -Monitor your back up power systems to make sure your mission critical equipment is not failing due to power loss.

     

    (4) Keeping Secure & Safe

    • -Have at least one phone present in the room in case of emergencies.

    • -Either check for a preexisting fire alarm system and install one if there isn't.

    • -Get a fire suppression system if there is not one there. Take into consideration of whether you will have a wet or dry suppression system and the effects that will have on your equipment. (Halon is a great choice!)

    • -Have reliable contacts to help resolve issues immediately, or form a system of escalation.

    • -Monitor for flooding, especially if this has happened historically in the past.

    • -Secure entrances/exits, this is expensive equipment with critical data, you don't want just anyone in there messing around!

     

    (5) Other Considerations

    • -Get the best cabling/wiring available within budget constraints. 

    • -Keep extra cabling/wiring around, because you never know when you may need it.

    • -Consider color coding wires/cables, a little more work now but definitely a time-saver in the future!

    • -Think about lighting: location & heat produced.

    • -If there is someone sharing the space, get them some earplugs! It's going to be loud in there with the equipment being used.

    • -Consider networking/phone lines being run in there and how much space you have left after that.

    • -Plan for future expansion or retrofitting (again).

    • -Leave the service loops in the ceilings.

    • -Label outlets.

    • -Get rid of dust, your equipment hates it!

    • -Check if you have a rodent/pest problem.

    • -Cover emergency shutoff switches so that it can't be accidentally triggered.

    • -Try to centralize the room in the building so that you can eliminate having to use more cabling/wiring than you need to.

    • -Meet OSHA and ASHRAE guidelines as well local codes.


    Is your server room or do you know of someone's server room that is not being monitored for temperature? Are you concerned with energy consumption, ability to monitor off-hours, and/or preventing mission critical equipment from failure? If you or know someone who is experiencing such issues, we want to hear form YOU!

    We will be giving away ONE FREE USB DEVICE per month to the server room with the most need! Valued at $129.99,Temperature@lert USB Edition is a low-cost, high-performance device that monitors the ambient temperature in your server room and alerts you via email when the temperature rises or falls outside your acceptable range.

    Please send a brief description, pictures, and/or videos to diane@temperaturealert.com for consideration! Our team will select one winner each month based on description and need, because we firmly believe that companies in every industry 


    Full story

    Comments (0)

  • How many temperature sensors do I need?

    What is the temperature in the room you’re in right now?  Take a guess, you’ll be correct within a few degrees.  Now, what is the temperature of the room?  Don’t bother answering that, it’s a trick question because in fact there is no “temperature of the room”, the temperature of the room is a 3D matrix that likely varies by up to 3°C (5.4°F) from one point to another.

    In spaces as different as commercial refrigerators and data centers, temperature differences can be even greater.  Computer modeling demonstrates how, in a data center, server racks can be cool at the bottom and hot near the top.  Commercial refrigerators can have very cold areas near the chilled air outlets.  Whether or not the temperature variations are meaningful depends on what they impact.  Consider the last time you turned your refrigerator down a little and noticed the next morning the milk container in the direct blast from the cooled air outlet was partially frozen.

    Temperature@lert’s White Paper Library has an entry titled “Why isn’t the sensor reading the same as my thermostat?”   (Link to White Paper) The paper shows a room cycling through a twenty-four hour cycle in a second floor, sunny bedroom temperature differences at the floor and 6-feet from the floor can be as much as 5°F, and are never equal.  MIT’s Building Technology Group is explores design, technology and implementation of environmentally responsive urban housing in China.  Figure 1 shows temperature variations from room to room in a sustainably designed apartment.  This one plane model shows a 1°C (1.8°F) temperature difference in rooms with heat sources.



    Figure 1: Modeling temperature variations in an environmentally responsive urban home shows average of 24°C and high of 25°C.   Source: MIT Chinahousing Research  (Link to MIT China House)

    To make informed decisions about how many sensors to deploy, consider whether or not the heating and cooling sources are in direct line with sensitive materials.  Enough sensors will be needed to insure the warmest and coolest locations are within established parameters.  Too many sensors can lead to “sensor data fatigue”, having too much data.  If you’re unsure, experimenting with a few in different locations is a good start.  A balance of protecting valuable materials, cost, and variability within the space being monitored will insure that when problems occur, they are noticed.

    For questions or additional information, contact Temperature@lert at info@temperaturealert.com.

    Full story

    Comments (0)