temperature@lert blog

Go Back
  • Data Center Monitoring: Raised Temperatures, Riskier Management

    Data Center Temperature Monitoring: Raised Temperatures, Riskier Management

    In 2008, American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) published new environmental guidelines for datacom equipment. They increased the high-end temperature from 77°F to 80.6°F.

    The guideline chart below shows the changes in more details:

    data center guideline chart

    According to the 2008 guideline, the recommended operating environments could not ensure optimum energy efficiency. There are varying degrees of energy efficiency within the recommended zone, depending on the outdoor temperature and the cooling system design. Thus, the guideline suggests, “it is incumbent upon each data center operator to review and determine, with appropriate engineering expertise, the ideal point for their system”.

    Patrick Thibodeau, reporter at computerworld.com, conducted an interview with Roger Schmidt, the IBM chief engineer for data center energy efficiency, about how the new temperature parameters will influence energy savings and data center cooling. When asked “how much heat can servers handle before they run into trouble”, Schmidt replied:

    “The previous guidelines for inlet conditions into server and storage racks was recommended at 68 degrees Fahrenheit to 77 Fahrenheit. This is where the IT industry feels that if you run at those conditions you will have reliable equipment for long periods of time. There is an allowable limit that is much bigger, from 59 degrees Fahrenheit to 89 degrees. That means that IT equipment will operate in that range, but if you run at the extremes of that range for long periods of time you may have some fails. We changed the recommended level -- the allowable levels remained the same -- to 64F to 81F. That means at the inlet of your server rack you can go to 81 degrees -- that's pretty warm. [The standard also sets recommendation on humidity levels as well.]”

    He also revealed that 81°F is a point where the power increase is minimal, because “raising it higher than that [the recommended limit] may end up diminishing returns for saving power at the whole data center level.” In fact, according to GSA, it can save about 4% to 5% in energy costs for each degree of increase in the server inlet temperature.

    Too much humidity will result in condensation, which leads to electrical shorts. According to GSA, “based on extensive reliability testing of Printed Circuit Board (PCB) laminate materials, it has been shown that conductive anodic filament (CAF) growth is strongly related to relative humidity. As humidity increases, time to failure rapidly decreases. Extended periods of relative humidity exceeding 60% can result in failures, especially given the reduced conductor to conductor spacing common in many designs today.” The upper moisture region is also important in protecting the disk and tape from corrosion. Excessive humidity forms monolayers of water on device surfaces, providing electrolyte for corrosion. On the other hand, too little humidity will leave the room electro-statistically charged.

    After the new standards were published, it would take time for the data centers to update their operating rooms. According to Schmidt, IBM started using the new guidelines internally since 2008, and some other data center probably would step it up two degrees at a time. To run near the new ASHRAE temperature limits means a higher risk environment for staff to manage and requires more operational expertise. According to 2013 Uptime Institute survey data, nearly half of all data centers reported that their systems ran at 71°F to 75°F. 37% of data center reported temperature from 65°F to 70°F, the next largest temperature segment. The trend to warmer data centers is better revealed by the fact that there were 7% data centers operating at 75°F or above, compared with 3% in the year before.

    Free IT Monitoring Guide


    References:

    ASHRAE, “2008 ASHRAE Environmental Guidelines for Datacom Equipment” http://tc99.ashraetcs.org/documents/ASHRAE_Extended_Environmental_Envelope_Final_Aug_1_2008.pdf

    Patrick Thibodeau, “It's getting warmer in some data centers”, 07/15/2013. http://www.computerworld.com/s/article/9240803/It_s_getting_warmer_in_some_data_centers

    Patrick Thibodeau , “Q&A: The man who helped raise server operating temperatures”, 07/06/2009. http://www.computerworld.com/s/article/9135139/Q_A_The_man_who_helped_raise_server_operating_temperatures_



    Written by:

    Ivory Wu, Sharp Semantic Scribe

    Traveling from Beijing to Massachusetts, Ivory recently graduated with a BA from Wellesley College in Sociology and Economics. Scholastic Ivory has also studied at NYU Stern School of Business as well as MIT. She joins Temperature@lert as the Sharp Semantic Scribe, where she creates weekly blog posts and assists with marketing team projects. When Ivory is not working on her posts and her studies, she enjoys cooking and eating sweets, traveling and couch surfing (12 countries and counting), and fencing (She was the Women's Foil Champion in Beijing at 15!). For this active blogger, Ivory's favorite temperature is 72°F because it's the perfect temperature for outdoor jogging.

    Chris Monaco Temperature@lert

    Full story

    Comments (0)

  • Dawn of Solar Data Centers?


    Major player projects can point to readiness, costs and benefits of solar power for data centers.


    Water, water everywhere,

    And all the boards did shrink.

    Water, water everywhere,

    Nor any drop to drink.                The Rime of the Ancient Mariner - Samuel Taylor Coleridge


    Data center managers must feel a lot like Coleridge’s Ancient Mariner when they look out the window (assuming their offices have any windows). Like the sailors on Coleridge’s journey, data center professionals are surrounded by free power from the wind, sun, water,the earth’s heat and biofuel, but none of it is usable as it exists to power the insatiable demands of the equipment inside the vessel. Despite this challenge, there have been several interesting projects regarding green energy sources. This piece in the data center energy series will explore solar photovoltaic to help determine if the technology is suitable to provide cost effective, reliable power to data centers.


    Temperature@lert Blog: Dawn of Solar Data Centers?
    Left: Engraving by Gustave Doré for an 1876 edition of the poem. "The Albatross," depicts 17 sailors on the deck of a wooden ship facing an albatross. Right: A statue of the Ancient Mariner, with the albatross around his neck, at Watchet, Somerset in south west England where the poem was written. (Link to Source - Wikipedia)


    Solar powered data centers have been in the news recently primarily due to projects by Apple and Google. In an effort to build green data center, Apple’s Maiden, North Carolina 500,000 sq.ft. site is powered in part by a nearby 20-acre, 20-megawatt (MW) solar array, The site also has a 10-MW fuel cell array that uses “directed biogas” credits as the energy source. (Link to Apple Source) The remainder of the power needed for the site is purchased from the local utility with Apple buying renewable energy credits to offset the largely coal and nuclear generated Duke Energy electricity. Apple sells the power from the fuel cells to the local utility in the form of Renewable Energy Credits used to pay electric utility bills. Apple expects that the combination of solar photovoltaic panels and biogas fuel cells will allow the Maiden data center to use 100% renewable energy or energy credits by the end of the year. Several lesser known companies have also implemented solar initiatives but the news is not so widespread.


    Temperature@lert Blog: Dawn of Solar Data Centers?
    Left: Apple Maiden, NC data center site shows solar array in green (Link to Source - Apple); Right: Aerial photo of site with solar array in foreground (Link to Source - Apple Insider)


    It will be instructive to follow reports from Apple to determine the cost-effectiveness of the company’s green approach. That being said, many if not most companies do not have the luxury of being able to build a 20-acre solar farm next to the data center. And most have neither the cash to invest in such projects nor the corporate caché of Apple to get such projects approved, so initiatives such as Maiden may be few and far between. Still, there’s a lot of desert land ripe for solar farms in the US Southwest. Telecommunication infrastructure may be one limitation, but California buys a lot of its electrical power from neighboring states so anything is possible.

    What about solar power for sites where the data center is built in more developed areas, is there any hope? Colocation provider Lifeline Data Centers announced their existing 60,000 sq. ft. Indianapolis, Indiana site will be “largely powered by solar energy”. (Link to Source - Data Center Dynamics) Author Mark Monroe’s piece titled Solar Data Center NOT “Largely Solar Powered” thought about his solar panel installation and took a at the numbers behind this claim. Lifeline is planning to install a 4-MW utility-grade solar array on the roof and in campus parking lot by mid-2014. Author Monroe takes a swag at determining how much of the data center’s power needs will be filled by the solar array.

    Assuming the site’s PUE is equal to the Uptime Institute’s average of 1.64 and taking into account the photovoltaic array’s operating characteristics (tilt angle, non-tracking), site factors (sun angle, cloud cover), etc., Monroe calculates that 4.7% of the site’s total energy and 12% of the overhead energy will be available from the solar installation. At an industry leading PUE of 1.1, the installation will provide 7% of the total energy and 77% of the overhead energy. Monroe notes that while these numbers are a step in the right direction, Lighthouse’s claim of a data center “largely powered by solar energy” is largely not based on the facts. His piece notes that even Apple’s Maiden site with 20 acres of panels only generates about 60% of the total energy needed by the site overhead and IT gear. Lifeline would need to add and extra 6-MW of solar capacity and operate at a PUE of 1.2 to operate at Net Zero Overhead.

    I am curious to see hard data from these and other solar photovoltaic projects for data centers that will show hard cost, performance data and financial incentives (tax considerations, power contracts, etc.) that the industry can review to determine if solar is the right approach for their electrical power needs. Although such disclosure is unlikely due to competitive considerations, it would greatly assist the industry to help promote such green initiatives to help take the spotlight off of headlines criticizing the “power hungry monster”.

    All efforts to improve industry efficiency and reduce energy consumption are steps in the right direction. Companies like Lighthouse Data Centers that don’t have the deep pockets of Apple or Google are taking steps toward the goal of Net Zero Overhead. The challenge for data center operators that initiate green energy or efficiency based projects will be to boast about these efforts to make headline grabbing claims that may not be well supported by the data. As Launcelot Gobbo tells Old Gobbo in Shakespeare’s The Merchant of Venice, “but at any length truth will out.” Green powered and energy independent are claims that need to be examined carefully to maintain industry credibility and good will or “truth will out.”

    Temperature@lert FREE IT Monitoring Guide

    Full story

    Comments (0)

  • Does Cogeneration Yield a Suitable RoI in Data Centers?


    What does the data say?

    This is the second of two pieces on Cogeneration or CHP.  The first explored the topic, this one will explore the RoI of technology proven for other industries as applied to data centers.

    As the data center industry continued to consolidate and competitiveness becomes more intense, IT professionals understand the pressure on both capital and operating budgets.  They are torn by two competing forces, faster and more reliable vs. low cost and now.  IT equipment improvements are continuously and the desire to update always calls.  Reliability has become the mantra of hosted application and cloud customers and although electrical grid failures are not considered “failures against uptime guarantees” for some, businesses affected by outages feel the pain all the same.  And if there are solutions, management pressure to implement them quickly and at low cost is always a factor.

    Cogeneration is typically neither fast nor cheap, but it does offer an alternate path to reliability and uptime.   As in all major investments that require sizable capital and space, the best time to consider cogeneration is during data center construction.  That being said, data centers operating today are not going any place soon, so retrofit upgrade paths are also a consideration, especially in areas where electric power reliability from the local utility has become less reliable over time.  So when should data center professionals consider cogeneration or CHP?  Fortunately there are studies available on public websites that help provide answers.

    Temperature@lert: Does Cogeneration Yield a Suitable RoI in Data Centers?

    University of Syracuse data center exterior; Microturbines in utility area (Link to Source)

    One such study is an installation at the University of Syracuse.  Opened in 2009, the 12,000 ft2 (1100 m2) data center with a peak load of 780 KW employs cogeneration and other green technologies to squeeze every ounce of energy out of the system. (Link to Source)  The site’s 12 natural gas fueled microturbines generate electricity.  The microturbine’s hot exhaust is piped to the chiller room, where it is used to generate cooling for the servers and both heat and cooling for an adjacent office building.  Technologies such as adsorption chillers to turn heat into cooling, reusing waste heat in nearby buildings and rear door server rack cooling that eliminates the need for server fans completes what IBM calls its Greenest Data Center yet.

    Temperature@lert: Does Cogeneration Yield a Suitable RoI in Data Centers?

    Left: Heat exchanger used in winter months to capture waste microturbine heat for use in nearby buildings; Right: IBM “Cool Blue” server rack heat exchangers employ chilled water piped under floor.

    This is certainly an aggressive project, but can the cost be justified with a reasonable Return on Investment?  Fortunately data has recently been released to quantify the energy conservation benefits.  PUE performance measured during 2012 was presented at an October 2013 conference and show a steady PUE between 1.25 and 1.30 during the period, a value that compares very favorably when compared to the typical data center PUE of 2.0. Uptime Institute self reporting average PUE is 1.65 with qualifications, Digital Realty Trust survey of 300 IT professionals with annual revenues of at least $1 Billion and 5,000 employees revealed PUE of 2.9.  (Link to Sources: Uptime Institute Digital Realty Trust)

    Temperature@lert: Does Cogeneration Yield a Suitable RoI in Data Centers?      

    IBM/SU Green Data Center 2009 Goals (Link to Source); 2012 Actual Performance (Link to Source)

    So how can we calculate the actual RoI and compare it to the projected goals.  First, the goals stated in the table on the left show savings of $500,000+ per year.  Another presentation by the microturbine supplier shows a $300,000 per year goal, quite a bit different.  So how do we know what the savings is?  We don’t since there is no reference site where the data center is identical and in an identical location without the CHP.  So we can use the 2.0 average PUE and calculate the energy savings, but that’s not a real answer.  And we also need to take into account the fact that tax incentives and grants such as the $5 Million for the Syracuse University project needs to be reviewed to determine the cost to non-subsidized projects.  Hopefully project managers will provide more information to help data center operators better understand the actual savings as the project matures.

    CHP for data centers is presented with an array of benefits including improved reliability through less dependence on grid power, lower power costs, reduced carbon footprint.  NetApps installed CHP in their Silicon Valley data center to reduce their reliance on grid power due to frequent rolling brownouts and the uncertainties of the power market costs.  Their experience is not as instructive due to the site’s reduced need for cooling due to use of direct air cooling.  As a result the CHP system is used only when the utility is strained.  It is difficult to find quantitative data for modern installations.   While the data seems encouraging, actual energy cost savings are not provided.  We will watch the progress at this and other projects over the next several months to see if CHP costs yield an acceptable RoI via reduced energy costs.  Stay tuned.

    Full story

    Comments (0)

  • Does Cogeneration Have a Role in Data Centers?

    Operators have many options to consider.


    An earlier piece in this series titled Data Centers as Utilities explored the idea that emergency backup power systems in data centers could be used to supply the utility with peak demand power when the grid is running near capacity and the data center’s emergency generators are not needed.  But what about the idea that data centers generate their own power to provide less reliance on the grid?  There are several approaches, particularly in the green energy space that will be explored in future pieces.  One that is readily available and may make sense for data centers to consider is called cogeneration or Combined Heat and Power, CHP for short.

    CHP is not new, it has been used in more traditional industries for decades, primarily heavy industries with large energy needs, steel and paper mills for example.  Cogeneration for data centers has been in the news for quite some time but has had a relatively low adoption rate.  After all, data center operators try to put their capital into IT infrastructure; the utility and facility sides are often looked at as necessary added cost.  But with reports that grid capacity and reliability may not be able to address the growth or reliability needs of the industry, operators are taking a fresh look at options such as self generation.   Low natural gas prices are also a factor since operators may be able to secure the fuel for their own operations more cheaply than through electric utilities.

    As early as 2007 the US Environmental Protection Agency highlighted the potential of cogeneration in the future of data centers in a piece titled The Role of Distributed Generation and Combined Heat and Power (CHP) Systems in Data Centers.(Link to Source)  With advances in the technology, changes in energy costs, and greater emphasis on grid capacity and reliability as it pertains to data centers, cogeneration has received a significant boost with sponsorship from companies such as IBM.  

    Temperature@lert Does Cogeneration Have a Role in Data Centers?

    US sponsored report table showing various technology applications

    all under the CHP or Cogeneration name. (Link to Source)

    There are several approaches to cogeneration or CHP.  The EPA report shows application of several technologies that fall under the sphere CHP or cogeneration.  Recent installations include five gasoline engine powered turbines in a Beijing data center. According to one report, Powered by five of GE’s 3.34-megawatt (MW)  cogeneration units, the 16.7-MW combined cooling and heating power plant (CCHP) will offer a total efficiency of up to 85 percent to minimize the data center’s energy costs. (Link to Source) The project is sponsored by the China National Petroleum Corporation and represents the trend toward distributed energy production in high usage industries.  Ebay’s natural gas powered Salt Lake City plans to deploy a geothermal heat recovery system to product electricity from waste heat. (Link to Source)

    Temperature@lert: Does Cogeneration Have a Role in Data Centers?

    Example of Micro Turbine or Fuel Cell CHP layout (Link to Source)

    Data from projects at the University of Syracuse and University of Toledo data centers will be examined in a companion piece to demonstrate the potential RoI for CHP.

    Temperature@lert: Does Cogeneration Have a Role in Data Centers?

    University of Toledo Natural Gas Fired Micro Turbine Cogeneration Plant. (Link to Source)

    Full story

    Comments (3488)

  • Who exactly is ushering in ASHRAE’s Temperature Guidelines?



    Temperature@lert Dave Ruede Dave Ruede, VP of Marketing at Temperature@lert, says:

    "Is raising data center temperature like a game of “you blinked first”, only with your job on the line?"

    While no global standard exists for data center temperature recommendations, many refer to the white paper ASHRAE Technical Committee (TC 9.9) for Mission Critical Facilities, Technology Spaces, and Electronic Equipment.  As many know, the committee published a 2011 update titled 2011 Thermal Guidelines for Data Processing Environments – Expanded Data Center Classes and Usage Guidance.  (Link to Whitepaper)  With this document, ASHRAE’s TC 9.9 raised the recommended high end temperature from 25°C (77°F) to 27°C (80.6°F) for Class 1 data centers (the most tightly controlled class).  More importantly, the allowed high end was set a warm 32°C (89.6°F), perfect for growing succulents like cacti.

    And yet, recent posts on IT professional social media sites have produced questions like, “What gloves are recommended for data centers to help protect from cold temperatures?”  So it appears not everyone is following ASHRAE’s guidelines.  Yet the other fact is that many IT professional media discussions are about energy savings. And if I remember living through the history of the 1973 OPEC oil embargo correctly, raising home air conditioning temperatures during the summer and lowering home heating temperatures during the winter saves energy and money.  The U.S. Department of Energy’s website estimates a 1% energy saving for each degree the AC temperature is raised.  Some sites claim 2%, 3% and even 4% savings, but even 1% for a data center’s energy budget is very significant.

    What are data center’s really doing?  In a July 15, 2013 piece posted on the Computerworld UK website titled It’s getting warmer in some data centers, author Patrick Thibodeau notes that, “The U.S. General Services Administration, as part of data center consolidation and efficiency efforts, has recommended raising data center temperatures from 72 degrees Fahrenheit (22.2°C) to as high as 80 degrees (26.7°C) . Based on industry best practices, the GSA said it can save 4% to 5% in energy costs for every one degree increase in the server inlet temperature.”  (Link to Article)  A 5% energy savings is something that makes IT managers really salivate.

    eBay’s newest data center in Phoenix, AZ employs open-air cooling technology to reduce energy used for cooling as a percentage of total site power consumption.  (Link to Image)
    So where is the industry?  The article continues that the 2013 Uptime Institute survey that included 1,000 data centers globally, almost 50% were operating at between 71°F (21.6°C) and 75°F (23.9°C).  The Uptime Institute noted that the survey did not show much change from the previous year.  Incredibly, 37% of data centers were operating a frigid 65°F (18.3°C) to 70°F (21.1°C).  Some good news was the fact that data centers operating at less than 65°F (18.3°C) have decreased from 15% to 6% of those surveyed.  This is a self-selected survey, so the data has to be looked at somewhat cautiously since some data center personnel may not elect to participate, but the data is sobering.

    So what’s the problem?  Server and other electronic equipment suppliers have participated fully in the TC 9.9 guidelines; they are certain that their equipment will operate within specification at the higher temperatures.  Their warranties reflect this.  And yet, other issues exist.

    One may the issue of poorly controlled buildings.   Older, poorly insulated facilities with dated, less efficient HVAC equipment may be forced to lower the temperature to withstand elevated summer temperatures, especially if they have significant air leakage.  Indeed, in the Boston area the month of July 2013 has been an average 4°F (2.2°C) hotter than average, a load that will tax even newer cooling systems.  Finally, the elevated temperatures may only apply to the newer equipment in any given data center.  Many data centers have a collection of equipment that contains some of the newest, state of the art servers sharing space vintage electronics that need the cooler temperatures to operate without problems.  And changing out equipment to allow a site to raise the temperature will mean assessing all electronic systems, including building facilities.
    So the industry has a dilemma, save energy and operating cost by raising data center temperatures which could require building, HVAC electronic equipment upgrades, or continue to pay higher operating costs.  The flip side is the price to retrofit buildings, systems and electronic equipment; a cost that would be paid by “Facilities” or “Operations”, not “IT”.
    Image from Slate.com piece about Google’s data center (Link to Image)

    Data center professionals are no different from other industries in that making change is hard, it can come with risks.  And changes to operating protocols are not done lightly when many data centers based their business strategy on reliability guarantees to their customers.  Who among us is willing to stake their professional reputation and possibly their job on a major undertaking that contains variables that may be out of our control?  So a studied approach is called for.   But in the end, the cost of energy will inevitably increase, and the need to implement more powerful servers, etc. will be irresistible. When that time comes, the need to implement raising temperature limits will be examined closely as part of an overall business strategy.  In the mean time, data center personnel may want to check out a recent Slate website post titled “The Internet Wears Shorts”, wherein the author describes Google technicians who work in summer clothes.  The thrust is that Google has achieved significant energy efficiency, partially by running their data centers at “a balmy 80°F” (22.2°C).

    Author: Dave Ruede is VP Marketing at Boston based Temperature@lert (www.temperaturealert.com), a leading developer and provider of low-cost, high-performance temperature monitoring products.  Professional interests include environmental and energy issues as they relate to data centers, clean rooms, and electronics.  Contact: dave@temperaturealert.com

    Full story

    Comments (0)

  • What's the Best Temperature Sensor?

    "What's the Best Temperature Sensor?"

    The Classic Top 10 List: We've all seen (and heard) top 10 lists, and these lists can be both informative and misleading. The attempt to categorize any product or collection of people into a "top" list can be useful for outsiders or uninformed audiences, but overall, these lists tend to be rash generalizations supplanted by one particular perspective.
     

    "The Best Temperature Sensor" In the world of Temperature Sensors and environmental monitors, many people look for a sacred 'list' of Best Temperature Sensors, or some sort of "top 5" reference point that leads to an informed decision. In truth, many prospective buyers are unfamiliar with the sensor market, and are depending on this type of neutral resource to guide them on a path to some sort of "purchaser enlightenment". Unfortunately, this resource does not exist in the natural internet environment, and users rely on technical forums (SpiceWorks) and raw customer feedback (Amazon Reviews, etc). If you're using these resources, there is one important detail to keep in mind when scanning these scattered sources of information.
     

    What is that Detail?  Application or industry. This is the most important indicator to consider when browsing for 'best temperature sensors'. Not all sensors are created equal, and we can use of our own products as a simple example. Our USB device is primarily utilized by IT clients that are looking to monitor the temperature of their server rooms, and for this purpose, the USB edition is our first recommendation for any prospect with this exact need. With that said, the USB edition has little or no value to the commercial refrigeration industry (for instance), wherein a computer is not typically in close proximity to a walk-in fridge or freezer. Even if this is the case, the device is primarily designed for the IT industry, and suggestions to use the USB product for other applications outside of IT are, well, misleading. In the larger picture, there are specialized sensors that are designated for particular industries, and it's very rare to find a product that is infinitely flexible, or can be used in every industry. Take recommendations from your peers in your industry, and don't rely on reviews or outsider suggestions if you aren't sure of their application. 

    Or take this example. In restaurants and commercial refrigeration, power outages can lead to serious issues. The loss of cooling (from refrigerators or freezers) can cause temperatures to rise dramatically, and as we've discussed in other blog posts, exposure to these high temperatures can lead to bacterial infestation. Put another way, in the case of ice cream, the rising temperatures can lead to thousands of dollars in melted deliciousness. In this particular instance, suggesting any kind of temperature sensor or monitoring solution without a backup battery would be pointless. If the device solely relies on AC-power, the power outage would disable the reading capability of the device. Even if a vendor boasts reliability, dependability, or any other relevant buzzword, the absence of a backup power source is a serious hole. The purchase of this device would be a major oversight, and in fact, would not meet the complete set of needs for a commercial refrigeration client. 

     If you're looking for a list of best temperature sensors, remember that the 'best' is in the eye of the beholder, and in this type of situation, the beholder will make suggestions from their industry perspective, which may or may not be relevant to your search. Search with an open ear, and remember that the voice of reason in "temperature sensors" is highly dependent on the specific application. There are companies that specialize in sensors for the IT industry, and while they may be reputable vendors, they may not have the best solution for your needs. Listen to your industry voices!

    Full story

    Comments (118)

  • How much is my Server Room worth?

    How much is your Server Room Worth?

    Redundancy and the value of your in-house data

    Way in the back of your office--beyond the marketing mavens and chipper CEOs--is a room of servers. There might be a few 4U servers in a closet,  a handful of database servers in a larger space, or even an entire room with “racks on racks”. Most businesses will have a dedicated space for server equipment, and no matter the size, the overall value of the information can far outweigh the actual costs of the server hardware. Think of it in these terms; how valuable do you consider your “big data”, and what precautions are you undertaking to protect the information?

    Redundancy is one common method and is typically associated with the concept of Disaster Recovery (DR). In fact, a slew of cloud and hosting providers now tout DRaaS (Disaster Recovery as a Service) as a selling point for their solutions. But for the smaller-scale SMBs that utilize an in-house data closet, in-house redundancy can be difficult to produce. In-house redundancy may involve the use of vacant servers and equipment that receives copies of all data transmissions and related information. While this is an important concept to consider for your servers, keep in mind that the duplicate purchases (of identical equipment for failover) can be a costly expense. Remember that  the costs of data loss/leakage (depending on your business size) can be astronomical. Check out these words about data losses from David M. Smith of Pepperdine’s Graziadio School of Business and Management:

    “The final cost to be accounted for in a data loss episode is the value of the lost data if the data cannot be retrieved. As noted earlier, this outcome occurs in approximately 17 percent of data loss incidents. The value of the lost data varies widely depending on the incident and, most critically, on the amount of data lost. In some cases the data may be re-keyed in a short period of time, a result that would translate to a relatively low cost of the lost data. In other cases, the value of the lost data may take hundreds of man-hours over several weeks to recover or reconstruct. Such prolonged effort could cost a company thousands, even potentially millions, of dollars.[12] Although it is difficult to precisely measure the intrinsic value of data, and the value of different types of data varies, several sources in the computer literature suggest that the value of 100 megabytes of data is valued at approximately $1 million, translating to $10,000 for each MB of lost data.[13] Using this figure, and assuming the average data loss incident results in 2 megabytes of lost data, one can calculate that such a loss would cost $20,000. Factoring in the 17 percent probability that the incident would result in permanent data loss, one can further predict that each such data loss would result in a $3,400 expected cost.”


    And with that said, the cost of redundancy (or rescuing data after a failure or disaster) can be difficult to calculate when you consider the man-hours associated with the recovery. As an aside, big data has come under the spotlight recently as a overused buzzword, and the divide between useful data and well, just data, is a difficult line to draw. Marketers in particular are faced with this problem, and sorting through the mountains of data can be both cumbersome and useful. Some consider "big data" to be overrated, in that the finding of useful information within piles of useless data can be time intensive and wasteful. Regardless, the value of an endless data pool is difficult to calculate (depending on business size and application), but is significant, and ultimately the consequences of lost data aren’t to be ignored. Protect your data, use redundancy, and seek out other methods of reliability and sustainability for your priceless server rooms and closets.


    Full story

    Comments (0)

  • Particle Filter Series: What You Can't Remove

    Particle Filter Series: What You Can't Remove

    Particle filters may not be enough to protect IT equipment in all environments.

    In this series, we’ve compared data center particle filters to those found in a home HVAC system, mainly to highlight the similarities for filter consideration. In either application, needless to say, particle filters remove particles.  Some remove more particles or smaller particles, some less.  The more particles a filter removes, the less efficiency-robbing dust there will be on heat exchangers, fans, and electronic circuitry.  Unless we use HEPA or ULPA filters (as is done in semiconductor manufacturing to remove very small size particles), some will get through.  What is needed, is a balance of cost, efficiency, pressure drop, an understanding of the cleanliness of the local environment, and the OEM’s requirements.

    But particle filters cannot and do not remove gases such as oxygen and nitrogen.  Of concern here is the fact that particle filters do not remove corrosive gases, those that can corrode exposed metal in electronic assemblies.  Equipment manufacturers often take precautions to prevent corrosion.  Coatings are sometimes used, as well as materials that specifically inhibit corrosion.  But since the implementation of RoHS lead-free initiatives, the materials used for electronics are now less resistant to the effects of corrosive gases than they were before the regulation updates.  Nobody wants to go back to using lead and other environmentally hazardous materials, so additional precautions may be needed.

    What are the corrosive gases of concern?  The most corrosive are two byproducts of combustion: sulfur dioxide (SO2) from automobile exhausts and home heating systems and hydrogen sulfide (H2S) from burning coal, pulp mills, landfills, or waste treatment plants.  Needless to say, large urban areas known for poor air quality, especially communities that are situated "downwind" of coal-fired electrical generators. In these areas, the air can have elevated concentrations of these corrosive gases.  Other corrosive agents are the list of chemicals containing chlorine or chlorides.  As seaside residents know, rust is an issue on metallic surfaces.  Facilities near coastal areas, as well as those near water treatment plants or pulp and paper mills, are also areas of concern for chloride-based gases.

    The effects of corrosive gases are everywhere: remember your grandmother’s silverware? What needs to be understood is the rate of corrosion.  To help with guidelines, ASHRAE’s Technical Committee 9.9 has issued a report to help IT professionals with the issue of corrosive gases. (Link to ASHRAE guidelines)

    Table 2 from the ASHRAE report titled 2011 Gaseous and Particulate Contamination Guidelines For Data Centers shows a reference to the International Automation Society’s standard for electronic materials corrosion, ISA-71.04 (1985).  The table describes copper corrosion activity per month and classifies it into four categories; the greater the amount of corrosive gases, the greater the amount of corrosion per unit time.  Table 4 describes ASHRAE’s latest recommendations for Acceptable Limits of Gaseous Contamination.  The ASHRAE guidelines add silver corrosion activity to the guidelines.  This is due to the use of silver as a replacement for lead in RoHS compliant electronics, and the relatively high reactivity of silver to sulfur containing gases.  ISA is considering adding silver to the ISA-71.04 standard. Electronic device manufacturers often reference these documents in their equipment warranty policies.

     

    Too Small?

    Particle filters, no matter how efficient they may be, cannot filter out these gaseous “particles”. They are simply too small (meaning the size of air molecules), and therefore cannot be removed with this type of filtration.

    Several companies specialize in determining the level of corrosion on copper and silver strips of metal in data centers and other environments.  Typically they employ metal strips referred to as “coupons” that are placed in the IT space for thirty days and sent to be analyzed.  A report describing the level of corrosion per month as well as the types of gases present is the result.  In many locations, the levels of corrosive gases in the data center are low enough to be considered safe and within OEM specification.

           particle resized 600        

    Figures 3a, 3b: Corrosion Classification Coupons, also known as Reactivity Monitoring Coupons from two suppliers for use in Data Centers based on methods describes in ASHRAE TC 9.9 and ISA-71.04 guidelines.  

    (Link to Image)

     

     

     

     

     

     

     

     

     

     

     

     

     

    Large urban centers in Asia and South Asia can experience high levels of corrosive gases in the local atmosphere. Keep in mind, urban centers in Europe and North America have also experienced high levels, especially during home heating season months. Figures 4a and 4b show results from the US EPA Acid Rain Program.  The program was founded under the 1990 Clean Air Act and shows dramatic decreases in SO2 (4a) and Sulfate (4b) concentrations. (Link to Source)  Data centers that are immediately downwind of sources will need to monitor corrosive gas levels at the atmospheric level to understand if they are in areas of concern.  Those considering air side economizers will want to place some coupons outside, specifically near the make-up air intake.




    Figure 4a (Above): SO2 levels in Eastern USA in 1989-1991 and 2007-2009

    Figure 4b (Above): Sulfate levels in Eastern USA in 1989-1991 and 2007-2009

    For IT managers in locations with high levels of corrosive gases, what's the best plan of action?  Fear not, there are solutions.  The next and final post in this series will provide some guidance.

    Full story

    Comments (0)

  • The Do's and Don't of Particle Filters and the effect on the IT space

    What dust? I don’t see no stinking dust!



    Dust Accumulation in a Dell Laptop (Source)


    What particle filters do, what they don’t do, and how it affects filter selection for IT spaces.

    Up to now, this series has discussed particle filtration in data centers. All IT spaces, including computer, telecom, server, and instrument rooms require particle filtration in their air handling systems for optimum performance.  But why is this so?

    Earlier it was mentioned that when I personally replaced a standard hot air heating system filter with one rated to reduce allergens, the high efficiency filter removed a remarkable amount of dust in comparison to the standard one. Nothing in the house changed; the dust particles had been there all along.  They simply passed through the standard filter without slowing down.  Luckily, as my family doesn’t suffer from allergies, our health was not affected by the particles. Otherwise, for all intents and purposes, the furnace performed well with the standard filter.  However, this is not the case for CRAC units, where a little dust on the cooling coils can degrade efficiency.  Note to self: vacuum refrigerator coils, summer is arriving in two weeks.

    Dust Accumulation in Laptop Fan (Source)

    More importantly, dust on electrical components can insulate them and keep them from dissipating heat as designed.  This can overheat and stress components, leading to reduced or intermittent performance and in some cases, premature failure, especially toward the end of the device’s life.  That’s why all of the IT equipment manufacturers specify particle filtration as a condition in their warranty language.  Images on the web are more likely to show extreme examples in laptops and home computers, but the same can be seen in poorly maintained servers.

    But what is this dust, and how does the filter keep it out?  Dust is a buildup of fibers (natural and synthetic textiles, hair, fur, microscopic plastic, wood and metal scrapings from flooring, shoes, desks, etc.; plant, animal, soil, sand, pavement, building materials from the ambient environment, and the residues from combustion such as soot and smoke particles. Keep in mind, bacteria, mold spores, and viruses are particles in the air that can also be trapped in the filter.  They can be visible or invisible, and are sometimes too small to see with even the most powerful of microscopes (viruses for example).  Here’s an example of the range of particles that may be found in the air at any given time.

    Particle

    Particle Size
    (microns)

    one inch (24.4 mm)

    25400

    dot (.)

    615

    Eye of a Needle

    1230

    Beach Sand

    100 - 10000

    Mist

    70 - 350

    Pollens

    10 - 1000

    Textile Fibers

    10 - 1000

    Human Hair

    40 - 300

    Dust Mites

    100 - 300

    Saw Dust

    30 - 600

    Mold Spores

    10 - 30

    Red Blood Cells

    5 - 10

    Spider web

    2 - 3

    Combustion-related - motor vehicles, wood burning, industrial processes

    up to 2.5

    Milled Flour, Milled Corn

    1 - 100

    Coal Dust

    1 - 100

    Talcum Dust

    0.5 - 50

    Copier Toner

    0.5 - 15

    Liquid Droplets

    0.5 - 5

    Anthrax

    1 - 5

    Smoldering or Flaming Cooking Oil

    0.03 - 0.9

    Bacteria

    0.3 - 60

    Combustion

    0.01 - 0.1

    Burning Wood

    0.2 - 3

    Tobacco Smoke

    0.01 - 4

    Viruses

    0.005 - 0.3

    Typical Atmospheric Dust

    0.001 to 30

    Carbon Dioxide

    0.00065

    Oxygen

    0.0005

    Table edited for length. (Source)

    Different geographical locations have unique variables to consider, including the local environment (arid, tropical, arctic, etc.), local culture (cooking and heating methods, popular modes of transportation, hygiene practices, etc.), the site’s policies and practices in the selection of materials used and allowed in IT spaces, access practices (gowning, etc.), and HVAC system maintenance.  In clean rooms, such as those used for integrated circuit production at companies like Intel, Samsung and TSMC, the removal of particles is especially critical.  The cleanliness requirements for microprocessors and flash memory devices are significantly higher than a server closet, as the finished chips must operate under their defined specifications without obstruction or minuscule imperfections. After assessing the particle loading potential in any given site, the cost of filtration, cleaning, and other measures can be weighed against the OEM’s requirements.

    Typical Particles found in the Environment (Source)

    Local HVAC service companies are very familiar with the particle filtration needs of their areas.  By matching this knowledge with the site’s policies and practices, one can easily determine the optimum particle filtration for any given location. From data closets, to server rooms, and even in microprocessor factories, staying informed on best practices in filtration is always a wise move. 

    IC Wafer Lab (Source

     

    If you have other suggestions, tips, or insights on this issue, feel free to chime in on the comments section of this page!



    Full story

    Comments (0)

  • What Air Filter(s) Do I Need?

    What Air Filter(s) Do I Need?

    How to choose the correct air filters for CRAC units within a Data Center

    As previously discussed, air filters used in data center CRAC units are specialized for particle removal efficiency and resistance to air flow (pressure drop).  Similar to our comments about the filters in home heating and air conditioning systems, proper maintenance is required for problem free operation in both cases.  But when it’s time to change the filter, how do you know which to select from the myriad of choices?

    To reiterate, removing particles is important to insure the heat exchanger is keep clean and working at optimal efficiency.  It's therefore important to choose filters that meet the manufacturer’s particle removal specifications.  These specifications are often in terms that are not commonly used or understood, so what do they mean in simple terms?  The specifications are typically describing the industry standard for testing of the "particle removal efficiency" of air filters. These are geographically based suggestions, and may vary based on location. The terms that are relevant for most applications are the following:

    • MERV: Minimum Efficiency Reporting Value from ASHRAE Std. 52.2 - 2007 Testing Method
      • Filters rated from MERV 1 to MERV 16, commonly used in USA
    • CEN: European Committee for Standardization from EN779:2012 Testing Method
      • Filters are rated G, M, F, and U (clean room applications), commonly used in EU
    • ISO: International Organization for Standardization from ISO 14644-1
      • Filters are rated ISO 1 to ISO 9, International Standard often used in clean rooms, frequently referenced by electronic equipment manufacturers

    When selecting an air filter to replace the existing CRAC unit filters, use of the OEM filter will insure that the filter meets both particle removal and pressure drop specifications as required.  However, because the particle types and concentration in the air can vary from location to location, the life of the filter will need to be determined by monitoring the pressure instrumentation of the CRAC unit.  For example, arid climates may have a greater loading of sandy dust, whereas wetter environments may find many pollen and plant fibers.  The number of individuals that enter or leave the IT space, the leak integrity of the room, and the particle removal effectiveness of the makeup air handler can all play an important role in the actual life of the filters.  CRAC OEMs can provide guidance about when to change filters and options for extended filter life (if needed).  Third party suppliers can also offer a wealth of information. One can select the optimum filter for any particular location by using comparative filter testing data and gathering information on vendors and available filter types.

    If an alternate filter is selected, it's important to monitor the operating parameters of the CRAC unit.  Filters with greater pressure drop will require more fan capacity (energy).  CRAC units with VFD fans will automatically accommodate the change within the control range. On the opposite end, Non-VFD units may require manual adjustment.  The ultimate goal is to ensure the fan is not operating at 100% capacity with a new filter, since there will be no additional capacity to overcome the increase in pressure drop as the filter accumulates dust. 

    The following images are not necessarily from CRAC filter suppliers.  They are intended to assist the reader in discussions about replacement air filters.

     

    Different types of air filters that may be employed. 

    Source: Link to Image 1




     Various particle capture mechanisms that may be employed in air filters. Generally smaller particles are removed more efficiently by mechanisms at the bottom of the chart.

    Source: Link to Image 2



    Graphical visualization of the types of particles found in households compared to various filter types.  CRAC air filters are generally in the Standard Filter category.  The source of this image markets Electrostatic Air Purifiers to control allergens. 

    Source: Link to Image 3


    References: Generally from air filter, equipment or cleaning services suppliers (Standards can be ordered from the respective organizations.)

     

    ASHRAE (https://www.ashrae.org/)

     

    1. Flanders Filters: ASHRAE Std. 52.1 Comparison to Std. 52.2http://www.flanderscorp.com/files/Technical_Data/ASHRAE+MERV+CROSS+REFERENCE.pdf
    2. Camfil Farr Technical Services Bulletin ASHRAE Testing for HVAC Air Filtration A Review of Standards 52.1-1992 & 52.2-1999 http://www.camfil.no/FileArchive/Quality%20certificates%20and%20awards/ASHRAE52.pdf
    3. Camfil Farr presentation describing ASHRAE 52.2 with good technical  background information on particle sizes and particle removal mechanisms http://www.cshe.org/1-10-13%20OC%20ASHRAE%20Ind%20Stds%2052.2.pdf

     

    CEN (http://www.cen.eu/cen/Sectors/Sectors/HVACetc/Pages/default.aspx)

    1. AAF International: EN779:2012 New European Standard for General Ventilation Filtershttp://www.aafeurope.com/en/148/en779-2012
    2. Filtrair B.V The Netherlands presentation European Air Filter Test Standard EN779:2012 describing the background for filter testing and rating with a good technical description of test methods for both CEN and ASHRAE tests.http://www.airah.org.au/imis15_prod/Content_Files/Divisionmeetingpresentations/QLD/PPQLD_13-06-2012-GW.pdf
    3. CEN and ASHRAE  cross reference charts from:
      1. Camfil Farrhttp://www.camfil.com/FileArchive/Brochures/Gas%20turbines%20and%20other%20power%20systems/Filter%20brochures/Filter_class_chart_ASHRAE_EN_Moved.pdf)
      2. Flanders Filters http://www.flanderscorp.com/files/FlandersFFI_literature/PB3001_FilterEff.pdf

     

    ISO (http://www.iso.org/iso/home.html)

    1. Fujitsu FTS Specification “Gaseous and Particulate Contamination Guidelines for Data Centers”http://globalsp.ts.fujitsu.com/dmsp/Publications/public/FTS-04230-Specification-for-DataCenter.pdf
    2. IBM Systems Hardware information, Environmental design criteriahttp://pic.dhe.ibm.com/infocenter/powersys/v3r1m5/index.jsp?topic=/p7ebe/p7ebetempandhumiditydesign.htm
    3. Camfil Farr Clean Room Design Standards and Energy Optimization, describes ISO, ASHRAE and CEN particle specifications for pharmaceutical manufacturers, good technical particle source and control discussions for clean room environments with lessons for data center operatorshttp://www.camfil.us/Global/Documents/US/Literature%20Library/CREO.pdf

     



    Full story

    Comments (0)

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. Next page