InvestorsHub Logo
Followers 57
Posts 5328
Boards Moderated 0
Alias Born 09/28/2007

Re: agribusiness72 post# 11482

Thursday, 06/06/2013 1:55:48 PM

Thursday, June 06, 2013 1:55:48 PM

Post# of 20680
In this Tips & Tricks blogpost this week, TelVue’s Director of Systems Engineering Chris Perry spells out the do’s and don’ts of creating the proper environment for best server performance:

Having visited and worked in countless Data Centers or Master Control Suites across the country, I’ve seen just about everything – the good, bad, and ugly. Whether you are looking toward a new building, server upgrade path, or just looking to clean things up a bit - here are a few tips, tricks, and thoughts on Data Centers.


1- Servers make noise. While some older platforms were quieter and could be kept in a room where people would work, the newer platforms are louder and more shrill. This is due in large part to the form factor of the newer units- 1 RU (Rack Unit) servers are going to make more noise than their 3RU counterparts because the fans have to move a greater volume of air to keep the units cool and thus run at a higher velocity. It’s a good idea to have a dedicated server room if nothing more than for the noise factor. If this isn’t an option there are special sound dampening racks that can be installed, however these tend to be very expensive. For anyone like myself who is working in server rooms daily, it’s a good idea to travel with earplugs. No joke. If someone is going to be working in a server room for any length of time (more than an hour) earplugs are a must as you can lose the higher registers of your hearing over prolonged exposure to server fan noise. Most hardware stores have the box of 100 single-use packages of 3M earplugs. If nothing more it will keep you sane!

2- Servers generate heat. I’ve worked in a couple of enterprise class data centers – where hundreds of servers, computers, and other equipment is all running 24×7. If the power hiccups (or goes down before the generators spin up) the HVAC system will also hiccup. This will cause the temperature in these rooms that are normally kept at 65-68 degrees fahrenheit to rise rapidly. I’m talking 20 to 30 degrees in less than a minute. While your operation is probably not that big, cooling and ventilation still play an important part in maintaining a good data center/master control.

So how much AC do you really need? More than you’d think. AC units are typically measured in BTU output – but to some this number seems arbitrary. Selecting a higher BTU means the AC unit can push out more cool air in less time, however this should be paired with how much equipment you have. Fortunately BTU is an easily calculated number; some manufacturers put it right in their documentation. If it’s not there, then the wattage or the current draw (Amps) is surely listed. So how does this translate to BTU?

Voltage x Amperage = Wattage (in the USA we run off a 120 volt system, thus we can use that as a constant)

1 Watt = 3.41 BTU/hour.

So a 100 watt server generates 341 BTU/hour. But a note about using the manufacturer’s listed wattage, especially on servers: this is often the MAXIMUM wattage the server will draw based on the power supply ratings NOT the actual usage. For example, TelVue’s B1000 HyperCaster shows 650 watts on the cut sheet – but this is how the power supplies are rated. In actual use the server averages closer to 300 watts. If you are working to calculate how much heat is being generated, make sure you find the actual wattage. Sometimes you may have to reach out to a manufacturer to find this out, or for you do-it-yourselfers, while you’re buying earplugs at the hardware store, look for a Kill-A-Watt. This is a special meter that you can put in line between the power source (plug) and the unit you are trying to measure. It will accurately show how much power a given unit is consuming.

So what does all of that mean for you? First, work closely with your HVAC technician to ensure that you select an AC system that is designed to run 24x7x365. Yes, even in the dead of winter you will be running this system. Second, take the time to calculate approximately how many BTU/hour your equipment is pushing out. You’ll want to keep the room cooler than 72 degrees fahrenheit. Third, plan for the future – you’re only going to be adding more equipment. Don’t be afraid to look at a bigger unit or two smaller units to do the job. If you are not doing a build out, but are looking for a supplemental AC system, check out the Split-Ductless AC systems; these are the wall cassettes that have the refrigerant lines running to the condenser unit outside. Finally, ask questions. If you’re planning on using an existing HVAC system or the same system that cools the rest of the building, make sure that it’s zoned correctly so you can set the thermostat in the equipment room to 65, and NOT freeze out your editing suites!

3- Plan for power. Plan, at minimum, one 20 amp circuit for each rack you have. If you are going to be adding equipment to an equipment room, run some rough calculations on current power load. Since you’ve already done most of the work when doing the BTU calculations, this should be relatively easy. Know where the power panel for your equipment room is located, know which racks are on which circuits, and have this labeled or posted somewhere. You don’t want to be in a situation where you trip a breaker and it takes you an hour to find it, only to later discover that some random plug is also on the same circuit and that a vacuum being plugged in wiped out your server room.

4- UPS! Not the shipper UPS, but the “Uninterruptible Power Supply.” Again, based on your equipment load, you can select the correct size UPS for a given system. Most UPS units are measured in terms of VA (watts) of output. Do yourself a favor and buy the rack mountable versions for installation in a rack and the tower versions for office use. I work with a number of APC models, 1500 VA, 2200 VA, and 3000 VA, depending on the power needs. Some of these units require special outlets, so pay attention to that too. You’ll want to read all the instructions with these units, and pay close attention to the loading. UPS units aren’t designed to run for more than a few minutes, basically enough to get you through a short blackout, brownout (sag), or other unexpected power bump; or until such time as a frequency stabilized backup generator can pick up the load after the grid goes down. Also remember that the lead acid batteries used by UPS units will need to be changed out and recycled about every three years, and should be tested every six months. Daylight savings time is a good benchmark for testing your UPS units- just like changing those batteries in your smoke detectors :)

Always remember: 3 things that kill servers and other electronic equipment are Heat, Humidity, Dust. Although I did not talk much about dust, conventional wisdom applies. Make sure you change out or clean the filters in your AC/HVAC systems, keep servers 18? off the ground, and take the time to sweep the room. Over time servers can build up dust internally, and depending on the manufacturer’s recommendations it’s not a bad idea to take the unit outside with some compressed air and blow it out.

Other Considerations and Musings:

Water – If you are in a low-lying area, plan for the server room to flood. It may never happen but thinking about how your equipment is placed ahead of time can save you an accidental drowning… And insurance may not cover the claim.

Earthquakes – If you live in a seismic zone, make sure to work with a contractor to properly secure the seismic rated racks to the floor.

Fire – Some facilities don’t have the capital to install a data center approved fire suppression system, but if you are doing a full build out it may be worth asking. If nothing more, make sure that you don’t have standpipe connections or sprinkler heads above the racks.

Grounding – Electrical code mandates racks be grounded. Make sure to consult your electrician about proper grounding for racks. If racks are placed in a carpeted room make sure that the humidity is controlled and, if possible, the carpet has a static drain. I could write an entire article about the importance of grounding equipment – but I digress.

Computer Floors – Floating floors are awesome as they give you space to run cable from rack to rack under the removable floor tiles. If you’re doing a build-out or a retrofit, check out computer flooring as an option.

Rack Placement & ADA Requirements – ADA mandates a minimum of 3' behind any rack. When planning space in front of a rack make sure that you allow room to slide equipment all the way out, leaving room for you to stand. Most servers these days are 30? or longer. This means you need a minimum of 40?+ in front of a given rack.

Buy Deep Racks – As mentioned above, new chassis are deeper. Buy racks that are a minimum of 36? deep. I normally spec the Middle Atlantic VRK-4436-LRD. It has threaded rack rails, cable lacing bars, and is 36? deep, making it easy to work with. Don’t be afraid to recycle the 25-year-old racks and update to the newest models. You could even use the money from recycling the old ones to a scrap yard to help pay for new racks.

Ladders and Trays – as you add cable and expand, cable management becomes very important. If possible don’t run cables from rack to rack without exiting the top or bottom first. i.e.: Don’t pull cables wherever they are most convenient. Plan on running them as bundles and in logical manners, even if it means an extra foot of cable.

Computer Floors – Floating floors are awesome as they give you space to run cable from rack to rack under the removable floor tiles. If you’re doing a build-out or a retrofit, check out computer flooring as an option.

Rack Screws – Don’t be cheap and keep using the same screws over and over. Buy good screws that already have the plastic washers on them. Who knows how many times a screw has been put in and taken out? Trust me… There’s nothing worse than having to drill out screws because they were stripped from overuse.

Power Strips – Instead of using small power strips, most rack manufacturers have special power strips that are easily installed in the back of the racks that run the vertical length. You can put one on either side giving you way more outlets than you THINK you would ever need.

Rack Tops – A lot of people buy the tops with the fans. I’m not a huge fan of the fans. Most equipment vents from front to back these days, so leaving both the front and rear of the rack accessible and open will take care of 90% of your ventilation problems.

Get a Labeler – The Brady IDXpert is a great hand-held labeler. Dymo and others have similar cable labeling products. Labeled cables make troubleshooting easier, especially if you are trying to talk someone thru the process.

Google:
Join InvestorsHub

Join the InvestorsHub Community

Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.