[rescue] Deciding how to place machines in a rack...

Skeezics Boondoggle skeezicsb at gmail.com
Fri Mar 7 13:53:41 CST 2008


Lots of interesting discussion about rack placement and cooling!
Guess I'll toss in my $0.02...

The last shop I worked in had well over 100 Sun V20z's (i.e., rebadged
Newisys 1U Opteron boxes) that they had to space 1U apart because
putting them together in groups cooked them. (We called those racks
the "pizza ovens".)

As I recall, the big problem with the V20z was that there were some
ventilation holes at the back on the top of the case - so the one on
top of it would block the holes of the machine beneath it! Fully
configured, they were very power hungry, too. They'd only mount them
in groups of 8 (and even that was pushing it, on a 20A circuit!) and
get 16 machines in a 40+U rack. Ugh. Since they didn't have the power
to support the higher density anyway, they ended up wasting that rack
space and - while still far from optimal - that "solved" the cooling
problem.

But I have a lot more experience with SPARC hardware, in particular
the 220R/420R/280R boxes. They do proper front-to-back cooling and
I've never known those cases to even get warm to the touch in most
environments.

I've never been a great fan of doors on my racks, since they tend to
just be annoying and get in the way. They also, IMHO, tend to make it
difficult to maintain cooling for all the machines. Even in a raised
floor environment with sufficiently sized A/C, the machines at the
bottom tend to stay significantly cooler than the ones mounted higher
up if you don't properly space your perforated tiles; adding solid
plexiglass doors on the front of the racks can exascerbate that.
Fooling around with fans inside the racks to pull more air up so the
servers at the top can breathe is a pain... But obviously you have to
work in the space that you have with the budget and equipment you
have. Given the preference, though, I prefer open racks.

Besides, a bank of closed doors is borrrrring. Maybe some people like
the "mystique" of a monolithic bank of cabinets, but I've found that
when you're spending piles of someone else's $$s to put together a
server room, the folks that write the checks want to see the "blinky
lights." Seriously. Never underestimate the value of good lighting,
aesthetic placement of the hardware, and quality cable management! The
guys in suits probably don't understand what they're looking at, but
if it's organized, professional and a little bit flashy, you win.

In my home environment, and in the last machine room I built at work,
I mounted a couple of dozen 420R type machines along with a trio of
NetApps using the same strategy. I tend to start at the center of the
rack and work toward the top and bottom. I'd group machines largely by
function, careful to spread them across racks to keep them on
different power circuits - i.e., you don't want the entire web server
farm to go down at once if a single UPS or breaker faults. So in a 40U
rack, I'd mount the first machine from 21-24, then from 17-20, then
from 25-28, 13-16, etc. I'd reserve bottom few spaces for UPS (if not
centralized) and other power management stuff like transfer switches,
distribution, etc. At the top, leave space for network gear, patch
panels and cable management. With the storage arrays, put the head in
the middle, then build separate loops with disk trays above and below,
and so on; makes it easy to add more shelves later on. If it's
financially feasible, having (e.g.) 8 racks that are 2/3rds full is
hugely preferable to 6 racks that are crammed top-to-bottom. (Trust
me, you do NOT want to replace a 420R memory riser card perched on a
ladder to reach a machine mounted 6' off the floor...)

In many cases the careful placement of cooling and exhaust or return
air - even when just moving it around with fans - is often sufficient,
avoiding the expense of an A/C unit. My house is situated on a narrow
lot between two neighboring structures, so it gets very little direct
sunlight on either the east or west sides, and the foundation gets no
southern exposure. It stays naturally cool in the basement year round.
So I took two old DEC rack fans, bought a custom grille and built a
box to contain them, mounted it on hinges under the ceiling joists
beneath a custom plenum with a slot to contain a filter. That brings
in outside air and builds positive air pressure in the basement
outside the server room.

Inside the server room I have a 14" variable-speed 1800cfm fan that
just happens to fit perfectly in a round portal through the foundation
(that may have at one time been a coal chute?), and it exhausts the
hot air in the room to the outside. This works out perfectly, since
the room is largely below grade, and the fan is near the ceiling. The
space is around 900 cubic feet, and I don't need to change the air
twice a minute, so I run the fan at about 1/2 speed or lower most of
the time. :-)

The "membrane" between the two rooms are three filtered grilles
mounted at the opposite corners from the exhaust fan. A small 4"x10"
plenum brings in outside air directly, and two large 14"x14" grilles
(which fit perfectly between the wall studs) are situated between the
computer room and the side of basement where the intake fans are
mounted. With negative air pressure in the server room and positive
pressure outside it, and filters at each stage to keep down the dust,
the naturally cool air from the basement circulates through the room
from NW to SE, flowing through the racks and keeping my servers at a
very comfortable 72F ambient (a fairly constant 28-30degC shelf
temperature as reported by the NetApp, square in the center of its
rated range). The fans use a *lot* less power than running even a
small A/C unit, so I can even keep them on a UPS to maintain cooling
during outages.

Of course, with some better planning, I'd have situated things
slightly differently. We have very mild winters, but I still wish I'd
put in some diverters to give me the option to recapture the heat in
winter rather than blow it out the side of the house. And with summers
in Portland getting hotter earlier and staying hotter longer, this
system doesn't quite hold up when we get 3 or more days in a row above
90F, so I do now have a supplemental A/C unit that helps manage
things. Alas.

I'm actually looking for the one person in a million who'd walk into
this house and say "Holy crap, you have 120A of power wired up in
here, and this crazy cooling system? And 3 44U racks _included_?
Sold!!" (So, I need a computer geek or a pot grower who wants a 3,100
sq ft duplex w/6BR, 3baths...) I'm planning to sell, and if some
Rescuer wanted to pick up a lovely pile of old HP, DEC, NeXT,
Tektronix, and Sun hardware, I'd throw it all in for free! But we're
straying off-topic. :-)

-- Chris



More information about the rescue mailing list