Not at home, but homebrewed...

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View
Here at work, we've recently upgraded the HDD's in the library's
computers.  In doing that, our computer department has on its hands
about twenty-seven 40GB serial ATA drives, which sparked the idea for a
backup machine.  We're planning to build it from scratch, and while I
have experience building PC's and servers with IDE and SCSI drives, the
technical challenges presented by this projects are a bit beyond what
I'm used to, namely building a machine that will require space,
connectivity, and power for at least six or more of these drives.

What I'm curious about is this:

1. Can anyone recommend a decent ADA-RAID card for Linux?  (I'm hoping
for Ubuntu, but at this stage we haven't discussed that far.)

2. Can anyone suggest a place to buy a case with room for two power
supplies.  (Ideally we can do all this with one, but I'd rather be

3. For this kind of project, would it be better to divide the drives up
over a couple of motherboards, essentially putting a cluster in one box?

Any suggestions would be appreciated.  Thanks in advance!

Andrew Burton - A Guide to Esoteric Technology in Paragon City

Re: Not at home, but homebrewed...

Andrew Burton wrote:
Quoted text here. Click to load it

Maybe I can attack this question, in a reverse order.

Two brand new 750GB disks, would have the same storage capacity as
your 27 disks. They use less electricity, and don't need a Herculean
power supply. And there are probably two SATA ports on just
about any computer you grab, to do the job. Linux should have
no probs in such a simple environment. Disk cost totals $540.

If you really want to build a server with that many disks, a $200
power supply might be used. A Silencer 750W has 12V@60A for example,
and would be enough to spin up 30 disks at 2 amps a piece. 12V power drops
to 0.6A at idle, after spinup, so 27 * 0.6A = 16.2 amps at 12V. Close
to 200W of power just for the +12V rail. And on the +5V rail, the controller
boards on the disk use 5V @ 1 amp, 27 amps total for 27 disks, or 135W there.
All that heat must be efficiently exhausted, via multiple case fans.

There are RAID racks, which might have good volumetric efficiency.
I don't know what the electrical interface looks like on those,
or what it takes to cable them up. A tower case has limits to how much
it can hold, and a server case costs a fortune for a good one (less
if dumpster diving or from Ebay).

In terms of controller cards, the cheapest route I can see, is
SATA controller cards based on SIL3114. That chip is four ports.
You'd need 7 cards and would run out of PCI slots before you are

Marvell makes an eight port SATA chip, which would be ideal for
making a competitor to the SIL3114, but AFAIK, nobody makes cheap
non-RAID controller cards with their chips. At least I haven't run
into any along the way. It could be that Marvell priced the chips
for usage only in RAID controllers, preventing companies like Syba
from making a cheap controller. With an eight port card, you'd only
need four cards. And probably the wrong bus interface, on an eight
port controller chip like that.

In the Cadillac class, there is Areca. They make RAID controllers.
This is the most dense card I can find, and has 24 SATA ports on it.
They make cards for different busses, and perhaps two 16 port
PCI Express cards would be a better fit for your average desktop
("SLI" configuration) motherboard. $1170. Dunno about Linux drivers,
as I expect the price is repellent enough to make checking for
drivers irrelevant.

So, really, you'd have to have deep pockets, and be doing it
for the fun of having a monster disk farm.

There are other ways to build storage controllers. For example,
a two port RAID storage controller based on SIL3132, can be
used with SATA port multiplier boxes. The port multiplier
box converts one SATA port to five ports. Thus, with one
controller card, you get 10 ports total. Still waiting
to see someone combine the three chips here, on a single
controller card (with resultant price drop).

    SIL3132           <-----> 5 port, port multiplier box $100
    2 Port Controller <-----> 5 port, port multiplier box $100

The SIL3132 is pretty cheap (cheaper than the boxes). Getting
enough ports, and a power supply, might cost you $1000 or so,
sans computer case.

So far, the SIL3114 is about the best I could do, and maybe
with a 6 slot PCI bus machine, you could host six $20 cards.
So your cost is $200 for power, $120 for controllers. And
hope that all six will co-exist peacefully.

SYBA SD-SATA-4P PCI SATA Controller Card - (SIL3114 based) $20 each. has participants with large storage servers. You might
register on this site, and use their search engine, for more ideas.

I'm not saying it is impossible, merely wasteful of electricity.
And perhaps space, depending on what you can find for a case
for the 27 disks.

Maybe you have a local computer recycler, and the 27 disks can
be used in some computers saved from the scrap heap ? Or sell the
lot on Ebay and see what they will fetch. You might get enough
to make a down payment on the 2x750GB solution.


Re: Not at home, but homebrewed...

Quoted text here. Click to load it

...not to mention the fact that these drives are probably what, 4-5 years
old? with a what, 3 year warranty? So for a "backup" you will be using disks
that are more likely to fail than the disks storing the original data? The
poster mentioned using perhaps only 6 of the 27 disks. That's less than 240
gigs of space, presumably knocked in half if you are using raid with
mirroring or down under 200 if you are using raid 5. I just bought a few new
400 gig seagate drives for about $119 each. So about twice the space for
$119? I could think of very many other things to fiddle with that would be
more interesting and useful. If I had those disks, I would clone the
operating systems from the the machines that they were pulled from onto them
using ghost or Acronis, set them aside for use in an emergency, and buy a
new drive or two for backup purposes.


Posted via a free Usenet account from

Site Timeline