SAS Expanders, Build Your Own JBOD DAS Enclosure and Save – Iteration 1
Oftentimes, users running file servers such as Windows 2008 Server R2, Windows Home Server, Linux variants (including Openfiler), OpenSolaris, FreeBSD (including FreeNAS), and so forth will require more storage than their server can physically store. One option is to add more servers to the SAN. Another option is to add more storage to an existing server. Adding a second (or third) enclosure for additional disks is a great option. This allows a server administrator to build a massive DAS storage system very inexpensively for applications like iSCSI, backup storage, media storage, virtual machine storage, and etc. Oftentimes, the ensuing research will lead IT professionals to JBOD DAS enclosures with SAS expanders built in.
While a common solution, large JBOD enclosures with SAS expanders built-in can often be quite pricey. For those unfamiliar with what these are, these are direct attached storage (DAS) enclosures that connect directly to a server or system via one or more cable, and hold numerous hot-swap drives. The included SAS expanders are not meant to run RAID natively, but rather to allow numerous disks to connect directly to a server though fewer cables. In essence, for this 20 drive enclosure, there will be one SFF-8088 cable connecting the DAS SAS expander enclosure to the main server for twenty JBOD drives.
This does not mean that the drives in the SAS expander cannot be used in Raid. Quite the contrary, as oftentimes SAS expanders will have their disks controlled by a host raid controller. The cost savings with this setup are achieved because it is cheaper to use SAS expanders when high port counts are needed.
Many will note that one can easily purchase a SAS Expander enclosure from a variety of sources. These solutions often range in the $2,000 to over $3,000 per 3U or 4U enclosure. My goal, show you how to DIY a JBOD DAS enclosure with a SAS Expander for half that (potentially less). The unit will be able to support over 40TB of data today, and by the end of 2010 when the 3Tb and 4TB disks are out will scale to 60TB to 80TB in a single 4U enclosure.
In the near future, I will have some updates to this article showing some improvements I have made. So stay tuned!
In this guide I will use:
- Norco RPC-4220 which is a 4U case
- HP SAS Expander
- 550w power supply (some suggest a higher rated power supply, but without a CPU the power consumption is very low.)
- A cheap $35 motherboard
- Miscellaneous SFF-8087 and SFF-8088 cables
Total cost, was under $800. A great place to spend a bit more money (and will be required for many) is on a redundant power supply. One thing to note is that I tried this with a few different low cost mATX motherboards, and did have an issue with the MSI socket AM2 board I used in the Sempron 140 in a box review. It is also the one I pictured since I used it for mock up purposes.
Step 1: Ready the chassis for installation. The first step is to get the Norco RPC-4220 opened up and ready for installation. Most users will want to uninstall the fan bracket as it makes installing power and data cables much easier.
Step 2: Install power supply. For this guide I used a low cost 550w power supply. Since there is no CPU and motherboard using power at startup, this PSU actually worked fine. Of course, doing this again, I would likely choose a PSU with more power output and built in redundancy.
Step 3: Install internal power cables and install internal SFF-8087 cables. This is a really simple step. Again, you will likely want to uninstall the fan bracket as that does make cable isntallation much easier in the Norco RPC-4220.
Step 4: Install motherboard. It should be noted the MSI motherboard depicted below had a SM bus issue that prevented the HP SAS expander from operating properly when there was no CPU installed. I ended up swapping this for an Intel mATX H55 board and that one worked perfectly, as well as old EVGA and ASUS Socket 775 motherbaords. Those boards all worked flawlessly without the CPU and memory installed.
I decided not to use a CPU and memory in the DAS enclosure because it provides less restrictive airflow without the additional components. In addition, there are less pieces to fail, and less parts to consume power, both which are good things for this enclosure.
Step 5: Install the HP SAS Expander by simply plugging it into the open PCIe slot. The HP SAS Expander does not use the PCIe bus for data transfer.
Step 6: Connect internal SFF-8087 cables and power cables to HP SAS Expander and motherboard (respectively)
Step 7: Install disks like Hitachi Deskstar 1TB and 2TB drives (they work great in raid and are CHEAP!) Personally, I would only have one drive completely installed at first. You want to test everything is working before plugging in $2,500 worth of drives.
Step 8: Connect DAS box to server using a SFF-8088 cable between the HP SAS Expander’s external backplane port and the main server/ workstation’s SFF-8088 port (usually will be found on the servers RAID controller.) One huge advantage to using the HP SAS Expander is that it has this external port making it easy to use external cabling. To convert a SFF 8087 to SFF-8088 through a PCB backplane sitting in an expansion slot will cost $40 or more once all costs are added up. Using the HP SAS Expander with a supported RAID card that includes external ports allows one to save the unnecessary expense and space.
Step 9: Power unit on with one low cost drive and make sure everything is working properly including the fans, HP SAS Expander, disk, power supply and etc.
Step 10: Add more drives and start testing/ configuring
I have even tested this configuration out with SSD’s:
I would advise against attaching too many SSDs to the DAS box because the HP SAS Expander only is using an aggregate 12.0gbps uplink to the Areca 1680LP. For small reads/ writes, this may make sense but adding 22 SSDs to the Norco RPC-4220 will obviously have a bit of a bottleneck around the HP SAS Expander.
That’s it! While you are doing the above, think that each of the 10 steps saves you $100 to $200 per step over pre-built solutions. There are very few DIY hardware projects that can save so much money so easily.
One important note is that the HP SAS Expanders use do support SAS 6.0gbps (you want the newest firmware v2.02 for this, although it was introduced in the v1.5x firmware) but only SATA II 3.0gbps. Also, the HP SAS Expanders do NOT work with all raid controllers. For some quick information please see this post. You also want to consider a card with staggered disk startup capability. Spinning up 20+ hard drives can be strenuous on the power supply if they all come online together. Once spinning, the amount of power required is much lower.
Next steps for this series: A better power solution (I have this running in the current DAS box attached to the Big WHS), using the 4U of space more efficiently, configuration, other options. I will post Iteration 2 of this guide shortly with those changes.