Sign in with
Sign up | Sign in

Hardware Comparison Table and Test Setup

Roundup: Three 16-Port Enterprise SAS Controllers
By
Manufacturer
Adaptec
Areca
Promise
Model
RAID 51645ARC-1680ix-16SuperTrak EX16650
Internal Connectors
4x SFF 80874x SFF 80874x SFF 8087
External Connectors
1x SFF 80881x SFF-8088, 1x LAN, COMN/A
Cache
512 MB DDR2-400 ECC
on board
DDR2-533 512 MB
DIMM
512 MB DDR2 ECC
on board
Profile
Full Height, half lengthFull Height
Full Height
Interface
PCI Express x8PCI Express x8PCI Express x8
XOR Engine
1.2 GHz Dual-Core RAID on Chip (ROC)Intel IOP348
1200 MHz
Intel IOP348
1200 MHz
RAID Level Migration
Yes
Yes
Yes
Online Capacity Expansion
Yes
Yes
Yes
2+ TB Volumes (64 bit LBA)
Yes
Yes
Yes
Multiple RAID Arrays
Yes
Yes
Yes
Command Line Interface
Yes
Yes
No
Hot Spare Support
Yes
Yes
Yes
Battery Back-up Unit
Optional
Optional
Optional
RAID 5 init
45 min
25 min
1h 16 min
RAID 5 rebuild
33 min
50 min
55 min
RAID 6 init
55 min
25 min
1h 30 min
RAID 6 rebuild Drive 1
40 min
55 min
1h 4 min
RAID 6 rebuild Drive 2
32 min
rebuilt simultaneously
55 min
RAID 6 rebuild Total
1h 12 min
57 min
1h 59 min
Spin Down Idle Drives
Yes
Yes
No
Power Consumption Power Saving
298 W
296 W
N/A
Power Consumption Idle
368 W
365 W
364 W
Power Consumption Peak
412 W
409 W
402 W
Supported RAID Modes
0, 1, 1E, 5, 5EE, 6, 10, 50, 60, JBOD0, 1, 10(1E), 3, 5, 6, 30, 50, 60, Single Disk or JBOD0, 1, 1E, 5, 6, 10, 50, 60
Fan
No
Yes
No
Supported OS
Windows XP, Server 2003/2008, Vista, Red Hat Enterprise Linux (RHEL), SUSE Linux Enterprise Server (SLES), SCO OpenServer, UnixWare, Sun Solaris 10 x86, FreeBSDWindows 2000/XP/Server 2003/Vista
Linux
FreeBSD
Novell Netware 6.5
Solaris 10 x86/x86_64
SCO Unixware 7.x.x
Mac OS X 10.x (EFI BIOS Support)
Microsoft Windows Vista, 2000, XP, Windows Server 2003, Windows Server 2008
Red Hat Linux, SuSE Linux, Miracle Linux, Fedora Core, Linux open source driver (32/64-bit)
FreeBSD, VMware 3.02, 3.5
Other Features
Copy-back Hot-spareIntegrated Web server
Warranty
3 years
3 years
3 years
Price
$999
$999
$800
System Hardware
Processor(s)
2x Intel Xeon Processor (Nocona core); 3.6 GHz, FSB800, 1 MB L2 Cache
Platform
Asus NCL-DS (Socket 604)
Intel E7520 Chipset, BIOS 1005
RAM
Corsair CM72DD512AR-400 (DDR2-400 ECC, reg.)
2x 512 MB, CL3-3-3-10 Timings
System Hard Drive

Western Digital Caviar WD1200JB

120 GB, 7 200 RPM, 8 MB Cache, Ultra ATA/100

Test Drives
16x Fujitsu MBA3147RC
147 GB, 15,000 RPM, 16 MB Cache, SAS
Mass Storage Controller(s)
Adaptec RAID 51645
Areca ARC-1680D-IX-16
Promise Supertrak 16650
Networking
Broadcom BCM5721 On-Board Gigabit Ethernet NIC
Graphics Subsystem
On-Board Graphics
ATI RageXL, 8 MB
I/O Performance
IOMeter 2003.05.10
Fileserver-Benchmark
Webserver-Benchmark
Database-Benchmark
Workstation-Benchmark
Streaming Reads
Streaming Writes
System Software & Drivers
OS
Microsoft Windows Server 2003 Enterprise Edition, Service Pack 1
Platform Driver
Intel Chipset Installation Utility 7.0.0.1025
Graphics Driver
Default Windows Graphics Driver


Test Drives: Fujitsu MBA3147RC (15,000 RPM)

We used 16 Fujitsu MBA3147RC 15,000 RPM SAS drives to make sure that the controllers could be saturated during our tests. These Fujitsu drives are state-of-the-art server models with 16 MB cache and throughput of over 150 MB/s.        

Display 2 comments.
This thread is closed for comments
  • 0 Hide
    Anonymous , 24 April 2009 21:58
    what firmware did you test on the areca card? i heard there were some performance issues with earlier firmwares (current is 1.46)
  • 0 Hide
    jwoollis , 14 May 2009 17:26
    I am disappointed in this review, it does not seem to go into a level of detail sufficient to do justice to this topic.

    Firstly most companies these days would build systems using more than 16 drives or 1x4U rackmounted device, and use iSCSI to distribute storage rather than using disks mounted in each server. It's likely that many companies will have several of these using 32,64,128,256 or more drives and also use this storage in parallel to avoid risk of controller or hardware (other than disks) failure.

    You neglect to mention that each port on the controller card in supporting upto 4 drives with a fanout cable or more drives through the use of edge and fanout connectors runs in Full Duplex that is to say 4x3Gb/s or 12Gb/s in both directions and of course this will double with the next generation 6Gb/s connections. With conventional SATA/SAS drives peaking at between 100 and 200MB/s it is possible for a single port to support much more than four drives, In practice between 6 and 12 drives transfering data continuously in one direction at full drive speed would be required to use up the full bandwidth of one port. If you allow for the fact that drives are rarely use in this manner and not for sustained periods of time and that the drives may be separated into groups rather than used as a single huge RAID array, it would be possible to actually use between 16 and 64 drives off a single port. This of course might be seen as bad practice if a controller has enough ports to separate the drives into smaller groups but the point is that the controllers are far more versitile and this article does little to inform us of this fact.

    You neglect to mention that there is the possibility to acquire Edge and Fanout Expandors as either 3.5" or 5.25" Drive Bay mounted devices or a circuit board which can be installed in both Free Standing Cases or any PC/Server Case to facilitate the use of more drives per port per controller card than would normally be possible with fanout cables.

    There are some who would rather build custom/bespoke solutions rather than pay the extra-ordinarily large sums of money that is required for a 16 bay rackmounted storage solution which are often prohibitively expensive.

    You also neglect to performance test these controller cards by testing performance to the limits of the controller card when used with multiple edge and fanout expanders.

    Please when investigating such topic, do us the service of covering all aspects of the topic properly and in detail so that we might make an informed decision. The controller card is only one part of this solution and the cost of such addons may range from £8,000 to £24,000 per rackmount bay depending on the number of disks supported and the amount/size of disks preinstalled. Perhaps you might offer us examples of these addons with specifications and a Cost per GB! That will certainly put things into perspective!