Aplicata Quad M.2 NVMe SSD PCIe x8 Adapter Review

Conclusion

The Aplicata Quad M.2 NVMe SSD PCIe x8 Adapter has obvious advantages for software-defined servers in data centers, but that's not our target audience. The adapter does have some nice features for mainstream users, but not for the obvious reasons.

The adapter allows you to fit four M.2 SSDs in your system. With most motherboards, you can only mount two M.2 SSDs before you have to use the PCIe slots. Unfortunately, the onboard M.2 slots usually route through the PCH that's shared with many other devices. The DMI link between the PCH and the CPU is only PCIe 3.0 x4 (the same as one NVMe SSD). That means your two NVMe SSDs share the same bus with nearly every other device connected to your system.

The Aplicata Quad M.2 NVMe SSD PCIe x8 Adapter moves your high-bandwidth storage to the PCI Express bus and doesn't route through the PCH. There are significant performance advantages that don't always show up in our canned testing, particularly if your applications crave extreme storage bandwidth.

We shouldn't always put performance first, though. The adapter allows you to build a high capacity NVMe SSD out of smaller and less expensive drives. You can cram up to 8TB of high-speed flash into the adapter if you use 2TB SSDs, and you can get there one step at a time by buying drives as your needs increase.

Thermal throttling is not an issue for most users, but you are moving a lot of data if you need a product that can write at up to 7,000 MB/s. Extended heavy sequential write workloads are the most common catalyst for thermal throttling conditions, but the adapter's full-height half-length design leaves room for large heat sinks to cool the M.2 SSDs. With moderate airflow, the adapter will reduce the chance of throttling much more than a similar product without heat sinks. You can still heat soak the coolers over time, but you move the condition out to hours of use rather than minutes or even seconds with bare drives.

There are a number of low-cost M.2-to-PCIe slot "dummy" adapters on the market, but you shouldn't lump this product into the same category. This adapter has a PLX bridge and, more importantly, provides additional features. Most SSDs with a custom NVMe driver cache some user data, so if your drive loses power, you can lose data. The capacitors on the Aplicata adapter provide another layer of security during an unexpected power loss.

The adapter also brings more connectivity to consumer-focused chipsets. The Z97 through Z270 chipsets can't provide sixteen lanes to the second primary PCIe slot if you already have a video card installed. If you only have a x8 connection available, many of the other M.2 to PCIe adapters only support two M.2 slots out of the four on the card.

A majority of users will not see a performance benefit from a product like this. Most of us would just be happy to own a single NVMe SSD, much less four. The Aplicata Quad M.2 NVMe SSD PCIe x8 Adapter costs $449 at the time of writing, so it's expensive and overbuilt for home or even some workstation users.

Most of the BOM cost stems from the PLX chip. Aplicata also sells a less expensive version without the PLX chip, but we expect less widespread compatibility. We have a similar bridgeless HighPoint 4x M.2 to PCIe x16 adapter, and it has compatibility issues in some of our older systems. We hope to test the newer Aplicata design with full x16 bandwidth soon.

Right now, all these products are limited by the platform. A bootable array is far more interesting than a secondary storage device, so we'll take another look at the Aplicata x8 adapter when Intel releases dongle keys for vROC.

MORE: Best SSDs

MORE: How We Test HDDs And SSDs

MORE: All SSD Content

This thread is closed for comments
29 comments
    Your comment
  • dudmont
    Ram, when the dongle shows, will you be doing a test with this and similar products, with 32gb Optanes?
  • daglesj
    I hope the 3 or so of you that can actually exploit this performance have fun using it.
  • dudmont
    142182 said:
    I hope the 3 or so of you that can actually exploit this performance have fun using it.


    While I wholeheartedly agree with you, there's a certain kid in a candy store kind of thing about articles like this.
  • takeshi7
    I would have loved to see this with 4 of the Intel Optane 32GB drives installed. That would be the fastest 128GB SSD ever.
  • AnimeMania
    I was too stupid to understand anything the article said, but not too stupid to have questions. Are you allowed to mix and match the four M.2 SSDs with different brands and capacities? Do the four M.2 SSDs appear as 4 different drives (with different drive letters) or does that depend on if they are RAIDed?
  • PancakePuppy
    1839266 said:
    I was too stupid to understand anything the article said, but not too stupid to have questions. Are you allowed to mix and match the four M.2 SSDs with different brands and capacities? Do the four M.2 SSDs appear as 4 different drives (with different drive letters) or does that depend on if they are RAIDed?


    Functionally, the card is just a carrier for the PCIe packet switch, associated support components, and M.2 connectors. It should be completely unaware of NVMe, so you could plug in 4 of the same SSDs, or 4 completely different ones, or 2 SSDs and 2 M.2 to PCIe edge connector adapters, all fair game.
  • DerekA_C
    curious as to why this isn't added to the backside of eatx or atx boards or even matx boards with some kind of heansink plate particularly to the x299 and x399 boards that support enough pcie lanes.
  • bit_user
    Running a RAID-0 of 4 drives mostly makes sense if you're using it for caching or scratch space. I wouldn't use this to hold the primary copy of any data I really care about.

    Now, if they included a RAID-5 controller that could keep up with these drives, that would be very interesting.
  • bit_user
    2428111 said:
    curious as to why this isn't added to the backside of eatx or atx boards or even matx boards with some kind of heansink plate particularly to the x299 and x399 boards that support enough pcie lanes.

    Hmmm...

    Need we go on?

    IMO, this is the best option: easily accessible, likely to have good airflow, and can be paired with many different motherboards. You could even install multiple, if you're doing something particularly crazy. Like trying to host big files over 100 Gbps Ethernet.

    BTW, if a motherboard did add something like this, then it would make more sense to place the M.2 boards perpendicular to the motherboard and add a bracket to hold the other ends. This could take the place of one of the expansion card slots, so you'd have some airflow moving across them.
  • alan.campbell99
    I'm interested in trying this but it seems it won't ship to New Zealand, bugger.
  • photonboy
    GAMES LOADING much faster than what an SSD can already do is highly unlikely considering an SSD that is 4x faster than another SSD show little difference in loading times.

    Games still have CPU and GPU tasks that take up a big bulk of that loading time which a faster drive will not help with.
  • deenie1219
    Alan, there are middlemen shippers that could work you, depending how badly you want the item(s).
  • alan.campbell99
    Thanks for the tip, I'll give it some consideration. I have US relatives I was thinking about asking.
  • AgentLozen
    Say I buy one of these things and stick 4 NVMe SSDs in it. Will it still take 15 minutes for Windows to update when I try and shut it down?
  • TheOtherOne
    Most of that time is spent uploading stuff back to MS for adver purpose ....

    So yeah, Windows will still take time to spy on you.
  • AnarchoPrimitiv
    ACtually, the FIRST one of these cards brought to the market is the Highpoint Technologies SSD7101A-1 that's ALREADY been on Newegg for a while, come on Tom, get your stuff together
  • dudmont
    2164868 said:
    ACtually, the FIRST one of these cards brought to the market is the Highpoint Technologies SSD7101A-1 that's ALREADY been on Newegg for a while, come on Tom, get your stuff together


    http://www.tomshardware.com/news/highpoint-ssd7100-ssd7101-vroc-ssd,35004.html
    ;)
    If I was standing next to you, I'd offer you a rag to wipe the egg off your face.....
  • MRFS
    > BTW, if a motherboard did add something like this, then it would make more sense to place the M.2 boards perpendicular to the motherboard and add a bracket to hold the other ends.

    See the ASUS DIMM.2 slot.
  • James Mason
    Damn, if this thing was like $50-70, it's be a real nice way to get more M.2 slots in a desktop PC. But at that price point, it's really only going to work for Workstation class systems where you use them to make money.
  • bit_user
    1536795 said:
    Damn, if this thing was like $50-70, it's be a real nice way to get more M.2 slots in a desktop PC.

    There are plenty of products like what you're describing.

    https://www.newegg.com/Product/Product.aspx?Item=9SIAA6W4827476
  • MRFS
    > There are plenty of products like what you're describing.

    Yes, indeed: the market is flourishing with NVMe solutions.

    Our focus here has been to promote RAID controllers
    with x16 edge connectors and 4 x M.2 ports.

    The main reason for this preference is the MAX upstream
    bandwidth that is imposed by Intel's DMI 3.0 link
    (exact same bandwidth as a single NVMe M.2 port).

    There is an engineering elegance that obtains from
    4 x NVMe M.2 SSDs @ x4 PCIe 3.0 lanes = x16 PCIe 3.0 lanes.

    What many prosumers would prefer is a bootable RAID controller
    that supports all modern RAID modes with 4 x NVMe M.2 SSDs.

    One User at another Forum has reported success getting the
    driver software for the Highpoint SSD7110 to work with the
    Highpoint SSD7101. And, Highpoint has said they are working
    on making the SSD7101 "officially" bootable.

    I'm assuming, without proof, that Highpoint are also working
    on making their SSD7120 bootable.
  • jn77
    I am a little concerned about this type of product in respect to the fact that I primarily use laptops these days that are maxed out i7's with 32 or more gb of ram.

    I have been considering building a new desktop (the last one I build was a socket 775 with a Q6600 in it).

    If I don't really "need" a desktop right now and started from scratch, I would love to put one of these it in it, but I am also needing to consider that PCIe 3.x is going out soon for PCIe 4 and PCIe 5 is maybe 4 years out, so:

    Do I build a PCIe 3.x based system now with DDR-4, massive i9's or thread ripper processors and put one of these in, or do I wait for the speed advantage of PCIe 4 or 5? Considering the laptops are working just fine for now.
  • MRFS
    > "We're testing the older Quad M.2 NVMe SSD PCIe x8 Adapter. Aplicata released the x16 card a few days ago for systems that support bifurcation, but the Quad x8 has broader compatibility."

    The x8 edge connector explains why the sequential READ speeds
    are hovering around 6,000 MB/second.

    With an x16 edge connector, sequential READ speeds are
    exceeding 10,000 MB/second with similar add-in cards.