So you have heard all the talk...  Infiniband is super fast but super expensive, but what if that were only half true.

 

Read on to find out how you can do, with a bit of luck and time, DDR 20Gb/s Infiniband for under US$200.

 

DDR you what now ?.

First it would be good to get a basic idea of a few basic Infiniband terms together.

 

For the purpose of this post, there are three main levels in Infiniband (There are others but they are way out of budget);

  • SDR - Single data rate = 10Gb/s
  • DDR - Double data rate = 20Gb/s
  • QDR - Quad data rate = 40Gb/s

 

Infiniband is like an electrical circuit (known as a fabric).  It provides connectivity between 2 or more points but you need a format to pass that information around the circuit that producer and consumer can understand.

 

Two of the most common are;

  • IPoIB - Infiniband over IP.  Allows Ethernet traffic to flow over the Infiniband fabric.
  • SRP - SCSI RDMA protocol.  Allows the transmission of SCSI commands over the Infiniband fabric at up to full speed.

 

Plugs, ports, cables and calamity.

The basic concept for wiring is the same as for a standard network.

 

You need adapters in the communicating machines (HCAs) and you may need a switch in between if you are connecting more than two computers.  The cabling is, however, much thicker than GbE network cables and can be much more expensive.

 

One other major difference is that in an Infiniband network you need a Subnet Manager.  Luckily there is one available from OpenFabrics which is packaged to work on either Windows or Linux.  The subnet manager maintains the routing tables to allow the packets of data to get to the right destinations.  Some, but not all, switches have hardware subnet managers.  If you have not got one of these switches then you will need to run a subnet manager on one of the computers on the Infiniband network.  Only one subnet manager is needed per Infiniband network.

 

So:

Computer (HCA) -> cable -> (HCA) Computer - With one of the two computers running a Subnet Manager.

or

Computer 1..n (HCA) -> Infiniband Switch -> (HCA) Computer 1..n -  With one of the computers or the switch running a Subnet Manager.

 

There are also a variety of connectors.  Three of the more common varieties are:

CX2 - used with SDR cables, they have a squarish plug with 2 lanes wired.

CX4 - used with DDR cables, they are the same shape as CX2 cables but with 4 lanes wired.

QFSP - A long narrow connector much like the SAS SFF-8088 connector.  Used mainly for QDR.

 

There are also a variety of adapter and converters out there but for this post, the above is a good place to start.

 

Wow, but where to get all this from ?.

Well if, like me, you are not super rich and are doing this for a home project then EBay can be your friend and if you contact them directly, some sellers also allow international sales and shipping.

 

I would advise looking for deals on Connectx cards as they are of a newer generation and seem to avoid some of the pitfalls of the previous Infinihost cards.  Note the Connectx cards with model number ending in EN are Ethernet only cards.

 

The following are reasonable contenders within budget;

  • MHGH28-XTC Dual 4X 20Gb/s InfiniBand PCIe 2.0 2.5GT/s 11.0W 13.6cm x 6.4cm (try to get A2 or newer revisions)
  • MHGH29-XTC Dual 4X 20Gb/s InfiniBand PCIe 2.0 5.0GT/s 11.6W 13.6cm x 6.4cm

 

Currently the MHGH28-XTC are going for US$75 each on EBay.

 

Cables are silly expensive compared to standard Ethernet cables with them starting at US$60 for 0.5mtrs.  In addition to this, a lot of sellers on EBay do not list the version (SDR / DDR) and as the CX2 and CX4 connectors look the same there is no way to easily tell which one they are.  Luckily there are some sellers that pop up now and then listing the cables correctly and one such seller is currently listing 8mtr DDR CX4 cables for US$15.

 

Switches are expensive but generally cheaper than 10GbE versions.  If you are lucky and wait around a bit then you may be able to pick up something like the Flextronics F-X430046 24-Port 4x DDR Infiniband Switch for under US$500.  Check the model before you buy though as the Flextronics F-X430060 is only a SDR switch.

 

So using current EBay pricing...

A basic peer to peer Infiniband DDR setup would cost;

2x MHGH28-XTC HCA = 2x US$75 = US$150.

1x CX4 to CX4 DDR cable = US$15 (8mtr).

Total: US$165 + shipping.

 

Expand with a switch

Flextronics F-X430046: US$500

2x MHGH28-XTC HCA = 2x US$75 = US$150.

2x CX4 to CX4 DDR cable = US$30 (8mtr).

DDR Infiniband peer to peer for: US$680 + shipping.

Connectivity for extra computers could be as low as: US$100 + shipping.

 

So plug them in and away I go ?.

If only it was that simple.

 

This really depends on the firmware on the card and if there is a driver available for the operating system you wish to use.

 

With the MHGH28-XTC I found the following;

ESXi 5.1: Mellanox supplied vib needs to be installed is a ssh session to the ESXi server.  Card is then recognised.

Windows Server 2012: Firmware update needed, check out the thread here for details.

 

But why so slowwww ?.

One thing commonly overlooked when setting up infrastructure like this is that whilst the connection between two computers may be large, the computers have to be of a certan level in order to fill that connection.  Most consumer hard drives top out around 130MB/s, SSDs at around 500GB/s.  Copying to or from one of these is not going to get you 2.5GB/s (20Gb) of speed.  It is like trying to fill up a storm drain with a teacup.  Both ends need to be able to support that speed, usually with large hard drive or SSD arrays.  There are some benchmarks that run from RAM which can get up to this level though.

 

So, if your transfers seem slow, check you storage at both ends is fast enough to fill the connections bandwidth.

 

I shall add and amend as time goes by but I would suggest a look at David Hunts post "Infiniband on the cheap" that seems to be the starting point for many peoples Infiniband at home dreams.