Adventures in 10GB Networking – Part 1
June 5, 2020 | by bgarmon
I blame Apple for my long, frustrating, and expensive path to having a 10GB home network. They just had to include a 10GBE NIC in the Mac Mini (2018) model that I’m writing this from. And I paid extra for the NIC, so I might as well use it. Right!!?? Right. My road to 10GB is paved with a number of self-inflected wounds. I share my mistakes here in the hopes that you can avoid them.
Understand the basics
I’m at home theater guy. Getting to 10GB is similar to moving your home theater from Blu-Ray to 4K UHD. You have to throw out your current Blu-Ray player and buy a new one. You have to buy new Blu-Ray discs that support 4K UHD. And yes, you likely have to throw out your TV and possibly even your home theater receiver and buy a new TV that supports 4K UHD. You even (most likely) have to throw out your current HDMI cables and buy ones that support a higher bandwidth. It’s a lot of effort and a lot of money to spend. The benefit though is a stunning visual delight. At least until you see 8K and then you are going to be back on the upgrade train. I digress. The point here is that moving from 1GB to 10GB networking is exactly the same thing – only it’s all your computer stuff you have to toss out (or modify).
In home theater design, the path has two forks in the road: use a TV or use a Projector. Both paths take you down a different rabbit hole in terms of technical capabilities and cost. Back on the computer networking side, your pivot point is the type of cabling you are going to use: Copper or Fiber. The team over at Cablify has a simple explanation of the main differences between Fiber and Copper here if you want to learn a little more. What this tends to boil down to is that you will likely have a mixture of both. If your home network has enterprise grade servers and networking gear, fiber is your path forward for those devices using technologies like SFP+. On the desktop/laptop equipment fiber still doesn’t make much sense financially so sticking with copper is good enough.
Pick the gear
Starting what I will describe as a top down approach to equipment my cable modem is the first link in the networking chain. I knew off the bat 10GB networking for me was a LAN only option due primarily to cost. My business internet service from Comcast tops out at 76mbps down and 15mbs up and already costs me an arm and a leg each month. I could bump this up to 1GB if I hit the lottery, which wasn’t happening, so I knew my use cases for 10GB were going to be limited to LAN only use cases. So I can keep the existing cable modem.
Moving down the networking stack I was leveraging a PFSense appliance from Netgate as my primary edge router/firewall was good to go as well. Now to look at my network switches.
I’d been using a pair of 24-port HP Procurve 10/100/1000 fanless switches that I bought 13 years ago. It was going to be sad to part with these as they had survived three house moves since the initial purchase and had gone this long without a single issue but the path to 10GB requires a blood sacrifice and this was going to be the first of many, but in running the number of ports I would require I wasn’t going to need 24 10GB connections so I decided to keep one of them in the mix and replace one with a smaller device. Cost was my motivating factor in my hardware choices so I opted for a 12-port Netgear XS512EM switch which supports ten 10GB ethernet ports and two SFP+ uplinks. It also supports what I thought would be a useful feature: 802.3bz-2016,2.5G/5GBASE-T.
It’s a mouthful, but 802.3bz-2016,2.5G/5GBASE-T replaces the 10mb/100mb/1000mb ports we typically find on networking equipment with 2gb/5gb/10gb copper ports (they still support 1GB).
With a core switch picked out and ordered (thanks Amazon) it was time to move to the actual devices in use starting with my server gear.
The Network Interface Card (NIC) is the only thing that needs to be replaced on the computing devices. Depending on device specifics, this can be simple or it can be a PITA. On the simple side, the Dell T640 Poweredge Server I run came with dual 10GB E adapters on the motherboard, so no changes were necessary for that device. I run several Intel NUC 10th Gen devices and those only come with a single 1GB NIC which meant I would need to find an external NIC option. There are several vendors that offer bus powered USB-3.1 or USB-C NICs. These NICs run at 1gb/2gb/5gb sacrificing a little speed for a smaller form factor. I was already using dual 1GB USB-C adapters with my Intel NUCs and so moving them up to 5GBE using the Sonnet Solo5G NICs seemed like a worthy upgrade. It’s not 10GB but it’s a speed bump.
On the storage side, I run QNAP NAS devices which ship with a single 5GB NIC and multiple 1GB NICs but the devices support PCI expansion slots for a 10GB upgrade which I opted for as follows:
- For QNAP 1 I ended up going with the QNAP QXG-10G1T Single-Port expansion card
- For QNAP 2 I picked the QNAP QM2-2S10G1TA as I needed the M2 drives for another project.
I run a number of gaming desktops from Alienware so for those desktops I went with a Pci-E x4 ASUS XG-C100C 10GB card to replace the existing NICs they ship with.
As my primary desire for 10GB was limited to my main office and the server rack that’s in it, I initially was going to be fine with not having 10GB in the other rooms of the house, but the more I started thinking about it I decided “go big or go home.” The problem though is that I had run Cat7 ethernet to the Living Room and to my son’s bedroom but I failed to run more than a single cable to each room. Link aggregation wasn’t an option. For a variety of reasons I won’t go into here, running more (or different) physical cabling is not an option so I was going to have to add a couple more 10GB switches in each room. The Netgear 10-port GS110EMX Smart Managed Plus product did the trick offering 10GBE for the uplink and a second 10GB port for one device per room which was fine as the gaming machines were going to be the only ones that I wanted to add the speed bump to.
In summary, with 3 new switches, 6 new NICs and a bunch of new Cat7 cabling it was time to string it all together and live happily ever after right? Well it went that way for about a month. Then the real fun began which I’ll cover in a second part.