A funny thing happened in network cabling sometime in the last, oh heck I can’t recall exactly when. We had a marketing genius take over for the fine folks at the Telecommunications Industry Association(TIA). Apparently these fine, highly qualified and educated folks, (some rumored to be actual electrical engineers), were not making new standards fast enough for our intrepid marketer! In case you detect an edge in my voice, you would be correct. What, you may ask, has gotten my glasses all foggy?
To be honest I am pretty tired of hearing, “The other guy’s cable says 350 MHz; why doesn’t yours?”
Sometimes, in my more sarcastic moments, I find myself thinking: “To heck with it lets put fuzzy bunnies on the darn boxes and cables and call it good.” Yes I am being extremely sarcastic. Extra sauce today if you will. However, there is merit to this rant. A Category 5e cable labelled 350 MHz is worth no more than one labelled 100 MHz. And so long as our fuzzy bunny cable is tested through 100 MHz, it might be worth more than both because it costs more to print fuzzy bunnies on boxes and cables.
Hopefully by now I have a little bit of your attention. Above I mention TIA, those funny people who decide what exactly is a category cable. These fine folks came together, probably argued and fought, and eventually settled on a standard and published it. The standard is TIA/EIA-568, with the most current revision being C. In truth the standards define a lot of physical properties. Some are of the overtly physical variety, and some are more like physics, or electrical properties to narrow down. In there, one finds specifications about these megahertz thingies and different categories.
- Cat5e – 100 MHz
- Cat6 – 250 MHz
- Cat6a – 500 MHz
When the folks at TIA release such standards they define target/acceptable parameters across a range that begins somewhere near zero and persists through the numbers above for a given cat cable. They very clearly list acceptable loss values, in decibels, for the entire frequency range. Companies like Fluke use these standards to design and program their network testers. Other companies like Intel, HP or Broadcom use these standards to design their network hardware.
So, the standards that we are supposed to follow in order to be allowed to call a cable Category 5e, 6 or 6a are also the standards to which the hardware makers also adhere. That being said, if you were Fluke, would you spend money making sure a network tester tested outside of industry standards? If all the hardware vendors build to these specifications, how exactly does performing outside of them benefit you? For that matter how on Earth does one certify a cable is good to an imaginary standard that no real tester even entertains? Cat5e’s electrical properties are defined up to 100 MHz. End of discussion. To see a higher MHz defined, one must change Categories to 6 at which point why not just call it a Cat6 cable? They are worth more money. The same can be said for Cat6 vs Cat6a.
If you were in business, why would you sell an entire product category for significantly less than it was worth?