A long time ago the highest Wi-Fi data rates advertised used to be 2Mbps or 11Mbps (back in the days of 802.11b).
Then came 802.11g and 802.11a, and we started seeing 54Mbps.
However with the advent of 802.11n, this changed. Instead of a single number all of a sudden you would see products advertising anywhere from 144Mbps to 300Mbps to 450Mbps...
With 802.11ac this again changed to 433Mbps, 867Mbps and so on, and now with 802.11ac Wave2 once again we see lots of numbers. So what changed? What is behind these varied numbers for one particular standard?
All these data rate numbers are a function of three things, which in different combinations result in a different top level max-data-rate:
- The Modulation (MCS Rate) that is suppored.
- The Number of Spatial Streams that the Wi-Fi device supports.
- The channel width that the device can use.
To demystify the 300Mbps number we see with a lot of 802.11n radios, it comes from: 2 spatial streams, at the highest 802.11n modulation (64-QAM with 5/6 encoding) and at a 40MHz channel.
So a 2x2 802.11n radio has a maximum data rate of 300Mbps. How are some 802.11n devices claiming 450Mbps? Its becaue they support 3 spatial streams.
Lets come to 802.11ac: we see typical 2x2 devices claim 867Mbps. What is behind that number (& how do we get there from the 300Mbps of 2x2 802.11n). Two enhancements of 802.11ac over 802.11n boost that number:
- 802.11ac supports 256-QAM data rates (while 802.11n did just 64-QAM), so that takes us from 300Mbps to 400Mbps.
- 802.11ac supports 80MHz wide channels (while 802.11n did just 40MHz), so that takes us from the 400Mbps to the 867Mbps number.
This continues with 802.11ac Wave2 where a 4x4 radio by virtue of having more spatial streams can support 1.7Gbps data-rates at 80MHz (when a 2x2 radio could only do 867Mbps)
A good table on data rates is in this Keith Parsons tweet here: