Most access points available on the market today are dual radio. On paper, this is great! One radio keeps 2.4GHz clients happy, one radio keeps 5.0GHz clients happy, we’re all happy, and we just need to throw an AP up every 40 – 60 feet. Pretty simple, right?
Fortunately for my job security, not quite.
First, let’s get one unfortunate truth out of the way. While 2.4GHz may not be “dead,” it’s certainly limping a little. The poor ISM band has woefully few channels to work with and has to contend with many different sources of outside interference. In isolated areas without competing systems it can be used with decent results, but I can almost guarantee you that if you pull up your trusty SSID mapping tool that you purchased after reading my previous blog post and look at 2.4GHz in a crowded environment, you’re going to see an ugly mess.
With the exception of isolated warehouses, most designs I put together today are built with 5GHz in mind. It’s more reliable and less prone to CCI. But each AP that is placed also contains a 2.4GHz radio. Considering that both radios strapped into the same hardware behave very differently, what should we do with that 2.4GHz radio?
To illustrate the problem, here’s a snapshot of the predicted coverage of a standard 5GHz radio:
And here’s the predicted propagation from the 2.4GHz radio on that same AP:
Do you see the problem? More green is not always good. If we install the APs and leave the radios at the same transmit power, we’re going to have some significant co-channel interference present in our 2.4GHz network, as the 2.4GHz signal travels considerably farther than the 5.0GHz signal. To make matters worse, there are only three valid non-interfering channels in the 2.4GHz band available for us to play with, so CCI is more or less a given. (Don’t believe the numbers 1 – 13 at the bottom, they only represent 5MHz jumps and our channel widths are 20MHz minimum. An AP on channel 3 will interfere with APs on channels 1, 2, 3, 4, 5, and 6. Best practice is to place APs on channels 1, 6, and 11 in the US).
For a more detailed deep-dive on why only have three non-interfering channels creates design issues with CCI, here’s a great blog by Keith Parsons.
Compare this to the 5GHz band, which has a lot more room to play, and is usually much less crowded. These two scans were taken from the same location. Much cleaner, isn’t it? (Even with the usage of 80MHz channels! Those larger channels take up four separate “lanes” in the RF.)
While CCI in the 2.4GHz band is not the “end of the world” if present, it can greatly reduce throughput and potentially cause a higher level of retransmissions unless we further dampen throughput with RTS/CTS mechanisms. Also, with both radios set to the same transmit power our clients will generally perceive the 2.4GHz signal as a higher RSSI than the 5.0GHz signal, which can negatively influence where they choose to associate.
So what do we do with that pesky 2.4GHz radio that’s included on most of our APs?
There are two common ways around the problem. First, you can turn the 2.4GHz radios transmit power down so the cell coverage size doesn’t sprawl as far past the 5.0GHz boundaries. For example, if you set the 5.0GHz radio to 25mW, set the 2.4GHz radio to ~7mW.
Second, you can disable some 2.4GHz radios, converting them into sensors that scan the RF environment for threats and anomalies.
How do you determine which method to use on your network?
Let’s run through a few questions:
What type of clients do you need to support?
Clients, ever the wild card in 802.11, can greatly impact your wireless design. If you have clients that broadcast with a powerful signal you will be at a greater risk for CCI in the 2.4GHz band, as the client transmissions will be potentially travelling farther than the AP transmissions. Not much point in cranking down the 2.4GHz radio when the client is still blasting happily away at full strength. Thankfully, most “BYOD blend” tablets and smart phones today talk fairly quietly (for example, iPads and iPhones transmit around 10mW on average). Properly determining client capabilities is going to be further discussed in an upcoming blog post.
What type of applications do you need to support? Are users mobile?
If you need to support voice or real-time video applications for users that walk through your building, you will want to minimize the number of times the user roams. While lowering the 2.4GHz coverage transmission strengths will allow for more coverage cells in the same area, it also results in an increased level of roaming events.
What’s your RF environment?
In my experience, the more crowded the area, the less reliable the 2.4GHz band. If you’re in a crowded area, chances are that you’re really going to want to maximize the usage of 5GHz.
Do you support sensitive data transmissions across your WLAN?
Dedicated sensors greatly improve the responsiveness of WIDS/WIPS. Most manufacturers offer some form of “time slicing” for their access points, where the AP spends most of its time servicing clients but occasionally scans for threats. So dedicated sensors aren’t a hard requirement – but they can respond to threats much faster and scan channels more efficiently, as they don’t have to park on a set channel and service clients for the majority of the time.
Bonus Question – Can’t I just use ARM and let the system sort itself out?
ARM (adaptive radio management) can be used as a band-aid fix, but it’s much better to design your RF up front rather than relying on ARM. Conventional ARM can often make bad decisions. For example, consider an open office floor plan. The ceiling area is tall and wide open and there are lots of modular walls and funky office furniture down below. APs mounted from the ceiling in this scenario will see each other plainly… but clients down on the ground will have a fair amount of attenuation present. If you let ARM make decisions for you, you run the risk of the APs hearing each other too strongly and turning their signal down, hurting the experience for your users.
So, now that we have some more details, here’s my take on the Pros and Cons of each design choice.
Shrinking 2.4GHz Coverage:
Pros: The smaller the cell size, the smaller the BSS, the less clients that have to share the airwaves… IF your clients also transmit at a low power level, anyway. The configuration will be easier as well, as you can set a general transmit strength instead of turning radios on and off.
Cons: If your workforce is constantly on the move there can be lots of roaming events for 2.4GHz clients if you greatly lower the transmit power. You probably don’t want to be maximizing the usage of 2.4GHz anyway. You also don’t have the visibility that dedicated WIDS/WIPS sensors will give you.
Turning Off 2.4GHz Radios:
Pros: Turning off radios and converting them into sensors can greatly improve your WIDS/WIPS responsiveness and give you better visibility into the health of your RF. Larger coverage areas for 2.4GHz can mean less roams for your clients.
Cons: Takes a bit more planning to properly spec out and introduces some complexity. This should be well documented by the IT team in case changes are made to the environment down the road.
Personally, I use a mixture of the two in my designs. 2.4GHz transmissions are lowered slightly, but not to the point where the coverage mirrors the 5.0GHz coverage. I then use Ekahau to determine which radios can be turned off without impacting clients. This gives me more control over the RF – for example, if a hospital cart needs to roam up and down the halls while maintaining connection (and healthcare gear can be notoriously stubborn), it’s nice to have one long stretch of coverage for that device. The additional visibility from the dedicated AMs/SMs is useful as well.
How do you approach 2.4GHz design in your environment? Let me know in the comments!