Wi-Fi survey volatility is many factors
Keith Parsons at WLANPros posted this article on Wi-Fi Client Device Sensitivity i.e., how well an 802.11 Wi-Fi radio on a typical device like an iPhone can pick up a Wi-Fi signal. Keith asserts that various client devices have very volatile sensitivities and will see a Wi-Fi Access Point fluctuate 10 to 25 dBm in signal strength. I believe Keith is capturing 10 SSID beacons per second. He posts this graph to illustrate his point.
Measuring the RSSI of a fast UDP stream
I posted my rebuttal on Twitter using a different capture method where I used TCPDUMP to capture a PCAP trace and graphed the RSSI levels. You can see from the histogram that -40 and -41 dBm account for nearly 94% of the RSSI readings. I captured 1000 UDP packets/second using one Wi-Fi device to blast UDP broadcasts and another Linux computer with the Atheros AR9271 802.11 USB adapter with a standard 6-inch omni-directional whip antenna. This Atheros adapter supports promiscuous mode, monitor mode, and Packet Injection in Linux with default drivers. The capture computer was roughly 10 feet from the transmit device.
Now had I simply computed the maximum range variation of the results, I would have graphed an 8 dBm variation which is inline with Keith’s data. But you can see that from a Wireless Survey perspective which is what WLAN Pros care about, the relevant range is -40 to -42 dBm and everything else is mostly irrelevant because they’re so rare. -41 dBm would also correlate to the fastest modulation (MCS) Wi-Fi transmission speeds.
Measuring the RSSI of SSID beacons
Since Keith also had other data that suggests a much higher degree of variation in the RSSI when measuring SSID levels, I went ahead and captured around 5000 SSID beacons from my home 2.4 GHz Wi-Fi AP. I used the same Atheros radio on my Desktop computer running Ubuntu Linux with a 10-foot line-of-sight to the Access Point. The SSID beacons are sent at the default 1 Mbps in roughly 102 millisecond intervals.
The purpose of a Wi-Fi survey is to predict the speed (MCS) that users can realistically expect on the network. We know that the lower RSSI values dominate Wi-Fi MCS because a mere 1% packet error rate will cause WiFi to slow down, but we also know that Wi-Fi has error correction built into it so that reduces our losses. So a reasonable number to use is the 10th percentile RSSI level which is -66 dBm in our data. 10th percentile means that 90% of the RSSI measurements are higher than -66 dBm. This can confidently predict the MCS speed customers can expect.
So even though our median signal level is -64 dBm, we have a -2 dBm fade margin under the conditions I measured which are close to ideal because I have a 10-foot line-of-sight. Note that the signal is a low -64 dBm at short range because I am using low transmit power.
So this whole concern over this perceived client side volatility is almost meaningless. Even though it looks like we have an 8 dBm swing, 10th percentile RSSI is only 2 dBm lower than the median RSSI level.
Blocking the signal path will lower dBm
Real world conditions can result in even higher losses when people start blocking the path of the signal. In the graph above I cut off the last 30 samples because it had some anomalous dip when I walked in front of the antenna. My body and maybe even my skeletal bones can actually cause temporary drops of 20 dBm.
What I typically find is a -10 dBm drop when I position my body in the path of the Wi-Fi line-of-sight. -20 dBm can occasionally happen when certain large bones align just right. Declaring a 10 dBm fade margin is probably the reasonable standard practice.
In the process of capturing this data, TCPDUMP also captured 15 other Wi-Fi APs in my neighborhood. Here’s one of the distant APs across the street going through multiple walls, moving people, and moving cars. I used Wireshark to filter the transmitter address and exported to CSV so that I can analyze in Excel.
This is actually not due to volatility in client-side sensitivity. This is more environmental volatility due to people and cars wandering in and out of the radio path. Median RSSI is -84 dBm but the 10th percentile RSSI level is -93 dBm. So the fade margin is substantially larger due to the environment.
Obviously this AP would be extremely unreliable if I tried to use it, but that’s to be expected given the extreme distance. Even -84 dBm is below the official acceptable threshold of -82 dBm though most good 802.11 Wi-Fi adapters have much better sensitivity than the official thresholds.
Note: Some client devices are more immune to this volatility if they implement diversity antennas. It’s kind of like how a catamaran boat is more steady in bumpy waves than a single hull boat.
The tools you use can be flawed
Strangely the same computer and Atheros adapter running NirSoft WiFiInfoView in Windows 10 bounces back and forth with a huge 18 dBm swing. It’s the identical hardware setup with same 10-foot clear line-of-sight.
This looks to be a bug in WiFiInfoView for this particular hardware since I have 5 other computers running WiFiInfoView that show extremely consistent SSID RSSI levels within a 3 dBm range. It’s also clearly contradicted by the raw PCAP capture above so it goes to show that certain tools on some hardware may be flawed in how they analyze and present data.