(OR, WHAT’S THE DEAL WITH THROUGHPUT TESTS?)
In the previous article, I elaborated on what wireless site surveys are, what they are for, and what they can show us. In this post, I'd like to go a bit deeper into Active Surveys, what they are, and what are they used for.
As I stated in the previous article, an Active Survey implies that your are conducting a survey while adapter(s) you are using are CONNECTED to the WLAN (Wireless LAN). A wireless device can only connect, or "associate" to one access point (AP) at a time. It is not possible, for a wireless NIC to connect to more than one AP at a time.
This is inherently limiting. If you are connected to only on AP then all you see is data coming to and from that AP. And not only that, but you can only see YOU OWN data. You cannot see another devices data. So, if the purpose for you "survey" is to determine why the service manager's table can't connect to the WLAN, you'll be out of luck. You can't see the frames between that device and the AP that would help you troubleshoot the issue, because all you can see are your own frames. So, why would I want to perform an Active Survey and what can I do with it?
Throughput tests.
A throughput test allows you to determine the maximum data throughput on a connection. This seems like a useful test. Who wouldn't want to know if they are talking fast, or slow, on their Wi-Fi? But, Wi-Fi (802.11) is a peculiar beast. Unlike wired networks, there is no guarantee of speed. A device connected to a WLAN at 1.3Gbps data rate, will never, ever get 1.3Gbps throughput. At BEST, in a clean test environment, you MIGHT get 70-75% of the data rate in real throughput. You may even peak at around 80%. But, due to the inherent overhead in the 802.11 protocol, we start at a deficit. The real throughput numbers will be significantly less that the connected data rate. In addition, 802.11 functions at half-duplex, only one device can talk at a time on a channel, and every additional device passing traffic on that channel eats away at your throughput since whenever another device is transmitting, yours cannot. We call this CONTENTION. As in, each device contends for a slice of time on the channel to transmit data.
For these reasons, I never do throughput tests while I am performing a survey, because the tests only tell my what the device I was surveying with, at that specific time and location, was able to achieve, on that one specific AP that it was connected to. It does NOT tell me what the end users devices will be able to achieve.
Performing throughput tests, without context, is a waste of time that yields negligible results. When performing throughput test here are some things to consider:
802.11 Overhead - No matter what it says on the box (2300mbps!) it's a lie. Or, at the very least it's marketing pablum. Real throughput will always be much less than your connection rate, and gets lower as the network comes under load, as each device shares the AIRTIME available on the channel only when no other device is transmitting.
Device capabilities - Is your device 1-Stream, 2-Streams, or 3-Streams? 802.11a/b/g/n/ac/ax? etc. A newer, more capable device, will out-perform an older one. A laptop has larger antennas than a mobile phone, so it hears the network better, and thus will potentially be able to achieve better results.
Network Load - Testing after-hours, when there is no load, will yield better results than in the middle of the day, when the office/warehouse/school is full.
So, do they have any value? Is there a use-case for throughput test? Sure, but you need to understand what it is exactly that you are looking for in the data that you collect. Let's go through one use-case for throughput tests.
We have a customer that is converting a production and office facility to primarily wireless. One of the main issues is they don't use laptops. With few exceptions, all of their computers are desktops. This means we have install wireless adapters on them. Because of cost, time, & labor it takes to install wireless PCI cards, we've decided to use USB wireless adapters. First, they are quick to install, especially if you get ones that Windows can auto-install the drivers. Second, with a simple USB extension cable you can mount the adapter in an optimal location with better line-of-sight to the WLAN. PCI cards usually have the antennas connected directly to the card, behind the computer. Not a very good place for antennas.
I wanted to make sure that whatever adapter we settled on would work well in their environment. So, we purchased ten different 802.11ac USB adapters and I tested each adapter first to see how they heard the WLAN signal strength at various distances, through materials like walls, bookshelves, cubicles, etc. I was looking for adapter with good "ears". I then ran throughput tests on each of them to see how they performed.
I then performed throughput tests using a tool called iPerf. I used a channel that was not being used by other devices so they could get the best performance possible. I ran 12 tests on each adapter, throwing out the highest and lowest to get an average throughput per adapter. I also changed the orientation from horizontal to vertical to make sure I knew what was optimal when installed. This means I ran a total of 24 tests per adapter - 12 in each orientation.
Throughput itself doesn't tell the whole story, but it's start. Out of ten adapters five were consistently above 200mbps throughput. Four were consistently under 200mbps. So, I knew the top four would be among the ones I would choose.
While the throughput test is not an all-inclusive indicator of "this is the best adapter" it can show which ones are the more consistent high-performers. Also, the data from the bottom three was revealing. Unlike the top-performing NICs, where the throughput rates were fairly consistent, the low-performing NICs were inconsistent in their throughout numbers. Sometimes, they would get high results, other times they would get poor results. Sometimes they would stop passing data altogether! This inconsistency leads me to believe that these adapters, or their drivers, do not perform well under load. And their inconstancy revealed adapters I'd probably want to avoid. I would rather have an adapter that performs CONSISTENTLY, and RELIABLY than one this does not. TRUST is more important to me than raw speed.
Looking into the matter some more, using a wireless packet capture, I was able to see that the low-performers dropped frames at much higher rate than the high-performers. All adapters were tested one at a time, at the same location, on the same clean channel that was not being use by other devices. So, while a throughput test, or a "drag race" as some like to call it, may not be overly helpful in determining the health of the WLAN, it can reveal weaknesses, or driver issues, in wireless devices. IF you understand what you are looking for and how to interpret it.
Pinging.
Another type of Active Survey could simply be the use of ping/ICMP over the WLAN. Unfortunately, ping is also not a good tool for determine the health of a WLAN. First, the overhead and shortcomings of the 802.11 protocol mean you will rarely get consistent pings. As the load increases on the network, your ping may become more erratic. What you are seeing is not necessarily bad Wi-Fi. It's contention. Every ping you send is being contended for against other devices on the WLAN you are connected to. Only one device can talk at a time. Ping may be useful for determining if you have connectivity to a particular server, or IP address, but it does not reveal much about the health of the WLAN. As with throughput test, it could perhaps indicate SOMETHING is wrong, but not what, where, or why.
Device testing/requirement validation.
Another type of Active Survey I do perform often is what I call "device testing". Let's say the customer's requirement for the WLAN Survey was "Voice Grade WiFi". Well, I would absolutely perform a Passive Survey to get ALL the data about the WLAN, neighboring WLANs, possible misconfigurations, Primary Coverage, Secondary Coverage (for roaming), etc. I would then review my collected data and analyze it to see if the WLAN does, or does not meet the requirements for Voice Grade WiFi. If the answer is yes, then I following that up with an Active Survey.
For this "survey" however, I will not use my survey tools. I will use the customer's devices. If voice is the requirement, while the network is under load, I may give a phone to one person and call them with another. We'll then walk in different directions. If roaming is a requirement, I might walk down hallways and turn corners. I'll go in & out of rooms and walk through areas where previously the customer has intimated that calls dropped. If I can do successfully what the customers "needs" to do - keep uninterrupted calls while walking - then I can say that I have confirmed we have met the customer's requirement.
The important thing here is I used the CUSTOMER'S device to validate that the requirement is fully met. Testing with my device, one that may have greater capabilities than the end user, may not truly reveal deficits in the WLAN design. If my device works, but the customer's does not, I have not fulfilled my goal of designing the WLAN per the customer's requirements.
Conclusion.
So, should you perform Active Surveys? Sure, but first UNDERSTAND what it is exactly you are wanting test. Take the time to understand what the data is showing you. What device did you use and what are its capabilities? When did you perform the test - under load, or an empty network? Will the end users devices see similar results? Do you want the FASTEST device, or the most consistent, reliable device? Knowing why you are doing it, and understanding what the data is telling you, is important.