THE BLOG ★ Ramblings on WiFi & stuff.

Why Active Surveys?

(OR, WHAT’S THE DEAL WITH THROUGHPUT TESTS?)

In the previous article, I elaborated on what wireless site surveys are, what they are for, and what they can show us. In this post, I'd like to go a bit deeper into Active Surveys, what they are, and what are they used for.

As I stated in the previous article, an Active Survey implies that your are conducting a survey while adapter(s) you are using are CONNECTED to the WLAN (Wireless LAN). A wireless device can only connect, or "associate" to one access point (AP) at a time. It is not possible, for a wireless NIC to connect to more than one AP at a time.

This is inherently limiting. If you are connected to only on AP then all you see is data coming to and from that AP. And not only that, but you can only see YOU OWN data. You cannot see another devices data. So, if the purpose for you "survey" is to determine why the service manager's table can't connect to the WLAN, you'll be out of luck. You can't see the frames between that device and the AP that would help you troubleshoot the issue, because all you can see are your own frames. So, why would I want to perform an Active Survey and what can I do with it?

Throughput tests.

A throughput test allows you to determine the maximum data throughput on a connection. This seems like a useful test. Who wouldn't want to know if they are talking fast, or slow, on their Wi-Fi? But, Wi-Fi (802.11) is a peculiar beast. Unlike wired networks, there is no guarantee of speed. A device connected to a WLAN at 1.3Gbps data rate, will never, ever get 1.3Gbps throughput. At BEST, in a clean test environment, you MIGHT get 70-75% of the data rate in real throughput. You may even peak at around 80%. But, due to the inherent overhead in the 802.11 protocol, we start at a deficit. The real throughput numbers will be significantly less that the connected data rate. In addition, 802.11 functions at half-duplex, only one device can talk at a time on a channel, and every additional device passing traffic on that channel eats away at your throughput since whenever another device is transmitting, yours cannot. We call this CONTENTION. As in, each device contends for a slice of time on the channel to transmit data.

For these reasons, I never do throughput tests while I am performing a survey, because the tests only tell my what the device I was surveying with, at that specific time and location, was able to achieve, on that one specific AP that it was connected to. It does NOT tell me what the end users devices will be able to achieve.

Performing throughput tests, without context, is a waste of time that yields negligible results. When performing throughput test here are some things to consider:

  • 802.11 Overhead - No matter what it says on the box (2300mbps!) it's a lie. Or, at the very least it's marketing pablum. Real throughput will always be much less than your connection rate, and gets lower as the network comes under load, as each device shares the AIRTIME available on the channel only when no other device is transmitting.

  • Device capabilities - Is your device 1-Stream, 2-Streams, or 3-Streams? 802.11a/b/g/n/ac/ax? etc. A newer, more capable device, will out-perform an older one. A laptop has larger antennas than a mobile phone, so it hears the network better, and thus will potentially be able to achieve better results.

  • Network Load - Testing after-hours, when there is no load, will yield better results than in the middle of the day, when the office/warehouse/school is full.

So, do they have any value? Is there a use-case for throughput test? Sure, but you need to understand what it is exactly that you are looking for in the data that you collect. Let's go through one use-case for throughput tests.

We have a customer that is converting a production and office facility to primarily wireless. One of the main issues is they don't use laptops. With few exceptions, all of their computers are desktops. This means we have install wireless adapters on them. Because of cost, time, & labor it takes to install wireless PCI cards, we've decided to use USB wireless adapters. First, they are quick to install, especially if you get ones that Windows can auto-install the drivers. Second, with a simple USB extension cable you can mount the adapter in an optimal location with better line-of-sight to the WLAN. PCI cards usually have the antennas connected directly to the card, behind the computer. Not a very good place for antennas.

I wanted to make sure that whatever adapter we settled on would work well in their environment. So, we purchased ten different 802.11ac USB adapters and I tested each adapter first to see how they heard the WLAN signal strength at various distances, through materials like walls, bookshelves, cubicles, etc. I was looking for adapter with good "ears". I then ran throughput tests on each of them to see how they performed.

I then performed throughput tests using a tool called iPerf. I used a channel that was not being used by other devices so they could get the best performance possible. I ran 12 tests on each adapter, throwing out the highest and lowest to get an average throughput per adapter. I also changed the orientation from horizontal to vertical to make sure I knew what was optimal when installed. This means I ran a total of 24 tests per adapter - 12 in each orientation.

Throughput itself doesn't tell the whole story, but it's start. Out of ten adapters five were consistently above 200mbps throughput. Four were consistently under 200mbps. So, I knew the top four would be among the ones I would choose.

While the throughput test is not an all-inclusive indicator of "this is the best adapter" it can show which ones are the more consistent high-performers. Also, the data from the bottom three was revealing. Unlike the top-performing NICs, where the throughput rates were fairly consistent, the low-performing NICs were inconsistent in their throughout numbers. Sometimes, they would get high results, other times they would get poor results. Sometimes they would stop passing data altogether! This inconsistency leads me to believe that these adapters, or their drivers, do not perform well under load. And their inconstancy revealed adapters I'd probably want to avoid. I would rather have an adapter that performs CONSISTENTLY, and RELIABLY than one this does not. TRUST is more important to me than raw speed.

Looking into the matter some more, using a wireless packet capture, I was able to see that the low-performers dropped frames at much higher rate than the high-performers. All adapters were tested one at a time, at the same location, on the same clean channel that was not being use by other devices. So, while a throughput test, or a "drag race" as some like to call it, may not be overly helpful in determining the health of the WLAN, it can reveal weaknesses, or driver issues, in wireless devices. IF you understand what you are looking for and how to interpret it.

Pinging.

Another type of Active Survey could simply be the use of ping/ICMP over the WLAN. Unfortunately, ping is also not a good tool for determine the health of a WLAN. First, the overhead and shortcomings of the 802.11 protocol mean you will rarely get consistent pings. As the load increases on the network, your ping may become more erratic. What you are seeing is not necessarily bad Wi-Fi. It's contention. Every ping you send is being contended for against other devices on the WLAN you are connected to. Only one device can talk at a time. Ping may be useful for determining if you have connectivity to a particular server, or IP address, but it does not reveal much about the health of the WLAN. As with throughput test, it could perhaps indicate SOMETHING is wrong, but not what, where, or why.

Device testing/requirement validation.

Another type of Active Survey I do perform often is what I call "device testing". Let's say the customer's requirement for the WLAN Survey was "Voice Grade WiFi". Well, I would absolutely perform a Passive Survey to get ALL the data about the WLAN, neighboring WLANs, possible misconfigurations, Primary Coverage, Secondary Coverage (for roaming), etc. I would then review my collected data and analyze it to see if the WLAN does, or does not meet the requirements for Voice Grade WiFi. If the answer is yes, then I following that up with an Active Survey.

For this "survey" however, I will not use my survey tools. I will use the customer's devices. If voice is the requirement, while the network is under load, I may give a phone to one person and call them with another. We'll then walk in different directions. If roaming is a requirement, I might walk down hallways and turn corners. I'll go in & out of rooms and walk through areas where previously the customer has intimated that calls dropped. If I can do successfully what the customers "needs" to do - keep uninterrupted calls while walking - then I can say that I have confirmed we have met the customer's requirement.

The important thing here is I used the CUSTOMER'S device to validate that the requirement is fully met. Testing with my device, one that may have greater capabilities than the end user, may not truly reveal deficits in the WLAN design. If my device works, but the customer's does not, I have not fulfilled my goal of designing the WLAN per the customer's requirements.

Conclusion.

So, should you perform Active Surveys? Sure, but first UNDERSTAND what it is exactly you are wanting test. Take the time to understand what the data is showing you. What device did you use and what are its capabilities? When did you perform the test - under load, or an empty network? Will the end users devices see similar results? Do you want the FASTEST device, or the most consistent, reliable device? Knowing why you are doing it, and understanding what the data is telling you, is important.

Wireless Site Surveys Explained

(This blog swiped from my company website.)😎

What is a wireless site survey? Seems like a pretty straight forward question until you hear someone ask for a "predictive" survey. How does that work? how do you "predict" a survey? The truth is, there is no such thing as a "predictive" site survey. We can make a PLAN, or a Predictive DESIGN/MODEL. And better yet, we can collect data before we start to better inform our predictive model.

Webster’s Dictionary defines the word survey as:

survey (verb)

sur·​vey | \ sər-ˈvā , ˈsər-ˌvā \
surveyed; surveying

transitive verb

1a : to examine as to condition, situation, or value : APPRAISE
1b : to query (someone) in order to collect data for the analysis of some aspect of a group or area
2 : to determine and delineate the form, extent, and position of (such as a tract of land) by taking linear and angular measurements and by applying the principles of geometry and trigonometry
3 : to view or consider comprehensively
4 : INSPECT, SCRUTINIZE
: to make a survey

So, to survey is to examine, query, inspect, scrutinize data, etc. What is the data we collect? It depends on what it is you want to analyze. What's the percentage of people that are ok with clubbing baby seals? Will you be voting for expanding rights to indignant penguins? Or, for us, can the Wireless LAN (WLAN) provide what the end users need?

REQUIREMENTS GATHERING

First, we start by determining requirements. I consider this "surveying". It doesn't necessarily involve walking around, with your survey gear, measuring the Wi-Fi. It's conversation. It's taking notes and pictures on the wall types, and ceiling heights, and any other oddities that can impact your potential design. It's asking questions, "What type of devices are the most critical?". "What applications does your organization rely on?". "How many devices will be connected at peak, in the morning, on the 2nd shift?". "What are the areas of highest user density?". "What will they be doing on those devices?". Real-time services/applications such as voice, video conferencing, etc. have different requirements than say, web-browsing, e-mail, and accessing a database. Are you looking to perform large data-transfers? That's different than needing to open a file from a shared network folder, or printing.

These questions may seem inconsequential unless you understand the limitations of 802.11. Every organization does not have the same needs. And those needs can be different based on location and can change over time. A cafeteria may be a "high-density" area but will have a vastly different design requirement than say, a large auditorium/lecture/training facility. Wireless Voice-over-IP has different requirements then a straight data-only design. Do you need seamless roaming, where you can stay on a audio, or video call without dropping as you walk from place to place? That is a different design from one where roaming is not required. A supervisor may have a different idea of wireless use-cases versus the employees, on the floor, using the wireless day-to-day, with specific devices. So, I start ALL my surveys with a requirement gathering, data collection meeting.

I do this wether I am preparing for a WLAN design, a Validation Survey, or a Troubleshooting Survey. How do I know if things are "good", or "bad", if I don't know what good, or bad is for this particular deployment? This is a critical step that many fail to do, and therefore, fail with their WLAN deployment, because it cannot support whatever it is the customer needs.

A survey, is a survey, is a survey. Or, is it?

WALL ATTENUATION MEASUREMENT

Once I've determined the customer requirements I can get started on the "wireless" data collection piece. Often, I will perform what I call Wall Attenuation Measurements. This typically involves placing an access point in a room, and taking measurements on both sides of a wall/obstruction to determine just how much "attenuation", or signal loss, is seen in 2.4 and/or 5GHz. This is important if the purpose of the site visit is to gather pre-design data, to help make our Predictive Model more accurate.

In this scenario, I am collecting requirements, and RF data to get as much detail BEFORE I begin the design, so I have greater confidence that my prediction, my mathematical model, will be as accurate as I can get it. I would much rather KNOW that the walls in the offices are 5dBm of attenuation, than simply choosing the default "Dry Wall" from my planning software, hoping for the best. What if there was a wall that looked like drywall, but in actuality, is brick covered with drywall for aesthetics reasons? If you didn't ask, or measure, you would have no idea, and may cause your design to fail. You can view a detailed explanation on how to perform a Wall Attenuation Measure Survey here.

So, Wall Attenuation Measurements are a type of "survey".

AP-ON-A-STICK SURVEYS

AP-on-a-Stick, or APoaS, simply means I have a pole, or mounting system of some kind, and place an access point (AP) temporarily, at a location and height, where I would like to see how the RF from this specific AP propagates in the environment. APoaS surveys are helpful in complex environments like warehouses, production facilities, or other environments, where modeling in planning software can be difficult due to the complexity 3D environment, you’re in. So, by placing an AP at a specific height, on a specific channel, at a specific power level, I can measure in the real world, and know, exactly, how this AP will cover the area we are interested in. This removes guessing, and we are using REAL data, not PREDICTED data, to evaluate coverage and inform our WLAN design.

There are many use cases for APoaS. I won't explain them all here, but common ones are:

  • Pre-Design: to validate what AP, or antennas, work best for your design

  • Post-Design: to validate a portion of your predicted AP locations before you deploy

  • Wall Attenuation Measurements: to confirm RF loss through an obstruction

There are others, but these are the ones I use the most when I do APOS.

VALIDATION SURVEYS

Like the name implies, this type of survey is used to "validate" that a particular WLAN implementation meets the requirements of the project. The WLAN is already in place, configured, and running as intended. With your survey software, and adapters, you capture data throughout the entirety of the facility where Wi-Fi is a requirement. Once you've collected all the data, on all the channels that you care about (I typically survey ALL Wi-Fi channels), you can analyze the data, and compare it to your requirements to see if it "passes". Things you could look at could be: Primary Signal Strength, Secondary Signal Strength (for roaming), Co-Channel and Adjacent Channel Interference, Interfering networks, Rogue APs, misconfigurations, Channel-width, Channel usage, and more.

When something is found that does not meet the requirements, you can then resolve the issue(s), and perform the survey again, to confirm the changes made now allow the WLAN to meet the requirements you have set. I may not always do a pre-design wireless survey, but I ALWAYS do a Validation Survey.

ACTIVE VS. PASSIVE SURVEYS

"Active Surveys" - Ooh, that SOUNDS important. As in, "we need an Active Wireless Survey, STAT!" I want to be Active, not Passive, don't I? Well, in the case of wireless surveys, Passive is where it's at. Let me clarify.

ACTIVE survey implies there is activity. In this case, we mean actual data being transmitted and received. In order to perform an Active Survey, you MUST be connected to an AP. How many APs can a device connected simultaneously? "There can be only one, Neo". This means that in order to perform an Active Survey you MUST be connected to one AP, and pass traffic. What DON'T you see when you are connected to a WLAN? Prepare to be shocked, you do not see ANY wireless frames! Zip. Zero. None. You can only see your own traffic, not that of any other devices, and what you see is upper layer traffic - things like DHCP, DNS, IP addresses, webpages, SMPT, etc. Now, that may be great if all you are interested in is your own traffic, but you won't see the stuff that MATTERS when monitoring/surveying Wi-Fi - that is 802.11 frames. That's where the magic is - in those Management, Control, and Data Frames that are completely invisible when you are connected to a WLAN.

Also, when you are CONNECTED to a WLAN you only see your own traffic, at that time, at that location, with that specific device. You don’t see the ENTIRETY of the WLAN and how neighboring WLANs interact with it. Your view is extremely limited and tells you nothing about the the health of the WLAN. Suffice it to say, that for me, Active Surveys are rare, if I do them at all.

"PASSIVE Survey" sounds weak. Who wants to be passive? The Terminator's not passive, Sarah Connor ain't passive - I don't wanna be lame! The truth is Passive Surveys is what you NEED if you want to see what matters in understanding the health of a WLAN. By “Passive” we mean, you are NOT connected to the network. You can only monitor if you disconnected and listening ONLY. Passive Surveys allow you see ALL the channels and networks around you, not just the one you are connected to for an Active Survey. AND you see all the 802.11 frames! The magic of how Wi-Fi actually works! Only a passive survey will reveal how bad your Co-Channel, or Adjacent Channel Interference is, or if you have coverage in all the areas you care about, or SECONDARY coverage for seamless fast roaming. Passive Surveys can even reveal details about the configuration of the WLAN, if you have the right security, are your Basic, or Supported channels a potential problem, is there a rogue AP that was brought into your environment that shouldn't be there? These are things you cannot see with Active Surveys.

You can skip an Active Survey, but ALWAYS do a Passive Survey.

TROUBLESHOOTING/OPTIMIZATION SURVEYS

These types of surveys are essentially the same as a Validation Survey, the main difference is the Validation is simply to confirm that an implementation meets a set of requirements, usually as the final part of the planning, design, and implementation of a new WLAN, whereas these surveys are to TROUBLESHOOT the cause of a wireless issue, or to determine if the existing WLAN design can be improved upon, or OPTIMIZED for a new, changed, or updated use case. The process is the same - passive survey, on all the channels that matter, in all the areas that matter to the customer. You then analyze the data and determine if that data meets the REQUIREMENTS based on what you have determined by talking to the customer about the project.

That’s all for now…

As you can see, the word SURVEY is a loaded term. It can mean many things to different people. So, it's important to understand what it is exactly you are looking for, and how you need to collect that data. It should always start with determining the REQUIREMENTS for the WLAN. Gather as much data as you can BEFORE you begin a WLAN design. Collect data with a Passive Survey on all the channels you care about and in all the areas that matter. Finally, analyze the data and hold it up against the requirements you have determined, to confirm the WLAN meets those requirements, or not.

SURVEY - not as simple as it seems.

Download this as a white paper.

Bad Design at Your Request

Estimated reading time: 6 minutes, 44 seconds. Contains 1348 words

 

What does one do when presented with a highly questionable request from a customer? Nothing immoral here, just when a potential customer is asking you to do something you know won’t work. I had this exact scenario happen this Summer with a resort that wanted to do a wi-fi refresh.

Originally, the customer just wanted to do a rip-n-replace - swap out their existing 7-year old, 2.4GHz APs - with new APs. After some discussion we convinced them that a simple swap out was not the best solution. We agreed to do a predictive design using data collected from Our site visit.

One of the caveats was that the APs could not be in the rooms. For aesthetic reasons, and others, management wanted no APs in rooms. We knew this would not be an ideal solution, and we let this be known on several occasions. After explaining our reasoning, IT was in agreement with us on this matter. However, the management was not convinced and decided to take a chance on the hallway “design”.

We did our best in collecting data on-site and used that data in our predictive model. In the end the initial deployment was a hallway placement. We adjusted some AP locations and added some, but I knew this solution would not yield the desired results. Directional antennas brought the budget to more than they wanted, so those were not an option. They also wanted this in on a tight time-line. I had made my concerns know at multiple occasions, but there was no budging from management.

So, I could choose not to do this project and walk away, or sell a solution in which I was not confidant. Well, I have a business to run, bills to pay, and employees to… employ. I chose to “design” a solution within the constraints - both AP placement and budgetary. With this in mind I drafted the cover letter below for the design I presented to the customer:


Thank for this opportunity to allow us to present you with a wireless access solution for your wonderful property. Before you proceed to the information in the document please indulge us and read this overview in its entirety. It will clarify the purpose and scope of this document.

This report is a “Predictive Survey”. The term “predictive” is used deliberately to denote the fact that the process used in the creation of this report is a best guess and will most likely not be 100% accurate. In a structurally complicated deployment such as yours we can probably assume a 75-80% accuracy rate.

Predictive surveys are a very useful tool for the Wireless LAN Professional when a full, cost-prohibitive survey is unavailable. There are two methods to perform a predictive survey:

OPTION 1. Using only the floor plans and a questionnaire we can use the survey software to automatically place the APs and then manually adjust to our specifications. We can alternatively manual add the APs to specification. This type of survey is best for traditional, modern open-floor plan office environment where the loss and performance characteristics are well known. This is also the most cost-affective (up font) solution and very commonly used. In the end you get the best data out when you put the best data in so this option should be used sparingly and mainly for budgetary purposes.

OPTION 2. Perform a physical site survey to become aware of the building materials in use, current AP locations and limitations, physical build of the rooms and furniture materials and locations, etc. Also, using the same APs and antennas that are being proposed take as many readings as possible to determine the true signal loss at varying distances from the AP. We also are able to look at various types of RF interference that may be present, and possibly mitigate that interference before the deployment begins. This allows us to use that data to better build our predictive model. This is less cost effective than option 1, but more cost-effective than a full site survey, and allows the model to go in with the best possible data that the circumstances allow.

Option 2 is what we have done. We spent time on site taking readings, and noting material types as we could. Obviously, in an environment as busy as yours it was not possible to have access to every location so we did the best we could taking readings in multiple rooms and room types and through the various material types at your location. This is not perfect, nor definitive, but it will at least give us valuable information to use when making our predictive model.

We would also like to take this moment to also state that locating all the APs in the hallways is the least effective model in a multi-room, multi-tenant environment such as yours. The best-case scenario here is allowing the APs to be located with-in the rooms. This allows us to use the structure of the building itself to allow for separation between the APs and help mitigate interference as well as getting the RF signal closer to the clients. We understand that this is not always possible for a variety of reasons, but we felt the need to make you - the customer - aware of the limitations to which this design has been restricted.

In conclusion, we ask that you look at the following report with the information in this overview in mind and with the understanding that after deployment we highly recommend that we (or a 3rd party) perform a validation survey to confirm where the predictive model falls short. Upon the results of this verification there may be several things that need to be done. It may be adding, relocating, or even removing some APs, or we may simply need to disable certain AP radios to reduce co-channel interference. Either way the design is not complete in our view if it has not been validated after implementation.

With this in mind please proceed to review the report. Thank you.

Eddie Forero, Principal CommunicaONE Inc.


The report essentially showed what we had been saying - that APs in the hallways would not provide the in-room coverage they desired. We also provided an alternative design with APs in room. In the end the management went ahead with the hallway solution despite ITs misgivings.

The end result was not much better than what they had. I fully expected to bear the wrath of the customer. I was not happy that I installed a solution I didn’t believe in. And I was not expecting what happened next.

The Director of IT flew out the the head office to present the results of our post-installation validation survey. He showed that hallway APs were providing great “coverage” in the hallways, but not in the guest rooms. He explained how we had predicted that this design would not give them the results they were after and the gamble did not pay off.

Because we had been very clear about our concerns, and because we had clearly stated, then validated those concerns, the management decided to foot the bill for a complete in-room redesign (using different APs). And not only that, but also light up another property next door!

Maybe we should have walked away. But, instead, I stated clearly why the solution would not work and made sure they were aware of a drawbacks. I’m don’t know if I would do this again, but I will definitely make even more of an effort in the future to have the customer deploy the right solution the first time around. It’s more cost effective and less stressful.

I don’t know if this is helpful to anyone, but I figured I’m not the only one who has had projects where your hands were tied. The moral of this story is - stand your ground. In this case it worked out because the customer realized the error and stepped up to do it right. But, make sure you fully layout the issues. Be respectful of the customer, but respect your skills and knowledge as well.

What’s been impossible on iOS, but easy on Android for years, has finally come (back) in iOS 8.

Wi-Fi scanning can now be performed. You can see SSIDs, even hidden ones, and view RSSI. For now, it’s only available via the Apple Airport Utility and it needs to be manually enabled in settings.

Sorry, no API access for 3rd party developers (yet), but at least WLAN aficionados can finally scan wi-fi on iOS devices!

Download Apple Airport Utility:

https://appsto.re/us/YJ7Dz.i