Reports
Which Meeting & Calling Solutions Objectively Deliver the Best Quality Experience?
Tain Barzso

January 13, 20215 min read

Zoom is on a mission to make video communications frictionless and secure to get more done. While some people may have only started using Zoom in the last year, we have worked tirelessly over the last decade to deliver exceptional audio and video within an intuitive user experience. Customers tell us they choose Zoom because of our high-quality meetings and calls, but how do you quantify that?

To answer that question analytically, we engaged an independent third-party, Wainhouse Research, to evaluate the Zoom experience against other cloud meeting and phone providers. Wainhouse designed a methodology for evaluating solutions based on current industry standards for measuring audio and video quality. While Zoom commissioned the study, we had no role in designing the Wainhouse methodology or scoring system.

This research started with a single question for Wainhouse Research - how can we objectively evaluate the quality of meetings and calling audio, video, and performance?

Evaluation focus

Wainhouse Research evaluated meetings and phone quality across six vendors and 10 solutions. Over 2,000 tests were conducted.

Source: Quality Evaluation Enterprise Meetings & Calling report from Wainhouse Research January 2021

Enterprise Meetings - Video Quality

One key set of tests focused on VMAF score: A standard built to evaluate the performance and effectiveness of video encoding systems by comparing sent (reference) and received (processed) video files. VMAF uses an AI-trained model to predict how a user would score each test video on a scale of 0 (bad) to 100 (excellent).

Source: Quality Evaluation Enterprise Meetings & Calling report from Wainhouse Research January 2021

This chart depicts average VMAF scores (Y-Axis): Average of 3 reference videos - light, dark, and motion - and across baseline (normal), 5%, 20%, 40%, and 70% Packet Loss steps (X-Axis).

Average of Raw VMAF Scores from Wainhouse evaluation:

Source: Quality Evaluation Enterprise Meetings & Calling report from Wainhouse Research January 2021

This table matches the average VMAF scores chart: Average of 3 reference videos - light, dark, and motion - and across baseline (normal), 5%, 20%, 40%, and 70% Packet Loss steps.

Highlights:

  • Zoom starts with the highest measured video quality based on the standard services offered by each provider.
  • At 20% packet loss, Zoom is still in the EXCELLENT range
  • At 40% packet loss, Zoom is in the GOOD range, when all others drop to FAIR or below.
Enterprise Calling - Noise Suppression

Wainhouse also conducted tests to evaluate the noise suppression performance of enterprise calling solutions. Noise suppression has become more important as more people are working outside a corporate office setting. Many providers have improved their noise suppression capabilities, so this study compares those side by side.

Source: Quality Evaluation Enterprise Meetings & Calling report from Wainhouse Research January 2021

This test starts with a reference file of heavy brown noise, which sounds like a loud air conditioner working at full speed. The reference noise is sent through each solution with suppression enabled (when available), recorded on the other side, and the amount of recorded noise is measured against the original reference file. The orange waveforms represent the brown noise present in each test - you can see the unfiltered noise file above, and each solution's recording compared to the original noise file (in grey waveform) below.

While a real-life example would focus on how background noise suppression can make a voice more audible, this test aims to push each calling provider's noise suppression to the extreme to understand effectiveness and the time required to begin suppressing noise.

The following visual shows how each provider's calling service suppressed the brown noise from the same reference file.

Source: Quality Evaluation Enterprise Meetings & Calling report from Wainhouse Research January 2021

  • Webex Calling - begins to recognize the background noise around 11 seconds into the audio and only slight reduces the noise.
  • 8×8 - recognizes background noise before the first second and reduces it slightly, but then increases the suppression around 12 seconds.
  • RingCentral - also recognizes the noise before the first second, but increases the suppression earlier at around 9 seconds.
  • Microsoft Teams - recognizes the noise immediately and reduces the background noise compared to the previous providers, but still has larger variability in how much is reduced throughout this period of time. Note that this test is using Microsoft's latest noise suppression technology update.
  • Webex Teams - also recognizes the noise immediately, but does a better job of maintaining a more consistent noise suppression.
  • Zoom - also recognizes the noise immediately, but drastically reduces the background noise the most and maintains the most noise suppression of this sample set. It even continues to reduce it further as time progresses.

These two evaluation findings represent only a small portion of the thousands of tests conducted by Wainhouse Research to help answer the question of which solution provider objectively delivers the best quality meeting and calling experiences.

To learn more about how Wainhouse Research designed their testing methodology, controlled variables, weighted scoring, and calculated their answer, please join us for a fascinating Zoom webinar on Jan. 28. This webinar will also include an overview of additional evaluation results such as Enterprise Meeting Audio Quality, Enterprise Meeting System Performance, Comparisons of Subjective to Objective Testing, and Enterprise VoIP Audio Quality.

Register for the Jan. 28 webinar 'Service Quality Matters: Wainhouse Research Discusses Recent Findings on Meetings and Phone' today!

Don't forget to share this post

Attachments

  • Original document
  • Permalink

Disclaimer

Zoom Video Communications Inc. published this content on 13 January 2021 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 13 January 2021 14:07:07 UTC