ASP Best Support Websites Replaces the Top Ten

ASP2020BestSupportWebsites

Al Hahn

Executive Director

For the past twenty-two years, ASP has held an annual competition for the Top Ten Best Support Websites. Each year independent judges have checked out many different attributes of the sites. Currently there are 25 different areas that are judged and scored. Each entry receives a confidential site report detailing their average scores in all 25 areas, along with judges’ comments, screenshots, and recommendations for improving the sites. The winners get trophies and the use of a special award winners’ logo. We usually do webinars with some of the winners. We issue joint press release to promote winners accomplishments. These bragging rights are hotly contested. After this long, however, it is time to freshen it up.

Starting this year, we will no longer honor the top ten best support websites. We will still check out and score the 25 different areas of the sites and still provide the confidential site reports to those who enter. We will change how we determine the winners, however. The very top- scorers will still win an award, but not ten of them. We will look for outstanding areas to honor, such as Best Community, Best Search, Best Personalization, etc. It will be more like the Oscars, where there is a Best Picture, but they also honor Best Screenplay, Best Documentary, and Best Soundtrack. This will make the awards and the annual report on the winning sites much more interesting as we can delve more deeply into specific features, processes, and technology. The awards will be more diverse and useful to support organizations looking to improve their customer experiences. We will look for specific areas to honor among all the entries, not just the top ten. A site that has one or two really outstanding features will get honored even if their overall scores are not that great. Great features are not restricted to a simple numerical count. More sites have a chance to be honored this year. The nominations will come from our judges, who actually use the site’s features. We will also ask them to be alert to features that are not represented in our scoring but may be an early use of a new technique or technology. This will increase our visibility to innovations. I am really excited to see what new things these changes bring to our competition and reports.

We are also making it a bit easier to enter the competition. Entries will no longer be required to submit an essay. They still can send us an essay if they choose, but now have the option of simply filling out an online form. The form essentialy asks the same questions as we want answered in the essays, but it will be a bit easier to submit for many people. After all, who of us liked essay questions when we were in school?

We start taking entries April 27th and the deadline is June 22nd. For $900 to $1,500, depending on your company’s size, you get your site evaluated by five independent judges who have been vetted for relevant expertise and conflict of interest. Some of them are noted support consultants, such as David Kay and Francoise Tourniaire. Many have judged sites for us for years, such as Jennifer Macintosh who has judged for 20 years. Getting your site reviewed by consultants can easily cost $20,000. With ASP, you get five independent reviews and can also win an award. What do you have to lose? Nothing! Any critical judging will not be shared with anyone. Only you get to see your confidential site report and we do not publish the names of those who entered, so there is no black eye if you do not win an award. Go here for more information.

Related Posts

Time to Resolve

Time to Resolution (TTR) is a metric to measure the elapsed time it takes for Support to resolve a case. This article introduces the definition and inputs required to measure TTR.

Do you have ideas for using AI?

According to Svetlana Sicular of Gartner the reason why AI initiatives are rolling out so slowly, is the lack of ideas. Françoise Tourniaire of FTWorks offers insights and ideas for applying AI to Support.

How to Define and Measure Deflection

Using an accurate measure of deflection is imperative. If not measured correctly it is easy to overstate the impact of self-help and service automation on assisted support demand. The average rate of case deflection within the technology industry is 23%. For some companies deflecting 23% of the assisted support demand is extraordinary, while for other companies there is considerable room for improvement.