Scoring Criteria

Home/Awards/Scoring Criteria

The Ten Best Web Support Sites of 2018 Site Scoring System

Scoring Instructions for Judges

For each of the 25 criteria described on the following pages, please assign a numerical score that reflects your judgment about this item. Most of the questions deal with your subjective impression of the site, supplemented by information supplied by the entry essays, so please draw the best conclusion you can from the available information.

Each item is worth 0-10 points. We’ve offered brief text descriptions that correspond to scoring levels, but there is no single blueprint for a perfect support site. If the site developers have found an innovative way to deliver a quality support experience, feel free to assign a higher score than our text descriptions might suggest (not above a 10.0, please. Similarly, you should downgrade a poor implementation.)

Note that we have three categories of entries—Small (under $100 million), Medium ($100 million to $999 million), and large ($1 billion and up) Company Divisions. Although the scoring questions are the same, each Company’s entry only competes against others in their Division. This approach is intended to “level the playing field” for companies that don’t have the resources to support complex, full-featured sites.

For each item, you may assign any score between 0 and 10.0, to a tenth of a point (e.g., 3.5). Since a site’s overall standing will be based on a cumulative score (maximum 100 points), it’s important that you assign a score for every item; answers like “n/a” or “how would I know?” are the same as giving a score of zero points. However, if you can’t locate a feature (because of password problems or difficult navigation) or if an entry essay doesn’t provide essential information, feel free to assign a low score for that scoring question. Note that we are encouraging companies that have onboarding or site overview videos to include links in their essays to help familiarize judges with heir sites. Please view any video links provided and reflect them in their scores as appropriate.

Finally, the comments: We’d like you to describe the three most important enhancements you’d recommend for each site. Length is up to you—anywhere from a few notes to a mini-essay, if that’s what you feel like writing. Comments are no longer optional, because we’ve found that most people who enter are very eager to hear feedback from the judges about how to improve their sites. Your comments are the single most important benefit of the competition to those who enter

Site Scoring System

■ Overall Usability, Design, and Navigation
Based on your first impressions of the site and your subsequent exploration of key features, please assign scores for the following five criteria:

1. What’s your impression of the support site’s overall navigation?

  • Strong, consistent navigation model helps users quickly find all post-sale support- and service-related resources(10.0)
  • Tech support resources are easy to find, but other post-sale content (e.g., forums, training, downloads, phone support, videos, and other support channels) is not well integrated into the main support site (5.0)
  • All elements of the site are poorly organized and hard to find (0.0)

2. What’s your impression of the support site’s pathways to relevant content?

  • Link-based pathways are clearly defined; usually, less than three clicks are needed to reach popular services and high-traffic articles (10.0)
  • Pathways are logical, but some obstacles (e.g., excessive clicks, multiple logins) interfere with access to content (5.0)
  • Redundant, confusing pathways to content (0.0)

3. Does the site support a personalized “My Support” view of content ?

  • Yes, users can set up profiles based on product ownership and other preferences that automatically pre-configure most of the content and services they access (10.0)
  • Yes, but personalization is mostly limited to product ownership (5.0)
  • No personalization (0)

4. How would you describe the clarity and accuracy of navigational links and menus?

  • Links and menus are always clear and used consistently to describe content; no jargon; no need for trial-and-error navigation (10.0)
  • Some links are vague or don’t lead to expected destinations (5.0)
  • Much inconsistent, misleading link and menu terminology (0.0)

5. How does the site help new customers and first-time visitors?

  • Site includes general orientation feature (welcome message, site tours) (10.0)
  • No general orientation feature, but instructions on individual features are clear (5.0)
  • Site is difficult for new visitors to understand (0.0)

■ Knowledgebase and Search Implementation
After exploring the site’s content and trying a few knowledgebase queries,* please assign scores for the following five criteria:

1. What’s your impression of the site’s overall knowledgebase integration?

  • Search engine displays relevant content from multiple sources of information, including third-party/community resources (10.0)
  • Search engine displays relevant content from multiple sources within the company, including forums (5.0)
  • Search engine deals primarily with knowledgebase content (0.0)

2. What’s your impression of the KB search tool?

  • Incorporates advanced features, such as autocomplete, synonym recognition, spell checking, Boolean filters (10.0)
  • Standard features, with optional filtering by product and other user- defined relevancy criteria (5.0)
  • Keyword searches based on text matching; many off-topic results (0.0)

3. How well does the search engine identify highly relevant articles?

  • On-topic articles tend to appear in the first page of search results, and there are rarely any near-duplicate results (10.0)
  • On-topic articles tend to appear in the first two or three pages of results, often mixed in with near-duplicate results (5.0)
  • Most searches generate an excessive number of off-topic and near- duplicate results (0.0)

4. What’s your impression of the FAQ (or similar high-value articles) sections?

  • Answers to “frequent” questions are well-written and illustrated, often with more breadth than occurs with single-topic tech notes (10.0)
  • FAQ exists, but answers are mostly just links to standard tech notes (5.0)
  • No FAQ, or hard to find (0.0)

5. How would you describe how the site gathers feedback from knowledgebase users?

    • Asks for document quality scores and suggestions about how to improve each tech note (10.0)
    • Asks only if the document solved the user’s problem (5.0)
    • No knowledgebase feedback method (0.0)

* Suggestion: To create a test query, select a question from the middle of the FAQ list and reword it.

■ Interactive Features
Please assign scores for the following five criteria:

1.How would you describe the site’s use of technology to interact with customers?

    • Developers make good use of advanced tools (e.g., remote diagnostics, configuration wizards, wikis, webinars, automatic alerts, etc.) (10.0)
    • Competent implementation of standard Web technology (5.0)
    • Significant technical glitches (e.g., sluggish performance, multiple logins) (0.0)

2.What options are available for online case management (including repair and parts orders)?

  • Case tracking lets qualified users manage categories of incidents (e.g., all users in a company, all incidents of a specific type). (10.0)
  • Qualified users can easily open single cases online and track pending cases. (5.0)
  • No way to submit cases online (0.0)

3.If the site fails to resolve a problem, what escalation options (free or fee-based) do users have?

  • Easy access to telephone or chat support (10.0)
  • Escalation to telephone or chat support is available but hard to find (5.0)
  • No telephone or chat escalation options (0.0)

4.How does the site support downloads of patches, drivers, and upgrades?

  • Site helps users determine specific download needs with auto-detect tools, advanced filtering, or product profiles (For Web-based/SaaS products, upgrades should occur automatically.) (10.0)
  • User selects downloads from menus or lists; process is fairly intuitive (5.0)
  • No downloads available, or download process is difficult to use (0.0)

5.How does the site deliver support content and services to users with smartphones and tablets?

  • All support content and features are accessible and usable via a mobile- specific app or Web site based on “responsive” mobile coding. (10.0)
  • Partial implementation: Some features have been adapted for mobile devices, but not the entire support site. (5.0)
  • No specific adaptation for mobile devices; content displays just as it would on a standard PC. (0.0)

■ Community Engagement
Please assign scores for the following five criteria:

1.How broad is the interaction between the site and its users?

  • Site actively solicits “voice of the customer” engagement in creating new content, brainstorming, taking part in user groups and forums, and other customer/community activities (10.0)
  • Customer engagement is largely limited to online forums (5.0)
  • There are no forums or other customer engagement channels (0.0)

2.How do the site managers measure the success of their online forums?

  • Key metrics emphasize forum participation, specifically numbers of active members and postings (10.0)
  • Key metrics emphasize call deflection (5.0)
  • No formal metrics for communities (0.0)

3.What kind of participation do the forums attract?

  • Forums provide a platform for discussions of success-oriented business tactics as well as break-fix technical problems (10.0)
  • Forums are mostly limited to solving product-specific support issues (5.0)
  • Forums are largely inactive, with few recent postings (0.0)

4.How do forums attract outside expert contributors?

  • Forums showcase experts with well-crafted VIP programs, including awards, ladder boards, and/or special access to resources and information (10.0)
  • Top contributors are acknowledged, but in-house support staff handle many questions in order to deliver faster responses (5.0)
  • No special efforts to identify or reward top contributors (0.0)

5.Does the support organization invite users to create personal profiles?

  • Yes, and the profiles allow users to advertise their skills and interests for networking purposes (10.0)
  • Profiles are available, but rarely contain much information (5.0)
  • No profiles are available (0.0)

■ Site Development Strategy
Based on your review of the site-challenge essay and your experience with the site, please assign scores for the following five questions:

1. How would you describe the company’s overall site development strategy?

  • Ongoing, multi-year initiative to deliver state-of-the-art customer experience (10.0)
  • Catch-up project to remedy major deficiencies (5.0)
  • Ongoing site maintenance focused on relatively minor issues (0.0)

2. How did the site developers gather information to guide their strategy?

  • Developers used formal user-experience methodologies (e.g., customer experience journey mapping, persona/task design, user observation, A/B testing, traffic analysis). (10.0)
  • Research emphasized technical data about specific infrastructure components and tools (5.0)
  • Decisions seemed to be based on site developers’ personal preferences and anecdotal customer feedback. 0.0)

3. How did the site developers measure the success of their efforts?

  • Measured site performance and customer satisfaction against long-term trend lines (10.0)
  • Measured success mostly by subjective, anecdotal feedback (5.0)
  • Little evidence that success was measured in any way (0.0)

4. How did the site developers try to build support for their strategy?

  • Actively sought buy-in from internal and external stakeholders for the site improvements; involved users in beta testing (10.0)
  • Conducted a pre-launch marketing campaign to raise user awareness of the site enhancements (5.0)
  • Little effort to generate support (0.0)

5. How would you describe the “lessons learned” from implementing this strategy?

  • Insightful, strategic, applicable to broad range of situations (10.0)
  • Lessons are mostly technical and apply only to company itself (5.0)
  • Few insights gained from site development experience (0.0)

Eligibility Entry Procedures
Entries may be submitted by individuals, companies, or outside firms that participated in the site development process. Each entry must be accompanied by a completed application form, along with an essay that describes the major challenge the site developers faced and three key site features. (The essay should be submitted in electronic form.) We will accept video links to familiarize the judges with your site. An onboarding video might work well for that or an overview video. Do not attach video files as they are too large. Only links will be accepted and will probably help your scores.

Deadline for all entries is Friday, March 30, 2018.

The judges will use a scoring system that addresses five general areas:

  • Overall usability, design, and navigation.
  • Knowledgebase and search implementation.
  • Interactive features (forums, e-mail, chat, etc.)
  • Community engagement.
  • The major site development challenge

The ten winning sites will be profiled in an annual ASP report that will include screen shots, essays, and related material. All entrants will receive a complimentary pdf of this report after publication.

Entries should be submitted to The Association of Support Professionals in electronic form (see application for details). Entry fee is $950 for large company entries, $750 for medium company entries, and $550 for small company entries.

Click Here to Download PDF File of the Site Scoring Instruction for Judges