top of page

website evaluation

introduction

After onboarding to a large federal government project, protected by security clearances, one of my first tasks was to perform a heuristic evaluation of a public-facing site for the organization. According to my project, the definition of a heuristic evaluation:

“a comprehensive, task-focused assessment of a product’s usability issues along with recommendations conducted by one or more evaluations with extensive knowledge of usability principles, as known as ‘heuristics’".

The purpose of this heuristic evaluation is defined for our stakeholders, was,

“… to assess the interaction design of a product and identity usability issues through well-researched usability principals. The issues outlined throughout this evaluation are not option based. They are grounded in this well-established usability principles.’”

Based on these two definitions, my goal was to compile all the findings into an easily digestible, stand-alone slide deck for stakeholders to review and utilize in their decision-making. This content-heavy slide deck included: 

  • 14 heuristic principles used in evaluation

  • Background knowledge on how I conducted the evaluation

  • 5 recommendations for site improvement based on the heuristics 

  • Next steps in the process

  • The complete heuristic evaluation with imagery, heuristics, and information on how I would fix the site.

full heuristic evaluation

The review provided the following format for every page and sub-page item on the site. Below is an example slide deck page:

Each portion of the heuristic contains:

  • Image of the webpage for reference

  • A number indicating which part of the site is an issue

  • Right grey sidebar included:

    • Project Name

    • Date the heuristic was reviewed

    • Name of the screen/image shown (Ex. Homepage)

    • A number on the image (left) to indicate which area of the image is referencing, the severity of the issue, an explanation of how the heuristic is broken, and a recommendation on improvements

 

Findings

As the evaluation I performed cannot be made public, some data points from the review are:

  • 50 low severity violations

  • 4 medium severity violations

  • 5 high severity violations

the background knowledge

The background knowledge provided to our stakeholders to provide a holistic view of this process included the following 14 heuristics:

  • Sense of Place: From each screen, users should understand where they are within the system, the actions available to them, and the appropriate method for continuing along, or abandoning, their current path.

  • Accessibility: Make sure text and images are legible, use appropriate font size and contrast ratio. Also ensure that green and red are not used as the primary means to differentiate important information for people who are colorblind.

  • Match between system and the real world: The system should speak the users’ language, with words, phrases and concepts familiar to the user, rather than system orientated terms. Follow real-word conventions, making information appear in a natural and logical order.

  • Cross Pollination: Feed related information into across the application to minimize the number of times users need to enter information.

  • Consistency and Standards: Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions.

  • Error Prevention: Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or checkoff them and present users with a confirmation option before they commit to the action.

  • Value Proposition: Communicate the purpose and benefits of your product. Idea;;y users should recognize the things they want and need right away.

  • Recognition rather than recall: Minimize the users’s memory load by making objects, actions, and options visible. The users should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.

  • Flexibility and efficiency of use: Accelerators — unseen by the movie user — may often speed up the integration for the expert user such that they system can cater to both inexperienced and experienced users. Allow user to tailor frequent actions.

  • Hick’s Law: As the number of options increases so does the amount of time it takes to make a decision.

  • Visual Momentum: Make transitions between pages/windows smooth so they know where they are and where they came from.

  • Help and Documentation: Even though it is better if they system can be used without documentation, it maybe necessary to provide help and documentation.  Any such information should be easily to search, focused on the user’s task, list concrete steps to be carried out, and not be too large.

  • Affordances and Negative Affordances: Affordances are properties of an element that implicitly tell a user how to use it. Negative Affordances are properties that lead people to use a product incorrectly. First, remove negative affordances and then add positive affordances. 

  • Transparency: Make system’s operations that will affect actions, and successful of actions clear to the user. Also, users should know system status.

The web browser used in the evaluation was Google Chrome. When I performed this evaluation in 2020, 48% of all people in the United States used this web browser. Following this general knowledge of heuristic evaluations, I presented to stakeholders 5 critical recommendations for overall site improvement. They were:

  1. Design Accessibility

  2. Menu

  3. Page Layout

  4. Site Name versus Organization Name’s Website

  5. Organization’s Language versus Target Audience​

 

Slide deck images -

After providing the main recommendations to stakeholders and reviewing the following steps, the slide deck showcased the entire heuristic evaluation I performed on the site.

secondary information

Following the heuristic evaluation, I performed additional site analysis to provide the client with a complete view of how the site was performing for their users. This further analysis I conducted was:

  • An Accessibility Coding Violations

  • A Readability Analysis

  • Recommendations on Metrics and Measurement 

  • A Competitor Analysis

Accessibility Coding Violations

To better propose additional accessibility information to the client, I used a browser extension to compare the website’s code against Web Content Accessibility Guidelines (WCAG) coding guidelines. According to this extension, I found on this client’s website:

 

  • 9 variations of accessibility coding violations

  • 94% of the site had no violations or items to review for accessibility.

Readability Analysis
As language plays a significant role in this client’s website, I researched metrics to find one that would clearly show how the site’s content compares to the intended audience. As the target audience is both technical users and the general public, guidance from the Nelson Norman Group is,


“Aim at an 8th-grade reading level if targeting a broad consumer audience. If writing for an educated or specialized B2B audience, still target a reading level several steps below the audience’s formal-educations level. A 12th-grade reading level is often a good target to make text easy for readers with college degrees.”

  • All of this is based on the U.S. education system, with 12th grade relating to the last year of high school.


I used another website to automatically scan the client’s site and assign a reading level based on the U.S. public school system. After scanning every page on the site, the findings were:

25%

of the webpages scanned used language scored at an 10th grade reading level or above

59%

of the webpages scanned used language scored at an 9th grade reading level

16%

of the webpages scanned used language scored at an 8th grade reading level or below

Recommended Metrics
I asked product owners if customer metrics were available during the evaluation phase. In response, a list of recommended customer metrics was presented to the client. Additionally, information on how these metrics would be implemented and which metrics to gather would be determined in future meetings.

Competitor Analysis
Usually, a competitor analysis would be completed against direct and indirect competitors in business to business spaces. However, I determined that the client’s federal site is the first source of legal data, ensuring they have very few direct competitors. Instead, a creative solution I made was to compare the client’s site against state-level sites to see how the data is visually displayed to end users.  For context in UX design, a competitor analysis is defined by Nelson Norman as,


“Competitive usability evaluations are a method to determine how your site performs in relation to your competitors’ sites. The comparison can be holistic, ranking sites by some overall site-usability metrics, or it can be more focused, comparing features, content, or design elements across sites.”


In my research, the data from our client’s site can be displayed at varying levels of detail between local, state, and federal. With this level of variations in the data represented, many states create their versions of the federal client’s website. Using six examples, I created a summary of each for our product owners,

  • The name of the site or company

  • If they are considered a direct competitor, if they were a part of the same organization, or if they are a state, or county level example of the site

  • A summary on who they were

  • What they did differently

  • Their estimated target audience

  • If they use the data from the client's site

  • The link to the website

  • Why it was used as a showcase example

With the level of research and the pure amount of variations I found during this analysis, I created a visual representation of this research using a U.S. map.

bottom of page