Research: Heuristic Review

Research • Analysis • Report • Presentation


The UX team had been asked by Frontline’s VP of Business Solutions to conduct a heuristic review of the newly acquired TEAMS software. Heuristic evaluations have been established as a quick and cost effective way to assess a new property. Additionally, Frontline’s long-term strategy is to seamlessly integrate any acquired “solution” into its platform.  Thus, this type of review is a way to gauge the scope of such an integration.

My Role

I headed up the heuristic review. Thus, I was responsible for researching the 10 primary user-tasks by interviewing 6 internal stakeholders. Conducted 4, 1-hour heuristic review sessions (rumbles) that enlisted 2-3 fellow designers. In total, we documented 165 observations that ascribed a corresponding heuristic and a severity rating.

Data from this study was then compiled into a report listing 18 recommendations as well as a top-line report that listed the top 5.


This evaluation allows us the ability to:

  • Quickly become familiar with the new software and its primary workflows
  • Establish where the software aligns and doesn’t align with the established Frontline design system
  • Assess and document usability issues
  • Make recommendations to improve the usability
  • Provide optional approaches that will allow product management and development flexibility in implementation: ie. enhancements vs complete redesigns


I met with several stakeholders seeking a consensus of what was considered the 10 main task flows. During this discovery phase, I took notes and was particularly interested in seeing what pain-points stakeholders were observing. Additionally, I was seeking an in-depth review of not only the software but of the domains which I was completely new to finance, recruiting and hiring, payroll and human resources. Also, to understand a very complex system, I used schematics/models that I showed to stakeholders during meetings. If they disagreed with the model then I made adjustments to the model as we chatted. This has proved to be an effective way to collaborate with stakeholders that are remote and don’t have access to a particular online tool (in this case I was using Miro.) 

Notes, documented task flows, and system Model

Conducting Heuristic Rumble

The Heuristic Rumble is based on Nielsen Norman best practices. I recruited 2-3 reviewers (UX designers) per session. I walked the reviewers through each flow and then allowed them to perform the task independently. Time was allowed for the reviewers to complete an online excel chart where they entered the observation/heuristic violation and ascribe a severity to it. In all, 10 flows were reviewed and a total of 165 observed violations were recorded. With such a robust set of data, I was able to make draw conclusions with a high level of confidence.

Monitor showing TEAMS software and Excel sheet of documented heuristic observations.

Analysis and Report

From the data, we saw two big heuristic offenders: Heuristic #4, Consistency and Standards and Heuristic #8, Aesthetic & Minimalist Design. These findings agreed with the general comments I heard from other reviewers. Also, these results aren’t surprising considering that we were dealing with enterprise software that was over 10 years old.

Showing part of heuristic report published to Confluence as well as aggregated of observations and severity.


The presentation is designed for an executive-level audience as well as stakeholders. Thus, the deck is broken up into two sections. The first half reflects the larger picture. I start with the Summary or what might be considered “top line” report (see slide 5.) Then to substantiate those findings I show the aggregated data. 

In the second half, I show examples of heuristic violations as well as some best practices. Though these are part of the Recommendations section, my intent is not to solution but to create a conversation around where the team might go. 


As a newly acquired solution, the TEAMS heuristic review proved to be a comprehensive way to access and clearly define usability issues—that are actionable.

This report offers us tremendous value! . . . In fact, we are going to use this as part of our 2020 initiative planning.

—TEAMS Product Manager

Certainly, not all stakeholders were as enthusiastic. Understandably the review met with some push back. And yet, I saw the real strength of conducting such a study. Derived from industry standards meant usability issues were clearly defined. Recommendations were data-driven, based on the observations of several experts. 

Peter Sawchuk
Aligning all aspects of UX

Correspondence to

16007 Normal Road
Jamaica, New York 11432



Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Consent to display content from - Youtube
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google