Simpplr

A personalized home dashboard for an AI-powered employee experience platform

Research Question

How does the home dashboard fit into End users’ overall work process?

Method(s)

Survey
Contextual Inquiry
Card Sorting
Heuristic Evaluation
Usability Testing

Impact

Improved the SUS score by 73%

Background

Simpplr is a company founded in 2014 whose primary product is their eponymous employee experience platform.

An employee experience platform is an intranet with social capabilities, i.e., it acts as a local computer network where employees can share files and host internal communications.

Simpplr has 1000+ clients across multiple industries, e.g., DocuSign, Nutanix, the World Economic Forum, etc.

Our team was paired with Simpplr through the course PSYC 6023: Psychological Research Methods for HCI at the Georgia Institute of Technology.

Role: UX Researcher

  • Improved the SUS score by 73% through moderating 8 research sessions and identifying 35+ user needs with stakeholder buy-in
  • Facilitated 3 workshops to brainstorm 15+ research questions in collaboration with cross-functional partners

Role: UX Designer

  • Actively engaged in design critique for multiple stages of 35+ screens
  • Developed 3 iterations of the “Social” tab
  • Participated in 4 rounds of ideation

Team

UX Researchers (2), UX Designers (2)

Duration

Aug 2024 - Dec 2024 (5 Months)

Target Users

End Users

  • Primarily consume content on the home dashboard
  • Have limited customization abilities across the system
  • App Managers
  • System Administrators
  • Content Writers

Problem Statement

How might we tailor the home dashboard to better support the work tasks of End users?

Survey & Contextual Inquiry

  • We conducted a survey with Qualtrics as well as complementary contextual inquiries via Zoom.

We chose to disseminate a survey so we could gain insight into current usage and preferences on the home dashboard. Moreover, we wanted this understanding to hold high validity while minimizing the level of participant effort and potential for bias.

I led contextual inquiries so we could (1) develop an understanding of End users’ overall work process and (2) identify the role the home dashboard plays within this process. Additionally, this method allowed us to directly observe participants in action and produce a rich level of qualitative data.

Our participants were a range of Simpplr employees who utilize Simpplr for work tasks.

  • Survey: 36 participants
  • Contextual Inquiries: 3 participants

We reached out to 40+ prospective participants but unfortunately only received 3 responses, i.e., we found our target users weren't as willing to provide the extended amount of time necessary for this method.

🌱 Learning Moment

Our contextual inquiries often devolved into more traditional interviews. Therefore, we should have communicated the nature of this method more effectively in order to ensure these sessions focused more on participants presenting their work process.

User Persona

I supported creating a user persona because I wanted to give our team and internal stakeholders an approachable way to digest our research data and therefore empathize with target users.

Our user persona was directly based on the findings from our survey and contextual inquiries.

Turning Point

“It’s not truly a dashboard if everything’s there.”

Inspired by our findings, I realized our product isn't truly a dashboard if it's showing the user everything—that’s just a webpage. On the other hand, a dashboard surfaces content that’s most relevant to the user. Therefore, we utilized our research to determine what was most relevant to the End user and placed this content prominently within our design.

Card Sorting

  • I spearheaded card sorting with 15 participants utilizing Useberry.

I chose card sorting because our turning point necessitated a reimagination of Simpplr’s information architecture and we wanted to ensure our new design corresponded with End users’ mental models. Furthermore, we carried out this method in a remote, unmoderated environment so we could gather data as quickly as possible while minimizing the possibility of influencing participants.

Our participants were End users who were employed by either Simpplr or Simpplr’s clients.

  • Simpplr employees: 10 participants
  • Simpplr’s clients: 5 participants
    • Nutanix, UKG, Ivanti, Silicon Labs

I included a disproportionate number of Simpplr employees because they also use their employee experience platform and I found they were more receptive to participate in research (I sent 200+ outreach messages to employees of Simpplr’s clients).

Ideation

Heuristic Evaluation & Usability Testing

  • We performed 5 heuristic evaluations and 5 moderated usability testing sessions via Zoom and UserTesting.

We chose heuristic evaluations because we wanted to quickly determine larger usability issues and integrate common heuristics into addressing them.

I advocated for usability testing because I wanted to (1) validate previous design decisions (2) uncover pain points, and (3) gain a holistic understanding of the system’s usability. Moreover, this method allowed us to perform research in an environment where we could directly observe as well as make participants feel more comfortable to share due to the more personal nature of moderated sessions.

Our participants were experts and End users who fulfilled our recruitment criteria.

  • Heuristic Evaluation: 5 participants (experts)
    • 1+ years of experience with UX design
  • Usability Testing: 5 participants (End users)
    • Utilized company intranet for 1+ months

Outcomes

  • Improved the SUS score by 73%
  • Contributed to the overhaul of Simpplr’s information architecture
  • Validated previous research findings such as End users’ desire for personalization

Challenges

  1. Understanding internal stakeholders’ definition of success
  2. Not falling prey to feature creep
  3. Determining how much to consider business constraints
  4. Reconciling multiple work styles
  1. I asked follow-up questions to dig deeper into how internal stakeholders perceived and measured key indicators such as adoption—this allowed us to clarify that they viewed productivity as the main driver for adoption
  2. I urged our team to categorize features into must-have vs. nice-to-have (initially informed by our impact effort matrix) so we didn’t continue to bloat Simpplr’s UI and frustrate End users
  3. I advocated for following internal stakeholders’ directive to design without constraints while keeping an alternative direction in mind
  4. I committed myself to understanding and adapting to my teammates so we could achieve more as a group

Reflection

  1. Various Perspectives
    • It is crucial to recruit participants with various perspectives so you can gain a more broad and accurate picture of how users interact with a system.
  2. Importance of Lo-Fi
    • While it’s tempting to move straight into high fidelity, we gained valuable insights by receiving feedback on lower-fidelity wireframes first (and ultimately saved time and resources).
  3. Thorough Planning
    • Crafting a robust plan as a team before each stage can help streamline task delegation and alleviate stress.

Thank you for reading! 😁
Contact me at me@uxeli.com

More Case Studies: