top of page
  • LinkedIn
  • YouTube

UMich MLead

Inform students about leadership and campus involvement opportunities on campus to build leadership competencies

Overview

Role

User Experience Researcher, Needs Assessment and Usability Evaluation

Team

Team Project

Client
Tools

 Customer interviews, Prototype testing, Usability testing

Duration

January 2020 - May 2020

Client Goal

Inform students about leadership and campus involvement opportunities on campus to build leadership competencies

Target Population

 Current UM students

Task For Team

Evaluate the current website and provide an analysis on its efficiency and effectiveness at reaching students across campus. In addition, deliver recommendations for the aesthetic appearance of the website along with several other recommendations for how the site can be improved. 

Research Questions
  1. Understand students’ opinions towards leadership skill-building and development opportunities

  2. Assess the M-LEAD website’s current effectiveness at reaching students across campus

  3. Evaluate the workflow and design of the website and provide an analysis of its overall usability

Research Process

Research Process

We used a wide variety of research methods, with each method producing its own findings and recommendations. Each method is summarized briefly below.

Competitive Analysis

20 user interviews

Interviews

Various brands were studied and analysed

Interaction Map

Primary and secondary research

Usability Testing

Mockups for ease of understanding 

Heuristic Evaluation

Mockups for ease of understanding 

Surveys

Major recommendations to achieve the goal

Interaction Map

Our first task was to create an interaction map to understand the current structure of the website. We each went through the website to understand the interactions a user would experience when using their website and aggregated this information to create a map of the current website structure.

Capture.JPG

Interviews

We started by preparing an interview protocol and then recruited four participants. We conducted interviews in pairs, with one team member moderating and the other taking notes.

2.JPG

 Afterwards, team members individually coded interview transcripts, and then we came together as a team to find overarching themes in our coding and generate our main findings and recommendations. We concluded our interviews by creating personas and scenarios to get a better understanding of our users

Comparative Analysis

6.JPG

The team selected 8 comparable systems. We created evaluation criteria based on what we felt would be most helpful for our stakeholders reach their goals and then divided the systems among teammates to evaluate. Afterwards, as a team, we created and used a comparative matrix to assist us in our analysis of the systems.

Finding
7.JPG

Surveys

pilot survey was sent out to some students who had expressed they were interested in helping with the website re-design. The pilot survey helped us see where the survey could be improved before sending out our official survey. Our official survey was posted on the on the website and sent out to all student email list. We received 33 complete responses, which we analyzed in Qualtrics in order to produce our findings and recommendations.

8.JPG

Heuristic Evaluation

We then interpreted Nielsen’s 10 Usability Heuristics in terms of the MLead website. Then, team members individually evaluated the website based on these interpretations and assigned all of our findings a severity ranking based on Nielsen’s severity rankings. We aggregated our findings and prioritized the violations based on the average severity ranking. For the heuristic violations with severities of 3 or 4, we discussed which ones would have the greatest impact on a user’s experience and chose those for our findings and corresponding recommendations.

Usability Test

To acquire participants for our usability test, we reached out to survey responders who voluntarily signed up for participating in future project interviews (which was the last question we asked in our survey). We had 17 people who expressed interest in helping, so we reached out to all of them by sending them a recruitment email (see Appendix A). We asked users to fill out this Google form with their availability, and we received 5 responses to conduct usability tests. We then scheduled a 1 hour time slot on their calendars that provided a short overview of the usability test and provided a link to the digital consent form

9.JPG

Final Findings & Recommendations

RECOMMENDATION
FINDING

Users expected a clearer explanation of the filters section on the Opportunity page, and there are specific filters they expected but are currently missing

  1. Add additional explanation to the commitments filters

  2. Add logic to disable some filters under certain circumstances

  3. Add additional filters that users expect to see

Users found the program descriptions after a search is done insufficient and expected to see them being presented in a more organized way

  1. Show detailed description for the displayed programmes and add a “read more” button

  2. Make the competency tags under displayed programs more label-like

The “More about the Event” button on the speaker series page didn’t lead to users’ expected page

Relink the “More about the Event” button to information about the event

Social media icons were expected to be more present and easy to see, and users prefer additional social media platforms

  1. Present social media icons in the footer of each page

  2. Consider marketing through additional social media channels

A combined report with more details about each method and its full list of findings and recommendations is available upon request, just send out an email! 

© 2022 by Anviksha Agrawal

bottom of page