Turning Pain Points into Progress:
58% More Satisfied with the Indiana University Mobile App Experience

EdTech

Mixed Methods Research

B2C

TLDR

My Role

Conducted a mixed-methods study involving a UX audit, analytics review, surveys (50 students), interviews (6 students), and card sorting (10 users) to uncover usability issues, user mental models, and inform an intuitive re-architecture of the app.

Core Experience Gap

Poor navigation and scattered features made key tools hard to access

The Solutions

Information Architecture Updates Illustrated Through Wireframes

HIGHLIGHTS

Successfully delivered recommendations &
redesigns that led to

58%

increase in User Satisfaction with the app

3x

increase in Task Completion Rates

Now, let’s dive into the full story

Now, let’s dive into the full story

Now, let’s dive into the full story

PRODUCT OVERVIEW

IU (Indiana University) Mobile is an app designed for the students, faculty & staff of Indiana University that streamlines campus life by providing easy access to essential tools like maps, grades, schedules & events, all in one place.

PROBLEM STATEMENT

Challenges with App’s Usability

Users struggled to navigate the app and access key features efficiently, leading to frustration and reduced engagement. Accumulated usability issues and a poor user experience made interactions difficult, ultimately compromising the app's ability to fulfill its purpose as a reliable campus companion.

Reviews from Playstore

Reviews from Playstore

The call was clear, the IU Mobile App needed to be more intuitive and user-friendly.



How might we improve the Indiana University’s Mobile App's usability for seamless access and a better experience?

How might we improve the Indiana University’s Mobile App's usability for seamless access and a better experience?

RESEARCH QUESTIONS IDENTIFICATION

Process

Research Workshop with UX Design, Development and University Stakeholders

Goals

  • Identify key research questions

  • Establish Usability audit objectives

  • Defined focus areas for evaluation

  • Aligned stakeholder priorities

Research Question Examples

  1. What are the key pain points users experience when navigating the IU Mobile App?

  2. How effectively does the app enable users to access core features like bus tracking and grades?

  3. What specific elements of the interface contribute to user confusion or frustration?

METHODOLOGY SELECTION

My Hypothesis

The app’s usability is causing frustration and inefficiencies for users. Navigation does not align with the mental models of students, faculty, and staff. The design lacks intuitiveness, making key features hard to discover and use.

My Hypothesis

The app’s usability is causing frustration and inefficiencies for users. Navigation does not align with the mental models of students, faculty, and staff. The design lacks intuitiveness, making key features hard to discover and use.

My Hypothesis

The app’s usability is causing frustration and inefficiencies for users. Navigation does not align with the mental models of students, faculty, and staff. The design lacks intuitiveness, making key features hard to discover and use.

Method 1: Usability Audit

To establish a baseline understanding of usability issues, I conducted a UX Audit of the app using heuristic evaluation. The goal was to uncover general usability issues and establish a foundation for further research. This helped identify core problems before diving into user behavior analysis.

I evaluated both light mode and dark mode to ensure consistency across themes and address potential challenges specific to each. The design was evaluated on two levels:

UX

  • The usability audit uncovered structural issues, navigation challenges, and unintuitive features misaligned with user mental models, causing frustration in both light and dark modes.

UI

  • The audit also focused on inconsistencies in scannability and usability across light and dark modes.


  • Heuristic violations and visual design issues were addressed to ensure an intuitive and cohesive experience for users across themes.

Some of the issues found are:

Accessibility concerns

1.0

Low Contrast Search Bar in Dark Theme

The search bar in dark mode has a low contrast ratio, making it difficult for users to locate, violating Heuristic #4: Consistency and Standards.

2.0

Low Contrast in Shortcuts Menu

In dark mode, the shortcuts menu uses dark green text on a black background & has low contrast, making it hard to read and violating Heuristic
#4: Consistency and Standards.

Touch Target & Shortcuts Menu

1.0

Low Contrast Search Bar in Dark Theme

The search bar in dark mode has a low contrast ratio, making it difficult for users to locate, violating Heuristic #4: Consistency and Standards.

Inconsistent Categorization & Information Overload

1.0

Disorganized Content Grouping

Despite having a "Campus" section in the navigation, related information (such as Evening Transportation, Student organizations, Student newspapers, Laundry, and First Year Experience) is buried under "More," contributing to poor categorization and overwhelming content. This violates Heuristic #2: Match Between System and the Real World.

These findings suggested significant navigation and visibility issues, prompting me to investigate user behavior patterns through analytics review.

Method 2: Analytics Review

To validate usability issues identified in the audit, I examined user engagement metrics to determine how users interacted with the app and where they dropped off. Due to confidentiality, I cannot specify the features reviewed, but special attention was given to both high-traffic pages, to understand why engagement dropped off, and low-traffic pages, to identify potential discoverability issues.

Observations

High-traffic pages showed engagement times of just 4–18 seconds, indicating possible usability challenges or incomplete tasks.

Low-traffic pages highlighted poor discoverability of features that users could potentially benefit from.

The data confirmed that users were either struggling with navigation or abandoning tasks prematurely, reinforcing findings from the usability audit.

Method 3: Surveys

I conducted surveys to specifically quantify how widespread these issues were and identify patterns in user frustration.


I aimed to gather quantitative insights on which features were most used, how navigation confusion impacted user experience, and whether usability concerns affected engagement.

Tools Used

Screening Criteria

Participant demographic

Google Forms was used to design and distribute the survey due to its accessibility and ease of data collection.

Participants were recruited exclusively from Indiana University’s active student population to gather insights from the app's
primary user base.

A total of 50 students participated in the survey. Participants were recruited to ensure representation across different class years and disciplines, reflecting diverse user behaviors and challenges.

Results

Results

Feature Usage Trends

Feature Usage Trends

Over 70% of participants reported frequent use of core features like Kuali time and Bus tracking

Usability Concerns

Usability Concerns

2/5

Users rated the app 2 on a 1–5 scale, where 1 is very difficult and 5 is very easy, highlighting major usability challenges

Navigation Issues

Navigation Issues

Students reported difficulty locating features, highlighting gaps in the app’s organization.

These insights revealed that students frequently used key features but found them difficult to access. This reinforced the need to understand how users conceptualized and organized app content, leading me to conduct user interviews.

Method 4: User Interviews

These insights revealed that students frequently used key features but found them difficult to access. This reinforced the need to understand how users conceptualized and organized app content, leading me to conduct user interviews.

Protocol

Conducted one-on-one interviews with 6 students who were active users of the app.

Focused on understanding pain points, unmet needs, and their mental models of navigation.

Methodology

Semi-structured interviews with open-ended questions to allow participants to freely express their experiences.

Key questions included:

  • What challenges do you face when using the app?

  • How do you typically locate frequently used features?

  • What improvements would make the app easier to use?

  • Are there features you wish the app had or did better?

Insights

Complex Navigation

Students found the app's navigation confusing and cumbersome, with them struggling to locate specific features.

Feature Organization

Scattered feature placement forced users to “hunt” for frequently used tools.

Lack of Tutorials and Onboarding

Lack of onboarding tutorials made it harder for new users to adapt to the app.

Need for Enhanced Features

Many participants requested enhancements to existing features, such as real-time updates for bus tracking and clearer grade postings.

The most consistent problem across all users was the difficulty in locating frequently used features, which reinforced the findings from the surveys. Additionally, multiple participants expressed frustration over having to "hunt" for tools, validating the need for a clearer information architecture.

Method 5: Card Sorting

Insights from user interview revealed a fundamental issue with the app’s navigation. To address this, I needed to understand how students mentally grouped and categorized app features to create a navigation structure that better aligned with their expectations. This led me to conduct card sorting to uncover natural patterns in feature categorization.

Study Procedure

  • Participants: 10 participants were recruited for the study.

  • Methodology: An open card sort was conducted as diagnostic research to explore user mental models and improve the app's discoverability.

  • Tool Used: Optimal Workshop was used to facilitate the card sort study.

  • Number of Cards: 32 cards representing key app features and content areas were included in the study.

  • Used dendrograms to identify natural groupings and validate top-level categories.

Study Observations

  • Participation

    All 10 participants completed the study, providing diverse input on how they organize features.

  • Participation

    All 10 participants completed the study, providing diverse input on how they organize features.

  • Participation

    All 10 participants completed the study, providing diverse input on how they organize features.

  • Participation

    All 10 participants completed the study, providing diverse input on how they organize features.

  • Time Spent

    The average completion time for the study was 11 minutes, with the fastest participant finishing in 5 minutes.

  • Time Spent

    The average completion time for the study was 11 minutes, with the fastest participant finishing in 5 minutes.

  • Time Spent

    The average completion time for the study was 11 minutes, with the fastest participant finishing in 5 minutes.

  • Time Spent

    The average completion time for the study was 11 minutes, with the fastest participant finishing in 5 minutes.

  • Category Creation

    Participants generated 45 categories, with a median of 5 categories per participant.

  • Category Creation

    Participants generated 45 categories, with a median of 5 categories per participant.

  • Category Creation

    Participants generated 45 categories, with a median of 5 categories per participant.

  • Category Creation

    Participants generated 45 categories, with a median of 5 categories per participant.

Dendogram with best merge method

Categories Sorted

Based on the card sort and dendrogram analysis, the following categories were identified:

Core Academic Tools

Features like Class Schedule, Grades and Canvas were strongly associated with academic tasks.

Campus Navigation and Logistics

Features such as Bus Routes, Parking Permit, and Buildings were consistently grouped under campus navigation.

Student Support & Resources

Features like Health and Wellness, Sports Notifications, and Student Organizations formed a cluster related to student activities.

Help & Settings

Features like Tech Help, Notifications, and General Settings showed overlapping clusters with Help.

Information Architecture

With the categories defined, I created a hierarchical structure that aligned with user workflows. The information architecture grouped related features under intuitive headings and streamlined navigation

  • Home: Central dashboard for quick access to essential tools like bus routes and news.

  • My IU: Academic tools and personal resources.

  • Maps: Campus navigation features.

  • Discover: Opportunities and resources for educational and extracurricular exploration.

  • More: Additional tools like parking management, tech help, and feedback.

  • Profile: User-specific settings and preferences.

Creating Wireframes for Feedback

To validate the new information architecture, I created low-fidelity wireframes for key screens, such as the Home page,


My IU, Maps, Discover, and More. The wireframes were shared with stakeholders and users to gather feedback on the categorization and navigation.

Usability Testing

The redesigned IU Mobile App aimed to address user frustrations with poor navigation and feature discoverability. After restructuring the app’s information architecture and creating a new sitemap, usability testing was conducted to validate and assess whether the redesigned navigation and UI improvements enhanced usability.

Objective

To evaluate the impact of the redesigned information architecture and navigation system on:
1. Task completion success.
2. User satisfaction.
3. Navigation efficiency (time-on-task).

Methodology

Participants

  • 7 students from Indiana University, representing frequent and occasional app users.

  • Recruited to reflect diverse user behaviors and needs.

Tasks

Participants completed 4 key tasks

Task 1

Prompt - Track a Bus using widget
Target - Home screen

Task 2

Prompt - Check class schedule
Target - Classes tab

Task 3

Prompt - Look for food destinations
Target - Maps

Task 4

Prompt - Find student organizations
Target - discover tab

Metrics

  • Task Completion Rate: Percentage of successfully completed tasks.

  • Time-on-Task: Average time taken to complete each task.

  • User Satisfaction: Rated on a scale of 1 (very difficult) to 5 (very easy) for each task.

  • Qualitative Feedback: Comments and suggestions from participants about their experience.

Testing Environment:

  • Conducted with a clickable prototype of the redesigned app.

  • Tasks were observed, and user interactions were recorded for analysis.

Findings

Task Completion Rates

3x increase, from 31% to 94%, indicating improved navigation and feature accessibility.

User Satisfaction

58% improvement, with ratings increasing from 3/5 to 4.5/5 due to clearer categorization.

Navigation Efficiency

67% faster task completion, reducing average time from 3m 45s to 1m 15s, reflecting better flow.

Reflections
What I would've done differently

Include Faculty and Staff Perspectives

The study focused entirely on students, which made sense given that they are the primary users of the app. However, faculty and staff also interact with features like class schedules, academic tools, and campus navigation. Including their point of view would have provided a more holistic understanding of user needs, especially regarding administrative and teaching-related functionalities.

Longitudinal Testing for Real-World Validation

The usability test was conducted in a controlled environment with a prototype, which may not fully reflect how users interact with the app in real-life scenarios. Running a longitudinal study, where users test the redesigned app over several weeks, would reveal long-term usage patterns, adoption rates, and unforeseen usability challenges.

Personal Learnings

Users Don’t Always Think Like Designers

Our assumptions about navigation were completely overturned during card sorting. Users grouped features differently, reminding me that design should be research-driven, not assumption-driven.

Numbers Alone Don’t Tell the Full Story

While quantitative results showed success, the most powerful validation came from user quotes. Hearing ‘I found what I needed so much faster’ was just as meaningful as the 3x improvement in task completion

← Back

Let's get in Touch!

uxrsneha@gmail.com

Copied

Interviews, analysis, synthesis... and a latte

Let's get in Touch!

uxrsneha@gmail.com

Copied

Interviews, analysis, synthesis... and a latte

Let's get in Touch!

uxrsneha@gmail.com

Copied

Interviews, analysis, synthesis... and a latte

Let's get in Touch!

uxrsneha@gmail.com

Copied

Interviews, analysis, synthesis... and a latte