Herman Green III Logo
Home Work Resume About Contact
ACC Course Evaluation Guide Hero Image

Client

Austin Community College's Teaching, Learning and Excellence Division

Challenge

Post-COVID course evaluation completion rate dropped from 94% to 34%

Solution

Google calendar integration
Student testimonials
Mobile-first design

Impact

Reduced task time by 22%
Reduced errors by 15%
Reduced clicks by 29%


Team

3 Designers

My Roles

Lead UX Design / Strategy

Timeline

Jan – Apr 2022

Tools

Figma, Miro, Google Suite

Final Prototype Screens Desktop Screen 1 Desktop Screen 2

PROBLEM

Student participation in the course evaluations dropped sharply after moving them online due to Covid-19 regulations

When ACC shifted from in-class paper evaluations to digital-only formats in 2020 due to COVID-19, participation plummeted from 94% to 34%, despite 80% of participants rating the experience positively—suggesting barriers weren't about quality but engagement. Our team partnered with the Teaching, Learning, and Excellence Division to investigate these deeper participation barriers, aiming to increase completion rates to 49% within one semester by developing a comprehensive plan that would address the disconnect between student satisfaction and participation rates.

View Project Timeline →

STRATEGIC CONTEXT

Our design outcomes directly supported ACC's enrollment goals

Our UX outcomes directly supported ACC's Strategic Enrollment Planning initiative, aligning with the college's equity and access mission while contributing to their goal of reaching 85,000 annual students by 2030. By enhancing communication about evaluations, creating effective notification systems, and designing a more user-friendly guide, we helped address the participation gap. This improved feedback mechanism strengthened students' sense of being heard and valued—directly advancing ACC's strategic focus on building the trust and belonging that encourages continued registration and sustainable enrollment growth.

"User research is essential for designing products that meet real needs. Assumptions are the death of good design."
- Irene Au, Former head of the design teams for Google and Yahoo, Currently A Design Partner at Khosla Ventures

RESEARCH

We used surveys, interviews, and competitive analysis to uncover participation barriers

I led competitive analysis, targeted surveys, and in-depth interviews to uncover participation barriers. Here are the key findings.

  • Surveys showed 80% prefer mobile access and 70% simply forget to complete evaluations
  • Our competitive review confirmed clear access instructions but revealed a lack of engagement hooks
  • Interviews showed students wanted to know "how it helps" and needed timely reminders to follow through

Key User Interview Quotes

"I have to kind of take initiative to put on my calendar, or I have to do it as soon as the notification comes through otherwise, I'm going to mean to do it and then forget to do it."

Participant 1

"I guess if I knew directly how it helps, in what way it's helped. Like if I know us doing it, how is it going to impact us?"

Participant 2

Overview

  • Verification: Conducted stakeholder sessions with Teaching & Learning Excellence Division
  • Screener criteria: Must have been enrolled for at least one full semester
  • Mixed-methods approach: Balanced depth with project constraints
  • Timeline: Four-month project duration
  • Sample size: 10 surveys, 7 interviews

User Interviews

  • Participants: 7 students across diverse programs, ages (18-53), and course formats
  • Environment: Comfortable, low-pressure setting for candid responses
  • Focus: Pain points, motivations, and suggestions for improvement
  • Format: Remote, one-on-one, 30-minute sessions

Surveys

  • Topics: Participation frequency, time investment, and perceived value
  • Format: In-person surveys at ACC Highland Campus
  • Participants: 10 students with diverse backgrounds
  • Benefits: Maximized accessibility and convenience

Main Research Questions

  1. What are students' perceptions of the impact and value of their course evaluation feedback?
  2. What are the biggest barriers preventing students from participating in course evaluations?
  3. How do students prefer to receive and manage academic communications and reminders?
  4. What are students' motivations for providing feedback on their courses and instructors?

Competitive Analysis

Here's what stood out when we evaluated three Texas-based community college course evaluation pages:

  • 1. Lone Star College
  • 2. Houston Community College
  • 3. San Jacinto College

What They Do Well

  • Each page clearly explains what course evaluations are and why they exist.
  • Students are told when evaluations open and how to access them.

What They're Missing

  • None of the pages directly show students how their feedback has made a real difference.
  • Most interfaces appear built for compliance, not conversion or engagement.
Combined competitor screenshot

Additional Survey Findings

  • Qualitative answers showed us thatstudents described evaluations as "too long"with"too many questions"that "repeat too much".
  • 80% prefer completing evaluations on mobile devices, reinforcing the need for a mobile-first approach.
  • 70% of students simply forget to fill out evaluations, indicating an opportunity for better reminders.
  • 60% of students are motivated by bonus points added to their lowest grade.

Research Constraints

Constraint Impact Adaptation
No baseline analytics about page views, clickthrough rate etc provided Couldn't quantify page visits or drop-off rates Relied on interviews & surveys
Screener criteria Excluded students with under one semester's experience—new-student pain points may differ Planned follow-up sessions post launch with first-semester students to validate and broaden findings
Limited sample size & diversity Insights from 10 surveys and 7 interviews may not generalize across all student segments Applied purposive screening for key segments; laid groundwork for a larger post-launch survey
Project timeline Four-month scope restricted number of sessions and prototype iterations Prioritized core research questions; used lean artifacts to maximize insights
No instructor & staff interviews Lacked understanding of how and when instructors would promote the guide Plan post-launch stakeholder interviews to optimize sharing workflows
No email-engagement metrics Couldn't gauge open- or click-through rates on reminder messages Requested mailing-list analytics for post-launch performance tuning

User Personas

Based on our research findings I developed three strategic user personas

  • Bryan who represents students to understand the feedback's purpose
  • Michele who represents students that need a reminder mechanism they control
  • Taylor who represents students that needs great user experiences that are mobile first
Bryan avatar

Bryan

"I'm always too busy with assignments and work to fill out long evaluation forms."

Biography

Bryan is a 22-year-old student balancing a part-time job with a full course load. He values efficiency and quick processes.

Goals

  • Complete evaluations quickly
  • Provide meaningful feedback without too much effort
  • See that his feedback makes a difference

Frustrations

  • Too many evaluation forms to fill out
  • Unclear questions that take time to understand
  • No evidence that his feedback is considered
Michelle avatar

Michelle

"Everything I do goes into my calendar, its runs my entire life."

Biography

Michelle is a 24-year-old student. She's a double major who is very busy with all her classes, family life social obligations. She values being able to put important dates on her calendar.

Goals

  • Possessing a clear plan & structure for tasks
  • Desires to give back to ACC and help those coming after her
  • View time as her most valuable resource

Frustrations

  • Occasionally forgets to add things to their calendar or to do list
  • Has tendency to multi-task leading to task triage
  • At times needs extra input to understand how long a task will take
Taylor avatar

Taylor

"If I can't do it on my phone then it probably takes too long."

Biography

Taylor is a 21-year-old first generation student who spends most of their time on the go and values mobile first interactions.

Goals

  • The experiences needs to have a little friction as possible
  • To be aware of the latest events within their community
  • Representation is a paramount concern

Frustrations

  • May miss due to a constant on the go schedule
  • Impatient with task and likes to move quickly
  • Critical of experiences & expects them to be well crafted

User Journey Map

Right after defining our personas, we mapped the student's end-to-end experience—from the moment they learn an evaluation is due (via professor announcement or email) through selecting an access method (Blackboard vs. email link) to finally completing the form.

"True innovation happens when you're willing to explore ideas that feel almost too ambitious or too different."
- Jony Ive, Former Apple Chief Design Officer and LoveFrom Founder

SYNTHESIS

I identified three key opportunity areas from the research insights

Awareness & Communication


INSIGHT

70% of students forget to complete evaluations.

OPPORTUNITY

Add reminder mechanisms and clarify timelines.

Motivation & Incentives


INSIGHT

Students felt their feedback didn't matter.

OPPORTUNITY

Show how evaluations lead to real changes.

Design & User Experience


INSIGHT

80% preferred navigating online information with their mobile phone

OPPORTUNITY

Build a seamless, mobile-first experience.

Prioritizing What to Build

I created and facilitated an Effort–Impact Matrix workshop with my team to prioritize ideas based on user value and development feasibility, placing my calendar integration in the high-impact, low-effort quadrant.

Effort/Impact Matrix

Design Concepts

I created three distinct design concepts for the calendar integration feature, exploring different interaction patterns before settling on a dropdown-based approach that my usability testing proved reduced cognitive load. While my teammates focused on other areas, I championed the integration of testimonials with calendar functions to address both engagement barriers simultaneously.

The Campus Voice

This idea leveraged visual segmentation of evaluation periods (addressing confusion noted in interviews) to improve awareness for students like Michele, who struggled with timeline clarity.

Mobile mockup of the Campus Voice screen

The Direct Impact Portal

Here I explored whether directly showing completion rates could tap into social proof motivations identified for students like Bryan, potentially addressing the 'lack of perceived impact.

Mobile mockup of the Direct Impact screen, with headline, stats, and calendar button.

Project Constraints & Design Adaptations

Constraint Impact Design Adaptation
Privacy Requirements We couldn't display peer progress or public submission data Swapped in student and teacher testimonials and used student tone for messaging
Calendar Integration ACC couldn't support every platform (Yahoo, Outlook, Apple etc) natively Defaulted to Google Calendar, offered manual .ics export for others
Team Design Direction Divided opinions on the emphasis of student testimonials vs. other features I facilitated a structured critique session linking design preferences to research findings, establishing an evidence‑based approach for resolving design disagreements

The Concept That Was Selected For Usability Testing

Deferring back to our matrix and incorporating stakeholder feedback, I proposed a hybrid, combining Campus Voice's simplicity with real-world impact examples from the Portal—solving both the "forgetting" and "why it matters" barriers with a mobile first approach meet Bryan, Michele and Taylors needs.

Usability Test UI
"The goal is not to confirm that your designs are good; it is to find every single thing wrong with your designs."
- Hillman Curtis, Chief Creative Officer at Hillman Curtis

VALIDATION

Usability testing showed improvements in efficiency, accuracy, and task completion

We employed Jakob Nielsen's Usability Metrics Framework to measure improvements in efficiency, accuracy, and completion. Based on our first round of testing, I implemented several targeted enhancements to my features:

Error Rate
40.19% Test 1
25% Test 2

15.2% improvement

Users completed evaluation guide tasks with significantly fewer errors, improving accuracy.

Time on Tasks
1:17 Test 1
1:00 Test 2

22.08% improvement

The streamlined interface reduced the average time needed to complete tasks from test one to test two.

Completion Rate
73% Test 1
96% Test 2

23% improvement

Users relied less on the "contact us" section to complete task showing improvement in how information is being conveyed.

Number of Clicks
102 Test 1
72 Test 2

29.41% improvement

The redesigned navigation reduced the number of interactions needed to complete tasks.

Usability Test Setup Overview

  • Participants: 10 students (5 per round) from diverse programs, ages 25-53
  • Format: Mix of remote and in-person moderated sessions
  • Documentation: All sessions recorded for observation
  • Duration: 15-30 minutes per session
  • Gender: 60% female, 40% male

Tools & Tasks

  • Core tasks: Navigation to evaluation pages, using "Add to Calendar" feature, exploring content structure
  • Capture method: Zoom for screen sharing and audio recording
  • Documentation: Notes logged in Notion
  • Platform: Interactive Figma prototypes

What We Measured

  • Task completion rate: % of participants who successfully completed tasks
  • Number of clicks: Number of clicks needed to complete actions
  • Error rate: Frequency of mistakes or incorrect paths
  • Time on task: Efficiency of task completion

Observations

  • Test 1: Participants frequently went to the FAQ sections to resolve the tasks.
  • Test 2: Overall participants were able to navigate the interface with more ease.

Participant Quotes

  • "I thought I could just click add to calendar and the it would know my dates."
    Participant 3, Test 1
  • "I love this, If I could just have a way to do this for assignments, that would be awesome."
    Participant 2, Test 2

Key Metrics

Metric Round 1 Round 2 Change
Avg. calendar-task errors 2.4 errors/user 0.6 errors/user Reduced by 75%
Success rate 4/5 passed 5/5 passed Increased by 20%

Insight To-Feature Mapping

Research Insight Design Response Success Metric
70% forget to complete evaluations Calendar integration with customized reminders Reduced no-shows (measured by 23% completion rate increase)
Students don't see impact of feedback Faculty testimonials and "changes made" showcase Improved perceived value (measured by positive sentiment in testing)
80% prefer mobile completion Mobile-first design with thumb-friendly UI patterns Faster completion time (22% reduction in task time)
Confusion about evaluation timeline Course-specific evaluation window indicator Lower error rate (15.2% fewer errors in timeline-related tasks)
"Design is not art. It is about crafting solutions to real issues."
- Mark Boulton, Founder of Mark Boulton Design

DESIGN SOLUTIONS

We created two core mobile flows to improve evaluation participation

Flow 1 - "Start My evaluation"

  • 1. User lands on the welcome page and clicks the prominent "Start My Evaluation" button
  • 2. System presents a secure login interface
  • 3. User enters their student credentials (ID and password) and selects "Sign In"
  • 4. Google Calendar opens showing the evaluation deadline with a 72-hour reminder (customizable notification)
  • 5. User begins the interactive evaluation process
  • 6. After submitting all responses, user receives a confirmation screen with options to either return to their Class Climate dashboard or exit the system

My Design Decision: Calendar Integration

When designing the calendar feature, I made the critical decision to implement a dropdown selection rather than a date picker. My design choice was primarily driven by mobile space constraints, as my research showed 80% of students preferred mobile completion, this would help out students like Taylor. This compact solution not only saved valuable screen real estate but also simplified how students identified their evaluation windows by course length rather than arbitrary dates.

Flow 2 - "Add to Google Calendar"

  • 1. User taps the 'View Dates' button to see available evaluation periods
  • 2. User selects the date range that matches their specific course length and schedule. After selection, the dropdown automatically closes and displays the chosen evaluation period
  • 3. User taps the 'Add to Google Calendar' button, which immediately adds the event
  • 4. Google Calendar opens showing the evaluation deadline with a 72-hour reminder (customizable notification)

BEFORE & AFTER SCREENS

The screens below compare ACC’s original interface to the final design after two rounds of usability testing

What students see above the fold

Before

  • Promotes gift card incentives
  • Confusing user journey with no guidance or contextual framing
  • Lacks clear explanation of how and why to complete evaluations

After

  • Opens with a focused message on the value of evaluations
  • Features a carousel with how-to videos and student testimonials
  • Includes a prominent CTA button that links directly to the evaluation portal

Navigating the feedback window

REFLECTIONS

This project fundamentally shaped how I approach a UX process when looking to affect motivation and engagement in EdTech and design writ large.

Reducing Cognitive Load as a Design Strategy

Working on the calendar feature transformed my approach from chasing aesthetic perfection to systematically reducing decision complexity for users, particularly after seeing how students struggled with identifying their evaluation windows rather than with the interface itself. This shift has made me a more strategic designer who prioritizes cognitive simplicity in all my work. The calendar integration also revealed how system boundaries (like connecting evaluation platforms with personal calendars) are critical trust-building opportunities where users form lasting impressions about reliability, fundamentally changing how I approach integration points in my designs.

The Impact of Continuous & Open Communication

The most challenging aspect of this project was changing my mindset about how to communicate with my teammates and stakeholder. Initially I thought small updates here and there would work, but quickly found out during our research planning that we needed to really "over communicate" ideas, questions, concerns and even smalls wins. Developing the ability to "speak the language" of stakeholders was brought to light for me, particularly in translating user research into compelling narratives.

Balancing Academic & Real-World Contexts

Working on a capstone project that wouldn't be immediately implemented taught me how to design with both educational and professional standards in mind. Though our solutions wouldn't go live, we approached every decision as if it would, consulting with real stakeholders and gathering authentic user feedback. This dual perspective enhanced my ability to balance academic rigor with practical application—a skill that's proven valuable as I've transitioned to professional UX work.

Looking Forward

The Teaching & Learning Excellence Division expressed interest in incorporating elements of our design into their future system upgrades, particularly the calendar integration and impact visualization features. If this were ever to come to fruition I would need to make sure to create:

  • Longitudinal A/B Testing to measure which impact stories and testimonials drive highest conversion
  • Accessibility Audit to ensure screen reader compatibility for all interactive elements

TL;DR – Quick Recap

The Problem

Course evaluation participation dropped from 94% to 34% after going digital. Students forgot or didn’t see the value.

My Role

Led UX design and strategy, user research, usability testing, and stakeholder communication.

What I Did

Conducted user interviews, created personas, mapped journeys, built a calendar integration, and validated with 10 users across 2 test rounds.

Impact

The 23% lift in tasks completion fosters student trust & retention—driving ACC’s equity‑focused enrollment plan toward 85,000 students by 2030.

↑ Back to top