01 - Overview

TEACH.org is a non-profit organization dedicated to increasing the number of the teachers in the U.S. TEACH has several resources to make the application process for Educator Preparation Programs (EPPs) streamlined and manageable, the biggest being their Application Checklist. They also offer resources, such as 1:1 coaching with an advisor, scholarships, and application fee reimbursements to encourage prospective teachers to finish the application process.

My Role

I was the lead UX Researcher on this project. I was in charge of planning and facilitating our research efforts (competitive analysis, user interviews, surveys, and usability testing. I also compiled and analyzed the findings from our research activities and wrote the Final Research Report for our client at the end of the project.

Client

TEACH.org

My Role

UX Research
Interaction Design

Team

Jennifer Hicks, UX Designer
Lilian Szugyi, UX Designer
Matt Ropel, UX Designer

Duration

March - June 2020

Tools

Figma
Notion
Google Meet and Zoom

The Challenge

Many users who create application checklists don’t come back, with only 13% of users viewing their checklists more than 3 times within a 3-month period. Our team was tasked with redesigning the Application Checklist and encouraged to make TEACH’s other resources more findable and discoverable.

Constraints

Timeline

We needed to complete our work within a 10 week timeline.

TEACH Users

We didn’t have access to users who had interacted with TEACH.org. We circumnavigated this problem by talking to current and former teachers during our user research phase and running usability testing sessions on participants who had applied to an academic program within the last 10 years.

Coronavirus

Because of the Coronavirus pandemic, all work for this project had to be done remotely. Our team used Notion and Google Hangouts to collaborate throughout the project. All of our user interviews and usability testing sessions were conducted with either Zoom or Google Meet.

The Process

The following is the process we went through:

Research

Competitive Analysis
User Interviews
Survey
Affinity Diagramming

Ideate

HMW Statements
Rapid Sketching
Low-Fidelity Wireframing
Feedback from the Client

Design

Iterations
High-Fidelity Wireframing
Interactive Prototyping

Validate

Benchmark Usability Testing
Prototype Usability Testing

02 - Identifying the Problem

Before diving into the Application Checklist redesign, we wanted to look at other websites similar to TEACH. We also wanted to talk with teachers about their experiences in the profession, motivations behind wanting to become a teacher, and experiences with the application process. We decided to do a competitive analysis, user interviews, and deploy a survey.

How Do Competitor Websites Handle an Application Process?

We looked at nine sites dedicated to applying to teaching programs, as well as sites that help users complete a task they don't do often.

Key Findings:

What Insights Can Teacher's Give Us?

We also wanted more information about what motivates someone to become a teacher, what factors are most important in choosing a preparation program, the pain points of the application process, and helpful resources when applying. We conducted four user interviews with current and former K-12 teachers. We also deployed a survey and received 42 responses from current and former teachers across the U.S. From here, we created an affinity diagram consisting of over 100 cards to help us see the themes in our responses.

Key Findings:

Our remote affinity diagram helped us see trends in our research.

03 - Coming Up With Solutions

Up to this point, we had put off defining a specific problem statement because of the ambiguity of what the problem actually was. However, as we began our ideation phase, it became increasingly apparent that we needed to narrow the scope of the project to something attainable in the remaining weeks. We each came up with an array of How Might We (HMW) statements using the insights from our research as a guide. Coming back together, we realized all of our HMW statements had similar themes: 

We workshopped our HMW statements until we arrived at one main problem statement.

Problem Statement

How might we create a simple, engaging, and manageable application process so that students can become teachers?

The Motivation Map

To start generating solution ideas, we ran a Google Venture Design Sprint exercise with our problem statement in mind. Each of us ended up with a 3-pane mini storyboard of how TEACH could support and encourage applicants throughout the process.  I sketched rapidly through the Design Sprint exercise and ended up with my first version sketches.

I wanted to motivate the user to continue the application process by creating an interactive map for them to move through. I broke down the application steps into manageable chunks and had individual pages devoted to each step of the process. The individual pages included TEACH resources applicable to that step. The user also had an account page where they could clearly see which part of the application they were on for each program. Each page of the process has links to TEACH resources.

One of my first version sketches.

Refining My Solutions

Next, I moved my iteration efforts to Figma, incorporating the UI library TEACH provided us into my designs.

I continued my idea of breaking up application steps by grouping them together with other similar steps, however, I decided to forego the whimsical approach of an interactive map because I wanted my designs to fit in with the professional tone of the rest of the TEACH website. I also considered that a more straight-forward approach may be easier for the developers at TEACH to implement if they wanted to adopt our designs. I began brainstorming application groupings as well.

Each grouping is still broken out into another page with more information about how to complete that step. Resources and guides related to each step are found on this page. Users will also be able to see which categories they have completed on the side of the page to increase motivation by showing them what they've already accomplished.

In their account, users are still able to see what part of the application process they are on for each of their applications. They can easily continue their application or start an application for a new program.

I removed the help text from this iteration and instead incorporated an ever-present floating help button on each page with applicable TEACH resources that will display when clicked.

My initial TEACH wireframes.

The Final Flow

In the end, we took bits and pieces from each team member’s solutions and settled on a final flow to focus our design efforts on.

Comparing Programs

Where the user finds information about educator preparation programs and decides to start an application checklist.

Application Steps

A manageable view of application step groupings to make the process seem less daunting and overwhelming.

Individual Steps

A breakdown of steps in the application process with helpful TEACH resources and guides.

Help System

Prevalent and consistent information about TEACH resources throughout the entire application process.

04 - Finalizing Our Designs

What Do Users Think About the Current Site?

Once we had decided on the focus of our flow, we ran moderated benchmark testing on the TEACH site. We wanted to gauge user activity around comparing programs, interacting with application checklists, and locating TEACH resources. We also wanted to identify benchmark metrics to test our prototypes against. Four participants who had applied to an academic program in the last 10 years completed our testing sessions. During the sessions, we had participants complete tasks that guided them through each part of the flow we were focused on redesigning.

Previous EPP Explorer Experience

During our benchmark testing, we had participants compare two different programs using TEACH's Educator Prep Program (EPP) Explorer and start an application checklist for one of the programs. It became immediately apparent that participants had difficulty comparing programs. School cards display the program types offered, but to get to more information about the school or programs, users have to click into the school page. Each participant used a different method to compare the two programs, such as opening the programs in different tabs or starting new checklists to compare application steps. We realized this may also be contributing to the sharp drop-off of users returning to their already-started checklists.

Design Solutions:

  • Make comparing programs easier by displaying important information at the card level.
  • Add a compare feature allowing users to see information from several programs side-by-side.
Previous EPP Explorer experience.
Comparing two programs with the new EPP Explorer.

Our EPP Explorer Redesign

Highlights:

  • EPP Explorer cards by program with easily scannable information.
  • Compare feature allowing users to compare up to four different programs.
  • Compare page with most important program information.

First, we decided to break down the EEP Explorer cards by program instead of school. We focused on putting important, easily scannable program information on each card. We also implemented a compare feature allowing users to compare up to four different programs. The compare page has the most vital information for each program, as indicated by our participants during our initial research and benchmark testing. We encourage users to start a checklist for the program or learn more about the school at the bottom of the compare page. We think this will also reduce the number of checklists that are started and abandoned by allowing users to easily see program details side-by-side.

Redesigned EPP Explorer and Compare Programs pages.

Previous Application Step Experience

During our benchmark testing, we had participants navigate to an already-started checklist, comment on their progress, and tell us how much of the application they had left to complete. Several participants expressed concern about not being able to tell how much longer the application would take to complete because there was no indication of how long application tasks take to complete.

Design Solutions:

  • Combine application steps into manageable sections.
  • Add estimated times for application steps.
  • Add a progress bar with deadlines and important dates.
Previous Application Step experience.
Time Investment definitions are found at the start of each Application Category flow.

Our Application Step Redesign

Highlights:

  • Manageable application step groupings.
  • Overview and Summary pages with varying levels of information.
  • Progress bar with upcoming deadlines.
  • Tasks categorized as a low, medium, or high time investment.

We grouped all components of the application into six overarching categories to make the application process feel more manageable. Our Application Overview page is ground zero for each application checklist, but users can see each application step laid out in the Application Summary page. Users are able to start or continue where they left off on the checklist from either page.

We wanted to help users manage their time by creating a progress bar with upcoming deadlines. Deadlines are also displayed next to their related tasks on the Summary page. To give users an idea of how long a task will take to complete, we categorize each task as a low, medium, or high time investment. Only steps falling into the medium or high categories are shown on the Overview page so users can anticipate which tasks will take longer to complete. Definitions of each time category can be found on individual step pages.

Redesigned Application Overview and Application Summary pages.

Previous Individual Step Experience

Individual application steps were previously found by expanding out application sections. Users need to manually click into each section and check off the task that was completed to advance their application progress. TEACH mentioned that many users weren't engaging with the checkbox after completing application steps.

During our benchmark testing, we asked participants to complete an application task. Several participants had to be prompted to advance their progress by clicking the checkbox, stating that they didn't notice the checkbox until prompted.

We also had participants find two TEACH resources from the application checklist. Several participants struggled to find the resources, needing prompting to complete the task.

Design Solutions:

  • Create a wizard-like experience for each application category to guide users through sections.
  • Make the prompt for users to check off tasks more noticeable.
  • Make TEACH resources easily findable from any page in the checklist.
Previous Individual Step experience.
Individual Step page with applicable TEACH resources displayed.

Our Individual Steps Redesign

Highlights:

  • Wizard-like feature guiding users through application categories with positive reinforcement along the way.
  • Applicable TEACH resources on individual step pages.
  • Help Bar on the bottom of each page of the checklist.

We decided to guide users through each of the six main application categories using a wizard-like approach. When they start a new category, users will see an overview of each step in the section, with a time estimate of how long it will take to complete. They are guided through the category by either completing or skipping the step. Each step has detailed information on how to complete it, with applicable resources displayed on the side. After completing each step of the category, users are met with a congratulatory message for coming one step closer to becoming a teacher. They can also share their application progress with friends and family on social media. At the end of each section, users can either continue onto the next section or go back to the Overview page.

We decided to also create a help bar displayed on the bottom of each page of the checklist. Because talking with someone during the application process was such an important finding during our initial research, we decided to always show 1:1 coaching in the help bar. Users can select Guides to find more information on other TEACH resources.

Individual Step pages working through a category flow.

05 - So, Did Our Designs Solve the Problem?

We created prototypes of our designs and tested four more participants who had applied to an academic program in the last 10 years. We kept the same tasks and metrics from our benchmark testing to gauge the success of our designs.

We found that:

Using our prototype, users were able to navigate to and compare programs in

half the time

it took them during the benchmark testing.1
All participants commented

positively

on the progress bar and time investment features.
Participants enjoyed how our prototype

guided them through

the application categories.
Participants were able to find TEACH resources in

less than half the time

as the benchmark testing.2
1. Average benchmark time = 315 seconds vs. average prototype time = 114 seconds.
2. Average benchmark time = 200 seconds vs. average prototype time = 88 seconds.

We also had participants in both our benchmark and prototype testing groups complete the System Usability Scale (SUS) questionnaire at the end of their session. The average SUS Score for our prototype increased by 11.25 points, making it a "best imaginable" score and coinciding with an A usability grade.

The benchmark testing SUS Score vs. the prototype testing SUS score.
I submitted a detailed Research Report with all our research findings to the client at the end of the project.

Feedback

“I really really like the site. It makes me really want to sign up [for an account]. It’s tailored to me, it’s everything and the details I care about. It quells the anxiety in me. It’s delightful.”
- P5, Prototype Testing
"You all did an excellent job. We’re already putting together some concrete next steps about how to bring your research and designs to life.”
- Chief Product Officer, TEACH.org
"I loved your designs – they were clean, modern, and user friendly. The information on the pages/tiles and the workflow was very thoughtful. I am impressed with what you achieved in 10 weeks.”
- Product Manager, TEACH.org

What I Learned

The biggest lesson I learned from this project was how important it is to have a properly defined problem statement before diving into the design phase. Once we had settled on our single HMW statement it was much easier to finish the rest of the project.

Going into this project, I was also anxious because not only was it my first time working with a client, it had to be done completely remotely. I learned a lot about how to communicate and work collaboratively relying only on the programs we were using. This ended up being a blessing in disguise, because not only were we able to touch base at the drop of a hat, we were able to all collaborate together on every part of the project.

Other Projects