TEACH.org is a non-profit organization dedicated to increasing the number of the teachers in the U.S. TEACH has several resources to make the application process for Educator Preparation Programs (EPPs) streamlined and manageable, the biggest being their Application Checklist. They also offer resources, such as 1:1 coaching with an advisor, scholarships, and application fee reimbursements to encourage prospective teachers to finish the application process.
I was the lead UX Researcher on this project. I was in charge of planning and facilitating our research efforts (competitive analysis, user interviews, surveys, and usability testing. I also compiled and analyzed the findings from our research activities and wrote the Final Research Report for our client at the end of the project.
Many users who create application checklists don’t come back, with only 13% of users viewing their checklists more than 3 times within a 3-month period. Our team was tasked with redesigning the Application Checklist and encouraged to make TEACH’s other resources more findable and discoverable.
Before diving into the Application Checklist redesign, we wanted to look at other websites similar to TEACH. We also wanted to talk with teachers about their experiences in the profession, motivations behind wanting to become a teacher, and experiences with the application process. We decided to do a competitive analysis, user interviews, and deploy a survey.
We looked at nine sites dedicated to applying to teaching programs, as well as sites that help users complete a task they don't do often.
We also wanted more information about what motivates someone to become a teacher, what factors are most important in choosing a preparation program, the pain points of the application process, and helpful resources when applying. We conducted four user interviews with current and former K-12 teachers. We also deployed a survey and received 42 responses from current and former teachers across the U.S. From here, we created an affinity diagram consisting of over 100 cards to help us see the themes in our responses.
Up to this point, we had put off defining a specific problem statement because of the ambiguity of what the problem actually was. However, as we began our ideation phase, it became increasingly apparent that we needed to narrow the scope of the project to something attainable in the remaining weeks. We each came up with an array of How Might We (HMW) statements using the insights from our research as a guide. Coming back together, we realized all of our HMW statements had similar themes:
To start generating solution ideas, we ran a Google Venture Design Sprint exercise with our problem statement in mind. Each of us ended up with a 3-pane mini storyboard of how TEACH could support and encourage applicants throughout the process. I sketched rapidly through the Design Sprint exercise and ended up with my first version sketches.
I wanted to motivate the user to continue the application process by creating an interactive map for them to move through. I broke down the application steps into manageable chunks and had individual pages devoted to each step of the process. The individual pages included TEACH resources applicable to that step. The user also had an account page where they could clearly see which part of the application they were on for each program. Each page of the process has links to TEACH resources.
Next, I moved my iteration efforts to Figma, incorporating the UI library TEACH provided us into my designs.
I continued my idea of breaking up application steps by grouping them together with other similar steps, however, I decided to forego the whimsical approach of an interactive map because I wanted my designs to fit in with the professional tone of the rest of the TEACH website. I also considered that a more straight-forward approach may be easier for the developers at TEACH to implement if they wanted to adopt our designs. I began brainstorming application groupings as well.
Each grouping is still broken out into another page with more information about how to complete that step. Resources and guides related to each step are found on this page. Users will also be able to see which categories they have completed on the side of the page to increase motivation by showing them what they've already accomplished.
In their account, users are still able to see what part of the application process they are on for each of their applications. They can easily continue their application or start an application for a new program.
I removed the help text from this iteration and instead incorporated an ever-present floating help button on each page with applicable TEACH resources that will display when clicked.
In the end, we took bits and pieces from each team member’s solutions and settled on a final flow to focus our design efforts on.
Once we had decided on the focus of our flow, we ran moderated benchmark testing on the TEACH site. We wanted to gauge user activity around comparing programs, interacting with application checklists, and locating TEACH resources. We also wanted to identify benchmark metrics to test our prototypes against. Four participants who had applied to an academic program in the last 10 years completed our testing sessions. During the sessions, we had participants complete tasks that guided them through each part of the flow we were focused on redesigning.
During our benchmark testing, we had participants compare two different programs using TEACH's Educator Prep Program (EPP) Explorer and start an application checklist for one of the programs. It became immediately apparent that participants had difficulty comparing programs. School cards display the program types offered, but to get to more information about the school or programs, users have to click into the school page. Each participant used a different method to compare the two programs, such as opening the programs in different tabs or starting new checklists to compare application steps. We realized this may also be contributing to the sharp drop-off of users returning to their already-started checklists.
First, we decided to break down the EEP Explorer cards by program instead of school. We focused on putting important, easily scannable program information on each card. We also implemented a compare feature allowing users to compare up to four different programs. The compare page has the most vital information for each program, as indicated by our participants during our initial research and benchmark testing. We encourage users to start a checklist for the program or learn more about the school at the bottom of the compare page. We think this will also reduce the number of checklists that are started and abandoned by allowing users to easily see program details side-by-side.
During our benchmark testing, we had participants navigate to an already-started checklist, comment on their progress, and tell us how much of the application they had left to complete. Several participants expressed concern about not being able to tell how much longer the application would take to complete because there was no indication of how long application tasks take to complete.
We grouped all components of the application into six overarching categories to make the application process feel more manageable. Our Application Overview page is ground zero for each application checklist, but users can see each application step laid out in the Application Summary page. Users are able to start or continue where they left off on the checklist from either page.
We wanted to help users manage their time by creating a progress bar with upcoming deadlines. Deadlines are also displayed next to their related tasks on the Summary page. To give users an idea of how long a task will take to complete, we categorize each task as a low, medium, or high time investment. Only steps falling into the medium or high categories are shown on the Overview page so users can anticipate which tasks will take longer to complete. Definitions of each time category can be found on individual step pages.
Individual application steps were previously found by expanding out application sections. Users need to manually click into each section and check off the task that was completed to advance their application progress. TEACH mentioned that many users weren't engaging with the checkbox after completing application steps.
During our benchmark testing, we asked participants to complete an application task. Several participants had to be prompted to advance their progress by clicking the checkbox, stating that they didn't notice the checkbox until prompted.
We also had participants find two TEACH resources from the application checklist. Several participants struggled to find the resources, needing prompting to complete the task.
We decided to guide users through each of the six main application categories using a wizard-like approach. When they start a new category, users will see an overview of each step in the section, with a time estimate of how long it will take to complete. They are guided through the category by either completing or skipping the step. Each step has detailed information on how to complete it, with applicable resources displayed on the side. After completing each step of the category, users are met with a congratulatory message for coming one step closer to becoming a teacher. They can also share their application progress with friends and family on social media. At the end of each section, users can either continue onto the next section or go back to the Overview page.
We decided to also create a help bar displayed on the bottom of each page of the checklist. Because talking with someone during the application process was such an important finding during our initial research, we decided to always show 1:1 coaching in the help bar. Users can select Guides to find more information on other TEACH resources.
We created prototypes of our designs and tested four more participants who had applied to an academic program in the last 10 years. We kept the same tasks and metrics from our benchmark testing to gauge the success of our designs.
We also had participants in both our benchmark and prototype testing groups complete the System Usability Scale (SUS) questionnaire at the end of their session. The average SUS Score for our prototype increased by 11.25 points, making it a "best imaginable" score and coinciding with an A usability grade.
“I really really like the site. It makes me really want to sign up [for an account]. It’s tailored to me, it’s everything and the details I care about. It quells the anxiety in me. It’s delightful.”
- P5, Prototype Testing
"You all did an excellent job. We’re already putting together some concrete next steps about how to bring your research and designs to life.”
- Chief Product Officer, TEACH.org
"I loved your designs – they were clean, modern, and user friendly. The information on the pages/tiles and the workflow was very thoughtful. I am impressed with what you achieved in 10 weeks.”
- Product Manager, TEACH.org
The biggest lesson I learned from this project was how important it is to have a properly defined problem statement before diving into the design phase. Once we had settled on our single HMW statement it was much easier to finish the rest of the project.
Going into this project, I was also anxious because not only was it my first time working with a client, it had to be done completely remotely. I learned a lot about how to communicate and work collaboratively relying only on the programs we were using. This ended up being a blessing in disguise, because not only were we able to touch base at the drop of a hat, we were able to all collaborate together on every part of the project.