I'm a designer!

Altus Insights

I worked with the product team at Altus Assessments to Defined the product vision and experience for a new analytics platform to expand to new markets

smartmockups_kn38p29y.jpg

Context

Altus Assessments specialized in 1 test called Casper, with 120,000 applicants every cycle. Our primary customer base is the Undergraduate medical admissions teams. We currently provide 3 types of assessments (including Casper, Snapshot, Duet) that generate unique data points for programs to use in their admissions process.

Problem statement

We provide to our customers a sheet of data in .csv. They are expected to perform their own analysis in their own time.

With the launch of 2 new assessments, there are now a multitude of way to use these tools to evaluate an applicant, and our customers are getting overwhelmed and confused about how to use these data points. 

With this new product, we were hoping to help our customers understand how and when to use the data that Altus provide in their admissions process, so that they feel confident about the students that they bring into their program

Product discovery

In order to help our customers understand how and when to use the data, we start with understanding what their current process looks like, and where their pain points are.

Over the last 6 months, I studied and interviewed about 40 university admissions teams. I used a combination of 1 on 1 interview, feedback sessions, prototypes and card sort exercise to understand the scope of the problem space and where our product priorities lie.


In addition to that, I also created a visual mapping of the customer’s journey to help the team understand:

1) what is the admissions process

2) where should our customers use these data points to get the best value

Program admssions journey@2x.png

The general pattern that Most programs admissions process contain these critical parts.

Altus Insights case study@2x.png

It’s goal is to get from 20,000 applicants to 200 seats, as efficient, accurate, and as equitable as possible. 

The admissions team primarily using 2 ways to evaluate applicants 1) quantitative analysis: for example, look at their scores, establish some threshold cutoffs, or rank them from highest to lowest2) qualitative analysis: For example, look at the applicants themselves, through interviews and detailed review of their admissions files. 

A mighty team of 2

Altus Insights case study@2x (1).png

And most, if not all the time, it’s a team of 2 people Program director and Admissions assistant/program co-ordinator doing this.

They present their process and findings to an admissions committee and dean of admissions, who can also interview applicants. Every few years they review their admissions process internally, and with a regulatory body to make sure the process is up to standards. 

In larger programs, they have a data scientist on hand to help with the analysis part, but the majority of our customer base (60%) are smaller programs, which means that they do everything and is always time-crunched. 

This is a snapshot of what our end-users are like:

Altus Insights case study 2021.jpg

Current challenges

So we primarily talked to the team of 2 that are involved in the day to day operations of the process. These are the challenges our end-users face on a daily basis. 

  1. How do I incorporate all these assessments into my existing process? and when is the best time to use them?

  1. Am I selecting the right candidate? Is my selection process defensible? Are all my data points reliable?  For example, reference letters are not an accurate indicator of a suitable applicant. But there is no way of knowing how taking out one of the tools will affect the applicant pool. 

  2. Do I have a fair process that selects a healthy balance of ethnicities, gender, social-economic status? Right now most programs only look at these aspects towards the end of the admissions process, but not throughout, because it’s really hard to see a grand vision of how the demography changes. 

Defining the product vision

altus case study.png
altus case study 2.png

Before start tackling these challenges, it was important to understand the vision and the expectation of this product, the big picture. I partnered with VP of Strategy to conduct a Google Ventures style design sprint across multiple team leads, including the senior executive team, CSM team, marketing, subject matter experts, engineering and data science team.

My goal was to: 1) align everyone on the vision of Altus Insights. How might we help our customers to understand the data they are using without being an expert? 2) gather as much domain knowledge as possible from the data and admissions experts to tackle these challenges.

A guide to how to use the data

Through these product discovery exercises, I designed a decision support system that takes in data from Altus, such as test results, demographic data and associate that with unique data provided by our customers. 

To help program admins navigate different types of data, I introduced a recommendation process, where administrators are able to answer a couple of questions about their program, and then see a recommended way of using our assessments. This recommendation paints a picture of how they can use Altus products in their current admissions process to select the best students. 


In order for a user to truly understand the recommendation, it  is best used in context with real data, 

So the next piece of the puzzle is to allow the user to take this recommendation and apply it to their own data, to form a unique evaluation funnel that informs them exactly what data points they should look at, at which stage of their process. 

Contextualize the impact of selection criteria

In addition, what a customer can do is they can look at the demographic composition of their class, and see how their admissions criteria at each step affect that. And then see the results we provide from Altus helps them increase that diversity.

During the product discovery, one of the other interesting point I discovered was that some medical schools do compete for the same applicant pool, trying to get the best applicant in that cohort. Their primary interest is in how they compare to other schools of the same state, whether the selection criteria they set out is too low or too high to get the right applicant.

So in addition to visualizing demographic composition, a way to allow medical schools to compare across the states. they can then use this information to adjust the selection criteria for next year.

Validate with our end users

I sat down with some of our end-users to walk them through the designs.

My goal was to present these initial ideas as design prototypes to the program administrators and identify the highest value successes for our programs. The designs help to anchor the conversation, so they are more focused and explicit on the ideas we are trying to achieve.

Most commonly reported being valuable:

  1. Being able to tweak metrics and see the effects on the applicant pool as well as the effects on demographics

  2. Ability to look at the composition of the applicant pool to determine the effectiveness of thresholds, tools, recruitment efforts, etc.

  3. Being able to easily report this internally and externally

    1. Defend process internally and to accreditation bodies

    2. Defend selection of individual applicants

    3. Market/recruit for the program

  4. Being able to pinpoint which step in their process they’re creating issues or whether they have a problem at all

  5. Being able to know what other programs are doing in comparison because some programs are competing for the same applicant pool.

Using these designs, and engaging with our customers, I helped guide our team towards a unified vision, these designs helped to keep the team focused on the big picture, and inform what we should plan next for our quarter.