SEQL athlete discovery app — rebuilding a broken recruiter tool from the ground up

Redesigned a launched-without-research MVP into a trusted recruiter platform by solving the data credibility problem at its root — through a connected athlete app that keeps profiles accurate over time.
Role
Lead Designer + Researcher
Product type
Consumer Desktop
Timeline
Mar 2021 - Sept 2021
MacBook mockup
01 - the context

An MVP that launched before anyone talked to a user

College recruiting is a numbers problem. Out of roughly 8 million high school athletes in the US, fewer than 500,000 will compete at the collegiate level. Recruiters are trying to find the right ones — but the tools available to them were making that harder, not easier.

SEQL had an idea worth building: a recruiter-facing platform with verified athlete data, real search tools, and a direct connection to athlete profiles. The MVP had been built by an external agency and shipped — without user research or usability testing.


When I joined, my brief was to diagnose what was broken and rebuild the experience around what recruiters actually needed.

02 - the users

Three distinct recruiter types — each with different budgets, sports, and tracking needs

We interviewed recruiters with varying levels of seniority and experience, and quickly found that "recruiter" wasn't a monolith. Three distinct profiles emerged, each with meaningfully different constraints. For the MVP, we focused on revenue sport recruiters — the highest-need, highest-budget segment — and scoped non-revenue sport support for post-launch. That was a deliberate prioritization decision, not an oversight.
breaking down the recruiter needs by sport and division informed three distinct user archetypes

D1 Revenue sport recruiter

Football, basketball. Larger discretionary budgets. Recruits for one sport only. Often tracking athletes from middle school onward — needs to group and tier prospects by graduation year, position, and degree of interest over a multi-year horizon.

"I'm managing relationships that span years. I need to know exactly where every prospect stands at any point in time."

D2 recruiter

Football, basketball, baseball. Similar tracking needs to D1 but with less budget. More price-sensitive about platform costs and less tolerant of paying for data they can't verify.

"I can't afford to waste a recruiting visit on someone whose profile turned out to be outdated."

Non-revenue sport recruiter

Limited budgets. Often recruits across multiple sports simultaneously. Has unique tracking and filtering needs that existing platforms — built around revenue sport requirements — completely ignore.

"Every tool is built for football. My filters don't even exist on most platforms."

03 - the problem

The existing tools were expensive, fragmented — and actively distrusted

"Athlete-focused programs like BeRecruited prey on families who don't understand the process, and use statistics that aren't verifiable for the recruiter. The emails are typically immediately deleted when received. Kids are spending thousands of dollars a year that, from the college recruiter perspective, are worthless."

— D1 recruiter, user research interview

This quote wasn't an outlier. Across our interviews, the same pattern emerged: recruiters had become deeply skeptical of athlete profile platforms as a category — not just the bad ones, but all of them. That skepticism was the central problem the product had to solve.

In addition to this deeply ingrained mistrust in recruiting products, the existing application that had been built had to be completely replaced.

Dashboard mockup
problem  1

built on the wrong assumptions

Map-first search and a prominent chat feature both reflect the same fundamental misunderstanding of the recruiting environment. Recruiters search by attributes, not location. And direct recruiter-to-athlete communication outside official channels is NCAA-regulated — making chat not just unhelpful but a feature that couldn't legally ship.

Two different screens, one root cause: the product was designed without recruiter input.

problem  2

Filtering too shallow to be useful

Recruiters track hundreds of students simultaneously across multi-year pipelines. They search by attributes, not names, and need multi-tiered groupings by graduation year, position, and custom attributes.

The existing board couldn't support any of that at scale.

problem  3

a trust deficit across the board

Profiles were user-sourced, rarely updated, and unverified. Recruiters had learned not to rely on what they saw. There was no annotation, no verification layer, no way to request updates — no mechanism to make the data trustworthy over time.

04 - the opportunity

Reframing the problem as a system, not a single product

To restore trust, we needed to rethink how data moved through the system. We weren’t designing a better profile viewer — we were designing a system that restores trust in athlete data.
Dashboard mockup
opprotunity 1

Attribute-first search

Replace the map default with stat and measurable-driven filtering, sport-specific, as the primary entry point.

opprotunity 2

A verification layer

If recruiters could flag, annotate, and request verification of athlete data, profiles would improve over time rather than stagnate.

opprotunity 3

Replace the spreadsheet

A built-in prospect board organized by priority, position, and graduating year could make SEQL the single place recruiting decisions lived.

05 - the design process

Two research tracks running concurrently

I structured discovery in two parallel tracks: a heuristic audit of the existing MVP, and user sessions with active college recruiters — including sessions where recruiters reviewed competitor platforms directly while we observed and took notes. Running them simultaneously let me separate execution problems from structural ones quickly.

Heuristic audit of the MVP

A systematic review of the existing product against usability principles gave the team a shared, documented baseline — rather than competing intuitions about what was broken. It also surfaced the map-first issue explicitly: the default view was oriented around geography when every recruiter we'd spoken to searched by attributes first.

Recruiter interviews

Sessions with active college recruiters focused on their actual workflows — how they currently found and tracked prospects, what they'd tried before, and crucially, what would need to be true for them to trust a platform enough to replace their spreadsheets. The trust findings came directly from these conversations and reshaped the product direction entirely.

Designing in parallel with the athlete app

The recruiter platform and the SEQL Athlete App were being designed simultaneously. Decisions I made on the recruiter side — what data fields to surface, how verification requests would work — had direct consequences for what the athlete app had to support, and vice versa. Managing that cross-product dependency shaped how I approached handoff documentation to ensure both sides of the system would work as intended when built.

06 - the solution

A recruiter platform built on a two-sided data architecture

The most important design decision wasn't a screen — it was the system. The Athlete Discovery App and the SEQL Athlete App were built to work in tandem: athletes update their profiles in the athlete app, and that data flows directly into the recruiter platform. When a recruiter requests missing information, a push notification goes to the athlete to update their profile.
Dashboard mockup

Sport-specific athlete search

A discovery page with filtering that adapts to the sport — position, height, wingspan, graduating year, and attributes that vary by discipline. A football recruiter's filters look meaningfully different from a diving recruiter's. Generic search was useless for real recruiting decisions; specificity was the whole point.

Dashboard mockup

Verified data layer with shared annotations

Rather than treating all athlete data as equivalent, the platform surfaces verified versus unverified information transparently. Recruiters can add notes — particularly useful after scouting events — request verification content, and flag discrepancies. Critically, notes and annotations are shared across recruiter profiles within the same program: Recruiter A and Recruiter B always see the most up-to-date information on any athlete, eliminating the siloed knowledge problem that made spreadsheets necessary in the first place.

Recruiters can:

  • Add notes
  • Flag discrepancies
  • Share insights across staff

👉 Turning profiles into collaborative, evolving records

Dashboard mockup

Information request flow

Athlete profiles were often missing the details that mattered most — GPA, transcript records, current measurables like a 40-yard dash time. Rather than leaving recruiters to work around the gaps, the platform lets them request specific missing information directly. A push notification goes to the athlete, nudging them to update their profile. This keeps records accurate and shifts the burden of maintenance off the recruiter without relying on athletes to self-manage proactively.

How it works

  • Athletes update their profiles in the athlete app
  • Data flows directly into the recruiter platform
  • Recruiters request missing or updated information
  • Athletes receive notifications and respond

👉 This creates a continuous feedback loop instead of static profiles

Dashboard mockup

Prospect tracking board

A prospect board organized by priority level, sport, position, and graduating year — designed to replace the spreadsheet entirely. D1 revenue sport recruiters tracking athletes from middle school onward can now group and filter their full pipeline in one place: all 2024 Top Prospects at Running Back, instantly. No open tabs, no manual spreadsheet updates, no duplicated effort between staff members. Recruiters were often frustrated by incomplete profiles.

  • Group by year, position, priority
  • Shared visibility across recruiting staff
  • Real-time updates as athlete data changes
07 - metrics

A structural problem solved, not just a UI refreshed

The redesigned app shipped as the beta release with measurably improved usability scores over the original MVP. More significantly, the two-sided architecture — athlete app feeding recruiter app, recruiters pushing information requests back to athletes — addressed the data trust problem that no amount of UI improvement could have fixed on its own. Post-launch feedback from recruiters indicated the prospect board reduced reliance on manual spreadsheets, and the shared annotation system eliminated the duplicate effort that came from multiple recruiters tracking the same athlete independently.

What I'd do differently

The original MVP launched without usability testing — meaning I inherited problems that could have been caught much earlier. I'd advocate harder for lightweight testing at the MVP stage on future projects; fixing structural issues post-launch is substantially more expensive than catching them in a prototype. I'd also push for athlete-side testing earlier in the process. The recruiter experience was only as good as the data feeding it, and we didn't fully pressure-test that dependency until we were already building.