← Back to Case Studies
Chegg (Chegg Skills)Chief of Staff, COO2020-2021

Making Job Search a Decision System

Drove adoption of new tooling for job seekers and career coaches, enabling just-in-time interview prep near-term and surfacing critical leading and lagging indicators to guide decision-making on major program changes.

Decision SystemsBuild vs. BuyData VisibilityInternal ToolsProduct OperationsStakeholder managementProcess designZapierTableauSalesforce

Problem Overview: Bootcamp grads not getting hired and no visibility into why

Context

Chegg Skills provided online programs to train students for entry-level software engineers, data analysts, UX designers, and more. Every program came with 6 months of dedicated career coaching upon graduation. Moreover, each graduate was guaranteed a job within six months, as long as they met a baseline level of job search activity, or get their tuition refunded.

It was clear that graduates were having a rough time: the job market was tumultuous, and there were murmurs across the career coaching team that many of their graduates were either unable or unwilling to conduct a respectable job search.

Visibility for coaches depended on self-reported spreadsheets and manual check-ins, which didn't scale with rapid caseload growth.

Resulting Chaos

The org couldn't tell who needed help, who was gaming the system, or whether the guarantee itself put the business at risk.

In other words: they couldn't distinguish between bad luck, bad heavior, and bad systems.

It was crucial that we find ways to best serve our graduates, get them hired, empower our coaches to facilitate that success, and remove a potentially high liability for the business.

I came on as a product-minded Chief-of-Staff to turn moral & financial risk into something the department could reason about.

And I set my sights on turning off-platform activity into decision-grade signals.

The Output: Solutions Implemented, Decisions Enabled

Before diving into the process, here's what we did and how it helped:

What Changed

  • Rolled out Huntr (post-build v. buy analysis) across our jobseeker population
    • Leveraged webhooks via Zapier to:
      • update CRM (Salesforce) for Careers org to determine population-level jobseeker status health
      • event-driven alerts for coaches to reach out for just-in-time interview prep
  • Built separate data warehouse using event & entity data to build predictive hiring models
    • Worked with data engineering & scientists to understand leading indicators of successful vs. stagnant hires
    • Defined "heatlhy jobseeker" and compliance metrics
    • Pushed segmentation & hireability logic (different playbooks for different jobseekers)
    • Produced interactive Tableau visualizations

What it enabled for the Careers Organization

  • Quick coach alerts to intervene ahead of meaningful events (screens & interviews)
  • Reduced "invisible job search" states
  • Consistent source of truth for the organization
    • Employer relations representatives could query and make more effective intros

What it enabled for Executive Leadership

  • Shared language to analyze and predict jobseeker performance
  • Built targets, cohort cuts, and attribution to understand what interventions were working
    • Defined baseline hired rate trajectory to forecast impact of operational improvements vs. experiments
  • Funded program-level interventions for stagnant jobseekers
    • Based on quicker detection of stagnation at a cohort level
  • Launched upstream course and program overhauls inspired by data-driven discoveries

At a Glance: The System We Designed

After settling on Huntr as our tool of choice for the org and coordinating consent retreival process with Legal, I documented the workflows so the team could independently troubleshoot and make changes if the needs arose.

Step 1: Define Our Surface Area

In short: where did we have agency and authority to make changes and measure what mattered? I conducted interviews with relevant stakeholders and students to understand major pain points and how the org functioned as a whole:

  • All career coaches
  • Select Academic Success Managers: those who worked with students prior to graduation
  • Leadership Teams
  • Students -Current students
    • Struggling, actively job-seeking graduates
    • Recent successfully hired graduates
  • Product, Engineering, Legal, and Data teams

My conversations yielded additional clarifying guardrails:

  1. Coaches were overloaded with increasing load of jobseekers
    • This suggested little time to learn or wait long for new software or tooling
  2. Software development was committed to marketing, admissions (sales), and education intiatives
    • Building new custom tooling would be expensive, sacrifice other company initiatives
    • This would also require lobbying, validating, designing new systems
  3. Admissions Quality was a major unaddressable factor
    • The students who came into many of the programs were not well suited for employment in these fields
      • Chegg Skills as a whole was delaying investments in digital literacy training, or placing additional quality filters onto incoming students
    • For better or for worse: this was politically sensitive
      • There was meaningful resistance to suggestions being made too far upstream in the student experience
      • My mandate was to start within Career Services, and use our findings to validate

Step 2: Defining Principles & Tradeoffs

Principle 1: Fewer, Clearer Leading Indicators > Exhaustive Data Collection

We deliberately narrowed the signals collected.

  • Defined "meaningful activity" as job Applications, Phone Screens, First and Second Interviews
  • Determined that networking activity (e.g. outreach, informational interviews) was too noisy to measure reliably
    • We highly encouraged jobseekers to network aggressively and track their efforts
    • We did not anchor early models on it

The tradeoff: We sacrificed visibility into upstream networking gaps, knowing this area would go undetected.

I argued it was worth narrowing the data set to achieve a forecasting model faster.

The result: High application + low interview volume revealed key deficiencies further upstream in the actual program pre-graduation. Networking wouldn't have solved that.

Principle 2: Business Visibility > Jobseeker UX

  • We standardized workflows across our jobseeker population, even though:
    • It reduced nuances in individual interview processes
    • It forced all jobseekers into the same interface
    • It prioritized fast event routing over polished UI

This let us meet coaches inside their current tooling (e.g. Slack, Salesforce) instead of building new internal dashboards.

The tradeoff: Adoption lagged, and rollout was slower than expected. Spreadsheet users offered meaningful resistance.

The result: Early adopters triggered system to alert coaches, who intervened to prep their clients. This led to better outcomes for those jobseekers, and a virtuous cycle of success stories coaches could evangelize to encourage more adoption.

Principle 3: Coach Action > Comprehensive Analysis

Alert first, intervene immediately, instrument later.

  • Built event-driven business workflows before the data pipelines for analysis and forecasting
  • Avoided waiting for custom builds that would have delayed relief for frontline teams

The tradeoff: We contaminated our ability to measure a clean "before" state because operational improvements were happening in real time.

I argued this was reasonable and necessary: Careers wasn't a lab. It was a live business with financial risk. Improving outcomes immediately mattered more than isolating measurement.

The result: Executive forecasting matured to inform larger program changes and systemic interventions, including upstream academic program overhauls.

Principle 4: Speed of Iteration > System Completeness

  • Prioritized tools with minimal engineering lift
  • Launched with partial visibility
  • Deferred complex behavioral metrics
  • Accepted that definitions would evolve as we learned

The tradeoff: Blind spots existed and persisted early (networking, behavioral nuances, segmentation complexity)

The result: We rolled out a system that could evolve rather than waiting for a perfect internal system.

Step 3: Build vs. Buy vs. Buy vs. Buy

Build (Rejected Early)

We considered building internal tooling to manage job search tracking for grads and coaches.

Why we didn't:

  • Coaches needed relief immediately.
  • We had yet to determine which signals mattered most.
  • Engineering was committed to admissions, marketing, and education initatives.
  • Custom builds would have required lobbying, validation, months of iteration.

Tradeoff: We sacrificed full control over UX and scheme in favor of speed and surface area.

Buy #1: Huntr (Selected)

Huntr offered:

  • Kanban-based job tracking
  • Simple admin-level visibility
  • Webhooks + API access
  • Customizable fields
  • Chrome extension to reduce friction

Ultimately, the event surface and API was Huntr's real value rather than the UI.

Why Huntr won

  • Immediate activity instrumentation
    • Webhooks allowed Zapier compatibility
    • Ability to push events into Salesforce + Slack
  • Fast org-wide rollout without engineering dependency

Tradeoff Not built for enterprise-scale forecasting out of the box, so we needed to build out our own data logic for the decision intelligence.

Buy #2: Evaluate Alternatives (e.g. Placement.com, TealHQ)

We explored alternatives that boasted:

  • Stronger coach/admin tooling
  • More direct job matching + job board integrations
  • More customized reporting
  • Cleaner UIs

Why these tools lost

Ultimately: slower rollout potential, much less control

  • Less mature event routing capability
  • Reporting required custom builds

Huntr wasn't perfect, but it was the fastest path to operational leverage followed by instrumentation. Remember: we prioritized impact now, while collecting data to be structured in the future.

Step 4: Roll out, document, implement

Rollout Strategy

  • Coordinated consent retrieval with Legal before onboarding
  • Phased invitation waves to avoid overwhelming coaches
  • Identified early adopters to demonstrate value
  • Standardized board templates to reduce noise

Change Management

With coaches overloaded, we:

  • Met them in Slack instead of asking them to learn new dashboards
    • Kept cohort-level analysis to org (and eventually, executive) leadership early on
  • Routed interview events to public channels but tagged coaches for visibilty
  • Immediately reduced manual spreadsheet auditing

Grad motivation was uneven:

  • New grads adopted quickly
  • Many spreadsheet-native grads resisted
  • Success stories from early adoption created internal evangelism

Documentation

  • Created workflow diagrams so team could troubleshoot independently
  • Clarified how Huntr interacted with Zapier, Salesforce, Typeform, and Slack
  • Reduced reliance on single technical owner

Result:

The system became operational, iterable infrastracture rather than an experiment.

Step 5: Build decision logic

Rollout enabled immediate event visibility. Shortly after, we shifted from monitorning to reasoning.

Defining the "Healthy Jobseeker"

We went from anecdotes to structured criteria:

  • Consistent application volume
  • Interview conversion activity
  • Timelines-to-hire from graduation
  • Compliance thresholds for tuition-refund guarantee eligibility

Segment and Diagnose

We surfaced the following patterns:

  • High applications -> low interview volume
    • Revealed broader skills gaps that broke down into what we could fix and where
  • Low applications -> low interviews
    • Revealed motivation or coaching gaps
  • Strong early interview volume

This produced differentiated playbooks vs. either ad-hoc, completely customized OR one-size-fits-all coaching.

Forecast and Attribute

  • Establisehd baseline hired-rate trajectories by cohort
  • Built targets and compared actuals against expected paths
  • Separated:
    • Operational improvements
    • Experiments
    • Structural constraints

This shifted conversations with execs from "Are we hitting the number?" to "Which lever is moving the number?"

Impact

The data eventually supported:

  • Funding program-level interventions
  • Remediation for stagnant cohorts
  • Course-level overhauls innspired by skill gap patterns