Solutions Applied Data Scientist, Healthcare
About this role
Company Overview:
We are building Protege to solve the biggest unmet need in AI — getting access to the right training data. The process today is time intensive, incredibly expensive, and often ends in failure. The Protege platform facilitates the secure, efficient, and privacy-centric exchange of AI training data.
Solving AI’s data problem is a generational opportunity. We’re backed by world-class investors and already powering partnerships with some of the most ambitious teams in AI. The company that succeeds will be one of the largest in AI — and in tech.
We’re a lean, fast-moving, high-trust team of builders who are obsessed with velocity and impact. Our culture is built for people who thrive on ambiguity, own outcomes, and want to shape the future of data and AI.
ROLE OVERVIEW
We are hiring a Solutions Applied Data Scientist to help design, construct, and validate complex healthcare data cohorts used for AI model training. This role sits within the delivery organization, working closely with Solutions Leads and delivery engineers to solve complex data challenges that arise during customer projects.
- Solutions Leads own the customer relationship and overall delivery of projects. The Solutions Applied Data Scientist serves as their technical partner for more complex data problems, including cohort construction, multi-source dataset assembly, feasibility analysis, and data validation.
- You will help translate research generated by Protege’s Data Lab and customer requirements into practical dataset definitions, determine whether those requirements can be met with available data, and build the SQL and analysis needed to construct the resulting datasets.
- You will also collaborate with delivery engineers when solutions require changes to data pipelines, infrastructure, or large-scale data movement.
- This is a highly applied role focused on solving real-world dataset challenges, not research or model development.
The ideal candidate is someone who enjoys solving messy real-world data problems, working directly with large healthcare datasets, writing complex SQL and collaborating closely with cross-functional teams. Our environment has a lot going on as we grow - so we’re looking for someone energized by and excited by the fast pace of the industry and our company!
What You'll Do
Technical Escalation & Delivery Collaboration
During delivery projects, Solutions Leads may encounter complex data challenges that require deeper analysis or technical problem-solving. You will act as a technical partner, helping solve things such as:
- Complex cohort definitions that require multi-source joins
- Linking datasets across different data partners
- Investigating unexpected gaps or anomalies in delivered data
- Evaluating whether requested variables or labels exist in available datasets
- Determining whether a dataset can realistically satisfy model requirements
You will work collaboratively with Solutions Leads to unblock delivery challenges while keeping projects moving toward successful completion.
When solutions require infrastructure or pipeline changes, you will partner with the Solutions Engineer and internal platform engineering teams to implement the required workflows.
Cohort Definition & Dataset Construction
Work with Solutions Leads to translate customer requirements into concrete dataset logic. You will help ensure that datasets accurately represent the intended population and meet customer specifications.
Responsibilities include:
- Writing complex SQL queries to construct cohorts
- Implementing inclusion and exclusion logic
- Joining datasets across multiple data sources
- Validating linkage between datasets
- Identifying and resolving inconsistencies or missing fields
- Partner with Solutions Leads to resolve complex data questions that arise during project delivery
- Escalate or collaborate with delivery engineers when dataset construction requires pipeline changes or large-scale data processing
Data Quality Validation & Completeness Analysis
Before complex datasets are delivered to customers you will help validate that they meet required standards. You will work closely with Solutions Leads before datasets are delivered to ensure that the datasets meet agreed acceptance criteria. Review bespoke QA methodology and suggest platform improvements to Product and Engineering to decrease custom work across engagements.
Responsibilities include:
- Performing data completeness analysis
- Investigating missing or anomalous data
- Verifying cohort logic results
- Validating row counts and dataset structure
- Creating summary statistics and validation outputs
Data Feasibility
Many customer projects involve AI researchers who are defining the healthcare datasets required to train or evaluate models. You will work with these customer teams to translate research goals into practical dataset specifications.
Responsibilities include:
- Reviewing dataset requests from AI researchers and model development teams
- Helping clarify and refine requirements for model training or evaluation datasets
- Evaluating whether requested variables or labels exist in available data sources
- Identifying proxy variables or alternative dataset structures when ideal variables are unavailable
- Assessing feasibility of requested cohort definitions given real-world data constraints
- Explaining data limitations, tradeoffs, and potential biases to technical stakeholders
- Iterating with researchers to converge on datasets that are both scientifically meaningful and operationally feasible
This role requires someone who is comfortable engaging with technically sophisticated stakeholders while grounding conversations in the realities of messy, real-world data.
Data Partner & Source Data Analysis
Many datasets originate from external healthcare data partners.
You will help analyze partner datasets to:
- understand schema and field availability
- assess data quality and completeness
- identify required transformations
- evaluate feasibility of cohort logic
This work helps ensure that projects are grounded in what data actually exists.
Delivery Tooling & Workflow Improvements
As delivery patterns emerge, you will help develop tools and reusable workflows that improve efficiency.
Examples include:
- reusable SQL templates for cohort construction
- automated validation checks
- scripts for dataset preparation
- tools that reduce manual delivery work
This role is an important bridge between manual dataset delivery and scalable data infrastructure.
What Success Looks Like
30 days: Learn the delivery motion and source-data reality. Build working knowledge of Solutions workflows, healthcare data partners, common cohort patterns, and how complex requests get escalated. Shadow active projects, understand existing QA approaches, and start contributing to scoped feasibility and validation work.
60 days: Own scoped technical escalations and create early leverage
Independently support complex cohort-definition and dataset-construction work, write and validate SQL / Python workflows, and help Solutions Leads answer hard feasibility questions with clear tradeoffs.
90 days: Become a trusted technical partner across delivery
Handle the hardest dataset problems with limited oversight, improve QA and repeatability, and propose workflow or platform improvements that reduce bespoke work across engagements.
What You Bring
- Experience working with large structured healthcare datasets
- Strong SQL and python skills and experience writing complex queries
- Experience using Claude Code / Codex
- Experience joining and transforming large datasets
- Experience performing data validation and exploratory analysis
- Strong Python skills for data analysis and scripting
- Experience working with structured file formats (CSV, Parquet, etc.)
- Ability to translate ambiguous requirements into concrete data logic
- Strong communication skills and ability to collaborate with technical and non-technical stakeholders
Protege Values
1. We pass the loved ones' test — integrity isn't negotiable, even when it's costly
2. We always find a way — obstacles are expected, giving up isn't
3. We go fast and grow fast — velocity is a competitive advantage and we treat it that way
4. We practice kindness and candor — hard conversations happen here, and they happen with care
5. We deliver together — no silos, no lone heroes, no passengers
6. We own the outcome — full accountability, continuous improvement, mastery over time