Perik.ai See who’s hiring. Apply before everyone else.
← Back to all jobs

Data Intern

Ready
📍 Remote 📅 Posted May 8, 2026
Apply on Ready’s website →

About this role

We're looking for a Geospatial Data Engineering Intern to help build and scale our geospatial data infrastructure over the summer. This role is designed for a junior or senior undergraduate with at least one prior internship under their belt, strong data engineering instincts, and the team awareness to ship work in a shared production codebase. You'll get hands-on experience building pipelines that ingest, transform, and serve geospatial data with exposure to AI agent tooling along the way. This role begins as a full-time, 3-month summer internship and then continues part-time through September.

You'll work directly with our data team, contributing to operational infrastructure that powers geospatial analysis and decision-making across the organization. Your primary focus will be building reliable, well-documented data pipelines with a geospatial backbone, while getting meaningful exposure to applied AI systems and helping us complete an in-flight migration from Airflow 2 to Airflow 3.

About your role at Ready ⚡️

You’ll spend the majority of your time on geospatial data engineering, with supporting work in geospatial analysis and applied AI.

Geospatial Data Engineering (Primary Focus)

- Build and improve Airflow ELT pipelines that ingest, transform, and serve geospatial datasets at scale, working across both our Airflow 2 and Airflow 3 repositories and actively assisting with the Airflow 2 to 3 migration, including porting DAGs, validating parity, and helping retire legacy pipelines

- Write clean, type-hinted Python and well-structured SQL, including geospatial operations (PostGIS, spatial joins, CRS management) against Athena (Trino), PostgreSQL, Redshift and duckdb

- Develop modular dbt models with semantic layer definitions and documented business logic for geospatial tables

- Contribute to data quality systems, including schema validation, freshness monitoring, and spatial integrity checks

- Support DataHub adoption through schema documentation, lineage tracking, and metadata management for geospatial assets

- Triage failing DAG runs, read Airflow task logs, and own fixes end-to-end

- Communicate progress through documentation, code reviews, and regular updates

GeoData Science (Supporting Research)

- Contribute to research-oriented analyses such as tree canopy classification, network resiliency analysis, and spatial feature extraction

- Design and document reproducible analytical workflows that feed into production pipelines

- Translate complex geospatial methods into clear, accessible outputs for non-technical stakeholders

- Share learnings on emerging GeoAI methods and geospatial tooling with the team

AI Engineering (Applied Exposure)

- Assist with building data agents using tools like LangGraph, LangChain, or Bedrock Agent Core

- Support development and iteration on pipelines and text-to-SQL approaches for natural-language data access

- Contribute to MCP server development and agent evaluation as needed

- Document agent failure modes and help refine prompts based on feedback

A bit about you 🥇

- Currently a junior or senior undergraduate (or higher) in Computer Science, Data Science, GIS, Geospatial Engineering, Software Engineering, or a related field

- At least one prior internship (or equivalent team-based engineering experience); you've shipped code in a shared repo, taken a code review, and worked a ticket end-to-end

- Available to work full-time for 3 months during the summer, then part-time through the fall semester

- Strong fundamentals in Python, including classes, inheritance, decorators, type hints, and explicit imports

- Strong fundamentals in SQL: joins, CTEs, window functions, and aggregations

- Comfortable working in Git/GitHub with a dev → main PR-to-deploy workflow

- Comfortable on the Unix command line or eager to learn (bash, navigating a filesystem, running scripts)

- Familiarity with geospatial concepts (CRS, spatial joins, indexing) and tooling such as PostGIS, GeoPandas, or QGIS is a plus

- Exposure to AWS or similar cloud providers; ideally S3, IAM, Athena, Glue, ECS, or Redshift

- Experience with Airflow or similar orchestration tools is a plus (or strong eagerness to learn quickly. You'll be ramping on two versions in parallel and contributing directly to our migration effort)

- Familiarity with dbt, Pandas, or Parquet/columnar data is a plus

- Exposure to AI agent architectures (e.g., ReAct) and protocols (A2A, MCP, AG-UI) is a plus

About Ready 🚀

- Creative problem solvers approaching a legacy industry with a new point of view

- Humble but ambitious, knowledgeable but curious, persistent but not obnoxious

- Concise and effective in written and spoken communication

- Comfortable working remotely

- One team, one dream

About what you get…

- Competitive hourly wage - $35-$40 per hour

- 100% remote work from home

- Opportunities to learn and grow – all things startups

- A chance to play a role in defining the roadmap as we pursue a bold vision and and a big goal

- Work from (almost) anywhere. Ready is a remote-first company, but for security and compliance reasons, employees are not permitted to work from China (excluding Hong Kong, Macau, and Taiwan), Russia, Iran, or North Korea. These restrictions are in place to protect our systems, data, and intellectual property.

- To get away - we all convene 1-2x a year for [optional, encouraged] retreats

- The charter to realize a market that is set to receive $65 billion in grant funding across the United States

This listing was aggregated by Perik.ai from Ready’s public job board. Click the button above to view the full job description and apply directly.
Explore more jobs
More from Ready Browse all AI & tech jobs

Perik.ai is an AI & tech job board that aggregates the latest openings from top companies — updated daily so you can apply before everyone else.

About FAQ Privacy Policy Terms of Service Contact