Perik.ai See who’s hiring. Apply before everyone else.
← Back to all jobs

Data Engineer

Rocketpartners
📍 Makati 📅 Posted April 30, 2026
Apply on Rocketpartners’s website →

About this role

Responsibilities

•
Design and implement scalable data models supporting operational and analytical reporting, including Bronze, Silver, and Gold data layers in line with Medallion architecture

•
Apply dimensional modelling techniques (Star/Snowflake schemas) to support business intelligence requirements and own semantic models for end-user reporting tools

•
Build, optimise, and maintain ETL/ELT pipelines ingesting data from multiple source systems (ERP, CRM, supply chain systems)

•
Engineer robust processes for transforming complex and semi-structured data (including JSON) into structured datasets

•
Develop logic to parse, cleanse, and standardise raw data into reliable, analytics-ready formats; monitor and validate data quality across all pipeline stages

•
Implement data validation, reconciliation, and error-handling mechanisms

•
Support migration from legacy systems to modern cloud-based platforms (e.g. Microsoft Fabric or similar) and optimise SQL queries and data refresh processes

•
Work closely with BI developers, analysts, and business stakeholders to understand reporting and data needs, translating requirements into technical data solutions

•
Contribute to data governance practices including documentation, standards, and lineage tracking

Requirements

•
Minimum 5+ years' experience in data engineering or a similar role

•
Proven experience building and maintaining ETL/ELT pipelines in a commercial environment

•
Experience working with large-scale datasets, ideally in retail, wholesale, distribution, or supply chain environments

•
Demonstrated experience working with cloud-based data platforms

•
Advanced proficiency in SQL for data transformation and performance tuning

•
Strong experience with data transformation tools (e.g. Power Query, dbt, or similar)

•
Proficiency in Python for data processing and automation

•
Experience with Microsoft Fabric or similar platforms (Azure Synapse, Databricks, Snowflake)

•
Strong understanding of Medallion architecture and data lakehouse concepts

•
Experience working with semi-structured data formats (e.g. JSON, XML)

•
Knowledge of data modelling techniques (Star/Snowflake schemas)

•
Experience with Power BI or similar BI tools

•
Strong problem-solving, analytical, and communication skills with the ability to translate technical concepts to non-technical stakeholders

•
Bachelor's degree in Computer Science, Information Technology, Data Engineering, or related field preferred; relevant cloud certifications highly regarded

•
Coachable mindset—eager to validate AI-generated code and understand when to seek human guidance on technical decisions

What We Offer

•
Mentorship and training opportunities

•
Competitive salary

•
Opportunity to work with cloud-first technologies and methodologies

•
Career growth in a fast-paced, innovative company

This listing was aggregated by Perik.ai from Rocketpartners’s public job board. Click the button above to view the full job description and apply directly.
Explore more jobs
More from Rocketpartners Browse all AI & tech jobs

Perik.ai is an AI & tech job board that aggregates the latest openings from top companies — updated daily so you can apply before everyone else.

About FAQ Privacy Policy Terms of Service Contact