Data Platform Engineer
About this role
Techwave, we are always in an exercise to foster a culture of growth, and inclusivity. We ensure whoever is associated with the brand is being challenged at every step and is provided with all the necessary opportunities to excel in life. People are at the core of everything we do.
Who are we?
Techwave is a leading global IT and engineering services and solutions company revolutionizing digital transformations. We believe in enabling clients to maximize the potential and achieve a greater market with a wide array of technology services, including, but not limited to, Enterprise Resource Planning, Application Development, Analytics, Digital, and the Internet of things (IoT).
Founded in 2004, headquartered in Houston, TX, USA, Techwave leverages its expertise in Digital Transformation, Enterprise Applications, and Engineering Services to enable businesses accelerate their growth.
Plus, we're a team of dreamers and doers who are pushing the boundaries of what's possible.
And we want YOU to be a part of it.
Job Description
Role: Data Platform Engineer – Senior Associate (Data & Analytics | Data Operations)
Experience: 5+ Years of experience
Location: Hyderabad
Summary:
• Tech’s Data & Analytics function is dedicated to designing and delivering robust global data platforms that enable business solutions and high-quality analytics. The Data Platform Engineer - Senior Associate is responsible for the reliability, scalability, and operational excellence of the enterprise data platform and its services. This role focuses on production support, platform reliability improvements, automation, and standardization, enabling scalable delivery and maintenance of trusted data products aligned to the Data & Analytics strategy.
Responsibilities
• Design and implement scalable ingestion, orchestration, compute, and storage patterns for enterprise analytics and data products.
• Build and maintain CI/CD for data platform assets and pipeline deployments; develop reusable templates/modules to accelerate onboarding and delivery. Maintain the Information Architecture (IA).
• Drive production reliability through SRE-aligned practices: monitoring, alerting, SLOs/SLIs, error budgets, automated remediation, and runbooks.
• Implement data quality and data observability patterns and integrate with enterprise monitoring.
• Partner with security and governance stakeholders to embed access, encryption, key management, and masking/tokenization patterns where required. Maintain, enforce and automate role-based access control (RBAC). Support user onboarding requests. Respond to and remediate security incidents. Provide evidence for audits. Ensure business continuity processes are documented and reviewed.
• Optimize cost and performance for warehouses, queries and data loads. Recommend and deploy schema changes (tables, views, materialized views, etc.). Identify and analyze pipeline failures and performance issues, recommend and deploy query tuning and optimization.
• Regular management of service requests, change requests, and incident management. Major Incident Management (MIM) including enhanced response times, support, Root Cause Analysis (RCA) and remediation.
• Weekly operational reporting on service request status, incident management, RCA tickets, platform improvements and platform consumption (FinOps) monitoring.
• Mentor associate engineers; contribute to standards and documentation.
Skills
• Strong platform engineering/operational background with 5+ years of relevant experience.
• Hands-on Snowflake experience is required.
• Proficiency in Python and SQL
• Experience with ETL/ELT tools and orchestration (Informatica, DBT).
• Proven experience with API-based data ingestion (REST APIs, JSON/XML).
• Experience implementing and maintaining CI/CD, IA, and environment management for data platforms.
• Strong operational discipline: incident management, problem management, and release/change practices.
• Familiarity with cloud data platforms (Azure, AWS).
• Strong understanding of data lifecycle.
• Agile/Scrum delivery exposure.
Education / Professional Experience/ Qualifications
• Bachelor’s degree in information technology, Computer Science, Engineering, or related discipline.
• Experience with lakehouse and/or modern warehouse platforms; streaming platforms; data catalog/lineage.
• Experience with enterprise authentication/authorization patterns, RBAC, secrets management.
• Exposure to FinOps practices for platform consumption.