Cloud Data Engineer
TECH AALTO PTE. LTD.

Job Role: Cloud Data Engineer – GCP Migration & Transformation Specialist
Experience Level: 8+ years (with recent GCP project experience)
Job Summary:
We are looking for a skilled Data Engineer with strong experience in Google Cloud Platform (GCP) and enterprise data migrations. The ideal candidate will have a deep understanding of the GCP data landscape, Medallion architecture, and hands-on experience in migrating data models and ETL pipelines from on-premise platforms such as Teradata and Cloudera to GCP BigQuery.
Key Responsibilities:
- Design and implement scalable data pipelines using GCP BigQuery and Cloud Composer , following Medallion architecture principles.
- Lead data model and data migrations from on-premise platforms like Teradata and Cloudera to BigQuery , ensuring accuracy and performance.
- Develop BigQuery SQL-based transformations and standard ELT conversion patterns (e.g., pushdown optimization, partitioning strategies, staging layer design).
- Map legacy Informatica workflows and Teradata BTEQ scripts to BigQuery SQL -based ELT solutions.
- Migrate Control-M workflows to GCP Cloud Composer , ensuring orchestration compatibility and performance.
- Design modular and reusable BigQuery pipeline patterns applicable across multiple use cases.
- Collaborate with reporting teams to build dashboards and reports, ensuring BigQuery data type compatibility with tools such as QlikSense .
- Troubleshoot data pipeline issues, ensuring timely resolution and optimal system performance.
Required Skills & Experience:
- Strong working knowledge of GCP services , including BigQuery , Cloud Composer , and the overall GCP data ecosystem .
- Experience with Medallion architecture in a cloud-based data platform.
- Hands-on experience with data migration and transformation projects from Teradata , Cloudera , or similar on-premise systems to GCP .
- Proficiency in BigQuery SQL for advanced data transformation tasks.
- Familiarity with standard data engineering patterns such as partitioning , staging , modularity , and pipeline reusability .
- Experience in converting legacy Informatica and BTEQ workflows to cloud-native solutions.
- Experience with QlikSense or similar BI tools, including handling data type compatibility with BigQuery.
- Strong debugging and issue resolution skills in a cloud-based data environment.
- Experience with data governance , data quality checks , and performance tuning in BigQuery.
- Exposure to CI/CD practices and infrastructure as code (IaC) for pipeline deployments.
When you apply, you voluntarily consent to the disclosure, collection and use of your personal data for employment/recruitment and related purposes in accordance with the Tech Aalto Privacy Policy, a copy of which is published at Tech Aalto’s website ( https://www.techaalto.com/privacy/)
Confidentiality is assured, and only shortlisted candidates will be notified for interviews.
See more jobs in Singapore