Data Engineer

HORIZON COMPUTER MANAGEMENT PTE. LTD.


Date: 2 days ago
Area: Singapore, Singapore
Salary: SGD 9,500 - SGD 12,500 per month
Contract type: Full time

Key Responsibilities

  • Data Pipeline Development:
    Design, build, and maintain scalable and reliable data pipelines to support business needs.
    Automate data extraction, transformation, and loading (ETL/ELT) processes.
  • Database and Storage Management:
    Develop and optimize data storage solutions, including relational and NoSQL databases.
    Ensure data integrity, security, and accessibility across platforms.
  • Data Integration:
    Integrate data from multiple sources, including APIs, databases, and third-party systems.
    Collaborate with data analysts, scientists, and stakeholders to understand data requirements.
  • Performance Optimization:
    Monitor and improve the performance of data pipelines and systems.
    Address data quality issues and implement robust data validation processes.
  • Cloud and Big Data Technologies:
    Utilize cloud platforms (e.g., AWS, Azure, GCP) for scalable data processing.
    Implement and manage big data technologies like Hadoop, Spark, or Kafka.
  • Collaboration & Documentation:
    Work closely with cross-functional teams to align data infrastructure with business goals.
    Document data workflows, schemas, and processes for seamless knowledge sharing.

Key Requirements

  • Education:
    Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field.
  • Experience:
    At least 10 years of experience in data engineering or a related field.
    Proven expertise in designing and managing complex data architectures.
  • Technical Skills:
    Proficiency in SQL, Python, and Java/Scala for data manipulation and pipeline development.
    Experience with ETL tools (e.g., Apache NiFi, Talend, Informatica).
    Strong understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery).
    Familiarity with big data frameworks like Hadoop, Spark, or Kafka.
    Hands-on experience with cloud platforms such as AWS (Glue, Redshift), GCP (BigQuery, Dataflow), or Azure (Data Factory, Synapse).
  • Certifications:
    Relevant certifications in cloud data engineering (e.g., AWS Certified Data Analytics, Google Professional Data Engineer) are a plus.
  • Soft Skills:
    Strong problem-solving and analytical thinking abilities.
    Excellent communication and collaboration skills.
    Ability to work independently and in a team-oriented environment.
Post a CV