Our clients reserves the right not to make an appointment. In considering candidates for appointment into advertised posts, preference will be accorded to persons from a designated group in accordance with the approved Employment Equity Plan.

Senior Data Engineer - Contract - Onsite (PDG2000483502)

Overview

Reference
PDG2000483502

Salary
ZAR600 - ZAR695/hour

Job Location
- South Africa -- City of Cape Town -- Cape Town

Job Type
Contract

Posted
02 July 2025

Closing date
02 Aug 2025 21:59


Our Client a Global tech firm is seeking 2 Senior Data Engineers to join their team in Cape town (onsite) on a contract basis.

Summary

We are seeking a highly experienced Senior Data Engineer to lead the design, development, and optimization of data pipelines, APIs, and data platforms. This role will focus on ETL/ELT processes using Matillion and Snowflake, API development, and integration with machine learning workflows and Databricks. The ideal candidate will have a strong background in data engineering, cloud platforms, and modern data architecture.

Responsibilities

  • Design, build, and maintain ETL/ELT pipelines using Matillion and Snowflake.
  • Develop and manage RESTful APIs for data access and integration.
  • Collaborate with data scientists and ML engineers to integrate data pipelines with machine learning workflows.
  • Optimize Snowflake data warehouse performance and manage data models.
  • Implement data quality, governance, and security best practices.
  • Work with Databricks for data processing, transformation, and ML model support.
  • Automate data validation and reconciliation processes.
  • Document data architecture, pipelines, and integration processes.

Qualifications

  • Bachelors or Masters degree in Computer Science, Engineering, or a related field.
  • 8+ years of experience in a data modeling, data warehousing, and data integration role.

Required Skills

  • Strong experience with Matillion ETL and Snowflake.
  • Proficiency in Databricks
  • Experience in data management requirements, integration platforms, and have expertise with APIs.
  • Experience with API development using frameworks .
  • Familiarity with Databricks and integrating Machine Learning workflows.
  • Solid understanding of data warehousing, data modeling, and ELT best practices.
  • Experience with CI/CD pipelines, version control (Git), and DevOps practices.
  • Strong proficiency in SQL and at least one programming language (e.g., Python, Java, or Scala).
  • Experience with data modeling, data warehousing, and data integration.
  • Familiarity with cloud platforms and data services.
  • Understanding of data governance, security, and compliance requirements


Contact information

Ashley Singh