Description:
About the Role
We are seeking a highly skilled Senior Data Engineer to join our dynamic data team. You will be responsible for designing, building, and maintaining robust and scalable data pipelines and architectures that enable analytics and business intelligence at enterprise scale. This role is ideal for professionals who thrive in complex data environments, value clean, efficient, and secure data engineering practices, and enjoy collaborating with cross-functional teams to deliver high-quality, reliable data solutions.
Key Responsibilities
Data Architecture & Pipeline Development Design, develop, and maintain scalable data pipelines and ETL/ELT processes Implement data modeling and architecture solutions including Data Vault, dimensional modeling, and normalized structures Optimize data storage and retrieval to support analytics, reporting, and machine learning workloads Integrate structured and unstructured data from multiple sources, including cloud and on-premises systems Develop and maintain data warehouses, data lakes, and cloud-native data platforms Cloud & Big Data Engineering Work with cloud platforms such as AWS, Azure, or GCP for data storage, processing, and orchestration Implement and manage data processing frameworks such as Spark, Databricks, or Hadoop Develop scalable, automated, and reliable batch and real-time data workflows Ensure data pipelines meet performance, security, and compliance standards Collaboration & Process Improvement Collaborate with data analysts, data scientists, software engineers, and business stakeholders to understand requirements and deliver solutions Mentor junior and intermediate data engineers, promoting best practices and code quality standards Participate in Agile/Scrum ceremonies, including sprint planning, stand-ups, and retrospectives Continuously research and recommend new tools, technologies, and approaches to improve data engineering efficiency, reliability, and scalability
Requirements & Qualifications
Must-Have Skills:
6+ years of professional experience in data engineering or software engineering with a focus on data Strong expertise in SQL, relational databases (PostgreSQL, MSSQL, MySQL), and NoSQL databases (MongoDB, Cassandra, DynamoDB) Experience with Data Vault modeling and implementation Hands-on experience with ETL/ELT tools and processes Proficiency in programming/scripting languages such as Python, Scala, or Java Experience building data pipelines and architectures in cloud platforms (AWS, Azure, GCP) Strong understanding of data warehousing, data lakes, and clo
26 Nov 2025;
from:
gumtree.co.za