Description:
What Youll Be Doing
Creating large, complex data sets that meet both functional and non-functional business requirements Designing, developing, and maintaining scalable ETL pipelines and data workflows Building reliable infrastructure to extract, transform, and load data from diverse sources often Cloud-based Identifying opportunities to automate, optimise, and improve internal data processes Developing tools and solutions that turn data into actionable business insights Ensuring data quality, consistency, integrity, and security across environments Driving best practices in data engineering, performance, and documentationSkills & Experience You Bring
Bachelors Degree in Computer Science, Engineering, Mathematics, or similar 57 years experience in data engineering, database management, or a related role Strong programming capability in Python or Scala Advanced SQL skills and experience with relational databases (MSSQL, MySQL, PostgreSQL, etc.) Exposure to NoSQL technologies Proven experience in performance tuning and database optimisation Solid understanding of data integration concepts and practices Exposure to BI tools (Power BI, Yellowfin, etc.) Experience with MS SQL Replication and data archiving strategies is beneficial Experience with cloud technologies (AWS, Azure, GCP) and tools like S3, Lambda, Redshift, BigQuery, or Snowflake Familiarity with big-data ecosystems such as Apache Spark, Databricks, or Hive Understanding of data modelling, warehousing concepts, and data governance Exposure to data cleansing and de-duplication strategiesBonus Points For
Experience with streaming platforms (Kafka, Spark Streaming, Flink) Knowledge of Docker and Kubernetes Understanding of CI/CD and infrastructure-as-code Exposure to machine learning workflows or MLOps
03 Jan 2026;
from:
gumtree.co.za