Description:
Role overview
You will:
- Design and maintain scalable, secure, high-performance data architecture
- Build robust ETL/ELT pipelines for batch and streaming data
- Enable analytics, BI, and AI workloads through reliable data platforms
- Ensure regulatory compliance and data governance for sensitive financial data
Duties:
- Design and maintain scalable, secure, and high-performance data architectures
Requirements:
5+ years of experience in data engineering, data architecture Strong proficiency in SQL and data modelling (dimensional and/or normalized Experience building data pipelines using tools such as Airflow, dbt, Spark, or similar Strong programming skills in Python, Scala, or Java Hands-on experience with cloud data services (BigQuery, Redshift, Snowflake, Databricks) Experience with both batch and streaming data systems (Kafka, Kinesis, Pub/Sub) Solid understanding of data security, privacy, and governance principles Experience in fintech, banking, payments, or financial services Familiarity with regulatory frameworks (PCI DSS, SOC 2, GDPR) Experience supporting analytics, BI, and machine learning workloads Knowledge of event-driven and real-time data architectures Prior experience leading architectural decisions or data platform migrations
26 Jan 2026;
from:
gumtree.co.za