Where

Data Engineer

Virgin Active South Africa Pty Ltd
Cape Town Full-day Full-time

Description:

Your Purpose...
  • Is to “ change people’s lives for the better through wellness ”;
  • We deliver social wellness experiences that enable people to meet their personal wellness goals, through holistic physical, mental nutritional and social wellbeing.
  • This role focuses on designing, building, and optimizing scalable cloud-based data solutions using Microsoft Azure technologies. You will be responsible for batch and real-time data ingestion, transformation, and integration to support analytics, reporting, and advanced data use cases.
  • You will work closely with business stakeholders, data analysts, and platform teams to deliver reliable, secure, and high-performing data pipelines.
Your Duties and Responsibilities...

Data Engineering & Pipeline Development

  • Design, develop, and maintain ETL/ELT pipelines using Azure Data Factory (ADF).
  • Build and optimize large-scale data processing solutions using Azure Databricks (Spark).
  • Implement real-time data ingestion using Azure Event Hubs.
  • Develop and maintain scalable data models for analytics and reporting.
  • Perform data transformation, cleansing, and enrichment processes.

Cloud and Data Platform Engineering

  • Support and enhance Azure-based data lake and data warehouse architectures.
  • Optimize data storage, partitioning, and performance strategies.
  • Ensure high availability, scalability, and cost-efficiency of data solutions.
  • Automate workflows and support CI/CD for data pipelines.

Data Integration & Streaming

  • Integrate structured and unstructured data from multiple enterprise systems.
  • Design solutions for both batch and streaming data pipelines.
  • Collaborate with integration teams on event-driven architectures.

Data Quality, Governance and Security

  • Implement data validation, monitoring, and reconciliation processes.
  • Apply data governance and security best practices across Azure services.
  • Document data lineage, transformations, and architecture components.

Collaboration and Delivery

  • Translate business requirements into scalable technical solutions.
  • Partner with analytics and BI teams to deliver trusted datasets.
  • Participate in agile delivery cycles and code reviews.
Our Minimum Requirements...

We can’t live without…

  • Bachelor’s degree in Computer Science, Engineering, Data Science, or related field.
  • 3–5 years of hands-on experience in data engineering roles.
  • Strong expertise in:
    • Azure Data Factory
    • Azure Databricks
    • Azure Event Hubs
    • SQL (advanced level)
    • Python (preferred)
  • Experience designing data lakes and data warehouse solutions.
  • Strong understanding of ETL/ELT design patterns.
  • Experience working with Azure cloud services and security models.
  • Knowledge of data modeling (dimensional and normalized models).
  • Experience with Delta Lake and Spark optimization.
  • Familiarity with DevOps practices and CI/CD pipelines.
  • Exposure to event-driven architecture concepts.
  • Strong troubleshooting and performance tuning skills.
  • Excellent communication and stakeholder engagement abilities.

Core Interpersonal Skills:

  • Emotional Intelligence- proven ability to anticipate the needs of others before they are voiced
  • Active Listening- Able to genuinely hear concerns with the ability to defuse tense situations and ensuring members and staff feel heard and valued
  • Cultural Awareness and Fluency- understand international norms, etiquette and diverse backgrounds to provide a welcoming environment for our international and VIP members
  • Empathy – be able to understand, or and be aware of, someone else’s feelings especially during stressful situations, such as travel delays or billing disputes

Operational & leadership Skills Required:

  • Attention to Detail, noticing the small things
  • Resilience and Composure, be able to remain calm and optimistic under extreme pressure, during peak or periods or system failures
  • Problem solving and conflict resolution – be able to turn challenges into opportunities by thinking and acting quickly to resolve issues before they impact a member’s experience
  • Adaptability: be flexible to handle unpredictable shifts in processers, resource changes, last minute VIP guest arrivals, or sudden changes in event plans.
  • Relatability – the capacity to connect with diverse stakeholders across all backgrounds
  • Situational Awareness – constantly monitoring the environment to identify members who need support or help
  • Curiosity- the ability to seek out and transform standard transactions into personalised stories and memorable experiences
  • Time Management – the ability to efficiently meet stringent deadlines while maintaining a relaxed and unhurried demeanor for guests.
  • Digital Fluency – must be proficient in new age technology and systems


Requirements:

  • Is to “ change people’s lives for the better through wellness ”;
  • We deliver social wellness experiences that enable people to meet their personal wellness goals, through holistic physical, mental nutritional and social wellbeing.
  • This role focuses on designing, building, and optimizing scalable cloud-based data solutions using Microsoft Azure technologies. You will be responsible for batch and real-time data ingestion, transformation, and integration to support analytics, reporting, and advanced data use cases.
  • You will work closely with business stakeholders, data analysts, and platform teams to deliver reliable, secure, and high-performing data pipelines.
  • Design, develop, and maintain ETL/ELT pipelines using Azure Data Factory (ADF).
  • Build and optimize large-scale data processing solutions using Azure Databricks (Spark).
  • Implement real-time data ingestion using Azure Event Hubs.
  • Develop and maintain scalable data models for analytics and reporting.
  • Perform data transformation, cleansing, and enrichment processes.
  • Support and enhance Azure-based data lake and data warehouse architectures.
  • Optimize data storage, partitioning, and performance strategies.
  • Ensure high availability, scalability, and cost-efficiency of data solutions.
  • Automate workflows and support CI/CD for data pipelines.
  • Integrate structured and unstructured data from multiple enterprise systems.
  • Design solutions for both batch and streaming data pipelines.
  • Collaborate with integration teams on event-driven architectures.
  • Implement data validation, monitoring, and reconciliation processes.
  • Apply data governance and security best practices across Azure services.
  • Document data lineage, transformations, and architecture components.
  • Translate business requirements into scalable technical solutions.
  • Partner with analytics and BI teams to deliver trusted datasets.
  • Participate in agile delivery cycles and code reviews.
  • Bachelor’s degree in Computer Science, Engineering, Data Science, or related field.
  • 3–5 years of hands-on experience in data engineering roles.
  • Strong expertise in:
    • Azure Data Factory
    • Azure Databricks
    • Azure Event Hubs
    • SQL (advanced level)
    • Python (preferred)
  • Experience designing data lakes and data warehouse solutions.
  • Strong understanding of ETL/ELT design patterns.
  • Experience working with Azure cloud services and security models.
  • Knowledge of data modeling (dimensional and normalized models).
  • Experience with Delta Lake and Spark optimization.
  • Familiarity with DevOps practices and CI/CD pipelines.
  • Exposure to event-driven architecture concepts.
  • Strong troubleshooting and performance tuning skills.
  • Excellent communication and stakeholder engagement abilities.
  • Azure Data Factory
  • Azure Databricks
  • Azure Event Hubs
  • SQL (advanced level)
  • Python (preferred)
  • Emotional Intelligence- proven ability to anticipate the needs of others before they are voiced
  • Active Listening- Able to genuinely hear concerns with the ability to defuse tense situations and ensuring members and staff feel heard and valued
  • Cultural Awareness and Fluency- understand international norms, etiquette and diverse backgrounds to provide a welcoming environment for our international and VIP members
  • Empathy – be able to understand, or and be aware of, someone else’s feelings especially during stressful situations, such as travel delays or billing disputes
  • Attention to Detail, noticing the small things
  • Resilience and Composure, be able to remain calm and optimistic under extreme pressure, during peak or periods or system failures
  • Problem solving and conflict resolution – be able to turn challenges into opportunities by thinking and acting quickly to resolve issues before they impact a member’s experience
  • Adaptability: be flexible to handle unpredictable shifts in processers, resource changes, last minute VIP guest arrivals, or sudden changes in event plans.
  • Relatability – the capacity to connect with diverse stakeholders across all backgrounds
  • Situational Awareness – constantly monitoring the environment to identify members who need support or help
  • Curiosity- the ability to seek out and transform standard transactions into personalised stories and memorable experiences
  • Time Management – the ability to efficiently meet stringent deadlines while maintaining a relaxed and unhurried demeanor for guests.
  • Digital Fluency – must be proficient in new age technology and systems
23 Feb 2026;   from: careers24.com

Similar jobs

  • Intelligent Debt Management
  • Cape Town
... experience as a data engineer. Proven experience with big data tools, particularly ... experience as a data engineer. Proven experience with big data tools, particularly ... experience as a data engineer. Proven experience with big data tools, particularly ...
18 days ago
  • Intelligent Debt Management
  • Cape Town
... experience as a data engineer. Proven experience with big data tools, particularly ... experience as a data engineer. Proven experience with big data tools, particularly ... experience as a data engineer. Proven experience with big data tools, particularly ...
18 days ago
  • Intelligent Debt Management
  • Cape Town
... experience as a data engineer. Proven experience with big data tools, particularly ... experience as a data engineer. Proven experience with big data tools, particularly ... experience as a data engineer. Proven experience with big data tools, particularly ...
18 days ago
  • Intelligent Debt Management
  • Cape Town
... experience as a data engineer. Proven experience with big data tools, particularly ... experience as a data engineer. Proven experience with big data tools, particularly ... experience as a data engineer. Proven experience with big data tools, particularly ...
18 days ago