Description:
Data Engineer (5 Positions) Gauteng, Johannesburg Metro Hybrid 12-Month ContractJob Overview
This role requires experienced Data Engineers to join a leading organisation within the banking and finance sector on a 12-month contract basis. The positions are based in Johannesburg, Gauteng, with a hybrid working arrangement. Candidates must possess strong expertise in Hadoop, Informatica, and Oracle technologies. The successful individuals will work full-time hours however the service includes a defined standby/on-call component to guarantee rapid response and continuity during critical incidents or outside core support hours. Contributing to the development, optimisation, and maintenance of data infrastructure to support business intelligence and analytics initiatives.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines and workflows using Hadoop ecosystem tools and Informatica.
- Manage and optimise Oracle databases to ensure data integrity, availability, and performance.
- Collaborate with business analysts and data scientists to understand data requirements and translate them into technical solutions.
- Implement data ingestion, transformation, and validation processes to support data warehousing and analytics activities.
- Ensure data quality, consistency, and accuracy through thorough testing and validation procedures.
- Monitor and troubleshoot data processing workflows and resolve technical issues promptly.
- Document data architecture, pipeline designs, and operational processes clearly and comprehensively.
- Adhere to data governance policies, security standards, and compliance requirements relevant to the banking and finance industry.
- Support continuous improvement initiatives by identifying opportunities to optimise data handling and processing.
- Participate in agile teams and contribute to sprint planning, reviews, and retrospectives.
Required Qualifications
- A recognised bachelors degree in Computer Science, Information Technology, Engineering, or a related discipline.
- Professional certifications in Hadoop, Informatica, Oracle, or related technologies are advantageous or Oracle OCP
Experience
- Minimum of three years experience working as a Data Engineer or in a similar role within the banking and finance sector or comparable industries.
- Proven experience in developing and managing data pipelines using Hadoop ecosystem components, including HDFS, MapReduce, Hive, or Spark.
- Strong background in Informatica PowerCenter or equivalent ETL tools.
- Demonstrable experience in Oracle database management, including performance tuning and query optimisation.
- Experience working in a hybrid or agile working environment is desirable.
Knowledge and Skills
- In-depth knowledge of big data technologies and data pipeline arch
- In-depth knowledge of big data technologies and data pipeline arch