Posted on: 11 September 2023
ID 885575

Senior Data Engineer

Job Title: Senior Data Engineer
Reporting to: BI Team Lead

Want to be part of a team thats pioneering in our industry? Then look no further. Our mission is to unlock the power of our data by finding innovative ways to analyse, organise, integrate and visualise information.

By doing this we empower leaders across the business to make decisions quickly and confidently. Youll have access to cutting edge technologies and techniques and be part of a wider team of high performing analysts, data scientists, product managers and marketers.
Excited by the power of data? You could be just who we want on our team.

Purpose of the Role:

This role requires a motivated, energetic, inquisitive, and highly numerate person who has a solid technical base with an appreciation of the value that can be derived from well-designed data systems.
The role will be responsible for building and maintaining highly performant data pipelines that facilitate data management and data analytics solutions against very large data sets.

The ideal candidate is a very experienced and highly technical data delivery specialist who enjoys optimising data systems and building them from the ground up. The Data Engineer role will be responsible for ingesting and maintaining large data sets in an optimal manner that is aligned to the reference architecture and solution design provided by the solution architects. The person must be self-managed and comfortable supporting the data needs of multiple teams, systems, and products. The right candidate will be excited by the prospect of playing a key role in modernising our data processing systems.

Duties include, but not limited to:

Work as part of a multi-disciplinary data focussed team.
Develop greenfield projects using the both our Azure cloud platform and on-prem setups.
Assemble large, complex data sets that meet functional and non-functional business requirements.
Develop and maintain operational and analytical data systems.
Create robust ELT/ETL products for batch, micro-batch and near real-time data pipelines.
Develop and release data components in a DataOps environment.
Follow test-driven development practices.
Demo work to both technical and non-technical stakeholders.
Create documentation and training material for the solutions being delivered.
Guide and mentor junior team members

This job description is not intended to be an exhaustive list of responsibilities. The job holder may be required to complete any other reasonable duties in order to achieve business objectives.

Essential Criteria:

Working knowledge of event-driven systems, message queuing, stream processing, and highly scalable big data pipelines and data stores (e.g., Confluent Cloud, Apache Kafka, Apache Flink, RabbitMQ, Azure Event Hubs, Kinesis etc.).
Skilled and experienced in the Azure or AWS cloud platforms.
Advanced working knowledge of SQL (DDL, DML, JSON, XML) and extensive experience in dealing with large datasets and managing incremental batch loading methodologies (CDC, CT, CDO).
Advanced understanding of relational data structures including keys, constraints, and triggers.
Performance tuning and optimization of RDBMS
Highly skilled and experienced in using relational or NoSQL database technologies (MS SQL Server, MongoDB, CosmosDB, etc.) in an environment with high data volumes and many Transactional systems.
Understand how to design and implement a conceptual, logical and physical data model that supports the needs of the organization.
Solid understanding and experience in data modelling, data management and governance methodologies
Good understanding of data-related frameworks, methodologies, and patterns
Experience in designing and developing ETL/ELT processes and pipelines for large data sets
Strong analytic skills related to working with structured, semi-structured and unstructured data sets
Proficiency in Python, Java, or Scala
Experienced in implementation of CICD pipelines through technologies such as GitLab, Azure DevOps etc
Experience deploying data systems in an Infrastructure-as-Code (IaC) manner, preferably using Terraform.
Experience supporting and working with cross-functional teams in a dynamic environment.
Communicate effectively with both technical and non-technical stakeholders.
Demonstrates consistent behavior aligned to the Combiined F and Organizational Culture

Desirable Criteria:

Implementing security, disaster recovery, high availability, auditing, monitoring and alerting solutions in Azure.
Understand costing and optimizing spending in cloud platforms such as Azure, AWS, GCP etc
Deep understanding of C# concepts and advanced programming techniques.
Understanding of advance software development principles and patterns such as SOLID
Deep understanding of Python concepts and advanced programming techniques.
Understanding of implementation of alternative data modelling theories such as Data Vault
Test/Business-Driven Development
Experience with MS Power BI, Tableau or any similar data visualization tool that enables self service data analysis

Person Specifications:

Adaptability
Ownership & Accountability
Initiating Action
Resilience
Team Orientation
Integrity
Innovative
Occupation:
Other jobs


This job offer is not active at the moment.
Apply for a job
You have already applied to this job position
Save ad
Transnet Bakkies Truck Tenders 078 203 6974
Jobin.co.za
Transnet Bakkies Truck Contract 078 203 6974

Transnet Bakkies Truck Contract 078 203 6974

Transnet
Pretoria / Tshwane
Jobin.co.za
Mine Jobs-kuruman Black Rock
Jobin.co.za
MAINTENANCE Jobs-0731732853

MAINTENANCE Jobs-0731732853

Black Rock Mine
Kuruman
Jobin.co.za
Black Rock Mine Jobs-0731732853

Black Rock Mine Jobs-0731732853

BLACK ROCK MINE
Kuruman
Jobin.co.za