Working with us is not like working anywhere else, which is why we recruit people who will take a bolder, smarter approach to spot opportunities, solve problems and deliver results.
Our culture is all about encouraging you to think independently and to challenge convention to deliver the best result. Thats how we continue to achieve extraordinary things in extraordinary locations.
Job Description
Purpose of the Role:
The Data Engineer will be responsible for designing, developing, and maintaining data pipelines and systems in the Azure cloud environment. The incumbent will work collaboratively within an integrated team of Data Engineering, Data Designers, Data Scientists, Database Administrators, DevOps Engineer and Data Architects. Together this integrated team ensures the smooth deployment and vigilant monitoring of data solutions.
Key Responsibilities
Data Engineering:
- Data acquisition & source management
- Develop and deploy efficient data pipelines using Azure Synapse Analytics, adhering to Medallion architecture principles, and ensuring alignment with Semantic layer concept.
- Develop APIs & data feeds
- Extract, Transform, and Load (ETL) data from various sources (e.g., databases, APIs, flat files) into Azure data platforms.
- Data Quality
- Ensure data quality and consistency during the ETL process.
- Data Governance
- Implement and enforce data governance policies and security measures in compliance with industry standards and organizational requirements.
- Monitor and audit data access and usage to ensure data privacy and security.
- Testing
- Work closely with cross-functional teams to design and implement automated testing strategies for Synapse resources.
- Participate in release management processes and coordinate deployments across different environments
- Monitor and manage build and deployment processes, troubleshoot issues, and implement improvements
- Ensure seamless deployment and monitoring of data solutions.
- Implement and manage CI/CD pipelines to streamline the deployment process.
- Data Integration
- Contribute to integration standards for system to system requirements.
- Implement integrations with single sources of truth
- Data Storage, transfer & access
- Maintain and set standards surrounding how data is stored, transferred and accessed in the data lake
- Monitor Data Products
- Leverage Parquet partitioning strategies to enhance data retrieval efficiency for optimal performance.
- Employ and optimise orchestration tools to guarantee timely availability of data in the necessary storage, while also minimizing any excess workload
- Monitor and fine-tune system performance for optimal efficiency.
- Utilize automation with utility templates to streamline data processing workflows.
- Optimise data storage and retrieval for performance and cost efficiency are maximized.
- Assist with develop, maintain, and optimize DevOps agent pipelines for Azure Synapse Analytics
- Degree in Computer Science, Engineering, or related field.
- At least 5 to 8 years experience.
- Strong proficiency in Azure cloud services and tools, including Azure Synapse Analytics, and Medallion architecture.
- Minimum of 5 years hands-on experience in designing and implementing solutions using Common Data Model (CDM), demonstrating proficiency in data modelling and integration.
- Proficient in Scala or Pyspark for developing and optimizing data processing pipelines.
- Strong command of SQL for querying and manipulating structured data.
- Familiarity with Azure Stream Analytics.
- Familiarity with Azure Event Hubs.
- Experience with source control management systems like Git.
- Solid understanding of both streaming and batch data processing techniques, with a demonstrated ability to work with real-time and batch data pipelines.
- Experience with data governance, security, and compliance practices.
- In-depth knowledge of Parquet partitioning strategies for optimizing data retrieval performance.
- Proven experience in automating data processing workflows using utility templates.
- Performance and results orientated
- Collaborative team member and comfortable with duty rotation
- Effective communication abilities
- Proficiency in navigating change within an evolving environment
- Capability to perform effectively in high-pressure situations
- Exhibit leadership abilities, while also being adaptable to follow when necessary or step up to lead when required
- Knowledge and interest in computer systems and the latest technologies
- Travel: Regionally, minimum twice a year
- Location: Cape Town
- Place of work: Hybrid (at least 1 office day per week)
Follow us for the latest news at LinkedIn
If you are already a First Quantum employee and have access to the First Quantum network, log into First Quantum MINE > Careers to apply internally for this opportunity.
If you are an employee without network access, contact your Site Recruiter.