- Amazing brand with cutting-edge technology
- Excellent teams in Global team collaboration
- High work-life balance with Flexible hours
- Agile working environment
EXPERIENCE: 4-6 Years related working experience.
COMMENCEMENT: As soon as possible
Qualifications/Experience
- South African citizens/residents are preferred.
- Relevant IT / Business / Engineering Degree
- Candidates with one or more of the certifications are preferred.
- AWS Certified Cloud Practitioner
- AWS Certified SysOps Associate
- AWS Certified Developer Associate
- AWS Certified Architect Associate
- AWS Certified Architect Professional
- Hashicorp Certified Terraform Associate
- SQL - Oracle/PostgreSQL
- GROUP Cloud Data Hub (CDH) (Advantegous)
- AWS Quicksight
- Business Intelligence (BI) Experience
- Demonstrate expertise in data modelling Oracle SQL.
- Exceptional analytical skills analysing large and complex data sets.
- Perform thorough testing and data validation to ensure the accuracy of data transformations.
- Strong written and verbal communication skills, with precise documentation.
- Self-driven team player with ability to work independently and multi-task.
- Experience building data pipeline using AWS Glue or Data Pipeline, or similar platforms.
- Experience preparing specifications from which programs will be written, designed, coded, tested and debugged.
- Strong organizational skills. Experience in working with Enterprise Collaboration tools such as Confluence, JIRA etc.
- Experience developing technical documentation and artefacts.
- Experience working with Data Quality Tools such as Great Expectations.
- Experience developing and working with REST API's is a bonus.
- Basic experience in Networking and troubleshooting network issues.
- Knowledge of the Agile Working Model.
- Creating a network community with all needed data stewards within the Group IT.
- Self-driven data obtaining within the community.
- Advantageous Technical Skills / Technology
- Terraform
- Python 3x
- Py Spark
- Boto3
- ETL
- Docker
- Linux / Unix
- Big Data
- Powershell / Bash
- GROUP CDEC Blueprint
- Basic experience/understanding of AWS Components (in order of importance):
- Trino Distributed SQL queries.
- Glue (ETL Scripting)
- CloudWatch
- SNS
- Athena
- S3
- Kinesis Streams (Kinesis, Kinesis Firehose)
- Lambda
- DynamoDB
- Step Function
- Param Store
- Secrets Manager
- Code Build/Pipeline
- CloudFormation
- Technical data modelling and schema design (not drag and drop)
- Kafka
- AWS EMR
- Redshift
- Data Engineers are responsible for building and maintaining Big Data Pipelines using GROUP Data Platforms.
- Data Engineers are custodians of data and must ensure that data is shared in line with the information classification requirements on a need-to-know basis.
Also note, that if you have not received a response from us within 2 weeks, your application was unsuccessful.
#isanqa #isanqajobs #Datascientist #Dataengineer #AWStechnologies #BigData #FuelledbyPassionIntegrityExcellence