- Amazing brand with cutting edge technology
- Excellent teams in Global team collaboration
- High work-life balance with Flexible hours
- Agile working environment
COMMENCEMENT: As soon as possible
Qualifications/Experience
- Masters degree in Computer Science, Software Engineering, or a related field or similar qualification.
- 3 years experience as a Data Engineer
- 3 years experience as a Python Data Engineer
- 2 years experience on AWS
- At least 3 years experience as data engineer
- 3 years experience of Aws service for data engineers
- ECS (Elastic Container Service)
- Athena
- Lambda
- Glue
- 3 years experience with Databases for Data Engineers e.g. aws DynamoDB (NO-SQL)
- 3 years experience in Python (Python 3x) and Pyspark for writing Spark applications
- 3 years experience with scripting (Powershell, Bash)
- Basic experience with Aws components such as
- VPC / Iam
- CloudWatch (Metrics and Logs)
- Param Store
- Secret Manager
- SNS
- S3
- Kinesis Streams
- 1 years experience in data modelling (Oracle SQL)
- Analytical skills analysing large and complex data sets.
- Business Intelligence (BI) Experience
- Data extraction and data preparation for tableau
- Experience with Streaming (e.g., Kafka)
- Technical data modelling and schema design (not drag and drop)
- Agile experience (e.g., scrum)
- Our main goal is drive GROUP logistics into a data driven organisation by ingesting data into our GROUP Cloud Data Hub and to help build and provide data assets on the semantic layer for GROUP Transport logistics and container management.
- Drive and safeguard the ingest of logistic data into the cloud data hub
- Drive and safeguard the Business Object provisioning across all relevant data lake layers
- Develop and operate necessary features for the data provisioning
- Run and operate all necessary cloud components for the data provisioning
- Safeguarding, improving, and automating all necessary Aws cloud components
- Technical Safeguarding of all changes / used components concerning the ITLBHM AWS components across all participants (Hub, AG Data Engineer Department, 3rd party Supplier, ) from code changes (Pull Request Reviews) to component and architecture changes
- Maintenance and functional enhance the Data provisioning for the ITLBHM Business Object provisioning
- Drive and solve error analysis during data ingest and data provisioning for ITLBHM Business Objects