GoldenRule is currently recruiting a AWS Data Cloud Engineer for a renowned Insurance Giant. The role is a contract position with a hybrid mode of working.
Skills And Experience
- High knowledge in AWS security groups, permissions, VPC, roles, services.
- Experience developing engineering applications in Corporate companies
- Demonstrated project development and leadership skills
- Master's degree in Computer Science or Software Engineering is preferred
- Current understanding of best practices regarding system security measures
- Advanced education and application of business analysis techniques and strategy
- Experience with software engineering, customer experience and civil engineering preferred
- Experience working together with teams from several departments to facilitate the orderly execution of a proposed project plan.
- Professional experience and a high-level understanding of working with various operating systems and their implications.
- Professional work experience in team building and project organization.
- Responsibilities
- Coordinate/Prioritize the work that the Developers and PO
- Assist with cross-team projects that involve development
- AWS practices are followed with sound security and governance.
- Coordinate with the cloud COE team for any AWS account level security and access.
- Resolve any security findings on your responsible AWS accounts given by group security.
- Manage the CI/CD pipeline as per architecture approved
- Ensure good gates are in place for review for deployment and ensure good practices and patterns are followed in this regard.
- Code quality will be monitored through peer code reviews.
- The Quality Assurance Team consisting of the organisations and/or other third parties will conduct exploratory testing to identify defects.
- Must meet and pass the criteria as specified in the Definition of done as documented on the Feature Team's instance
- Strong conceptual understanding of the context and business requirements. Should be able to understand the business needs, High Level design and produce Low level design documents, implement code in accordance with the best practice
- Ability to perform data quality checks in a methodical manner to understand how to accurately utilize client data.
- Expert level programming skills using AWS to meet the challenges of advanced data manipulation, complicated programming logic, and large data volumes is required.
- Ability to communicate results and methodology with the project team and clients.
- Ability to meet deadlines and thrive in an insurance environment.
- Provides solutions for data driven applications involving large and complex data and providing reconciliation and test cases.
- Understand customer's Business processes and pain areas which need attention.
- Source system understanding and analysis.
- Solution Architecture for the entire flow from source to end reporting data marts.
- Design Conceptual and physical data model for a global Datawarehouse in AWS (ETL versus ELT)
- High Level & Low-Level design for ETL Components in AWS
- Test prototypes and oversee handover to operational teams.
- Propose best practices/standards.
- Build monitoring and testing mechanisms on data transformations
- Continuous improvements on AWS in terms of scalability, reliability, and monitoring.
- Analyze and enhance the architecture of the current implementation.
- Manage personal delivery on projects and enhancements.
- Ensure personal service level agreement standards are met.
- Implement initiatives to improve application performance.
- Ensure quality of programming code.
- Translate business requirements into system requirements
- Design and document robust, scalable solutions according to set standards.