Create and deliver data-driven solutions that add business value through the use of statistical models, machine learning algorithms, data mining and visualisation techniques.
Requirements
Experience & Educational Requirements:
- Degree in Computer Science, Computer Engineering, Actuarial Science, Statistics, Business Sciences, Business Mathematics. Other degrees or postgraduate qualifications with a statistical/modelling aspect and/or experience will be considered.
- A minimum of 4 years relevant experience in data science
- Proficiency in programming languages such as SQL and Python
- In-depth knowledge of machine learning modelling libraries (e.g., scikit-learn, PyTorch, Tensorflow)
- Experience with deep learning and natural language processing will be advantageous
- 2-5 years data analysis experience within an insurance environment
- Familiarity with visualisation tools (e.g. Tableau, PowerBI, Plotly)
- Experience in designing and building production systems in AWS, Google Cloud, or Azure
- Exposure to continuous integration/continuous deployment (CI/CD)
- Familiarity with version control systems (e.g., Git or AWS Commit)
- Experience in the short-term insurance industry will be advantageous
INTERNAL PROCESS
- Identify and develop predictive and prescriptive models to enable better decision making of business.
- Identify, understand and interpret data structures across various databases within the business to facilitate data analysis and continuous monitoring of activities.
- Delve for insights in data and processes to help improve the business.
- Participate and/or lead discovery processes with business stakeholders to identify problems and opportunities that may be addressed with statistical modelling or machine learning.
- Identify and develop the hypothesis testing framework and modelling approach to address the business requirements.
- Make strategic recommendations on data collection and experimental design incorporating business requirements and knowledge of best practices.
- Prepare the data for analysis and modelling, which includes data cleaning, standardisation, transformation, dimension reduction and feature engineering.
- Identify and train suitable models/algorithms to discover patterns and make predictions.
- Extend existing code and develop custom code to implement statistical models, machine learning algorithms and data mining techniques for large datasets in a computationally efficient manner.
- Compare model performance, select the best algorithm and be able to motivate this choice in a non-technical manner.
- Interpret results and translate findings into clear and actionable insights that can be easily validated with the project sponsor.
- Communicate findings to business with various skill levels and in various roles, presenting trends, correlations and patterns found in complicated datasets in a manner that clearly and concisely conveys meaningful insights.
- Assist business users in the use of the models and interpretation of model output.
- Deploy ML models to production on cloud platforms such as AWS, Google Cloud, or Azure.
- Assist with monitoring and reporting on model accuracy after it has been embedded in operations.
- Develop and maintain productive and collaborative working relationships with peers and stakeholders.
- Positively influence and participate in change initiatives.
- Continuously develop ones own expertise in terms of professional, industry and legislation knowledge.
- Contribute to continuous innovation through the development, sharing and implementation of new ideas.