If you are looking to join a team where your opinion is valued, your contributions are noticed, and enjoy working with fun and talented people from all over the world then this is the place for you.
If you have a desire to work in an organisation that is:
- Passionate about its people
- Focused on delivering the very best tech to our customers
- Offering the flexibility to work how and where you are most successful
- Obsessed with our customers success
- The leading SaaS platform to automate partnerships - affiliate, influencer, technology partners, and more!
- Entrepreneurial in spirit with a culture that rewards collaboration and curiosity
- Obsessed with making a difference in business and to the wider community
Your Role at Impact:
The Senior Analytics Engineer is a technical data professional; able to manage, process and analyse large datasets using big data technologies such as Apache Spark, SingleStore and BigQuery as well as being able to visualise and report on these datasets. The ideal candidate will be proficient in designing and implementing efficient data workflows to move, transform, aggregate and enrich data from various sources into a centralised data warehouse and purpose-built data marts, ensuring internal code management and data quality standards are adhered to, in addition to providing users access to standard reports, rich visualisations and other analytical data assets.
What You'll Do:
- Design, develop and maintain data models, data marts and analytical data stores
- Work closely with Subject Matter Experts (SMEs), Business and Technical stakeholders to define and document business logic and transformation rules to be used in data load jobs and (materialised) analytical views
- Build and maintain data load and transformation jobs to populate data lakes, data marts and data warehouses following the Extract-Load-Transform (ELT) and Extract-Transform-Load (ETL) paradigms as appropriate
- Create and maintain reusable data assets ready for consumption by machine learning models, data visualisation tools and data analysts
- Create and maintain entity-relationship diagrams (ERDs), data dictionaries and data flow diagrams
- Create and maintain table and column metadata
- Manage code releases, deployment cycles and the associated change management processes
- Build and maintain standard reports for internal stakeholders
- Contribute to the development and expansion of common utility libraries used by data teams
- Maintain high standards of quality, integrity and accuracy in produced data assets
- Troubleshoot and resolve any issues that arise relating to data assets in the production environment in a timely manner
- Optimise total system performance related to ETL/ELT workloads and analytical queries, ensuring efficient use of compute resources and stability of data systems
- Optimise code related to ELT/ETL workloads for simplicity, reusability and efficiency and in line with best practice
- Conduct periodic integrity checks on productionalized data assets
- Safeguard sensitive company data
- Work with the data Quality Assurance (QA) function to extend and enhance programmatic validation of productionalized data assets
- Stay up-to-date with the latest big data technologies and best practices
- Automate manual data load, data transformation and data management processes
- Review and Sign off code changes
- Mentor and train junior colleagues
- Actively participate in the hiring process and performance management of team members
- Bachelor's or Master's degree in Computer Science, Data Science or related field
- 6+ years of experience in data pipeline development and data warehousing using big data technologies such as Apache Spark, Google DataFlow, SingleStore, Impala, Kudu and/or BigQuery
- Proven track record in developing enterprise-level data marts
- Experience with Databricks advantageous
- Experience with dbt advantageous
- Experience with Google Cloud Platform and BigQuery advantageous
- Strong SQL development experience required
- Strong Python programming skills required
- Strong knowledge of relational database management systems
- Strong data modelling and schema design experience
- Experience with workflow management tools such as Airflow, Luigi or Oozie advantageous
- Knowledge of data integration patterns, data load patterns and best practices required
- Knowledge of software development best practices and version control tools
- Strong analytical and problem-solving skills
- Strong written and verbal communication skills
- Good leadership and workload management skills and experience advantageous
- Ability to work in a team environment and collaborate with internal stakeholders
- Affiliate & Partnerships Industry Fundamentals Certification by PXA
- Unlimited PTO policy
- Take the time off that you need. We are truly committed to a positive work-life balance, recognising that it is important to be happy and fulfilled in both
- Training & Development
- Learning the advanced partnership automation products
- Medical Aid and Provident Fund
- Group schemes with Discovery & Bonitas for medical aid
- Group scheme with Momentum for provident fund
- Stock Options
- 3-year vesting schedule pending Board approval
- Internet Allowance
- Flexible work hours
- Casual work environment
This position will be based in Cape Town post Covid-19 and welcomes anyone who is interested in relocating.