Career | <?phpecho $jobTitle;?> | <?phpecho $companyName;?>

Senior Analytics Engineer


Cape Town, ZA
  • Job Type: Full-Time
  • Function: Data Science
  • Industry: Cloud Business Apps
  • Post Date: 05/13/2024
  • Website:
  • Company Address: 223 E. De La Guerra St, Santa Barbara, CA, 93101

About Impact, the world’s leading partnership management platform, is transforming the way businesses manage and optimize all types of partnerships—including traditional rewards affiliates, influencers, commerce content publishers, B2B, and more.

Job Description

Our Company:

At we are passionate about our people, our technology, and are obsessed with customer success. Working together enables us to grow rapidly, win, and serve the largest brands in the world. We use cutting edge technology to solve real-world problems for our clients and continue to pull ahead of the pack as the leading SaaS platform for businesses to automate their partnerships and grow their revenue like never before. We have an entrepreneurial spirit and a culture where ambition and curiosity is rewarded. If you are looking to join a team where your opinion is valued, your contributions are noticed, and enjoy working with fun and talented people from all over the world, then this is the place for you!, the world’s leading partnership management platform, is transforming the way businesses manage and optimize all types of partnerships—including traditional rewards affiliates, influencers, commerce content publishers, B2B, and more. The company’s powerful, purpose-built platform makes it easy for businesses to create, manage, and scale an ecosystem of partnerships with the brands and communities that customers trust to make purchases, get information, and entertain themselves at home, at work, or on the go. To learn more about how’s technology platform and partnerships marketplace is driving revenue growth for global enterprise brands such as Walmart, Uber, Shopify, Lenovo, L’Oreal, Fanatics and Levi’s, visit


Your Role at Impact:

The Senior Analytics Engineer is a technical data professional; able to manage, process and analyse large datasets using big data technologies such as Apache Spark, SingleStore and BigQuery as well as being able to visualise and report on these datasets. The ideal candidate will be proficient in designing and implementing efficient data workflows to move, transform, aggregate and enrich data from various sources into a centralised data warehouse and purpose-built data marts, ensuring internal code management and data quality standards are adhered to, in addition to providing users access to standard reports, rich visualisations and other analytical data assets.

The position requires a strong analytical mindset, attention to detail, programming skills and experience with big data technologies. This is a highly collaborative role as the engineer needs to engage with Subject Matter Experts to implement business logic, understand source data structures and ensure data outputs are accurate, fit-for-purpose, pass quality assurance and provide value to the business.

What You'll Do:

  • Design, develop and maintain data models, data marts and analytical data stores
  • Work closely with Subject Matter Experts (SMEs), Business and Technical stakeholders to define and document business logic and transformation rules to be used in data load jobs and (materialised) analytical views
  • Build and maintain data load and transformation jobs to populate data lakes, data marts and data warehouses following the Extract-Load-Transform (ELT) and Extract-Transform-Load (ETL) paradigms as appropriate
  • Create and maintain reusable data assets ready for consumption by machine learning models, data visualisation tools and data analysts
  • Create and maintain entity-relationship diagrams (ERDs), data dictionaries and data flow diagrams
  • Create and maintain table and column metadata
  • Manage code releases, deployment cycles and the associated change management processes
  • Build and maintain standard reports for internal stakeholders
  • Contribute to the development and expansion of common utility libraries used by data teams
  • Maintain high standards of quality, integrity and accuracy in produced data assets
  • Troubleshoot and resolve any issues that arise relating to data assets in the production environment in a timely manner
  • Optimise total system performance related to ETL/ELT workloads and analytical queries, ensuring efficient use of compute resources and stability of data systems
  • Optimise code related to ELT/ETL workloads for simplicity, reusability and efficiency and in line with best practice
  • Conduct periodic integrity checks on productionalized data assets
  • Safeguard sensitive company data
  • Work with the data Quality Assurance (QA) function to extend and enhance programmatic validation of productionalized data assets
  • Stay up-to-date with the latest big data technologies and best practices
  • Automate manual data load, data transformation and data management processes
  • Review and Sign off code changes
  • Mentor and train junior colleagues
  • Actively participate in the hiring process and performance management of team members

What You Have:

  • Bachelor's or Master's degree in Computer Science, Data Science or related field
  • 6+ years of experience in data pipeline development and data warehousing using big data technologies such as Apache Spark, Google DataFlow, SingleStore, Impala, Kudu and/or BigQuery
  • Proven track record in developing enterprise-level data marts
  • Experience with Databricks advantageous
  • Experience with dbt advantageous
  • Experience with Google Cloud Platform and BigQuery advantageous
  • Strong SQL development experience required
  • Strong Python programming skills required
  • Strong knowledge of relational database management systems
  • Strong data modelling and schema design experience
  • Experience with workflow management tools such as Airflow, Luigi or Oozie advantageous
  • Knowledge of data integration patterns, data load patterns and best practices required
  • Knowledge of software development best practices and version control tools
  • Strong analytical and problem-solving skills
  • Strong written and verbal communication skills
  • Good leadership and workload management skills and experience advantageous
  • Ability to work in a team environment and collaborate with internal stakeholders

Nice to have:

 Affiliate & Partnerships Industry Fundamentals Certification by PXA


  • Casual work environment, including working from home
  • Flexible work hours
  • Unlimited PTO policy
    • Take the time off that you need. We are truly committed to a positive work-life balance, recognising that it is important to be happy and fulfilled in both
  • Primary Caregiver Leave
  • Training & Development
    • Learning the advanced partnership automation products
  • Medical Aid and Provident Fund 
    • Group schemes with Discovery & Bonitas for medical aid
    • Group scheme with Momentum for provident fund
  • Restricted Stock Units
    • 3-year vesting schedule pending Board approval
  • Internet Allowance
  • Fitness club fee reimbursements
  • Technology Stipend is proud to be an equal opportunity workplace. All employees and applicants for employment shall be given fair treatment and equal employment opportunity regardless of their race, ethnicity or ancestry, color or caste, religion or belief, age, sex (including gender identity, gender reassignment, sexual orientation, pregnancy/maternity), national origin, weight, neurodivergence, disability, marital and civil partnership status, caregiving status, veteran status, genetic information, political affiliation, or other prohibited non-merit factors.