Analytics Engineer

At Thread we’ve set out to rethink clothes shopping for the digital age. By combining expert stylists, machine learning algorithms and a marketplace with one of the largest fashion ranges in the world we’ve created a truly unique online platform that gives everyone a highly personalised selection of the perfect clothes just for them – in their size, budget and taste.   It’s your own personal clothes store. 

Over 1 million people and counting use Thread. We’ve recently launched womenswear which is growing faster than our menswear business and, for 25% of our customers, we’ve become the only way they shop. 

There’s a $10bn+ opportunity globally to build the Spotify of retail, and we’re leading the way. We believe having an intelligent assistant in your pocket is how the majority of people will shop in the future, and we’re looking for a talented Data Scientist who wants to help millions of people discover its value.

The Role

We’re hiring for an analytics engineer to be the first dedicated engineer to work on Thread’s analytics infrastructure. This person will bring engineering best practices to our dbt, BigQuery and Looker projects. They will maintain and develop our core data models, integrate new data sources, cleaning and transforming them ready to be used by end users. 

We believe teams work best when they have access to high quality data they can trust. We’re looking for someone to focus exclusively on improving data access and data quality. The role will report into our Head of Data, Ed Snelson and work collaboratively with the engineering team and wider Data Science team of Analysts and Data Scientists. 

Key Responsibilities:

  • Manage and improve Analytics Infrastructure at Thread, including our dbt, BigQuery and Looker Projects. 
  • Design and build internal analytics solutions through discussions with teams, discovering and understanding their unique needs and requirements. 
  • Help guide analytics engineering priorities at Thread, understanding the biggest levers we have in improving data quality and access. 
  • Implement new Product Analytics tools such as Amplitude and Segment. 
  • Help drive data quality at the source. Rather than just working around data quality, you will advocate for quality starting at the source. 
  • Raise the standards of confidence in our data through data quality tests and monitoring. 
  • Implement metrics with guidance from stakeholders leading to a single source of truth. 
  • Diagnose and solve issues related to end-user query performance and data quality. 

Required Experience:

  • Hands on experience building analytics solutions with dbt, Looker and cloud data warehouses using the ELT paradigm. 
  • Very strong SQL and data modeling skills. 
  • Experience with cloud data warehousing and database design.
  • Using modern data ingestion tools such as Fivetran or Stitch. 
  • Strong software development process skills (git, CI/CD) in an analytics context. 
  • Experience with scripting languages to solve problems. 
  • Workflow Orchestration tools such as Airflow or Luigi. 


If this role sounds exciting to you, please apply below. If you’re unsure, please still get in touch—we welcome applicants from different backgrounds and would love to hear from you.