Big Data Engineer 🕺


AB Tasty empowers teams to build optimization roadmaps that delight users and yield maximum results. By analyzing thousands of customer journeys, AB Tasty helps companies better understand user behavior, identify and correct points of friction on their websites, and create personalized web experiences—all in the service of increasing conversion rates.
AB Tasty isn’t about helping brands push a hard sell. Theyr’e about helping them sell better, by creating more positive consumer and user experiences across digital properties.
They must be on to something, since they are proud to say they have:

  • 900+ customers, including Le Bon Coin, Cdiscount, Carrefour …
  • 240+ employees in 6 countries on 3 continents (Americas, Europe, Asia)
  • Raised $24 million to grow globally
  • Bonus: they're nice, too.

2 prerequisites if you want to join the AB Tasty team:

  • Be a genuinely kind person! (they take this one seriously, it's half the battle in the recruitment process)
  • Have a passion, whatever it may be :).

Job Description

The data team, composed of 7 members who work closely with both the DevOps and the Data Science teams, is mainly in charge of developing and monitoring the data collect pipeline. The collect, which processes a few terabytes of data per day, has been deployed on a Google Cloud Platform environment and is critical to AB Tasty and Flagship. Different GCP environments are available to ensure the good development and deployment of features as well as data modelisation and documentation.

You will be required to work on different major areas :

  • Streaming Data pipeline (Dataflow, Pub/Sub, Big Table, Big Query):
  • Performances / Cost optimisation

  • Data restitution:
  • SQL / API / Architecture optimisations / Performances

  • Data QA
  • Ensure the quality of collected data, either on a technical point of view (data consistency), and the business analysis for the client (time series analysis)

  • Innovation / Research:
  • Test & Learn new technologies

    Propose innovative and valuable features for us and our clients

  • Intern Analysis
  • For exemple analyse the usage of our platform by our clients to optimize the way we pre-process the results

    Besides the projects planned annually in the roadmap, you will have other tasks to perform regarding the continuous improvement of the platform:

  • Keep up-to-date the versions of used technologies / tools
  • Improve the monitoring of the plateform for each services / pipeline KPI
  • Public / Internal documentation
  • What we're looking for :

  • Technical skills:
  • Strong knowledge of at least one of the following technologies: Apache Beam or Spark, Hadoop
  • Knowledge in Java, Python, and/or Golang
  • At least 2 years of experiences as a (Big) Data Engineer
  • Experience with cloud platform is a plus
  • Good level of English (written and spoken)
  • Knowledge in Clickhouse
  • Soft Skills required:

  • You understand the "CRO" and/or web analytics business
  • You know what it means to work in a team, and work with other teams
  • You know how to put yourself in the client's shoes when it comes to analyzing their needs
  • Additional Information

    • Contract Type: Full-Time
    • Location: Paris, France (75003)
    • Experience: > 2 years