CDI - Data Engineer - Big Data - Blockchain

  • Paris
  • Full-Time
  • Start Date: 28 July 2019
  • Apply Now


Nyctale - Business Intelligence and AI SaaS platform to analyze, measure and control blockchain networks activity.
Nyctale is an innovative startup in the fields of Data Science and Blockchain technologies. We develop advanced analytics tools to analyze the usages, measure the value, and control the money flows on these new transactional networks. We valorize our expertise in Machine Learning and Deep Learning applied to graph data to deliver the needed interpretation tools to understand blockchain systems. Our SaaS platform targets application providers and financial institutions. Nyctale supports the development of decentralized applications by providing strategic insights to structuring actors of this emerging and promising industry.

Job Description

Data is the fuel of Nyctale, its high quality is essential. Thus, we are looking for a data engineer to ensure the data to be properly administered, up-to-date, well documented, and easely requestable. Once in our team, you:

  • Implement and maintain data extraction, processing and storage processes in large scale data systems (data pipelines, data warehouses) for internal needs, customer’s analytics, and reporting features ;
  • Work closely with Data Scientists and help them to implement and maintain machine learning systems (feature generation, learning, evaluation, publishing) ;
  • Design, build and launch new data models and datasets in production ;
  • Define and manage SLA for datasets across the different storage layers to ensure performance meets needs ;
  • Optimise performance related to slow queries and algorithm ;
  • Define and manage overall schedule and availability of all data sets ;
  • Enjoy a fast-paced work environment.

Preferred Experience

Job Requirements:

  • Degree in Computer Science or a related field ;
  • Proficient with Python and AWS (EMR, RDS, S3, Athena...) ;
  • Experience in custom ETL design, implementation, and maintenance ;
  • Proficient and comfortable with Spark and SQL ;
  • Deep familiarity with distributed processing (Map Reduce) and more generally with big data technologies and unstructured data ;
  • Experience with configuration and maintenance of distributed computing systems such as Hadoop or Spark
  • Innately curious and organized to analyze data and identify deliverables’ anomalies, while proposing solutions to address these findings ;
  • Mastery of the environment used: Linux ;
  • Mastery of languages used: Python, Bash, Scala ;
  • Mastery of practical tools: Docker, Git ;
  • Ability to understand business processes and how to measure and improve them in an organization ;
  • Understanding of blockchain concepts is, obviously, a huge plus ;
  • We highly value previous professional experiences.

Recruitment Process

An interview with each of the founders (one of which will be technical) and a meeting with the team around a beer (or a non alcoholic beverage of your choice) at La felicità, next to Station F.

Additional Information

  • Contract Type: Full-Time
  • Start Date: 28 July 2019
  • Location: Paris, France (75013)
  • Education Level: Master's Degree
  • Experience: > 6 months