Job Search

Search
Skills
Technologies
  • Published on

    James Chase is looking for a Data Engineering Manager to lead a small, talented team in scaling their product catalogue ingestion platform. This role involves leading and mentoring a high-performing team, acting as a subject matter expert on data initiatives, and ensuring data quality through automated testing. The ideal candidate should have experience with large-scale data processing and be proficient in Python, SQL, and cloud platforms.

  • Published on

    Immersum is looking for a Lead Data Engineer to own the design and delivery of scalable, high-performing data pipelines. This role involves leading a team of 5 Data Engineers to enhance skills and ensure best practices. The candidate will need strong leadership experience and expertise with AWS, Snowflake, Airflow, and DBT.

  • Published on

    Tenth Revolution Group is looking for a Data Engineer specializing in Databricks and Snowflake. You will be involved in high-value data projects from design to industrialization, particularly focusing on cloud environments like Azure, AWS, or GCP. Responsibilities include designing scalable data architectures, developing data ingestion and transformation pipelines (ETL/ELT), and optimizing data processing performance.

  • Published on

    Glocomms is looking for a Data Engineer to be a part of a dynamic team. The selected candidate will work on a critical business project, focusing on the ingestion, transformation, storage, and provisioning of data within the data lake. This role involves implementing Snowflake as a cloud-based data platform, alongside development, build projects, and operational maintenance tasks. Candidates should have a minimum of 3 years of experience in data engineering, strong experience with AWS Cloud Architecture, and proficiency in Spark and Scala. French language skills are required.

  • Published on

    Signify Technology is looking for a Senior Data Engineer to join our dynamic data team. You'll be responsible for designing and maintaining scalable data infrastructure, building robust pipelines, and optimizing data workflows to support analytics, product development, and business intelligence efforts. Proficiency in Python and SQL is required, along with experience with modern data stack tools like Spark and Airflow.

  • Published on

    Synchrone Fr is looking for a Data Engineer Cloud H/F to join ambitious digital transformation projects. As a Data Engineer, you will be responsible for designing and deploying robust ETL/ELT pipelines on cloud platforms like AWS, GCP, and Azure. You will also optimize real-time and batch data processing using tools like Apache Spark and Kubernetes, ensuring data quality and governance while collaborating with Data Science and DevOps teams.

  • Published on

    Eneco is looking for a Data Science Forecaster to contribute to the development of forecasting products that meet the needs of stakeholders in the energy sector. You will play a critical role in researching and improving proprietary power production forecasting models for renewable energy sources. The focus will be on enhancing forecast accuracy, collaborating with various teams, and providing actionable insights that drive trading decisions.

  • Published on

    L’AGENT X is looking for a confirmed DATA ENGINEER to strengthen our DATA & IA team. The role involves designing, implementing, and optimizing models to store and process data, maintaining data quality, developing distributed data processing algorithms, and deploying data pipelines. Candidates should have at least 5 years of experience in critical environments and be proficient in Big Data tools such as DBT, Apache Hadoop, Spark, and have expertise in Snowflake.

  • Published on

    twentyAI is looking for a DevOps Engineer to support a Snowflake-based data platform. You'll collaborate with data engineers and cloud architects to deliver scalable, secure cloud infrastructure for real-time trading and analytics in a hybrid working environment. Key responsibilities include building CI/CD pipelines, automating AWS infrastructure with Terraform, deploying containers using Docker & Kubernetes, ensuring observability and security, and integrating various tools. Required skills include 3+ years in DevOps roles, expertise in AWS, strong knowledge of Terraform and container technologies, production experience with Snowflake, and proficiency in scripting languages such as Python or Bash. A strong team collaboration and Agile mindset are essential. Familiarity with commodity trading or finance, as well as data governance tools, is a plus.