Job Search

Search
Skills
Technologies
  • Published on

    Futures is looking for a Data Engineer to join a long-term programme within the National Data Exploitation Capability (NDEC). This role will focus on supporting both national and international investigations into serious and organized crime. The successful candidate will be responsible for designing, building, and maintaining data pipelines, working with a strong tech stack including Microsoft SQL Server, Python, Apache Airflow, and Elastic/OpenSearch.

  • Published on

    Tandym Group is looking for a DBT Developer to enhance and develop DBT models as part of a health services network. Responsibilities include remodeling existing DBT models to improve data processing, developing new ETL pipelines, and working with various healthcare payer data. Candidates should possess experience with DBT, Python, SQL scripting, and a strong understanding of ETL pipeline management.

  • Published on

    Harnham is looking for a Machine Learning / GenAI Engineer to help build scalable, production-grade systems that power their sales and customer success operations. This role involves producing GenAI solutions with a tangible business impact within a mature data science and engineering team, focusing on operational ML and GenAI solutions, not just notebooks, but fully integrated tools that deliver business value across the funnel. The candidate will be required to have strong hands-on Python experience, familiarity with tools like DBT and Airflow, and the ability to build ML solutions that can scale across large user bases.

  • Published on

    Tandym Group is looking for a DBT Developer to develop, deploy, and enhance DBT models focused on data processing and standardization in the health services network. The successful candidate will remodel existing DBT models, develop new ETL pipelines, and work with payer data including claims and eligibility. Proficiency in DBT, Python, and SQL scripting is essential, along with a strong understanding of ETL pipeline management and Apache Airflow.

  • Published on

    Tandym Group is looking for a DBT Developer to join their health services network. In this full-time remote role, the DBT Developer will enhance and develop DBT models for better data processing and standardization. Responsibilities include remodeling existing DBT models, developing new pipelines for payers, and ensuring data from multiple sources is seamlessly integrated into the data warehouse.

  • Published on

    L’AGENT X is looking for a confirmed DATA ENGINEER to strengthen our DATA & IA team. The role involves designing, implementing, and optimizing models to store and process data, maintaining data quality, developing distributed data processing algorithms, and deploying data pipelines. Candidates should have at least 5 years of experience in critical environments and be proficient in Big Data tools such as DBT, Apache Hadoop, Spark, and have expertise in Snowflake.

  • Published on

    Iris Software Inc. is looking for a Data Engineer with expertise in designing, building, and optimizing scalable data pipelines. In this hybrid role, you will work 3 days onsite per week in Toronto, collaborating closely with cross-functional teams, including Data Scientists and Analysts, to ensure efficient data flow across systems. Key responsibilities include developing data processing pipelines using PySpark and Databricks, orchestrating workflows with Apache Airflow, and ensuring compliance with data security standards. Candidates should have a degree in Computer Science or a related field, alongside 7 years of experience in data engineering, with a strong understanding of data governance and excellent communication skills.

  • Published on

    Hays is looking for a Senior Machine Learning Engineer to lead the development of next-generation personalization systems. Responsibilities include architecting advanced models for recommendations, optimizing generative AI applications, and designing scalable ML infrastructure. Required skills are expertise in machine learning frameworks, GCP services, and deep learning architectures.

  • Published on

    searchlink experts is looking for a Senior Software Solutions / Data Engineer (m/f/d) to join their team in Stuttgart. The successful candidate will be responsible for designing, developing, and maintaining a full-stack application, specifically a Data Analytics Platform, within a cloud infrastructure primarily using AWS. Key tasks include collaborating with product owners and offshore developers to ensure timely feature delivery, establishing code quality standards, and managing infrastructure effectively. Required qualifications include proficiency in Python and SQL, strong AWS experience, and a solid understanding of networking architecture, complemented by preferred skills in JavaScript and Docker.

  • Published on

    La Fosse is looking for a Lead Data Engineer to oversee the stability and enhancement of our cloud-native data platform. The responsibilities include leading a skilled team, managing complex data pipelines, and driving improvements in architecture and automation. We value experience in Google Cloud Platform and Apache Airflow, with additional familiarity in dbt, Terraform, and Kubernetes as a plus.