Job Search

Search
Skills
Technologies
  • Published on

    iVedha Inc. is looking for an Intermediate Airflow Developer with over 2 years of experience to help transition our existing Windows Scheduler jobs to Apache Airflow DAGs. In this role, you will modernize and optimize task automation processes by converting existing jobs into efficient, manageable workflows in Airflow, alongside implementing data pipelines and security hardening.

  • Published on

    iVedha Inc. is looking for an Intermediate Airflow Developer with over 2 years of experience to help transition our existing Windows scheduler jobs to Apache Airflow DAGs. You will play a critical role in modernizing and optimizing our task automation processes by converting existing jobs into efficient, manageable, and scalable workflows in Airflow. Your responsibilities will include developing and optimizing DAGs, implementing security measures, and designing data pipelines.

  • Published on

    Samba TV is looking for a Data Engineer responsible for developing scalable, high-performance data pipelines and infrastructure that power Samba TV's analytics and insights. The ideal candidate will play a critical role in designing and implementing architectural improvements, ensuring best practices, and optimizing data workflows while collaborating closely with Data Science, Analytics, and Product teams.

  • Published on

    Samba TV is looking for a Tech Lead - Data Engineering who will be responsible for leading the development of scalable, high-performance data pipelines that power Samba TV's analytics and insights. In this role, you will design and implement architectural improvements, mentor a team of engineers, and collaborate with Data Science, Analytics, and Product teams to deliver robust data solutions that drive business impact. The ideal candidate will have 5+ years of experience in Data Engineering or Software Engineering, expertise in technologies like Apache Airflow and Databricks, and strong problem-solving skills.

  • Published on

    S-Communication Services GmbH is looking for a Senior Data Engineer (m/w/d) to enhance our cloud-based data platform for Business Intelligence and Data Science. Responsibilities include connecting new data sources to the Data Warehouse, ensuring data quality, designing and implementing data architectures and ETL processes, and working independently on prototypes for technology decisions. The ideal candidate should have a degree in Computer Science (or a comparable field) and experience in a similar position, proficient in modern software development with Python, familiar with database systems like Google BigQuery, and experienced with cloud environments such as Google Cloud Platform.

  • Published on

    (Junior) Data Engineer (m/w/d)

    BFS health finance GmbHDortmund, DE

    BFS health finance is looking for a (Junior) Data Engineer (m/w/d) who will play a crucial role in designing and implementing our data warehouse. You will be responsible for data quality and data integrity across the data warehouse by developing and implementing a robust and scalable data architecture (In-Memory database). You will design and optimize data pipelines and ETL processes to extract, transform, and load data from various sources, identify solutions to improve data processing speed and efficiency, and manage a user management system with rights and role concepts. A successful graduation in Informatics/Data Science or similar training alongside relevant professional experience is required. Strong knowledge of databases, data modeling, and data integration, along with experience in ETL processes and data pipelines, are essential. Communication skills in both German and English are necessary, along with proficiency in SQL, Python, Apache Airflow, and Kafka.