Job Search

Search
Skills
Technologies
  • Published on

    IT Chapter is looking for a Data Engineer responsible for leading the delivery of data on a data and analytics platform. The role involves designing and implementing data pipelines and collaborating with stakeholders to ensure data governance standards are met. Required skills include programming in Spark with Scala and experience with Azure data architecture.

  • Published on

    Cherry Pick is looking for a Data Engineer to work with a client in the Aerospace sector. The Data Engineer will operate in a Microsoft-oriented environment focusing on data structuring, integration, and traceability, collaborating closely with both business and technical teams to build robust, reliable, and scalable pipelines. Responsibilities include designing and implementing data integration pipelines using Microsoft Fabric and Azure Synapse, constructing data lineage, and implementing the Medallion architecture (bronze/silver/gold). Additionally, the role requires managing versions and tickets via DevOps tools, transitioning pipelines from development to production, debugging, and continuous workflow improvements. Candidates should have senior-level experience (minimum 6-8 years), mastery in Microsoft Fabric, Synapse, Spark Notebooks, Data Lineage, and DevOps, alongside a good understanding of Agile methodologies (Scrum). Fluency in professional English is required since the work environment is 100% English-speaking, while proactivity, solution orientation, autonomy, and a sense of responsibility are critical traits. Remote work is possible, with 3 to 4 annual travel requirements (meetings or workshops) covered by the company. The position is set to start ASAP and is intended to be a long-term engagement.

  • Published on

    S-Communication Services GmbH is looking for a Senior Data Engineer (m/w/d) to enhance our cloud-based data platform for Business Intelligence and Data Science. Responsibilities include connecting new data sources to the Data Warehouse, ensuring data quality, designing and implementing data architectures and ETL processes, and working independently on prototypes for technology decisions. The ideal candidate should have a degree in Computer Science (or a comparable field) and experience in a similar position, proficient in modern software development with Python, familiar with database systems like Google BigQuery, and experienced with cloud environments such as Google Cloud Platform.