Job Search

Search
Skills
Technologies
  • Published on

    EPAM Systems is looking for a Senior Data Platform Engineer with profound expertise in Databricks and multi-cloud environments. In this role, you will engineer scalable, high-performance data solutions for Fortune 1000 clients. You thrive in building data platforms from scratch and driving innovations across cloud platforms, ensuring optimal performance and security while maintaining compliance standards.

  • Published on

    EPAM Systems is looking for a Lead Data Software Engineer fluent in Scala programming language to join their data teams. The role involves developing and operating critical data pipelines, consulting with stakeholders, and migrating data products to Databricks. Strong expertise in Apache Spark, Databricks, and CI/CD is required. Candidates should have a passion for data solutions and operate in a dynamic environment.

  • Published on

    EPAM Systems is looking for a Lead Data Software Engineer (Spark/Scala/Databricks) who is an open-minded professional fluent in Scala programming language. The successful candidate will be responsible for developing, monitoring, and operating critical curated data pipelines, consulting with stakeholders to improve KPI systems, and leveraging a cloud-based tech stack to enhance data solutions.

  • Published on

    EPAM Systems is looking for a Senior Data Software Engineer to design, develop, monitor, and operate data pipelines, with responsibilities including integrating datasets for analytical use-cases and enabling data teams to follow client standards. The ideal candidate should have 3+ years of hands-on experience in Databricks or similar ETL Tools, strong skills in Apache Spark, PySpark, and Python, and proven expertise in AWS Glue and Amazon S3. The position allows for remote work across Spain while adhering to high engineering practices and offers a progressive environment with significant career growth opportunities.

  • Published on

    EPAM Systems is looking for a Senior Data Software Engineer with expertise in Databricks or similar ETL Tools to join their dynamic team. Responsibilities include designing, developing, and maintaining data pipelines, as well as integrating datasets for analytical use-cases. The ideal candidate will have over 3 years of experience with Apache Spark, PySpark, AWS Glue, and CI/CD practices. Join EPAM to contribute to innovative projects while enjoying flexible work options in Málaga or remotely across Spain.

  • Published on

    EPAM Systems is looking for a Senior Data Engineer to bolster our Insurance client in Zurich. This role requires a mix of remote and on-site work in a hybrid model. The candidate will be responsible for designing, implementing, and maintaining data-driven applications, developing end-to-end solutions, and engineering reliable data pipelines.

  • Published on

    EPAM Systems is looking for a Senior Python Engineer with Spark to join our team. You will play a crucial role in our financial data and analytics team, responsible for data integration and transformation. The role involves developing models and managing extensive structured financial data. You will design and build robust software that efficiently processes this data utilizing our medallion architecture. We use the latest technology, including Databricks on Microsoft Azure, and you will have the flexibility of remote work in Kyrgyzstan, with an option to work from our office in Bishkek.

  • Published on

    Samba TV is looking for a Tech Lead - Data Engineering who will be responsible for leading the development of scalable, high-performance data pipelines that power Samba TV's analytics and insights. In this role, you will design and implement architectural improvements, mentor a team of engineers, and collaborate with Data Science, Analytics, and Product teams to deliver robust data solutions that drive business impact. The ideal candidate will have 5+ years of experience in Data Engineering or Software Engineering, expertise in technologies like Apache Airflow and Databricks, and strong problem-solving skills.

  • Published on

    Swipejobs is looking for a DevOps Engineer responsible for managing technology across the full stack on our production system, which includes performing code reviews and analysis, providing automation and optimization support, and driving the setup of our CI/CD environment. The ideal candidate will have at least 3 years of experience in the Australian market and strong skills in AWS, automation tools like Ansible, as well as familiarity with various technologies such as Kafka and Docker.