Data Engineer (AWS, Spark, Scala)
- Published on
About the Role
We are working with a well-known IT Services firm in Lyon looking for a freelance Data Engineer for a business-critical mission. The role requires onsite presence 3 days per week at our Lyon office. You will be part of project teams focusing on the ingestion, transformation, storage, and provisioning of data in their data lake. This includes the implementation of Snowflake as a cloud-based data platform, along with various development and operational tasks.
About the Candidate
The ideal candidate must have a minimum of 3 years' experience as a Data Engineer. Strong experience in AWS Cloud Architecture is essential, especially with services such as S3, Glue, Lambda, Athena, and ECS. Proficiency in Spark and Scala is mandatory, and familiarity with DevOps tools like Terraform, Ansible, Git, and Maven will be advantageous. Fluency in French is required to collaborate effectively within the team.
Desired Skills and Experience
- Experience working on AWS Cloud Architecture (S3, Glue, Lambda, Athena, ECS).
- Hands-on experience with Spark.
- Strong programming skills in Scala.
- Knowledge of DevOps tools like Ansible, Terraform, Git, Maven, etc.
Company Culture and Benefits
Joining Glocomms means being part of a vibrant and collaborative environment. You will have the opportunity to contribute significantly to exciting projects that shape our data strategies and implementations. Our company values innovation and teamwork while supporting personal growth in a fast-paced technology landscape.