GCP Data Engineer (Java, Spark, ETL)
Staffworx — London, GB
- Published on
About the Role
Staffworx is looking for a GCP Data Engineer for a hybrid role in the London area. This position is part of a digital Google Cloud transformation programme aimed at enhancing data capabilities within the organization.
Responsibilities
- Develop ETL processes for data ingestion and preparation.
- Implement and optimize real-time data processing workflows using GCP services, such as Dataflow, Pub/Sub, and BigQuery Streaming.
- Work with technologies including Python, PySpark, Java, SparkSQL, and Unix/Linux platforms.
- Utilize version control tools (Git, GitHub) and automated deployment tools.
- Familiarity with data orchestration tools such as GCP Cloud Composer is highly desirable.
- Google Cloud Platform certification(s) are a strong advantage.
About the Candidate
The ideal candidate should have:
- Proficiency in programming languages including Python, PySpark, and Java.
- A deep understanding of real-time data processing and event-driven architectures.
- The capability to work with various GCP services including BigQuery, CloudRun, DataFlow, and CloudStorage.
About the Company
Staffworx Limited is a UK-based recruitment consultancy supporting the global E-commerce, software, and consulting sectors. They specialize in connecting talented individuals with opportunities in the digital landscape.
Company Culture and Benefits
The role offers an initial 6-month contract with likely long-term extensions. Staffworx values a collaborative work environment where innovation and expertise in cloud technologies are highly regarded.
No specific benefits mentioned.