GCP Data Engineer (Java, Spark, ETL)
- Published on
About the Role
This role is for a GCP Data Engineer, focusing on a hybrid position in London working on a digital Google Cloud transformation program. The responsibilities include developing ETL processes for data ingestion and preparation, optimizing real-time data processing workflows using tools such as DataFlow and Pub/Sub, and employing SparkSQL along with GCP services.
Responsibilities
- Develop ETL processes for data ingestion and preparation.
- Use SparkSQL, CloudRun, DataFlow, CloudStorage and GCP BigQuery.
- Work with Google Cloud Platform Data Studio and Unix/Linux platforms.
- Optimize real-time data processing workflows and manage data orchestration with tools like GCP Cloud Composer.
- Utilize version control tools (Git, GitHub) and automated deployment tools.
About the Candidate
Ideal candidates will have a deep understanding of real-time data processing and event-driven architectures. Proficiency in programming languages such as Python, PySpark, and Java is required. A strong advantage will be having Google Cloud Platform certification(s).
About the Company
Staffworx Limited is a UK-based recruitment consultancy dedicated to supporting the global E-commerce, software, and consulting sectors. The agency provides comprehensive recruitment services to clients and candidates alike.
Company Culture and Benefits
Staffworx values expertise and innovation within its teams. Candidates can expect professional growth opportunities and a collaborative work environment, suitable for contributing to significant digital transformation projects.