Senior Data Engineer

TalentRemedyHerndon, US
Published on

About the Role

Our client builds their own LEO satellites for RF data collection and analysis. They are currently seeking a Senior Software Engineer to design, build, and deploy world-class algorithms for scalable cloud processing.

The role would be part of the Data Engineering team in the Data & Analytics group. Data Engineering manages the transition to production for advanced machine learning and geolocation algorithms. This team develops and manages scalable data processing platforms for exploratory data analysis and real-time analytics to support geospatial data exploration needs.

Responsibilities

As the Senior Software Engineer, your main responsibilities will be:

  • Write efficient, clean, and testable Python code for data engineering workflows.
  • Design, build, and maintain scalable ETL pipelines to support data ingestion and transformation at scale.
  • Develop and optimize parallel processing frameworks to enhance data throughput and performance.
  • Implement and maintain pipeline orchestration using tools such as Airflow or similar.
  • Design and manage cloud-native data solutions using AWS, including S3, RDS and other AWS services.
  • Perform database maintenance and optimization to ensure reliability, integrity, and performance across systems.
  • Containerize applications using Docker, and deploy/manage them using Kubernetes.
  • Work closely with Processing Algorithms & Data Science teams to integrate, optimize, and deploy state-of-the-art algorithms to production-ready applications.
  • Apply debugging and problem-solving skills to support data-intensive applications in production, with on-call responsibility.
  • Participate in collaborative software development practices and provide design feedback.
  • Work in a fast-paced agile environment, effectively communicate and track development activities using agile tools like JIRA/Confluence.
  • Work independently within a geographically distributed team.

About the Candidate

Your skills and qualifications should include:

  • B.S. degree in Computer Science, Electrical/Computer Engineering, or comparable experience.
  • 3+ years of professional software development experience using Python.
  • Proven experience in building and maintaining ETL pipelines and data workflows.
  • Hands-on experience with AWS services and solutions (e.g., Amazon S3, Amazon EC2).
  • Experience with modern data orchestration tools (e.g., Apache Airflow, Argo CD).
  • Deep understanding of parallel processing and performance optimization techniques.
  • Experience in Docker containerization and Kubernetes for deployment.
  • Familiarity with monitoring and logging solutions, especially Grafana and OpenTelemetry (OTel).

Desirable Skills

  • Knowledge of Infrastructure as Code (IaC) tools (e.g., terraform).
  • Understanding of streaming data tools (e.g., Apache Kafka, Spark).
  • Experience with frameworks for parallelizing compute-heavy tasks (e.g., Ray, Spark, Dask).

Company Culture and Benefits

TalentRemedy fosters a culture of innovation and collaboration. We emphasize small team environments that encourage rapid prototyping and productization of new ideas, deeply rooted in hands-on engineering.

Expect a stimulating work environment where you can grow and collaborate effectively with a globally distributed team, ensuring impactful contributions to our mission of excellence in data collection and analysis.