Senior Sdet Engineer

Fivetran Fivetran · Data AI · Bangalore, India · Engineering Department

Fivetran is seeking a Senior Software Development Engineer in Test (SDET) to design, develop, and maintain test automation tools and frameworks. The role involves improving test stability and performance, coaching quality engineers, and building automation suites from scratch. The candidate will work with Java, cloud platforms, CI/CD pipelines, and internal tools, including AI-assisted tooling for productivity.

What you'd actually do

  1. Design and evolve scalable automation frameworks in Java with a strong emphasis on reliability, performance, and data correctness.
  2. Build the datalakes team’s automation suite from the ground up, establishing the foundations, patterns, and best practices for scalable and maintainable test automation.
  3. Write high-signal test strategies for complex features and automate them across multiple environments.
  4. Improve CI/CD pipelines by building robust validation layers that prevent silent failures and regressions.
  5. Build internal tools and proof-of-concepts to improve engineering productivity, including areas such as: Faster debugging and failure triagingi, Improved test scenario generation, Flaky test detection, Better log analysis and root-cause identification

Skills

Required

  • Java 11
  • JUnit 5
  • Datalakes and Catalog technologies
  • Docker
  • Kubernetes
  • AWS
  • GCP
  • Azure
  • WireMock
  • Bazel
  • Buildkite
  • 5+ years of experience in the software industry
  • Strong proficiency in Java
  • deep understanding of object-oriented programming principles
  • Experience designing scalable automation frameworks
  • identifying negative, edge, and high-risk scenarios
  • Strong understanding of distributed systems risks such as schema drift, idempotency failures, data inconsistencies, and race conditions
  • Hands-on experience with cloud platforms such as AWS, GCP, or Azure
  • containerized environments such as Docker and Kubernetes
  • Experience building and maintaining CI/CD pipelines using tools such as Jenkins, CircleCI, or Buildkite
  • Experience working with backend automation frameworks
  • Basic familiarity with AI-assisted developer tools and productivity workflows
  • Ability to use AI tools effectively for tasks such as debugging, test design, documentation, or log analysis, with appropriate validation of outputs
  • Good judgment in verifying generated results and maintaining quality, reliability, and safety in engineering workflows
  • Interest in using modern tooling to improve engineering productivity and effectiveness
  • Exposure to datalake technologies and concepts such as object storage, file-based table formats, metadata management, and large-scale data processing workflows

Nice to have

  • Knowledge of database testing and ETL or data pipeline validation is a strong plus
  • Familiarity with testing data movement, schema evolution, partitioning, and data correctness in datalake environments is a plus

What the JD emphasized

  • extensive industry experience
  • highly technical
  • detail-oriented
  • creative
  • motivated
  • focused on achieving results
  • Strong proficiency in Java
  • deep understanding of object-oriented programming principles
  • Experience designing scalable automation frameworks
  • identifying negative, edge, and high-risk scenarios
  • Strong understanding of distributed systems risks
  • Hands-on experience with cloud platforms
  • containerized environments
  • Experience building and maintaining CI/CD pipelines
  • Experience working with backend automation frameworks
  • Knowledge of database testing and ETL or data pipeline validation is a strong plus
  • Basic familiarity with AI-assisted developer tools and productivity workflows
  • Ability to use AI tools effectively for tasks such as debugging, test design, documentation, or log analysis, with appropriate validation of outputs
  • Good judgment in verifying generated results and maintaining quality, reliability, and safety in engineering workflows
  • Interest in using modern tooling to improve engineering productivity and effectiveness
  • Exposure to datalake technologies and concepts
  • Familiarity with testing data movement, schema evolution, partitioning, and data correctness in datalake environments is a plus