Software Engineer III - Contributory Network

ZoomInfo ZoomInfo · Enterprise · Waltham, MA · 936 Engineering - Data Engineering

Software Engineer III role focused on building and maintaining data pipelines for ingesting, transforming, and processing first-party data. The role involves backend data engineering with a significant front-end component for monitoring and configuration dashboards. It utilizes streaming and batch processing technologies, with a focus on data quality and observability.

What you'd actually do

  1. Design and build data pipelines that ingest, validate, and transform first-party data from customer integrations (CRM systems, email providers, recording platforms)
  2. Develop and maintain ETL/ELT workflows for processing contributed contact information, opportunity metadata, and engagement signals at scale
  3. Build and enhance front-end interfaces for monitoring, configuration, and data quality dashboards using Angular
  4. Work with streaming and batch processing technologies to handle real-time and scheduled data flows
  5. Collaborate closely with product managers to translate business requirements into technical solutions

Skills

Required

  • 3+ years of professional software engineering experience
  • Solid experience building and operating data pipelines (ETL/ELT) at scale
  • Proficiency in Java
  • Experience with data processing frameworks such as Apache Beam, Apache Airflow or Spark
  • Hands-on experience with streaming technologies (Kafka, Pub/Sub, or similar)
  • Understanding of data modeling, schema design, and data quality practices
  • Working experience with modern front-end frameworks, ideally Angular
  • Ability to build functional, clean UIs for internal tools, dashboards, and configuration screens
  • Comfortable working with REST APIs and integrating front-end with backend services
  • Bachelor's degree in Computer Science, Software Engineering, or a related field
  • Strong problem-solving skills and attention to detail
  • Effective communicator who works well in a collaborative, cross-functional environment

Nice to have

  • Experience with Kubernetes (GKE/EKS) for distributed workloads
  • Familiarity with Snowflake, BigQuery, or similar cloud data warehouses
  • Experience with Terraform or infrastructure-as-code tools
  • Exposure to data integration patterns involving CRM systems or email/calendar APIs
  • Experience in a B2B data company or data-as-a-product environment
  • Familiarity with cloud platforms, preferably GCP (BigQuery, GKE, Dataflow/DataProc)

What the JD emphasized

  • data pipelines
  • ETL/ELT
  • streaming technologies
  • data quality