Senior Data Pipeline Analyst

ZoomInfo ZoomInfo · Enterprise · Vancouver, BC · 913 Engineering - Data Ops

This role focuses on becoming an expert in the company's data pipeline, which ingests, processes, and profiles millions of company records. The analyst will read code to understand data transformations, contribute to design conversations with Engineering and Product, and shape the evolution of the data infrastructure. The role involves solving ambiguous data challenges and ensuring the pipeline infrastructure evolves to meet customer needs. It requires systems thinking, technical depth in data pipelines/ETL, and the ability to analyze code (Python, Java, SQL).

What you'd actually do

  1. Master our company data pipeline architecture—how data flows from ingestion through profiling, what transforms are applied at each stage, and how components interconnect
  2. Read and analyze production code to understand data transformations, trace data lineage, and assess how proposed changes would impact the system
  3. Develop frameworks for evaluating tradeoffs between technical complexity, implementation effort, and customer impact
  4. Create clear documentation, system maps, and knowledge resources that capture architecture decisions, dependencies, and design rationale
  5. Participate actively in design conversations with Engineering and Product about our next-generation pipeline, bringing data quality insights, technical feasibility assessments, and informed opinions on architectural decisions

Skills

Required

  • Data pipeline expertise
  • ETL systems knowledge
  • Data processing infrastructure experience
  • Ability to read and understand code (Python, Java, SQL)
  • Systems thinking
  • Technical analysis
  • Problem-solving
  • Communication skills

Nice to have

  • Experience with infrastructure transition
  • Experience with architectural design conversations
  • Experience with data quality frameworks
  • Experience with documentation and knowledge sharing

What the JD emphasized

  • systems thinking
  • technical depth
  • data pipelines
  • ETL systems
  • data processing infrastructure
  • read code
  • Python
  • Java
  • SQL
  • data transformations
  • trace how data flows through systems
  • informed opinions
  • assess technical tradeoffs
  • evaluate whether a proposed solution is feasible
  • contribute meaningfully to design conversations
  • build deep expertise
  • complex domain
  • lead strategic initiatives
  • ambiguous problems
  • figuring things out as you went
  • project leadership capabilities
  • systems-focused environment
  • Analytical and Hands-On
  • writing code to analyze data patterns
  • manually investigating edge cases
  • dig into details
  • zoom out to see the bigger picture
  • Clear Communicator
  • explain technical complexity to non-technical audiences
  • worked effectively with Engineering, Product, or cross-functional teams
  • translating between technical constraints and business needs
  • Comfortable with Ambiguity
  • thrive in evolving environments
  • priorities shift
  • problems aren't always well-defined
  • maintain momentum and quality
  • path forward isn't perfectly clear
  • build deep expertise in our pipeline architecture
  • contributing to our infrastructure transition
  • systems stabilize
  • own pipeline architecture decisions
  • lead strategic data improvement initiatives
  • data quality insights
  • technical feasibility assessments
  • informed opinions on architectural decisions
  • rigorous testing
  • impact analysis
  • hands-on verification of data quality
  • data quality investigations
  • emerging requirements
  • system-level improvement opportunities
  • determine when problems should be solved at the pipeline/profiler level
  • downstream approaches
  • systems thinking
  • creative problem-solving
  • improving location verification
  • integrating new data sources
  • solving novel data extraction challenges
  • solution isn't obvious
  • code analysis
  • manual investigation
  • cross-functional coordination
  • iterative problem-solving
  • repeatable approaches to testing
  • validation
  • root cause analysis
  • Build Partnerships & Institutional Knowledge