Senior Analytics Engineer

Outreach Outreach · Enterprise · United States · Business Systems

Senior Analytics Engineer role focused on building and maintaining data models and sources for an AI platform company. The role involves SQL, data modeling, documentation, and partnering with analysts and subject matter experts to ensure data quality and accessibility. While the company uses AI, this role is primarily focused on the data infrastructure and analytics layer supporting it, not direct AI/ML model development.

What you'd actually do

  1. Model and document new datasets (both structured and semi-structured) to exploit the value therein across all our business units
  2. Partner with subject matter experts to document, align and automate business metrics that define success
  3. Work with Analysts to optimize their use of data, either through query reviews, modeling exercises, or dashboard audits
  4. Own the design, monitoring, and deployment of certified data sources on our reporting platforms and warehouses
  5. Insist on the highest standards for data reproducibility, auditability and compliance

Skills

Required

  • SQL
  • Data Modeling
  • Documentation
  • Business Metrics Alignment
  • Data Warehousing
  • ELT environments
  • Stakeholder Communication

Nice to have

  • Postgres
  • Snowflake
  • DBT
  • Airflow
  • Tableau
  • ML pipelines (Sagemaker)
  • Multi-dimensional data modeling
  • Spark Ecosystem (Delta Lake, Databricks)
  • Cloud platforms (Snowflake, Databricks)

What the JD emphasized

  • Expert SQL
  • History of working with or as a Business Analyst
  • Record of preemptive communication and documentation strategies favoring self-service education
  • Familiarity with Outreach tech stack: Snowflake, DBT, Airflow, Tableau, History of Query and Data Pipeline optimization
  • Have working knowledge with ML pipelines (i.e. Sagemaker)
  • Knowledge of multi-dimensional data modeling
  • Experience working in modern ELT environments
  • Code documentation best practices
  • Worked with business stakeholders for requirements gathering and documentation
  • Experience with Spark Ecosystem (Delta Lake, Databricks, etc.) is a plus
  • Experience in data modeling, schema design, and data quality best practices, with functional experience working on cloud platforms like Snowflake or Databricks is a nice to have.