Senior Data Analytics Engineer

Snowflake Snowflake · Data AI · CA-Menlo Park, United States · Data Analytics and AI

Senior Data Analytics Engineer for Snowflake's internal data platform, focusing on powering analytics for Finance, Sales, and HR. The role involves shaping platform architecture, leading data initiatives, ensuring data governance, and driving adoption of AI-assisted data workflows using LLM-powered tools and Cortex Agents. Requires deep expertise in Snowflake, dbt, and AI development tools.

What you'd actually do

  1. Shape the architecture and design principles for Snowflake's internal data platform, owning decisions that affect how data flows, is governed, and accessed across Finance, Sales, HR, and other business-critical functions
  2. Act as a technical leader today with the potential/interest in evolving into a people manager, balancing hands-on impact with team-level strategy and coaching
  3. Lead cross-functional data initiatives from problem framing through delivery, aligning stakeholders across Engineering, Analytics, and Business Systems on what to build and why
  4. Define and enforce RBAC frameworks, row access policies, and masking policies at scale, ensuring the right data reaches the right people with appropriate governance controls
  5. Drive adoption of AI-assisted data workflows by integrating LLM-powered tools, Cortex Agents, and automation capabilities into the team's data engineering practice

Skills

Required

  • 8+ years of experience in analytics engineering, data architecture, or related roles, or equivalent experience
  • 2+ years experience leading and mentoring analytics engineering or data teams, with a track record of elevating team performance and developing talent
  • Proven ability to operate as a technical leader today with the potential to evolve into a people manager, balancing hands-on impact with team-level strategy and coaching
  • Deep expertise in Snowflake, including advanced data modeling, RBAC design, performance optimization, and the broader feature set (Cortex, Dynamic Tables, Streams, Tasks)
  • Expert-level dbt skills (dbt Core and/or dbt Platform), including macro development, testing frameworks, CI/CD integration, and large-scale project management
  • Strong proficiency with AI development tools, including LLM-powered workflows, agent frameworks (Cortex Agents, Claude Code, Cortex Code), and prompt engineering for data use cases
  • Experience designing and implementing RBAC frameworks at scale, including row access policies, masking policies, and role hierarchies in Snowflake or similar platforms
  • Excellent written and verbal communication skills, with a track record of writing clear technical specs, architecture documents, and stakeholder-facing materials
  • Demonstrated ability to operate with principal-level scope: driving cross-functional initiatives, influencing without authority, and delivering outcomes across multiple teams

Nice to have

  • Experience working at a technology company with a large-scale internal data platform
  • Familiarity with Python for data pipeline development, scripting, and automation
  • Experience presenting technical content to external audiences (customers, conferences)
  • Experience with Snowflake Cortex AI or GenAI-native data tooling in production environments

What the JD emphasized

  • AI-assisted data workflows
  • Cortex Agents
  • LLM-powered tools

Other signals

  • AI-native thinkers
  • AI as a high-trust collaborator
  • integrating LLM-powered tools, Cortex Agents, and automation capabilities
  • AI-assisted data workflows