Data Architecture Lead

Snowflake Snowflake · Data AI · CA-Menlo Park, United States · Data Analytics and AI

Lead the architecture and development of AI-powered tooling and agent frameworks for data and analytics engineering teams, focusing on LLM-enabled automation and prompt engineering within the Snowflake ecosystem. This senior individual contributor role involves defining architectural standards, elevating RBAC frameworks, and partnering with product teams.

What you'd actually do

  1. Define and deploy architectural standards across analytics and data engineering teams, covering data modeling patterns in dbt, RBAC frameworks, deployment workflows, and code quality standards
  2. Build and deploy AI-powered tooling and skills — including agent frameworks, prompt-driven workflows, and LLM-enabled automation — for use across analytics and data engineering teams
  3. Elevate the RBAC framework for the data platform, including designing and implementing row access policies, column masking policies, and role-based access control patterns at scale in Snowflake
  4. Partner with Snowflake's product and engineering teams on feature testing and feedback — write meaningful product specifications, develop proof-of-concepts, and help push the Snowflake platform forward
  5. Represent Snowflake's internal data capabilities externally, meeting with customers to demonstrate what is possible on the platform and sharing real-world architectural patterns and learnings

Skills

Required

  • 8+ years of experience in data engineering, analytics engineering, or data architecture roles
  • Deep expertise in Snowflake, including advanced data modeling, performance optimization, RBAC design and the broader Snowflake feature set (Cortex, Dynamic Tables, Streams, Tasks, etc.)
  • Expert-level dbt skills (dbt Core and/or dbt Platform), including macro development, testing frameworks, CI/CD integration, and large-scale project management
  • Hands-on experience with Airflow or similar orchestration platforms for data pipeline management
  • Strong proficiency with AI development tools, including experience with LLM-powered workflows, agent frameworks (e.g., Cortex Agents, Claude Code, Cortex Code), and prompt engineering for data use cases
  • Experience designing and implementing RBAC frameworks at scale — including row access policies, masking policies, and role hierarchies in Snowflake or similar platforms
  • Excellent written and verbal communication skills, with a track record of writing clear technical specs, architecture documents, and stakeholder-facing materials
  • Ability to operate with principal-level scope: driving cross-functional initiatives, influencing without authority, and delivering outcomes across multiple teams

Nice to have

  • Experience working at a technology company with a large-scale internal data platform
  • Prior experience building or contributing to internal developer tooling, shared libraries, or platform-as-a-product initiatives
  • Familiarity with Python for data pipeline development, scripting, and automation
  • Experience presenting technical content to external audiences (customers, conferences, etc.)

What the JD emphasized

  • AI-powered tooling
  • agent frameworks
  • LLM-enabled automation
  • prompt engineering
  • RBAC framework
  • row access policies
  • column masking policies
  • role-based access control patterns

Other signals

  • AI-powered tooling
  • agent frameworks
  • LLM-enabled automation
  • prompt engineering