Software Engineer - Apg

Snowflake Snowflake · Data AI · CA-Menlo Park, United States · Engineering

This role is part of the Applied Performance Group (APG) Engineering team at Snowflake, focusing on performance issues, competitive benchmarking, and acting as a bridge between engineering and other customer-facing organizations. While the company mentions "agentic enterprise" and "AI-native thinkers", the core responsibilities and required skills are centered around database, data warehousing, data engineering, performance analysis, and competitive analysis, not directly building or shipping AI/ML models or systems. The AI/ML mention is in the "strongly desired" section and refers to using AI tools or developing with AI technologies, not as a core requirement of the role itself.

What you'd actually do

  1. Be an expert on the topics of Snowflake architecture, query processing, workload profiling, data engineering, and Snowflake tools
  2. Be an expert in competitor solutions/products for analytics and data engineering
  3. Define, automate, execute, and publish competitive benchmarks
  4. Partner closely with other engineering teams to deliver both results and guidance on performance gaps, challenges, advantages, and disadvantages
  5. Collaborate with Product Management and Engineering to continuously improve Snowflake’s products and eco-system roadmaps.

Skills

Required

  • SQL performance analysis
  • Python
  • Java
  • Database expertise
  • Data Warehouse expertise
  • Data Engineering expertise
  • Performance analysis
  • Cloud platforms
  • Large-scale infrastructure-as-a-service platforms
  • Snowflake competitors experience
  • Data ingestion
  • Data transformation
  • Data platform design
  • BI tools
  • Analytics tools
  • Coding/programming experience in process automation
  • Communication skills

Nice to have

  • Databricks
  • Spark
  • Netezza
  • Oracle
  • Teradata
  • Greenplum
  • Google BigQuery
  • Amazon Redshift
  • Microsoft Synapse
  • Postgres
  • Apache Spark
  • Ray
  • Dask
  • Apache Kafka
  • Apache Flink
  • C/C++
  • Scala
  • Spark
  • Ruby
  • Perl
  • Bash
  • AI and Machine Learning technologies
  • PyTorch
  • University degree in computer science, engineering, mathematics or related fields, or equivalent experience

What the JD emphasized

  • Minimum 2 years of experience in technical role delivering Database, Data Warehouse, or Data Engineering implementations and/or benchmark initiatives
  • Proven track record of experience with Snowflake competitors
  • Deep technical expertise in databases, data warehouses, data processing, and applications
  • Strong SQL performance analysis experience