Senior Partner Solution Engineer

Snowflake Snowflake · Data AI · Mexico, Mexico · Remote · Solution Engineering

This role is for a Senior Partner Solutions Engineer at Snowflake, focusing on enabling Systems Integrator (SI) partners to leverage Snowflake's platform for AI and data solutions. The role involves technical strategy, architectural expertise, developing joint solutions, and supporting partners in customer engagements. While AI is mentioned as a key area for partners, the core responsibilities of the Solutions Engineer are centered around platform enablement, data pipelines, and integration, rather than direct AI/ML model development or deployment.

What you'd actually do

  1. Help Solution Providers/Practice Leads with the technical strategies that enables them to sell their offerings on Snowflake
  2. Forward Strategic thinking - quickly grasp the essence of new concepts and business value messaging
  3. Strong understanding of how SI make revenue through the Industry priorities & complexities they face and influence where Snowflake products can have the most impact for their product services
  4. Help develop and launch joint differentiated solution offerings with SI Partners
  5. Conversations with other technologists, providing presentations at the C-level.

Skills

Required

  • PySpark or Scala and SQL
  • ELT/data pipelines
  • Spark
  • Kafka
  • DevOps and CI/CD processes and tools
  • major cloud platforms and tooling (Azure, AWS, or Google Cloud)
  • Big Data or Cloud integration technologies
  • large-scale systems in production
  • CS fundamentals (data structures, algorithms, distributed systems)
  • English fluency

Nice to have

  • Matillion, FiveTran, Informatica, dbtCloud
  • database technologies
  • AI/ML use cases
  • Developer Platform, Developer Experiences or Developer Productivity
  • multiple programming languages
  • database internals

What the JD emphasized

  • Production level hands on expertise in developing and deploying PySpark or Scala and SQL to build ELT/data pipelines into production environments
  • Hands on experience in designing and building highly scalable data pipelines using Spark, Kafka to ingest data from various systems
  • 5+ years industry experience designing, building and supporting large-scale systems in production