Senior Data Platform Architect

Snowflake Snowflake · Data AI · London, United Kingdom · Solution Engineering

Snowflake is seeking a Senior Data Platform Architect for their Applied Field Engineering team. This role involves providing technical leadership in designing and architecting the Snowflake Cloud Data Platform for enterprise customers. Responsibilities include working with sales teams, understanding customer needs, delivering compelling demonstrations, supporting Proof of Concepts, and closing business. The role requires expertise in data ingestion, transformation, and lakehouse workloads, with the ability to interact with both business and technical executives. Experience in pre-sales, data engineering, and various big data technologies is expected.

What you'd actually do

  1. Apply your multi-cloud data architecture expertise while presenting Snowflake technology and vision to executives and technical contributors at strategic prospects, customers, and partners
  2. Work hands-on with prospects and customers to demonstrate and communicate the value of Snowflake technology throughout the sales cycle, from demo to proof of concept to design and implementation
  3. Immerse yourself in the ever-evolving industry, maintaining a deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them
  4. Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing

Skills

Required

  • multi-cloud data architecture
  • Enterprise Data space architecture and data engineering
  • pre-sales environment experience
  • presentation skills to technical and executive audiences
  • connect customer business problems and Snowflake solutions
  • deep discovery of customer's architecture framework
  • large-scale Database and/or Data Warehouse technology
  • ETL
  • analytics
  • cloud technologies
  • Data Lake
  • Data Mesh
  • Data Fabric
  • SQL
  • Python
  • Pandas
  • Spark
  • PySpark
  • Hadoop
  • Hive
  • Big data technologies
  • data integration services and tools
  • ETL and ELT data pipelines
  • Apache NiFi
  • Matillion
  • Fivetran
  • Qlik
  • Informatica
  • streaming technologies
  • Kafka
  • Flink
  • Spark Streaming
  • Kinesis
  • real-time or near real time use cases
  • CDC
  • data lakehouse architectures
  • Iceberg
  • Delta
  • Parquet
  • architectural expertise in data engineering
  • Masters Degree in computer science, engineering, mathematics or related fields, or equivalent experience

Nice to have

  • Bachelor’s Degree

What the JD emphasized

  • 10+ years of architecture and data engineering experience within the Enterprise Data space
  • 5+ years experience within a pre-sales environment (Sales Engineer, Solutions Engineer, Solutions Architect, etc…)