Solution Engineer - Data Engineering Specialist (fsi)

Snowflake Snowflake · Data AI · NY-New York, United States · Solution Engineering

Solution Engineer specializing in Data Engineering for the Financial Services industry, focusing on Snowflake's Cloud Data Platform. The role involves technical leadership in designing and architecting solutions for enterprise customers, working closely with sales teams to understand needs, provide demonstrations, support Proof of Concepts, and close business. Responsibilities include leveraging expertise in data ingestion, transformation, and lakehouse workloads, and collaborating with various internal teams.

What you'd actually do

  1. Apply your multi-cloud data architecture expertise while presenting Snowflake technology and vision to executives and technical contributors at strategic prospects, customers, and partners
  2. Work hands-on with prospects and customers to demonstrate and communicate the value of Snowflake technology throughout the sales cycle, from demo to proof of concept to design and implementation
  3. Immerse yourself in the ever-evolving industry, maintaining a deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them
  4. Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing

Skills

Required

  • 10+ years of architecture and data engineering experience within the Enterprise Data space
  • 5+ years experience within a pre-sales environment (Sales Engineer, Solutions Engineer, Solutions Architect, etc…)
  • Proven experience working with Financial Services customers
  • Outstanding presentation skills to both technical and executive audiences
  • Ability to connect a customer’s specific business problems and Snowflake’s solutions
  • Ability to do deep discovery of customer’s architecture framework
  • Broad range of experience within large-scale Database and/or Data Warehouse technology, ETL, analytics and cloud technologies
  • Hands on Development experience with technologies such as SQL, Python, Pandas, Spark, PySpark, Hadoop, Hive and any other Big data technologies
  • Deep understanding of data integration services and tools for building ETL and ELT data pipelines such as Apache NiFi, Matillion, Fivetran, Qlik, or Informatica
  • Familiarity with streaming technologies (ex. Kafka, Flink, Spark Streaming, Kinesis) and real-time or near real time use cases (ex. CDC)
  • Experience designing interoperable data lakehouse architectures and experience working with Iceberg, Delta, and Parquet
  • Strong architectural expertise in data engineering

Nice to have

  • Masters Degree in computer science, engineering, mathematics or related fields, or equivalent experience

What the JD emphasized

  • Proven experience working with Financial Services customers
  • Broad range of experience within large-scale Database and/or Data Warehouse technology, ETL, analytics and cloud technologies
  • Deep understanding of data integration services and tools for building ETL and ELT data pipelines
  • Strong architectural expertise in data engineering