Solutions Consultant

Snowflake Snowflake · Data AI · Mexico, Mexico · Remote · Professional Services

Solutions Consultant role focused on deploying Snowflake's cloud products and services, assisting customers with modernization projects, and migrating them to Snowpark/Snowflake. The role involves acting as a customer expert, providing requirements for internal tooling, and guiding customers on data engineering pipelines. Experience in AI/ML, data warehousing, and cloud projects is beneficial, as is the ability to architect Spark and Scala environments and develop best practices for distributed computing and machine learning frameworks.

What you'd actually do

  1. Be responsible for delivering exceptional outcomes for our teams and customers during our modernization projects.
  2. You will engage with customers to migrate from legacy environments into Snowpark/Snowflake.
  3. You will act as the expert for our customers and partners throughout this process.
  4. In addition to customer engagements, you will work with our internal team to provide requirements for our Snowconvert utility, based on project experiences.
  5. This ensures that our tooling is continuously improved based on our implementation experience.

Skills

Required

  • University degree in computer science, engineering, mathematics or related fields, or equivalent experience
  • 2 - 5 years of experience as a solutions architect, data architect, database administrator, or data engineer.
  • Experience in Data Warehousing, Business Intelligence, AI/ML, application modernization, or Cloud projects
  • Experience in building realtime and batch data pipelines using Spark and Scala
  • Proven track-record of results with multi-party, multi-year digital transformation engagements
  • Proven ability to communicate and translate effectively across multiple groups from design and engineering to client executives and technical leaders
  • Strong organizational skills, ability to work independently and manage multiple projects simultaneously
  • Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations
  • Hands-on experience in a technical role (SQL, data warehousing, cloud data, analytics, or ML/AI)
  • Extensive knowledge of and experience with large-scale database technology (e.g. Snowflake, Netezza, Exadata, Teradata, Greenplum, etc.)
  • Software development experience with Python, Java , Spark and other Scripting languages
  • Proficiency in implementing data security measures, access controls, and design within the Snowflake platform.
  • Internal and/or external consulting experience.

Nice to have

  • Have the ability to outline the architecture of Spark and Scala environments
  • Guide customers on architecting and building data engineering pipelines on Snowflake
  • Run workshops and design sessions with stakeholders and customers
  • Create repeatable processes and documentation as a result of customer engagement
  • Scripting using python and shell scripts for ETL workflow
  • Develop best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own
  • Weigh in on and develop frameworks for Distributed Computing, Apache Spark, PySpark, Python, HBase, Kafka, REST based API, and Machine Learning as part of our tools development and overall modernization processes

What the JD emphasized

  • AI-native thinkers
  • reinventing how they work
  • low-ego individuals
  • experimental mindset
  • rapidly test emerging capabilities
  • Willingness to forge ahead to deliver outcomes for customers in a new arena, with a new product set
  • Passion for solving complex customer problems
  • Ability to learn new technology and build repeatable solutions/processes
  • Ability to anticipate project roadblocks and have mitigation plans in-hand