Data Engineer, Go-to-market

Notion Notion · Enterprise · San Francisco, CA · Engineering

Notion is seeking a Data Engineer to build foundational datasets and pipelines for their go-to-market strategies. This role involves integrating product and business systems, supporting marketing and sales reporting, and designing scalable data pipelines. The ideal candidate has 3+ years of data engineering experience, familiarity with SaaS marketing/sales datasets and business systems, cloud data solutions, SQL expertise, and object-oriented programming skills. While not an AI expert, curiosity and willingness to adopt AI tools are encouraged.

What you'd actually do

  1. You'll build core datasets to serve as the sources of truth for Notion’s marketing and sales reporting, integrating data to and from business systems data and Notion’s product.
  2. You’ll partner closely with our Marketing, Sales, Revenue Operations, Business Technology, Business Intelligence and Data Science teams to support critical reporting and analysis needs.
  3. You'll design, build and monitor pipelines that meet today's requirements but can gracefully scale with our growing data size.
  4. You’ll help democratize access to high quality data across go-to-market teams, Staff, and the entire company.

Skills

Required

  • 3+ years as a data engineer building core datasets and supporting business verticals
  • experience working with both Marketing and Sales datasets at a SaaS company
  • built integrations with business systems like Salesforce, Netsuite, Marketo, Zendesk, etc.
  • self-starter and continuously gather and synthesize high-impact needs from business partners, design and implementing the appropriate technical solutions, and effectively communicating about deliverables, timelines and tradeoffs
  • hands-on experience shipping scalable data solutions in the cloud (e.g AWS, GCP, Azure)
  • across multiple data stores (e.g Snowflake, Redshift, Hive, SQL/NoSQL, columnar storage formats)
  • methodologies (e.g dimensional modeling, data marts, star/snowflake schemas)
  • SQL expert
  • comfortable with object-oriented programming paradigms (e.g Python, Java, Scala)

Nice to have

  • hands-on experience in designing and building highly scalable and reliable data pipelines using BigData stack (e.g Airflow, DBT, Spark, Hive, Parquet/ORC, Protobuf/Thrift, etc)
  • worked with Go-To-Market stakeholders in the past
  • worked at a fast-growing company or are eager to contribute in such an environment