Senior Machine Learning Engineer - Policy & Safety

at Spotify · Consumer · London, United Kingdom +1 · Experience

Senior Machine Learning Engineer on the Policy & Safety team, responsible for building and scaling ML-driven systems for content moderation, policy enforcement, and compliance within Spotify's consumer experience. This role involves working at the intersection of ML, platform engineering, and regulatory compliance to ensure user safety and trusted interactions.

What you'd actually do

  1. Design, build, and deploy machine learning models and systems for content moderation and policy enforcement.
  2. Develop and maintain data pipelines for compliance and safety-related data.
  3. Collaborate with cross-functional teams (Trust & Safety, Legal, Public Affairs) to integrate safety measures into new features.
  4. Contribute to the re-architecture of large-scale systems to enhance user protection and enable safer interactions.
  5. Ensure systems are compliant with relevant regulations and policies.

Skills

Required

  • Experience building and deploying machine learning models in production.
  • Proficiency in Python and ML frameworks (e.g., TensorFlow, PyTorch).
  • Experience with data engineering and building data pipelines.
  • Understanding of content moderation and safety systems.
  • Familiarity with regulatory compliance requirements.
  • Strong software engineering skills.

Nice to have

  • Experience with large-scale distributed systems.
  • Knowledge of NLP techniques for content analysis.
  • Experience working in a consumer-facing product environment.

What the JD emphasized

  • critical path
  • regulatory compliance
  • safety by default
  • large-scale rearchitecture
  • ML-driven systems
  • proactively protect users
  • empower safer interactions

Other signals

  • content moderation infrastructure
  • detection models
  • policy enforcement systems
  • compliance data pipelines
  • ML-driven systems
  • proactively protect users
  • empower safer interactions
Read full job description

We design Spotify’s consumer experience—end to end, moment to moment, across every screen, platform, and partner integration. Our mission is to make listening feel effortless, personal, and joyful for billions of users around the world. That means turning complexity into clarity across hundreds of touchpoints—from our mobile and desktop apps to the smart speakers, TVs, cars, and integrations where Spotify shows up every day. If it touches a consumer, we shape it. We bring deep insight into human behavior, design, and technology to craft experiences that feel intuitive, expressive, and unmistakably Spotify.

The Policy & Safety team sits within Content Platform in the Experience Mission, building the systems that keep Spotify safe, compliant, and trusted by millions of users and creators. This team owns Spotify’s content moderation infrastructure — from detection models to policy enforcement systems and compliance data pipelines.

Working at the intersection of machine learning, platform engineering, and regulatory compliance, the team partners closely with Trust & Safety, Legal, and Public Affairs. They’re on the critical path for every new content type and social feature — including messaging, comments, and collaborative experiences — ensuring safety is built in from day one. With a strong focus on “safety by default,” the team is investing in large-scale rearchitecture and ML-driven systems to proactively protect users and empower safer interactions across the platform.

What You'll Do

Who You Are

Where You'll Be