Senior Staff Software Engineer, Ai/ml, Security

Google Google · Big Tech · Kirkland, WA +3

Senior Staff Software Engineer focused on AI/ML Security for Google Cloud, specifically on securing LLMs against threats like prompt injection and data exfiltration. The role involves defining technical strategy, designing and implementing scalable systems, influencing security roadmaps across Google, and resolving technical disagreements. It requires experience in ML infrastructure, AI model security, and delivering enterprise-grade security products.

What you'd actually do

  1. Define and drive the long-term technical goal and architectural strategy for model armor and sensitive data protection, ensuring Google Cloud remains a leader in generative AI security.
  2. Lead the design and implementation of highly scalable, low-latency systems to detect and mitigate emerging threats such as prompt injection, jailbreaking, and sensitive data leakage across enterprise environments.
  3. Act as a principal technical influencer to align security roadmaps across Google Cloud, DeepMind, and other product areas, navigating constraints like latency, cost, and global compliance.
  4. Resolve technical disagreements among execute engineers and researchers, forging consensus to deliver on high-stakes, cross-functional objectives across multiple time zones.
  5. Partner with executive leadership to evaluate Machine Learning (ML) security research and provide technical mentorship to staff and executive engineers, elevating the organization’s overall domain expertise.

Skills

Required

  • 8 years of experience in software development
  • 7 years of experience leading technical project strategy, ML design, and working with industry-scale ML infrastructure (e.g., model deployment, model evaluation, data processing, debugging, fine tuning)
  • 5 years of experience with one or more of the following: Speech/audio (e.g., technology duplicating and responding to the human voice), reinforcement learning (e.g., sequential decision making), ML infrastructure, or specialization in another ML field.
  • 5 years of experience with design and architecture; and testing/launching software products
  • 5 years of experience in Cloud

Nice to have

  • Master’s degree or PhD in Engineering, Computer Science, or a related technical field
  • 8 years of experience with data structures and algorithms
  • 5 years of experience in a technical leadership role leading project teams and setting technical direction
  • 5 years of programming experience in Java
  • Experience with AI model security, adversarial machine learning, or data privacy (e.g., prompt injection defenses, LLM introspection)
  • Experience delivering enterprise-grade security products that achieve measurable, high-impact business and customer outcomes

What the JD emphasized

  • secure LLMs
  • prompt injection
  • jailbreaking
  • data exfiltration
  • enterprise customers
  • Google Cloud scale
  • extreme ambiguity
  • high-stakes, cross-Google initiatives
  • AI model security
  • adversarial machine learning
  • data privacy

Other signals

  • LLM security
  • prompt injection
  • data exfiltration
  • generative AI security
  • enterprise customers