Aws Forward Deployed Engineer - Gps

This role focuses on building and deploying GenAI-enabled solutions, agentic platforms, and workflows for clients, requiring hands-on experience with AWS GenAI services like Bedrock, Bedrock Agents, and Knowledge Bases. The engineer will translate business needs into AI solutions, develop scalable AI engineering patterns, and apply architecture decisions balancing quality, safety, latency, and cost.

What you'd actually do

  1. Embed with clients to identify business needs and translate high-value GenAI use cases into solutions.
  2. Build AI-enabled solutions, agentic platforms, and workflows across enterprise AI platforms.
  3. Develop scalable AI engineering patterns, tool-use approaches, and human-in-the-loop controls.
  4. Apply architecture decisions that balance quality, safety, latency, cost, and model risk.
  5. Deliver production-quality code using strong practices in testing, CI/CD, logging, versioning, and documentation.

Skills

Required

  • 4+ years of experience in software engineering, data engineering, data science, or analytics engineering
  • 1+ years of hands-on experience building and deploying GenAI/LLM-powered solutions in client or production environments
  • 1+ years of experience with AWS including hands on experience with one of the following key platform technologies; Amazon Bedrock, Bedrock Agents, Knowledge Bases, Guardrails
  • 1+ years of experience leading project workstreams/engagements and translating business problems into AI solutions
  • 1+ years of experience building reliable, maintainable, and well-documented code
  • Ability to travel 50%, on average
  • Must be legally authorized to work in the United States without the need for employer sponsorship
  • Ability to obtain and maintain a US government security clearance

Nice to have

  • Experience with cloud environments (AWS, Azure, and/or Google Cloud) and common platform services (storage, compute, IAM, networking)
  • Demonstrated ability to work directly alongside client technical teams and program stakeholders in fast-paced, ambiguous delivery environments
  • Data engineering experience with Spark, Airflow/dbt, streaming, data modeling or ML/data science background feature engineering, experimentation or model evaluation
  • Experience with MLOps/LLMOps practices: evaluation frameworks, model monitoring, and prompt management
  • Experience integrating LLM solutions with enterprise systems via APIs, microservices, or event-driven architectures
  • Experience operating within hybrid onshore/offshore teams
  • Familiarity with security, privacy, and compliance considerations

What the JD emphasized

  • GenAI-enabled solutions
  • GenAI use cases
  • agentic platforms
  • tool-use approaches
  • human-in-the-loop controls
  • model risk

Other signals

  • GenAI-enabled solutions
  • Build AI-enabled solutions, agentic platforms, and workflows
  • Develop scalable AI engineering patterns, tool-use approaches
  • Prototype and deliver working AI solutions