Forward Deployed Engineer – Aws

Forward Deployed Engineer (FDE) for AWS at Deloitte, focused on building and deploying enterprise-scale GenAI solutions for clients. This role involves translating business needs into AI use cases, prototyping and delivering working AI solutions, and developing scalable AI engineering patterns, agentic platforms, and workflows. The FDE will apply architecture decisions balancing quality, safety, latency, and cost, and deliver production-quality code with strong engineering practices.

What you'd actually do

  1. Prototype and deliver working AI solutions using industry expertise and emerging capabilities.
  2. Build AI-enabled solutions, agentic platforms, and workflows across enterprise AI platforms.
  3. Develop scalable AI engineering patterns, tool-use approaches, and human-in-the-loop controls.
  4. Apply architecture decisions that balance quality, safety, latency, cost, and model risk.
  5. Deliver production-quality code using strong practices in testing, CI/CD, logging, versioning, and documentation.

Skills

Required

  • 3+ years of experience in software engineering, data engineering, data science, or analytics engineering
  • 1+ years of experience with AWS AI&Data including hands on experience with one of the following key platforms/products; Amazon Bedrock, Bedrock Agents, Knowledge Bases, Guardrails
  • 1+ years of hands-on experience building and deploying GenAI/LLM-powered solutions in client or production environments
  • 1+ years of experience leading project workstreams/engagements and translating business problems into AI solutions
  • 1+ years of experience building reliable, maintainable, and well-documented code

Nice to have

  • Experience with cloud environments (AWS, Azure, and/or Google Cloud) and common platform services (storage, compute, IAM, networking)
  • Demonstrated ability to work directly alongside client technical teams and program stakeholders in fast-paced, ambiguous delivery environments
  • Data engineering experience with Spark, Airflow/dbt, streaming, data modeling or ML/data science background feature engineering, experimentation or model evaluation
  • Experience with MLOps/LLMOps practices: evaluation frameworks, model monitoring, and prompt management
  • Experience integrating LLM solutions with enterprise systems via APIs, microservices, or event-driven architectures
  • Experience operating within hybrid onshore/offshore teams
  • Familiarity with security, privacy, and compliance considerations

What the JD emphasized

  • GenAI-enabled solutions
  • agentic platforms
  • tool-use approaches
  • production-quality code

Other signals

  • building working software
  • enterprise-scale impact
  • GenAI-enabled solutions
  • production-quality code