AI Analytics Engineer (ai & Analytics Platform)

Airtable Airtable · Enterprise · Austin, New York, San Francisco · Data

Airtable is seeking an AI Analytics Engineer to build and maintain the context layer, evaluation frameworks, and adoption strategy for internal AI analytics tools. This role involves building AI agent systems, designing evaluation frameworks, and developing automated insight generation systems to enable self-service analytics.

What you'd actually do

  1. Build and maintain context infrastructure: Translate institutional business knowledge into structured formats — business glossaries, DBT model enrichment, semantic layer definitions in Omni Analytics — so that AI tools can answer questions accurately, not just confidently.
  2. Design and run evaluation frameworks: Develop predefined test cases, accuracy benchmarks, and validation workflows that measure whether AI-generated insights are trustworthy. Own the feedback loop between eval results and context improvements.
  3. Build and orchestrate AI agent systems: Help design, build, and iterate on the agent architectures that power our analytics tools — including prompt pipelines, tool orchestration, query routing logic, and guardrails that determine when AI should answer autonomously vs. escalate for human validation.
  4. Experiment and evaluate: Test prompt configurations, agent behaviors, and model outputs across different use cases — using eval results and accuracy metrics to drive continuous improvement.
  5. Develop internal AI tooling and workflows: Build tools and automations that improve DS&A's own efficiency — identifying opportunities where AI can accelerate the team's work and executing on them.

Skills

Required

  • SQL proficiency
  • modern data tools (dbt, Databricks, Snowflake, or similar)
  • building context infrastructure
  • designing evaluation frameworks
  • building and orchestrating AI agent systems
  • prompt engineering
  • tool orchestration
  • guardrails
  • translating business knowledge into structured formats
  • developing internal AI tooling and workflows
  • building automated insight generation systems
  • communicating complex business logic
  • writing clear documentation

Nice to have

  • experience in SaaS or tech environments

What the JD emphasized

  • AI & Analytics Platform
  • AI-powered analytics tools
  • natural-language-to-SQL capabilities
  • Claude
  • Omni Analytics
  • AI-native world
  • AI agent systems
  • prompt pipelines
  • tool orchestration
  • query routing logic
  • guardrails
  • human validation
  • eval results
  • accuracy metrics
  • internal AI tooling
  • automated insight generation systems
  • AI-powered analytics tools
  • LLMs
  • prompt engineering
  • AI tooling
  • build systems around them
  • SQL-proficient
  • modern data tools
  • dbt
  • Databricks
  • Snowflake
  • AI-powered analytics
  • best practices don't exist yet
  • own workstreams end-to-end
  • scoping the problem
  • building the solution
  • iterating based on feedback
  • bring ideas to the table
  • move things forward
  • step-by-step direction
  • data-related roles
  • analytics engineer
  • data analyst
  • data scientist
  • partnering with business stakeholders
  • SaaS
  • tech environments
  • Strong SQL proficiency
  • modern data tools
  • dbt
  • Databricks
  • Snowflake

Other signals

  • building systems that make AI accurate
  • designing workflows that make AI trustworthy
  • partnering across the business to drive adoption
  • building and orchestrating AI agent systems
  • developing internal AI tooling and workflows
  • building automated insight generation systems