Sr. Data Operations Analyst

Tempus AI · Vertical AI · Chicago, IL

This role is responsible for building and maintaining the data architecture that powers AI agents and cross-functional data products within the diagnostics business unit. It involves translating operational requirements into technical specifications, operationalizing data pipelines, and owning reporting and dashboard development. The role also supports GenAI and agent lifecycle management, including defining success metrics and evaluation frameworks.

What you'd actually do

  1. Build and own the operational data architecture: Serve as the technical lead in translating diagnostic business needs into operational data processes. Define data models, source-to-target mappings, and the structural logic that connects SFDC, OPUS, LIMS, Hub, and lab systems into unified, reliable data products.
  2. Operationalize diagnostic data pipelines: Utilize SQL and data transformation tools to create and combine operational workflow data streams — including order, case, communication, and physician activity data — into formats that power dashboards, agents, and downstream analytics.
  3. Lead data architecture for AI agent development: Partner with AI Engineering and Tempus One agent teams to define the data requirements for diagnostic workflow agents (e.g., order status, tissue request automation, cancellation workflows). Translate operational needs into BRDs and technical specifications that engineering teams can build from.
  4. Support GenAI and agent lifecycle management: Assist in scoping, piloting, and ongoing monitoring of AI agents for diagnostic workflows. Define success metrics, establish ground truth evaluation frameworks, and manage the feedback loop between operations and the AI team post-launch.
  5. Build cross-functional data products: Extend operational reporting into shared data products used by BI, Commercial Systems, and business leadership. Ensure that data is structured, documented, and accessible enough to serve as input for agent builds, executive reporting, and cross-team analytics.

Skills

Required

  • Deep proficiency in SQL and experience with data transformation tools (e.g., dbt); ability to write and optimize complex queries across multi-source schemas
  • Experience with data visualization tools (Looker strongly preferred; Tableau acceptable)
  • Demonstrated experience building or operationalizing AI/ML data pipelines, agent workflows, or automated data products
  • Excellent interpersonal and communication skills; proven ability to translate technical data requirements to non-technical stakeholders and business leadership
  • Comfort with ambiguity, ability to create structure in fast-moving environments, and strong instinct for prioritization when data access is incomplete
  • Proficient in Google Suite (Sheets, Docs, Slides) and Excel
  • Bachelor's degree in an analytical, computational, or healthcare-related field (e.g., Data Science, Bioinformatics, Computer Science, Biomedical Engineering, Public Health)
  • 4–6 years of relevant experience in data analytics, data engineering, healthcare analytics, or clinical operations data

Nice to have

  • Experience in healthcare diagnostics, oncology, or life sciences
  • Familiarity with CRM systems (Salesforce/SFDC) and clinical data systems (EMR, LIMS)
  • Experience with prompt engineering, LLM agent testing, or AI agent evaluation frameworks

What the JD emphasized

  • primary owner of dashboards
  • define the data layer that makes agent building possible
  • partner cross-functionally to ensure data products are built on a stable, scalable foundation
  • primary builder and maintainer
  • leader in translating diagnostic business needs into operational data processes
  • define the data requirements for diagnostic workflow agents
  • establish ground truth evaluation frameworks

Other signals

  • AI Engineering
  • AI agents
  • data products
  • operational data architecture
  • data pipelines