Services Finance, Data Scientist

Apple Apple · Big Tech · Cupertino, CA · Corporate Functions

This role focuses on leveraging AI coding assistants and agentic AI frameworks to build and deploy analytical solutions within Services Finance at Apple. The Data Engineer will partner with stakeholders to understand business problems, design data-driven solutions, prototype rapidly using AI tools, and build production-ready models and automated pipelines. The role requires strong statistical and machine learning fundamentals, proficiency in Python/R, SQL, and experience with prompt engineering and agentic workflows.

What you'd actually do

  1. Partner with stakeholders to understand complex business problems and identify opportunities for data-driven impact
  2. Design and implement analytical frameworks that directly address business needs while leveraging AI tools to accelerate delivery
  3. Rapidly prototype solutions using AI coding assistants to validate approaches and gather early feedback
  4. Build and deploy production-ready models and automated analytics pipelines, ensuring scalability, reliability, and performance monitoring
  5. Conduct rigorous statistical analysis and model validation to measure impact and ensure solution effectiveness

Skills

Required

  • 5+ years of experience in a data science or related analytical role
  • Bachelor's degree in applied mathematics, statistics, computer science, data science, economics, or related quantitative field
  • Creative and curious thinker with ability to translate business problems into data requirements and actionable solutions
  • Proven "builder" mentality: demonstrated ability to independently execute from idea to implementation, with track record of shipping production solutions with minimal oversight
  • Excellent communication skills with ability to present complex findings to both technical and non-technical audiences
  • Strong programming proficiency in Python or R, with demonstrated experience using AI coding assistants (e.g., Claude Code, GitHub Copilot) to accelerate development
  • Expertise in SQL and data wrangling with large-scale datasets
  • Strong foundation in statistical methods and machine learning, with experience applying these techniques to solve business problems
  • Experience with prompt engineering and developing agentic AI workflows for automation and efficiency

Nice to have

  • Advanced modeling expertise: Time series forecasting, Bayesian methods, or anomaly detection
  • Data visualization & storytelling: Experience with interactive visualization tools and report generation frameworks such as Shiny, Quarto, Streamlit, Tableau, etc.
  • Data infrastructure proficiency: Modern data platforms (Snowflake, BigQuery, Spark), workflow orchestration (Airflow, GitHub Actions, etc.), and containerization (Docker)
  • API development and integration: Building and consuming REST APIs, database connectivity
  • Software engineering practices: Git, CI/CD pipelines, shell scripting, and production deployment experience

What the JD emphasized

  • AI coding assistants
  • agentic AI frameworks
  • prompt engineering
  • agentic AI workflows

Other signals

  • AI coding assistants
  • agentic AI frameworks
  • prompt engineering
  • production-ready models
  • automated analytics pipelines