Software Engineer, Frameworks/ui, Sensing & Connectivity

Apple Apple · Big Tech · Cupertino, CA · Software and Services

Software Engineer on the Location & Motion team at Apple, responsible for leveraging and extending existing frameworks to support customer-facing features. This role involves creating algorithms that fuse sensor data to extract insights and leveraging large language models to deliver these insights to customers. The engineer will also develop internal UI for visualization and validation, and collaborate with AIML experts.

What you'd actually do

  1. leveraging and extending existing frameworks to support customer-facing features on Apple platforms like iOS
  2. create powerful algorithms that fuse sensor data to extract insights
  3. leverage large language models to deliver these insights to customers
  4. develop customer-facing features to deliver insights and internal UI to help visualize and validate functionality

Skills

Required

  • Proficiency with Objective-C, Swift, or similar embedded programming language
  • Solid software engineering fundamentals — including data structures & algorithms, object-oriented design, and concurrency
  • Inter-process communication and systems development experience
  • Understanding of patterns, approaches, and constraints for large language model prompting
  • Ability to work collaboratively and explain complex ideas clearly

Nice to have

  • Strong analytical and quantitative skills with a solid foundation in mathematics and physics
  • Experience with SwiftUI and SwiftData
  • Interest in climate science, environmental impact reduction, or sustainability metrics
  • A proven track record of shipping applications / delivering code to production

What the JD emphasized

  • customer-facing features
  • large language models
  • fuse sensor data
  • on-device sensors
  • machine learning

Other signals

  • leveraging large language models
  • fuse sensor data
  • on-device sensors
  • machine learning