Data Foundations Engineer

Data Foundations Engineer responsible for designing and scaling modern data architectures, building high-performance data pipelines, and enabling analytics and ML use cases. Focuses on data modeling and scalable systems within the Wallet, Payments, and Commerce products, with exposure to MLOps, GenAI/RAG, and LLMs.

What you'd actually do

  1. Designs and scales modern data architectures powering Wallet, Payments, and Commerce products.
  2. Builds high-performance data pipelines and enables analytics and ML use cases, with strong fundamentals in data modeling and scalable systems.
  3. Design and implement scalable batch and near-real-time data pipelines.
  4. Develop ETL/ELT workflows optimized for performance and cost.
  5. Implement dimensional data models and standardize business metrics.

Skills

Required

  • Data Engineering
  • Data Modeling
  • SQL
  • Python
  • Scala
  • Java
  • Spark
  • Kafka
  • Airflow
  • Lakehouse architectures
  • AWS
  • Azure
  • GCP
  • Snowflake
  • Databricks
  • Trino
  • OLAP/NRT systems
  • Superset
  • Tableau
  • CI/CD
  • Data observability
  • Infrastructure-as-code
  • MLOps
  • GenAI/RAG pipelines
  • LLMs
  • Prompt engineering
  • Fine-tuning
  • RAG

Nice to have

  • Rotating on-call participation

What the JD emphasized

  • 6+ years of experience in data engineering for analytics or ML systems
  • Hands-on experience with LLMs (prompt engineering, fine-tuning, RAG)
  • Experience in FinTech, Wallet, or Payments domain

Other signals

  • data pipelines
  • ML use cases
  • data modeling
  • scalable systems
  • GenAI/RAG pipelines
  • LLMs