Data Foundations Engineer

Data Foundations Engineer responsible for designing and scaling modern data architectures, building high-performance data pipelines, and enabling analytics and ML use cases. The role requires strong fundamentals in data modeling, scalable systems, and experience with cloud platforms and data processing tools. Exposure to MLOps, GenAI/RAG pipelines, and LLMs is also required, with a focus on the FinTech, Wallet, or Payments domain.

What you'd actually do

  1. Designs and scales modern data architectures powering Wallet, Payments, and Commerce products.
  2. Builds high-performance data pipelines and enabling analytics and ML use cases, with strong fundamentals in data modeling and scalable systems.
  3. Design and implement scalable batch and near-real-time data pipelines.
  4. Develop ETL/ELT workflows optimized for performance and cost.
  5. Instrument APIs and user journeys to capture behavioral and transactional data.

Skills

Required

  • 8+ years of experience in data engineering for analytics or ML systems.
  • Strong SQL proficiency.
  • 8+yrs experience in Python, Scala, or Java.
  • Hands-on experience with Spark, Kafka, and Airflow (or similar).
  • Strong understanding of data modeling and lakehouse architectures (e.g., Iceberg).
  • Experience with AWS, Azure, or GCP.
  • Experience with Snowflake, Databricks, Trino, OLAP/NRT systems, Superset or Tableau.
  • Hands-on experience with LLMs (prompt engineering, fine-tuning, RAG).
  • Experience in FinTech, Wallet, or Payments domain.

Nice to have

  • Familiarity with CI/CD, data observability, infrastructure-as-code.
  • Exposure to MLOps and GenAI/RAG pipelines.

What the JD emphasized

  • 8+ years of experience in data engineering for analytics or ML systems.
  • Hands-on experience with LLMs (prompt engineering, fine-tuning, RAG).

Other signals

  • data pipelines
  • ML use cases
  • data modeling
  • scalable systems
  • GenAI/RAG pipelines
  • LLMs