Senior Lead Software Engineer - Databricks & Python

JPMorgan Chase JPMorgan Chase · Banking · GLASGOW, LANARKSHIRE, United Kingdom · Corporate Sector

Senior Lead Software Engineer role focused on building and optimizing Python/PySpark big data pipelines on Databricks for regulatory reporting in a fintech environment. The role involves designing and delivering microservices, owning end-to-end build-and-run, and providing technical leadership in areas like CI/CD and observability. While AI coding assistants are mentioned for accelerating delivery, the core function is data engineering and software development, not AI/ML model building.

What you'd actually do

  1. Lead by example: contribute production‑quality code daily, write tests, perform code reviews, and pair program.
  2. Design and deliver secure, high‑quality microservices and web applications with Java/Spring Boot and React; own deep debugging, root‑cause analysis, and performance tuning for high-availability services.
  3. Own end‑to‑end build‑and‑run: design, implement, test, deploy, and operate services (“you build it, you run it”).
  4. Design, develop, and maintain robust big data pipelines using Python and PySpark on Databricks platform on AWS, optimizing complex queries and data processing workflows to ensure efficient performance at scale.
  5. Provide technical leadership and act as SME for microservices, CI/CD, observability, performance engineering, and data modelling.

Skills

Required

  • Strong hands-on experience in data engineering or related roles
  • Strong proficiency in Python and PySpark for large-scale data processing
  • Advanced proficiency in Java and Spring Boot; strong fundamentals, design patterns, and secure coding.
  • Full‑stack delivery with React (component/state management) and secure RESTful API design.
  • Demonstrated experience with AWS, Databricks and Apache Spark ecosystem
  • Reliability and performance engineering: concurrency, thread management, caching, and resiliency patterns (circuit breakers, retries, backoff), with cost awareness.
  • Proven track record shipping and operating production systems; comfortable troubleshooting in Kubernetes, CI/CD, and cloud environments.
  • Relational and NoSQL databases: schema design, performance tuning, and secure data access.
  • Experience with AWS cloud services (S3, ECS, SNS/SQS, Lambda, etc.)
  • Strong analytical skills with ability to investigate data issues, identify root causes, and implement solutions
  • Experience with the complete SDLC, Jules/Jenkins, Spinnaker, Sonar and Agile methodologies

Nice to have

  • Understanding of regulatory finance/external reporting (workflows, aggregations, reconciliations, controls).
  • Cloud certifications; proven cloud‑native delivery on AWS.
  • Experience with large‑scale distributed systems, event‑driven architectures, and messaging/streaming patterns.
  • Observability/SRE depth: telemetry pipelines, alerting strategies, incident response, post‑mortems, and continuous improvement.
  • Experience with data orchestration tools (Airflow, Step Functions, etc.).
  • Understanding of financial services industry and regulatory requirements.
  • Databricks or AWS certifications
  • Automated testing frameworks, e.g. Playwright, Cucumber, Gherkin etc.
  • Experience with Parquet, JSON, CSV, Avro, Delta Lake

What the JD emphasized

  • regulatory finance/external reporting
  • regulatory requirements