Data Engineer

Visa Visa · Fintech · London, United Kingdom, United Kingdom

Visa is seeking a Data Engineer to build and evolve scalable data engineering capabilities that support Data Science, AI, and client-facing products for European markets. The role involves requirement analysis, building ETL processes, developing data models, managing data pipelines with tools like Apache Airflow, ensuring data quality, and collaborating with stakeholders. Experience with big data tools (Hadoop, Hive, Spark), Python, SQL, cloud services, and GenAI applications is required.

What you'd actually do

  1. Understand and translate business needs into data models supporting long-term solutions
  2. Build, manage and deploy large scale ETL processes to generate data assets for the region
  3. Build modular and reusable code considering the configurability and scalability while adhering to low-level design
  4. Perform thorough unit testing of development tasks and document the test results using standard defined templates
  5. Build, schedule, and manage DAGs in Apache Airflow efficiently

Skills

Required

  • 2-4 years development experience in building data pipelines and writing ETL code using Hive, PySpark, SQL and Unix
  • Experience in writing and optimizing SQL queries in a big data environment
  • Experience working in Linux/Unix environment and exposure to command line utilities
  • Experience creating/supporting production software/systems and a proven track record of identifying and resolving performance bottlenecks for production systems
  • Exposure to code version control systems (e.g. git, GitHub)
  • Experience working with cloud services (e.g. AWS, GCP, Azure)
  • Familiarity with common agentic coding tools
  • Hands-on experience building GenAI-based applications or workloads
  • Ability to understand a diverse set of business domains and requirements
  • Good understanding of agile working practices and related program management skills
  • Experience with workflow orchestration tools (e.g., Apache Airflow) and designing reliable data workflows
  • Experience applying data quality frameworks and practices (e.g., automated checks, reconciliation and data observability)
  • Strong communication and presentation skills with ability to interact with different cross-functional team members at varying levels

Nice to have

  • Advanced degree in technical field (e.g. Computer Science, statistics, etc.)
  • Experience with visualization tools like Tableau and Power BI
  • Exposure to Financial Services or the Payments Industry
  • Hands-on experience with CI/CD and automation pipelines (e.g., GitHub Actions, Jenkins, Azure DevOps) including testing and release practices

What the JD emphasized

  • Hands-on experience building GenAI-based applications or workloads

Other signals

  • data pipelines
  • ETL
  • GenAI applications
  • data quality
  • data models