Sr. Data Engineer

Adobe Adobe · Enterprise · San Jose, CA +8

This role is for a Sr. Data Engineer focused on building data integrations using AWS technology stack for Adobe's Customer Experience Orchestration enterprise customers. Responsibilities include collaborating with architects and engineering teams to define requirements, prepare data, build data pipelines, and develop new features for data integrations. The role requires strong Python/PySpark experience, AWS knowledge, and experience with tools like Apache Airflow, MongoDB, and MySQL. Experience with Docker/Kubernetes is a plus. The role is primarily focused on data engineering and integration within an enterprise AI context, but the core craft is not AI/ML model building.

What you'd actually do

  1. Collaborate with Data architects, Enterprise architects, Solution consultants and Product engineering teams to capture customer data integration requirements, conceptualize solutions & build required technology stack
  2. Collaborate with enterprise customer's engineering team to identify data sources, profile and quantify quality of data sources, develop tools to prepare data and build data pipelines for integrating customer data sources and third party data sources with Adobe solutions
  3. Develop new features and improve existing data integrations with customer data ecosystem
  4. Encourage team to think out-of-the-box and overcome engineering obstacles while incorporating new innovative design principles.
  5. Collaborate with a Project Manager to bill and forecast time for customer solutions

Skills

Required

  • Experience as an enterprise Data Engineer from a consulting background
  • AWS Certified Data Engineer – Associate or AWS Certified Cloud Practitioner
  • 5+ years experience in building/operating/maintaining fault tolerant and scalable data processing integrations using AWS
  • 7+ years experience in Python programming language preferably using PySpark
  • Software development experience working with Apache Airflow, MongoDB, MySQL
  • Strong capacity to manage numerous projects
  • BS/MS degree in Computer Science or equivalent industry experience
  • Ability to identify and resolve problems associated with production grade large scale data processing workflows
  • Excellent communication skills
  • Experience creating and maintaining unit tests and continuous integration.

Nice to have

  • Experience using Docker or Kubernetes is a plus
  • Experience & knowledge with Web Analytics or Digital Marketing
  • Experience & knowledge with Customer Data Platform (CDP) or Data Management Platform (DMP)
  • Experience & knowledge with Adobe Experience Cloud solutions

What the JD emphasized

  • 5+ years experience in building/operating/maintaining fault tolerant and scalable data processing integrations using AWS
  • 7+ years experience in Python programming language preferably using PySpark
  • Strong capacity to manage numerous projects are a must