Sr. Data Engineer

Adobe Adobe · Enterprise · San Jose, CA +6

This role is for a Sr. Data Engineer focused on building data integrations using AWS technologies for Adobe's Customer Experience Orchestration enterprise customers. The engineer will collaborate with various teams to capture requirements, conceptualize solutions, and build the necessary technology stack, including developing new features and improving existing data integrations. The role involves profiling data quality, preparing data, and building pipelines for integrating customer and third-party data with Adobe solutions. Experience with Python, PySpark, Apache Airflow, MongoDB, MySQL, Docker, and Kubernetes is required or a plus. The role also involves collaborating with a Project Manager for billing and forecasting.

What you'd actually do

  1. Collaborate with Data architects, Enterprise architects, Solution consultants and Product engineering teams to capture customer data integration requirements, conceptualize solutions & build required technology stack
  2. Collaborate with enterprise customer's engineering team to identify data sources, profile and quantify quality of data sources, develop tools to prepare data and build data pipelines for integrating customer data sources and third party data sources with Adobe solutions
  3. Develop new features and improve existing data integrations with customer data ecosystem
  4. Encourage team to think out-of-the-box and overcome engineering obstacles while incorporating new innovative design principles.
  5. Collaborate with a Project Manager to bill and forecast time for customer solutions

Skills

Required

  • Experience as an enterprise Data Engineer from a consulting background
  • AWS Certified Data Engineer – Associate or AWS Certified Cloud Practitioner
  • 5+ years experience in building/operating/maintaining fault tolerant and scalable data processing integrations using AWS
  • 7+ years experience in Python programming language preferably using PySpark
  • Software development experience working with Apache Airflow, MongoDB, MySQL
  • Ability to identify and resolve problems associated with production grade large scale data processing workflows
  • Experience creating and maintaining unit tests and continuous integration.

Nice to have

  • Experience using Docker or Kubernetes is a plus
  • BS/MS degree in Computer Science or equivalent industry experience
  • Experience & knowledge with Web Analytics or Digital Marketing
  • Experience & knowledge with Customer Data Platform (CDP) or Data Management Platform (DMP)
  • Experience & knowledge with Adobe Experience Cloud solutions

What the JD emphasized

  • Strong capacity to manage numerous projects are a must