Software Engineer Iii- Python/pyspark

JPMorgan Chase JPMorgan Chase · Banking · Jersey City, NJ +1 · Corporate Sector

Software Engineer III with Python/PySpark expertise at JPMorgan Chase, focusing on designing and delivering technology products within the Financial Planning and Analysis team. Responsibilities include executing software solutions, creating production code, producing architecture artifacts, working with data scientists, optimizing data workflows, implementing data governance, and utilizing Databricks for data processing. Requires 3+ years of experience in system design, application development, Python/Java, PySpark/Spark, AWS services, and database technologies.

What you'd actually do

  1. Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
  2. Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
  3. Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development
  4. Work closely with data scientists and analysts to comprehend data requirements and deliver solutions that align with business objectives
  5. Optimize and troubleshoot data workflows to ensure high performance and reliability

Skills

Required

  • software engineering concepts
  • system design
  • application development
  • testing
  • operational stability
  • Python
  • Java
  • PySpark
  • Spark
  • big data frameworks
  • AWS services
  • cloud-based data solutions
  • SQL
  • PostgreSQL
  • DynamoDB
  • modern programming languages
  • database querying languages
  • Software Development Life Cycle
  • agile methodologies
  • CI/CD
  • Application Resiliency
  • Security
  • cloud
  • artificial intelligence
  • machine learning
  • mobile

Nice to have

  • data warehousing solutions
  • ETL processes
  • data visualization tools and techniques
  • Databricks

What the JD emphasized

  • 3+ years applied experience
  • Expertise in PySpark/Spark
  • Hands-on experience with AWS services and cloud-based data solutions