Software Engineer II

JPMorgan Chase JPMorgan Chase · Banking · Bengaluru, Karnataka, India · Commercial & Investment Bank

Software Engineer II role focused on Data Management and Governance within JPMorgan Chase's Chief Data Office (CDO). The role involves designing, building, and deploying dashboards and reports using Qlik, Python, and Jira to track OKRs, KPIs, and KRIs related to data standards and controls. It requires collaboration with various stakeholders and offers exposure to enterprise-wide data initiatives.

What you'd actually do

  1. Support and Enhance Controls & Reporting: Maintain, develop, and extend the existing CIB controls and reporting processes using Jira, Python, and Qlik, while leveraging APIs to source data from various firmwide metadata systems of record (SORs)
  2. Design and Maintain Data Integration Pipelines: Build and optimize processes to aggregate, transform, and load data from multiple firmwide metadata systems into a consistent reporting data model
  3. Build Best-in-Class Metadata Reporting: Influence and deliver a production-grade metadata reporting capability that enables ad-hoc, dynamic, and self-service reporting across diverse data sources
  4. Drive Data Standard Compliance: Design, build, and deploy dashboards and reports that track OKR, KPI, and KRI metrics to measure implementation progress, adoption, and compliance with the CIB Data Standard and Technical Control Procedures
  5. Enable Data-Driven Decision Making: Design and deliver intuitive visualizations and reporting solutions that provide stakeholders with clear visibility into metadata quality, lineage, and governance metrics—highlighting areas requiring attention and driving prioritized action on remediation and compliance
  6. Collaborate Across the Enterprise: Partner with Business Analysts, Data Management Owners, Application Owners, CIB Information Architecture teams, and other Lines of Business to understand requirements and deliver impactful solutions

Skills

Required

  • data integration
  • reporting
  • analytics development
  • Qlik (QlikSense)
  • Tableau
  • aggregating data from multiple sources
  • ETL/ELT tools (Pentaho, Informatica, Apache Spark)
  • SQL
  • Python
  • OLTP databases
  • OLAP databases
  • NoSQL databases
  • Data Lake
  • Data Warehouse
  • Data Lakehouse patterns
  • reporting templates
  • metrics reporting
  • data standards adoption
  • data governance
  • communication (presentations, documents, workshops, meetings)

Nice to have

  • AI/ML techniques
  • Agile methodologies
  • BCBS 239
  • GDPR
  • CCPA
  • React
  • Bootstrap

What the JD emphasized

  • minimum 3 years of experience in data integration, reporting, and/or analytics development
  • Strong skills and demonstrable knowledge of developing metrics, dashboards, and reports using Qlik (primarily QlikSense) and Tableau
  • Experience in aggregating data from multiple sources (direct database access, API, files, etc.) into a consistent reporting data model
  • Use of ETL and ELT tools such as Pentaho, Informatica, Apache Spark, etc. Experience of and confidence in using SQL and Python to implement data integration and reporting solutions
  • Experience reporting on metrics that measure how well teams are adopting and adhering to data standards and policies