Assistant Vice President; Quantitative Finance Analyst

Bank of America Bank of America · Banking · Charlotte, NC

Quantitative Finance Analyst role focused on developing and automating quantitative analytics and modeling projects, including data analysis, ETL, data modeling, and scripting using Python and SQL within a financial services context.

What you'd actually do

  1. Automate new and existing EIT tests from second- and first-line coverage (includes data analysis, data mapping, business rules creation/coding on Python platform, pilot testing, QA and scheduling to production, etc.).
  2. Gather and document test requirements pertaining to test script and data required for testing.
  3. Create automation flow/ write recipes in Python.
  4. Participate in proof-of-concept projects to expand automation capability and explore new innovative tools.
  5. Work in collaboration with FLU and GCOR test owners and perform automation assessment.

Skills

Required

  • Bachelor's degree or equivalent in Engineering (any), Computer Science, Computer Information Systems, Management Information Systems, or related
  • 5 years of progressively responsible experience in the job offered or a related IT occupation
  • Gathering business requirements, analyzing user needs and working on design documents
  • Creating complex data models and responsible for the Extraction, Transformation and Loading (ETL) of the raw source data to the data models by cleansing and applying the business logic
  • Working on Automation using Unix shell scripting and performance tuning of Database and Datawarehouse
  • Developing, testing, and deploying the code using Agile methodologies
  • Utilizing scripting languages of SQL, PL/SQL, and Python Scripting
  • Databases of Hadoop, Oracle, Teradata, and DB2

What the JD emphasized

  • 5 years of progressively responsible experience
  • 5 years of experience in each of the following:
  • Gathering business requirements, analyzing user needs and working on design documents
  • Creating complex data models and responsible for the Extraction, Transformation and Loading (ETL) of the raw source data to the data models by cleansing and applying the business logic
  • Working on Automation using Unix shell scripting and performance tuning of Database and Datawarehouse
  • Developing, testing, and deploying the code using Agile methodologies
  • Utilizing scripting languages of SQL, PL/PL/SQL, and Python Scripting and databases of Hadoop, Oracle, Teradata, and DB2