(usa) Senior, Software Engineer

Walmart · Retail · Bentonville, AR

Senior Software Engineer role focused on building platforms, Machine Learning models, and AI Agents within Walmart's Finance Technology department. The role involves processing large quantities of data, developing ML models for observability and alerting, and orchestrating systems for end-to-end visibility. Experience with traditional software engineering practices is required, with a plus for Big Data, analytics, and ML model deployment.

What you'd actually do

  1. work and partner with teams to build Platforms, Machine Learning models and AI Agents under Finance Data Factory team within Finance Technology.
  2. develop machine learning models to enable observability, monitoring, data Quality and impact-based alerting, orchestrate with multiple systems in the landscape to provide end-to-end visibility and connectivity between the systems to better serve business/customers.
  3. Provides visibility into data correctness, completeness, accuracy for downstream systems.
  4. Identifies data gaps upfront, hence providing proactive alerting.
  5. Enabling data transparency, insights hence help kick off the data remediation process in upstream systems.

Skills

Required

  • Minimum 3 to 4 years of software development experience
  • Professional frontend development experience using Javascript, HTML/CSS, ReactJs or next.js
  • Strong Java, Spring framework, Kafka, SQL and cloud experience such as Azure.
  • Strong experience in RESTful Microservices
  • Container technologies such as Docker on Kubernetes
  • Strong API design, development, and management
  • Experience with cloud native technology, CI/CD (KITT)
  • Experience with third-party libraries and APIs
  • Strong experience with UI tech stack using React JS, React native
  • Experience with UI testing frameworks like Jest
  • Software development (Java or Python, Pysprak is preferable)
  • Solid skills in java stack (Spring, Maven, Hibernate)
  • Strong experience within the Cloud Services using Azure or Google GCP
  • Experience with working with CI/CD (Jenkins/Travis)
  • Experience with working with cloud deployments (scaling, resiliency, load balancing etc)
  • Experience with SQL and NoSQL databases (MSSQL, GCP BigQuery)
  • Experience working with message stream systems (Pub/Sub -Kafka, RabbitMQ)
  • Experience working in agile environment (Scrum, daily standups etc)
  • Experience of delivering and supporting a large-scale production system
  • Experience with being a core contributor to a software project: understanding domain and business requirements, being responsible for critical parts of the application.
  • Experience working with logging and application monitoring stack: Splunk, Dynatrace,Grafana
  • Good communication skills, working in multi-team project.

Nice to have

  • Strong knowledge of retail and one or more business practices across domains such as product, finance, marketing, technology, and business systems.
  • Big Data processing and feature engineering
  • Exceptional analytical and quantitative problem-solving skills.
  • Experience with large dataset mining tools and open-source programming tools (e.g., R, Python).
  • Expertise in solving optimization problems using integer programming.
  • Ability to deconstruct abstract problems, formulate hypotheses, and conduct detailed analyses.
  • Experience navigating ambiguity and building rapid business cases/prototypes.
  • Hands-on experience with analytics and reporting tools such as Google Cloud Platform (BigQuery), Microsoft Azure, Power BI, and Microsoft Excel.
  • Time series forecasting, anomaly detection, LLM tuning, using sound statistical and deep learning methods.
  • Deployment of machine learning models to cloud platforms to serve internal customers at scale.
  • Experience in Technical Solutions Architecture and design leadership.

What the JD emphasized

  • core contributor to a software project
  • core contributor to a software project

Other signals

  • Develop machine learning models
  • AI Agents
  • Finance Data Factory
  • Observability, monitoring, data Quality and impact-based alerting
  • Orchestrate with multiple systems