Senior Full-stack Software Engineer – Verification Data Platform

NVIDIA NVIDIA · Semiconductors · Durham, NC +2

Senior Full-stack Software Engineer to build a next-generation event-driven data platform for NVIDIA's GPU development. The role involves developing real-time data processing pipelines, building microservices and AI agents, and establishing guidelines for data architecture and observability. Experience with Kafka, Flink, Spark Streaming, and distributed systems is required. Familiarity with LLM integration and agentic AI frameworks is a plus.

What you'd actually do

  1. Participate in the full life-cycle of tool development, test, and deployment of web applications and services for test automation and chip verification.
  2. Develop and optimize real-time data processing pipelines using Kafka Streams, Apache Flink and Spark Streaming, ensuring high throughput, reliability, and low-latency performance.
  3. Collaborate closely with hardware engineering and chip verification teams to understand data requirements and deliver robust, scalable data solutions and UIs
  4. Establish optimal guidelines for streaming data architecture, schema management, data retention and platform observability (monitoring, logging, tracing).
  5. Build microservices and AI agents that are reliable, scalable, and maintainable.

Skills

Required

  • Java
  • Python
  • JavaScript/TypeScript
  • front-end frameworks (e.g., Ember.js, Vue.js)
  • distributed systems
  • microservices
  • Apache Kafka
  • Kafka Streams
  • Apache Flink
  • event driven data pipelines
  • distributed systems principles
  • concurrency
  • data structures
  • algorithms
  • Redis
  • Linux
  • BS/MS in Computer Science or related field or equivalent experience
  • 8+ years of experience for BS holders or 5+ years for MS holders

Nice to have

  • Elastic Stack (Elasticsearch, Kibana, Logstash)
  • OpenTelemetry (Grafana)
  • building agentic AI frameworks
  • LLM-powered autonomous agents
  • integrating LLMs with real-time data streams
  • chip design process
  • EDA verification workflows
  • ClickHouse

What the JD emphasized

  • Proven in-depth knowledge of Java and Python.
  • Deep understanding of Apache Kafka and proven experience building applications with Kafka Streams, Apache Flink or other event driven data pipelines.
  • Deep understanding of the scalable data caching solution, specifically Redis

Other signals

  • AI agents development
  • Build microservices and AI agents
  • integrating LLMs with real-time data streams