Resident Solutions Architect - Public Sector

Databricks Databricks · Data AI · CA · Remote · Professional Services Operations

This role is for a Big Data Solutions Architect on the Professional Services team, working with clients on short to medium-term engagements using the Databricks platform. The architect will provide data engineering, data science, and cloud technology project support, including designing reference architectures, creating how-to guides, and productionalizing customer use cases. The role involves consulting on architecture and design, bootstrapping or implementing customer projects, and providing escalated support for operational issues. It requires experience in data engineering, distributed computing (Spark), cloud ecosystems, and MLOps, with a focus on delivering value to customers through the Databricks platform.

What you'd actually do

  1. You will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.
  2. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.
  3. You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to's and productionalizing customer use cases
  4. Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications
  5. Consult on architecture and design; bootstrap or implement customer projects which leads to a customers' successful understanding, evaluation and adoption of Databricks.

Skills

Required

  • 6+ years experience in data engineering, data platforms & analytics
  • Comfortable writing code in either Python or Scala
  • Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one
  • Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals
  • Familiarity with CI/CD for production deployments
  • Working knowledge of MLOps
  • Design and deployment of performant end-to-end data architectures
  • Experience with technical project delivery - managing scope and timelines.
  • Documentation and white-boarding skills.
  • Experience working with clients and managing conflicts.
  • Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.
  • Databricks Certification

Nice to have

  • Candidates with an active Secret or higher clearance are strongly encouraged to apply.

What the JD emphasized

  • U.S. citizenship and eligibility for a U.S. government secret clearance are required
  • active Secret or higher clearance are strongly encouraged to apply