Databricks Consultant

Consultant role focused on implementing and optimizing the Databricks platform for clients, covering data engineering, data analytics, and machine learning model deployment. Requires hands-on expertise in Databricks, cloud platforms (AWS, Azure, GCP), SQL, Python, and Apache Spark.

What you'd actually do

  1. Data Engineering: Apply your project experience to design, develop, test, and maintain scalable data pipelines and data products using tools like Autoloader, Declarative Pipelines, and Structured Streaming.
  2. BI & Data Analytics: Enable business intelligence and data analytics by building and optimizing Databricks SQL endpoints, data models, and integrations with BI tools.
  3. Machine Learning: Assist in the development and deployment of machine learning models using Databricks, Unity Catalog for governance, and MLflow for the ML lifecycle.
  4. Client Advisory: Advise clients on best practices for implementing and operating the full suite of Databricks capabilities, drawing from successful real-world use cases.
  5. Technical Implementation: Engage in the hands-on implementation of comprehensive Databricks solutions, from data ingestion and transformation to AI/ML model deployment.

Skills

Required

  • Databricks platform
  • SQL
  • Python
  • Apache Spark
  • AWS
  • Azure
  • GCP

Nice to have

  • Databricks certifications
  • Public sector experience
  • Active public sector clearance

What the JD emphasized

  • hands-on expertise
  • hands-on implementation

Other signals

  • Databricks Platform
  • data engineering
  • data analytics
  • machine learning
  • ML model deployment
  • Databricks SQL endpoints
  • Unity Catalog
  • MLflow