Senior Databricks Developer - Government Health Sector
About the Role
We're hiring a Senior Databricks Developer to join our team in the Government Health sector. This role offers the opportunity to work remotely while making a significant impact on public health outcomes through advanced data analytics.
What You'll Do
- Lead the design and development of large-scale, resilient, and performant ETL/ELT data pipelines using PySpark/Scala within Databricks notebooks and jobs.
- Architect and manage the Delta Lake environment, focusing on data ingestion, quality enforcement using Delta Live Tables, and schema evolution for complex public health datasets.
- Optimise Databricks clusters, notebooks, and Spark jobs for cost-efficiency and performance, specifically targeting bottlenecks in high-volume batch and streaming workloads.
- Define and enforce data governance practices within the Lakehouse, utilizing Unity Catalogue for centralized metadata and access control, adhering to government standards.
- Collaborate closely with government analysts and data scientists to transition analytical models and research findings into scalable, production-ready pipelines.
- Champion CI/CD and MLOps practices for Databricks notebooks and workflows, utilizing tools like Azure DevOps or Jenkins.
- Mentor and guide junior engineers on Databricks development standards, Spark optimisation, and modern data engineering practices.
Requirements
- 6+ years of progressive professional experience in Data Engineering, with at least 3 years dedicated to developing solutions on the Databricks Platform.
- Expert-level proficiency in PySpark and/or Scala for distributed data processing.
- Hands-on experience with Delta Lake architecture, including DLT, time travel, and VACUUM operations.
- Deep understanding of cloud infrastructure (Azure preferred) and how Databricks integrates with cloud storage (ADLS Gen2) and services.
- Expert proficiency in SQL and dimensional modelling principles.
- Proven experience with CI/CD, Infrastructure as Code (e.g., Terraform), and Databricks command-line tools for automation.
- Exceptional communication and problem-solving skills, with the ability to analyse complex requirements and design resilient solutions in a highly regulated environment.
What We Offer
- Competitive salary with an estimated range of AUD 120,000 - 150,000 per year.
- Visa sponsorship for international candidates.
- Remote work flexibility.
- Opportunities for professional development and mentorship.
- Work on impactful projects that enhance public health outcomes.
This Senior Databricks Developer role offers a unique opportunity to work remotely in the Government Health sector, with competitive pay and visa sponsorship.
About ai talent
Explore AI Talent careers in 2026 and discover exciting job openings across remote, hybrid, and office roles. Utilize our advanced filters to tailor your job search, access application tracking, and gain insights into companies. Stay updated with industry news and find the perfect opportunity to grow your career at AI Talent. Start your journey today and unlock your potential in the dynamic AI landscape.
Who Will Succeed Here
Expertise in building and optimizing ETL/ELT data pipelines using PySpark and Scala within Databricks, demonstrating a strong understanding of Delta Lake architecture and performance tuning.
Proven experience working in a remote environment, showcasing self-discipline and the ability to manage time effectively while collaborating with cross-functional teams through tools like Azure DevOps and CI/CD pipelines.
Strong background in cloud infrastructure management using Terraform and Azure, with a strategic mindset for implementing scalable solutions in the government health sector.
Learning Resources
Career Path
Market Overview
Skills & Requirements
Domain Trends
Industry News
Loading latest industry news...
Finding relevant articles from the last 6 months