Resources and insights
Our Blog
Explore insights and practical tips on mastering Databricks Data Intelligence Platform and the full spectrum of today's modern data ecosystem.
Most teams that move to Databricks get the hard part right. They migrate the processing engine, rebuild the transformation logic, and stand up Unity Catalog. Then they leave Azure Data Factory running in the background: connected to everything, owned by nobody, and quietly accumulating cost and complexity. In this entry, that’s the gap we address.
Explore More Content
ML & AI
How to Pass Terraform Outputs to Databricks’ DABS
As teams migrate infrastructure definitions into Declarative Automation Bundles, Terraform still owns the Azure layer — Key Vaults, resource groups, networking. This post walks through a clean, CI/CD-ready pattern for passing Terraform outputs directly into bundle variable overrides, eliminating manual config steps and the environment drift that follows them.
Grant individual permission to secrets read in Unity Catalog
Implement granular secret access control in Unity Catalog that goes beyond traditional Key Vault-level permissions. This advanced approach uses Unity Catalog UDFs with service credentials to create secret-specific access controls, allowing you to grant users access to individual secrets rather than entire Key Vaults.
Unity Catalog to Azure Key Vault: No more dbutils.secrets()
Learn how to securely connect Azure Databricks to Key Vault using Unity Catalog Service Credentials for enterprise-grade secret management and governance.