Big Data

Running an Azure Databricks notebook in a CI/CD release stage

What happens if you have multiple environment? Or, what happens if you need to make sure your notebook passes unit testing before making its way to production?

In this post I will cover how you can execute a Databricks notebook, push changes to production upon successful execution and approval by a stage pre-deployment approval process.

Read More »
Big Data

CI/CD with Databricks and Azure DevOps

So you’ve created notebooks in your Databricks workspace, collaborated with your peers and now you’re ready to operationalize your work. This is a simple process if you only need to copy to another folder within the same workspace. But what if you needed to separate your DEV and PROD?

Things get a little more complicated when it comes to automating code deployment; hence CI/CD.

Read More »
Big Data

Managing passwords and secrets in Azure Databricks with KeyVault

When it comes to securing passwords or exchanging secrets between users sharing the same notebooks or simply not have them in clear texts, changes in the integration between Databricks and Azure have made things even easier… I’m referring to secrets within Databricks and Azure KeyVault.

Read More »