Find out how you can connect to an Azure SQL database or Azure SQL Data Warehouse by using a service principal
What happens if you have multiple environment? Or, what happens if you need to make sure your notebook passes unit testing before making its way to production?
In this post I will cover how you can execute a Databricks notebook, push changes to production upon successful execution and approval by a stage pre-deployment approval process.
So you’ve created notebooks in your Databricks workspace, collaborated with your peers and now you’re ready to operationalize your work. This is a simple process if you only need to copy to another folder within the same workspace. But what if you needed to separate your DEV and PROD?
Things get a little more complicated when it comes to automating code deployment; hence CI/CD.
Have you ever wanted to process in near real time new files added to your Azure Storage account (BLOB)? Have you tried using Azure EventHub but files are too large to make this a practical solution?
Let me present you to ABS-AQS file source…
When it comes to securing passwords or exchanging secrets between users sharing the same notebooks or simply not have them in clear texts, changes in the integration between Databricks and Azure have made things even easier… I’m referring to secrets within Databricks and Azure KeyVault.