This post is a beginning to a series of articles about building analytical capabilities in Azure using data lake, Databricks and Power BI. On the surface, those technologies seem like they were specifically designed to complement each other as they provide a set of foundational capabilities necessary to develop scalable and cost-effective business intelligence
Category: Azure Data Lake
Recently, I had a chance to work with Azure Analysis Services (AS) sourcing data from Azure Data Warehouse (DW) external tables. Optimizing the processing of the Azure Analysis Services partitions to use with the Azure DW external tables is a bit different from working with the regular (physical) data tables, and I will discuss the
I have run into this a few times now and every time it took me a while to figure out what’s going on, so I figured if I wrote about this on my blog, maybe I would not forget about it next time. Microsoft has a great article here that details how to setup Azure
In an earlier post, we talked about self-service process to hydrate the Data Lake Store. We also mentioned the need to use Power Shell to load data files larger than 2Gb. Here is the Power Shell script: Provide your credentials to login to Azure: $MyAzureName = “<YourAzureUsername>“; $MyAzurePassword = ConvertTo-SecureString ‘<YourAzurePassword>’ -AsPlainText -Force; $AzureRMCredential = New-Object System.Management.Automation.PSCredential($MyAzureName,
Do you have to be a developer in order to implement a solution that ties together Power BI and Azure Data Lake? I argue that you don’t. However, there are a several things that you need to be familiar with before you get going. Therefore, I decided to cover several of them in this article.