Last updated: 20/01/2025
Contents
Introduction
This case study serves to explore the intricacies of migrating a vanilla Power BI reporting solution to Microsoft Fabric. This case study describes the scope, structure and benefits of such a move as well as any difficulties encountered.
Scope of project
The project required Data Active to migrate existing Power BI financial reports to Microsoft Fabric.
The Power BI reports made use of an ODBC connection to a SQL server to extract data. The report made use of SQL queries at the extract level to do some initial filtering and data manipulation.
The data sources would need to be recreated in Fabric. It would also need to allow for future reports to be built on it. Thus, the data sources needed to be future-proof.
The Power BI apps contained financial reports such as Profit and Loss reports at various levels. A balance sheet, Cost of Sale Reports, Aged Debtors, and Aged Creditors Reports. Each one of these reports needed to be recreated exactly as they were.
Structure of the project
The Fabric solution would be built in the medallion architecture.
The bronze level consisted of a Gen 2 Dataflow which extracted the data from the SQL server via the same ODBC connection. However, in this step all the tables were extracted thus giving the project scope to expand based on client requirements.
This data is then saved to SQL tables in the Lakehouse.
The silver level makes use of a Gen 2 Dataflow as well. This time it applies specific transformations to the data (now being pulled from the Lakehouse) that are specific to the financial report.
These transformations are applied, and the resulting tables are then saved to production SQL tables in the Lakehouse.
Lastly, the gold level makes use of a semantic model where the model is built. The relationships between tables were created as they are in the initial Power BI app. Then the measures were created.
This final semantic model was then pulled into Power BI and is the base on which the new financial report was built.
The final product
The final product was a lightweight, fast refreshing, data warehouse and reporting solution that refreshed quickly and reliably.
Though the client-facing application remained exactly the same, the backend was a major improvement over simply using Power BI to do ETL.
Benefits of the project
The initial setup of the data warehouse was fairly straightforward. Many of the Data Active team members are familiar with the Power Query interface in the Gen 2 Dataflow and with its close-to-no-code environment creating the extract and transform flows was very easy.
The semantic model was also very familiar, allowing the team to quickly recreate the Power BI model and measures.
Upon completion, the team noticed the following:
- The solution was 25% faster when processing data.
- Faster for users who use the apps (near instant reloads thanks to the semantic model).
- Consolidated data sets which mean:
- Less calls to the SQL server therefore less bandwidth being used.
- Quicker turnaround of new development.
- More efficient working environment.
- Allowed for the development server to be downscaled thanks to the cloud handling the processing of data.
- Does not have as many moving parts as a general data warehouse platform.
Difficulties experienced
Many Power BI developers make use of empty tables to group measures. This is a simple process. A developer would go to “Enter Data”, give the table a name such as “Balance Sheet Calculations”, and save all future balance sheet measures.
This is less intuitive in Fabric. A workaround is to create a blank table in SQL, assign it a name like “Balance Sheet Calculations”, and pull it into the semantic model. To do this for every new grouping would be tedious so the team created one blank table and grouped the measures using display folders.
This change in measure names and naming convention meant that the measures in the app broke. All the measures needed to be readded to the visuals. Thankfully, the app in this case study was small but this might be a big ask for apps with thousands of visuals and measures. So, tread carefully!
Another difficulty was changing the structure of any table. One would have to delete the whole table and then recreate it with the new structure. Thereafter, every flow and semantic model needed to be updated.
Conclusion
In conclusion, Microsoft Fabric is an easy-to-use, powerful, purpose-built data warehouse and ETL tool. The integrations with many connectors, data sources, and storage options make data manipulation and storage a breeze. Power Query with its familiar, close-to-no-code environment makes creating data tables a quick process. Thanks to the semantic models, you can quickly and effortlessly connect a blank Power BI report to a fully functioning and powerful data source with prebuilt relationships and measures cutting down on development time and boosting reliability. Microsoft have really hit it the ball out the park with this one!