top of page

Why Sunsetting Default Semantic Models is Fabric's Best Decision Yet

  • Jihwan Kim
  • 5 days ago
  • 3 min read

In this writing, I’d like to share what I have learned from a recent Microsoft Fabric announcement that, in my opinion, is one of the most important steps forward for the platform’s maturity. Microsoft is officially sunsetting the automatic creation of default semantic models for Lakehouses and Warehouses.

For many in the community, this has been a hot request since Fabric was first released. While the auto-generated models were convenient for quick demos, they created significant friction in governed, enterprise environments. For those of us who live and breathe data modeling and lifecycle management, this change isn't a loss of a feature; it's a massive and welcome win for the platform.




From Magic to Management


The core problem with default semantic models was that they were too automatic. When you created a Lakehouse or a Warehouse, Fabric would instantly create a Power BI semantic model with the same name. On the surface, this seemed helpful—it lowered the barrier to getting data into a report.

In practice, however, it created ambiguity and clutter. Which was the true semantic layer? The one that was auto-generated, or the one a developer would later build with proper DAX measures, hierarchies, and security? It blurred the lines between data engineering and semantic modeling, two disciplines that should be distinct and deliberate.

This new direction moves Fabric away from "magic" and towards intentional, manageable, and governable BI assets. It’s a strategic shift that places control back into the hands of developers and architects, forcing us to be explicit about our design choices.



The Two-Stage Farewell


Microsoft is rolling out this change in two stages, with both planned for release by the end of December 2025.

  • Stage 1: Disabling Auto-Creation: Soon, any newly created Warehouse, Lakehouse, SQL Database, or Mirrored Database will no longer auto-generate a default semantic model. Existing artifacts will not be affected at this stage.

  • Stage 2: Migrating Existing Models: In the second stage, all existing default semantic models will be decoupled from their parent items. They will become regular, standalone semantic models that must be managed by users just like any other.

This phased approach gives us time to adapt, but the direction is clear: the era of the implicit, auto-created model is over.



The New Workflow is Intentional by Design


So, what does this mean for my day-to-day work? It means I get to adopt a cleaner, more robust, and more professional workflow. Here is how I see the ideal process evolving:

  1. Data Engineering is for Data Engineers: The Lakehouse and Warehouse can now focus on their primary purpose: ingesting, transforming, and preparing clean, reliable, and performant Delta tables. This is the domain of the data engineer, using Spark or SQL.

  2. Semantic Modeling is a Deliberate Act: Once the data is prepared, a BI developer or data modeler performs a separate, conscious action: creating a new Power BI semantic model. This will almost always be a Direct Lake model to get the best performance and data freshness. This new model is a distinct asset, not an automatic side effect.

  3. Clear Ownership and Lifecycle: This explicit creation process establishes clear ownership. The semantic model is no longer a child of the Lakehouse; it's a first-class citizen in the workspace with its own lifecycle. This makes it a perfect candidate for proper source control and robust CI/CD processes.

  4. Unambiguous Governance: From a governance perspective, this is a dream. There is one, and only one, intentionally created semantic model for a given business area. It can be endorsed, certified, and managed without the confusion of a default model lingering in the workspace.

This change isn't about adding a few extra clicks to our workflow. It's a fundamental shift that elevates Microsoft Fabric from a collection of powerful tools into a serious, governable, enterprise-grade analytics platform. It forces me to be better architects, and that is always a good thing.


I hope this helps having fun in building more robust and manageable solutions in Fabric.

How is your team planning to adapt to this change?

Comments


bottom of page