Understanding Composite Semantic Models with Direct Lake and Import Tables
- Jihwan Kim
- 8 minutes ago
- 3 min read
In this writing, I’d like to share how I started learning about Composite Semantic Models in Power BI and Fabric — especially now that Direct Lake mode can coexist with Import tables. When this feature first appeared in preview, I was curious how it could help simplify data architecture and improve performance for mixed workloads. After some testing, I’ve learned both the power and the limitations of this hybrid approach.
Why This Feature Matters
For years, I had to choose between Import mode and DirectQuery for semantic models. Import was fast but required scheduled refreshes; DirectQuery was real-time but slower. Then came Direct Lake, which unlocked near real-time querying from OneLake without sacrificing performance — a huge step forward.
Until recently, though, I couldn’t mix Direct Lake with Import tables in the same model. I had to choose one or the other.
Now, I can combine Direct Lake, Import, and DirectQuery tables within a single semantic model — creating a Composite Model that flexibly blends performance, freshness, and connectivity.
Reference:
My Experiment: A Hybrid Sales Model
To explore this, I built a small test model, like the below diagram, that blended different types of tables:
Table | Source | Mode | Purpose |
FactInternetSales | OneLake (Delta Parquet) | Direct Lake | Core transactional table |
DimDate, DimProduct, DimSalesTerritory, DimCustomer | SQL Database | Import | Lookup for attributes and segmentation |
Why I Chose This Mix
FactInternetSales needs up-to-date data — Direct Lake gives that with near-import speed.
Dimension tables changes less often — Import mode avoids unnecessary compute hits.
The model refreshes quickly, with only the import tables participating in scheduled refresh cycles, while the Direct Lake table stays synced with OneLake.

What I Observed
1. Better Performance from Direct Lake
Direct Lake really feels like Import — query response times were almost identical for large aggregations, even though the data wasn’t “loaded” into the model file.
2. Seamless Integration
Relationships between Import and Direct Lake tables worked normally. Filtering from a Customer dimension in Import mode properly cross-filtered the FactInternetSales table in Direct Lake mode.
3. Refresh Behavior
When I refreshed the model, only Import tables participated. Direct Lake tables stayed live. That means refresh times are now predictable and lightweight.
Governance and Deployment Considerations
From a governance perspective, this hybrid capability creates new flexibility — but also new complexity.
Version Control and Deployment:
Direct Lake tables link to Lakehouse datasets, so those dependencies need to be stable before deployment.
Import tables still rely on refresh pipelines. Managing both requires strong orchestration.
Currently, updates or fixes to the composite semantic model—and any modifications to reports connected to it—can be performed directly through web editing in the Power BI Service. This method ensures a smoother synchronization process with the Azure DevOps or GitHub main branch when the workspace is linked to source control.
Alternatively, changes to reports connected to the Direct Lake + Import composite semantic model can be made locally using Visual Studio Code. In this case, the connection string (or dataset binding path) to the composite semantic model must be obtained and configured manually to establish the proper linkage between the local environment and the remote model.

Practical Scenarios for Enterprises
From testing in a production-like setup, I see three patterns where this hybrid model shines:
Operational + Analytical Mix Use Direct Lake for transactional facts that update continuously and import for slower-changing reference tables.
Migration Path to Fabric Organizations moving from classic Import models can start migrating large tables to OneLake incrementally while keeping smaller ones in Import.
Scalable, Cost-Efficient Refresh Hybrid models reduce refresh compute cost — only Import tables consume capacity during scheduled refreshes.
This new capability — mixing Direct Lake and Import tables — is a quiet but major shift for Power BI and Fabric. It lets me build models that balance freshness, performance, and governance without splitting logic across multiple datasets.
For BI architects, this means simpler pipelines and a more unified data foundation.
I hope this helps having fun in learning and experimenting with Composite Semantic Models in Microsoft Fabric, and maybe inspires to try blending Direct Lake with Import in your next project.



Comments