Direct Lake – arguably one of the most tangible elements of newness that Fabric presents to the BI professional. The new Power BI storage mode in Fabric that lets you query data directly from OneLake with the performance of Import mode but without scheduled refresh overhead? I’m sold. While DirectLake has been around now since 2023, it’s really taken until this year for us to see the uptake of Fabric in the market at the scale that means DirectLake is now becoming commonplace approach in the Power BI sphere.
My energy on mastering this new storage mode initially went into understanding how to build for it, what you can’t do in it (still cursing the absence of field parameters and praying for this to make it’s way into the roadmap…), and how to monitor the performance of it and manage fallback (more on that from Chris Webb here).
This week, I had a query from a client that left me scratching my head. Seemingly a fundamental basic element – how do we give users access to Direct Lake reports?
Easy I say – pick your access strategy: workspace role, direct report access, app audience, or via link (shudder). App audience is my preferred method (more granular than workspace roles, less granular than direct access). So we add the Entra Group in question to the app audience and ta-dah… oh wait there’s an error. No problem I say – let’s whack the Entra Group on the underlying model also. Read access – no need to add build, or reshare etc. Oh there’s still an error?
At this moment I thought hang on… Direct Lake is reading right from the Lakehouse… and I had a sinking feeling I was about to have to consider giving access to the Lakehouse and recreating the same RLS I’d just setup in Power BI here as well. Fear not – this is where shareable cloud connections come in.
When you first publish a Direct Lake Semantic Model, the default cloud connection is used, leveraging Single Sign-On (SSO). This means the consumer of the report defines the identity that Power BI is using to query the SQL analytics endpoint:

The alternative? Create a fixed identity that supports centralised access management, allowing the Semantic Model to query the Lakehouse through a Service Principal. This way IT teams manage credentials, create shareable cloud connections, and share them with the intended creators. Report consumers do not need to be provisioned to the Lakehouse, and so my “you don’t need to use workspace roles for consumers and end up with 1000s of workspaces” argument remains intact. More on that here…. How One Task Flow Made Me Question Fabric Complexity – Livadata
So – what do you need to know?
There are two key Microsoft Learn articles to keep handy to make this work:
Manage Direct Lake semantic models – Microsoft Fabric | Microsoft Learn
The one element of this to consider, is that for seasoned engineers, hardcoding Service Principal Keys to a platform is a no-no:

Why? Because keys are effectively passwords. If you paste one directly into the model UI, you’re creating a static credential that:
- Can’t be rotated automatically (so if IT regenerates the secret, your model breaks until you manually update it).
- Increases exposure risk: anyone with edit rights to the model could accidentally overwrite or mishandle it.
- Fails compliance best practices where secrets should be centrally managed, logged, and tightly controlled.
Blogs like this one from Sue Bayes advise how to securely use Azure Key Vault secrets in Microsoft Fabric Notebooks, and that’s the gold standard: a central vault where secrets can be rotated, audited, and retrieved securely. But within the Semantic Model UI itself, we don’t yet have that luxury. Right now you’re forced to key in the Service Principal secret directly.
Yes, once you save it, the value won’t be visible to you or anyone else which is something, but it’s still a maintenance headache and not aligned with most enterprise security standards. For ease of operations (and everyone’s peace of mind), I think most engineers and IT teams would like to see better Key Vault integration here so we can tighten up that weak spot.
Thanks to one of Dufrain’s newer Microsoft superstar’s Bharani for being my researcher on this topic and cracking the setup in record time for one of our current projects! 🚀
If you’ve had to manage access for Direct Lake reports, I’d love to hear how you approached it. Drop a comment or reach out on LinkedIn to have a chat!




Leave a comment