Why Innovation is the New Norm
I’m starting this blog today from the comfort of an Avanti train from Manchester to London, heading home after a truly mind-melting day at a “Future of BI” workshop with some of my Dufrain team. We’re seeing rapid innovation across Fabric and AI, and even in the last few weeks, it feels like the landscape for BI Professionals has completely shifted. My team at Dufrain have been hard at work researching some of these new developments, and a group of us got together in Manchester today to learn, discuss and plan what the Future holds for our work in the BI space.
Today we focused on Automations through Fabric Data Agents and Semantic Link, Agentic Development for Power BI, and MCP servers. I’ll link below to some of the great content I’ve seen recently on these developments, but I’d like to summarise my key thoughts on what innovations like these mean for Power BI Gurus.
“Computer says no” is no longer an option

I’ve considered myself somewhat of an expert in Power BI for a number of years now. If someone asks me “Can Power BI do this”, I feel confident in being able to say “Yes”, “No” or “Sort of” very quickly. Watching demos from my team on the variety of use cases for working with Power BI through an MCP Server, or creating a Fabric Data Agent, I’m left feeling like gone are the days when we can say quickly “Power BI can’t do that”. With the capabilities presented to us through these new innovations, what’s possible is all new. I don’t feel there is any request that we can rule out on tech limitation alone. Projects have budgets, and scope has to be defined of course – this is not new. What’s new is the limitations we impose on ourselves as people building things in Power BI Desktop. We are now in a world where if someone says “Hey can you make this pig fly?” the response is no longer “Of course pigs can’t fly” but instead “Hey team… let’s see if we can make a pig fly!”
Practical Power BI is not Parkour

It’s exciting to see how people are pushing boundaries with Fabric and AI in BI. Every new demo could spark inspiration for someone’s next big solution. That said, I’ve always been an advocate for keeping Power BI practical. Over-engineered, “Parkour-style” builds that turn reports into black boxes help no one. If you need a tool to generate 100 measures in an hour… maybe ask why you need 100 measures in the first place. “Sales by Product” doesn’t need to be a measure… Sales is the measure… Product is the dimension. A well-designed model does the heavy lifting.
For many organisations, there’s growing pressure to “do AI” just to avoid missing the boat. Largely – I support this. With the right intentions and guardrails, even less mature teams can start small and explore how AI might help speed up delivery or improve the user experience. But the real question should always be: Will this make things meaningfully better?
Take something simple like documenting a Power BI model… I already know 3–4 ways to automate it. The point isn’t to chase the flashiest method — it’s to find what’s cost-effective, easy to implement and maintain, and actually fits your needs. For practical BI teams, avoid doing clever things just for the sake of being clever. The key is keeping your work current, while being relevant to what your users need. Not creating the most innovative way to get from A to B for the sake of it!
I’m still yelling about Power BI Governance (and I won’t stop)

If you know me at all… you would have known this was coming! The best time to start your investment in governing your Power BI & Fabric platform? Yesterday. The second best time? Today! I delivered my session on the importance of striking a balance between innovation and control a number of times across 2023-2024, but I feel a 2025 revival may be due (watch this space!). The more we innovate, the more there is to control – but we can’t let over-governance hamper innovation. As the demand to alter tenant settings, enable permissions and switch on new “things” increases, platform owners and admins may feel overwhelmed with how to govern solutions and patterns that are so new. It’s important to create sandbox environments for your innovative teams, let them play, and this is where monitoring is key. If you only have one Fabric capacity for your production assets – be prepared for some throttling if a rogue notebook eats up your CUs. For people to learn and experiment – it’s going to happen. Key steps to consider for platform owners include:
- Install and Monitor the Fabric Capacity Metrics App regularly
- Consider your wider Fabric Platform Monitoring tooling – I recommend FUAM as standard now over our previous bespoke REST API solution.
- Audit tenant settings regularly, and avoid “on/off” approaches where you can be enabling for “subsets of the organisation” only. This allows controlled release of new features to trusted individuals.
- Define and communicate workspace and access strategies. Ensure people don’t spin up 50 new workspaces for exciting new innovations. Utilise workspace folders and sandbox environments to keep experiments under control.
- Ensure that experiments being adopted are properly productionised. If your experiment works (as we hope it does) people will start using it. That’s when you need to move it from the sandbox into a well-governed, production-ready environment.
And I promise, it’s a lot more fun to make pigs fly when you’re not constantly cleaning up after them.
We need to untie the knot

There are so many methods of approaching the same use case right now, it’s getting murky to navigate. Separating where Power BI Copilot, other Copilots, Fabric Data Agents, Semantic Link, Azure AI Foundry, and MCP Servers all start and end in my brain has left me feeling a little like George R. R. Martin when he was stuck trying to untangle the Meereenese Knot in A Dance with Dragons.
It took GRRM over a decade to figure out how Tyrion, Quentyn, Victarion, and Aegon were all supposed to arrive in Meereen without breaking the story. Likewise, we’re all staring at this expanding Microsoft toolkit, trying to piece together how these innovations connect without overcomplicating our architecture. I don’t think it’ll take us 11 years to get there… but as innovation accelerates, it might take a little patience before a clear, strategic vision emerges. My aim for now, is to understand what each element brings in terms of unique features, cost and performance, ease to run, ease to implement, and wider architectural fit including security considerations.
So Megan… what are all these innovations you’re talking about?
If phrases like Agentic Development, Fabric Data Agents and MCP have been bringing you out in a cold sweat reading this blog… I’ll point you towards some of the great content I’ve read in the last few weeks that has been super valuable to my team during our research.
Agentic Development – Rui Romano
Rui shows how we can provide a set of user stories, data source schemas, development standards and resources, and ask Github Copilot to design our data model, build it in a pbip file including our measures, and then optimise itself through Best Practice Analyser.
A mind-melting demo that shows the rate at which AI innovation is accelerating how we approach BI. While the schema and user stories are simple here, and we are exploring how this method fairs with more complex scenarios, this is something I can see rapidly evolving in the coming years.
Documenting Power BI Models via Fabric Data Agent – Marthe Moengen / Chris Webb
I’m pretty sure Power BI Developers enjoy writing documentation for their models about as much as anyone enjoys having to read it… a Word doc full of screenshots of models and snips of DAX isn’t much fun for anyone, but is often a vital part of completing a build and handing it over to BAU teams to run. Chris & Marthe present to us here methods where we can use Fabric Data Agents to automate and enrich the documentation process for writers and readers alike.
Semantic Link – Kurt Buhler
Another tool that’s popped up in our research and has a lot of potential is Semantic Link — a way to access and interact with Power BI models programmatically using Python or REST APIs. For Power BI developers, this means we can start to automate all the stuff we usually do manually in Desktop — like pulling out metadata, validating measures, or even analysing dependencies between calculations.
I’ve always been a bit skeptical of overcomplicating Power BI with extra layers of complexity, but this functionality allows us to get creative within the Fabric environment itself, making it in my view much more accessible and govern-able than third party tooling.
MCP Servers – Kurt Buhler
And this is where I start getting way out my depth… I’ve been seeing a lot of buzz about “Chat with your data” style features across Copilot, Fabric Data Agents and more. This I can grasp and get excited about – safe within the confines of the Fabric UI right? But MCP Servers allow us to take our Power BI work into interfaces like Claude Desktop. I’ve made my point already about Power BI Parkour – it’s cool, I’m glad someone has done it… but will I be recommending it to my Fabric clients? I don’t see why I would when there are other options. If you can gain x% more control over features but that requires Y% more overhead, skills and management…. why bother? It does seem though that there are a number of smaller automations that MCP can enable – including greater control over access and refreshes, so I don’t think we should be counting this tech out yet.
In short… build cool things, and govern them properly. The future of BI might involve flying pigs – let’s just make sure they’re house-trained before we let them roam free.




Leave a comment