Microsoft Fabric Tour 2025 in Redmond, WA at the Microsoft Reactor Building

It was a beautiful Saturday on May 31st in Redmond, Washington when over 600 people descended on the Microsoft Reactor Building to hear great speakers talking about Microsoft Fabric.

The speakers and attendees were great. The speakers made time for questions and the questions were asked and answered with heartfelt passion and conviction. The organizers were also treated to some extra hustle requirements when the total number of attends was more than twice than was expected a week before kickoff.

As a person who doesn't always spend most of their time in the world of "Data" I learned a lot more about one of the most exciting platforms that is not discussed enough. Everyone can rave about AI, but data is the blood that drives the beating hearts and data is often found everywhere.

This is why it was with great to learn about the expansion of data mirroring and the reach for Microsoft OneLake. Sources that many clients already own such as Microsoft Dataverse, Snowflake, Azure SQL and SQL Server on-premises (accessed using the Azure Data Gateway).

Other key technology advances were also shared.

Security continues to be a focus and enhanced security features have been added including the ability for more rich administration. This includes defining access permissions once and having them consistently applied in numerous places. 

There was also a significant amount of time spent on near Real-Time processing and productivity of queries. It was great to hear people taking performance seriously. In the past performance often becomes the forgotten stepchild that only gets considered when there is a problem or gets overlooked because of the advancement in hardware, but bottlenecks are not just hardware! 

Microsoft Fabric also includes Microsoft Azure Synapse Analytics and when I first started learning about and watching the use of Azure Synapse Analytics, I watched the deep, deep technical gurus climb the steep learning curve, so it was nice to hear that this learning curve is also being addressed. As some would say, the path to simplicity must first deep dive into the complexity. "Simple" is by far not the correct term, but as technology matures, grace and usefulness does get impacted by complexity, so leaning towards use and usability continues to be seen from the Microsoft development teams.

As an application focused resource, I continue to watch the new features that are related to business rules and as such the announcement that Microsoft Fabric now supports user data functions was an interesting one. To get "User Data Functions" summarized I asked for a little help (thank you ChatGPT) and here is what I found out "User Data Functions are reusable, parameterized functions that you define using T-SQL or Spark (PySpark/Scala) and then call from various components in Fabric. They encapsulate business rules, data transformations, or logic that you want to apply consistently across your data environment." so these have some very interesting potential.

and down the rabbit hole when I asked how UDFs related to or could be used with the Power Platform

"User Data Functions (UDFs) are a Microsoft Fabric feature and not native to the Power Platform, you can bridge them into Power Platform solutions—particularly when working with Power BI, Power Apps, or Power Automate that consume or manipulate data from Microsoft Fabric, OneLake, or Synapse Lakehouse."

Platform How UDFs Are Used
Power BI Directly in semantic models, Lakehouse queries, or reports via SQL endpoints.
Power Apps Indirectly by connecting to Fabric-enriched tables or APIs.
Power Automate Triggers Fabric pipelines or notebooks where UDFs apply logic, with results flowing back.
Dataverse Can act as a bridge, syncing UDF-enriched data between Fabric and Power Platform.

One of the coolest benefits for companies working with the Microsoft Stack is that it is always pushing the bleeding edge and always offering mind bending options and notice that I have not even mentioned Microsoft Fabric and AI Foundry! 

"Fabric's data agents can now integrate with Azure AI Foundry via the Azure AI Agent Service"

  

 


MCP Servers

Model Context Protocol servers or MCP Servers

Just a few of the many to choose from

Awesome MCP Servers

Best MCP Servers and Clients List

"The Model Context Protocol (MCP) is an open standard that acts like a universal connector for AI applications. MCP allows AI to access external tools and data sources, such as cloud storage or code repositories, in a standardized way. This makes it easier to build AI-powered tools that can interact with the real world, from automating tasks to fetching live data."  retrieved from 10 Best MCP Servers You Need To Know About – Bind AI May 23, 2025.

BUT TELL ME MORE 

Okay so I really appreciate articles that "Explain it like I am Five" so take a deep dive into this one Explain It Like I’m Five: What the Heck Is an MCP Server? - Phase 2

Now if we put our Microsoft Hat on and look at the Microsoft growing Stack 

Introducing the Azure MCP Server - Azure SDK Blog

Securing the Model Context Protocol: Building a safer agentic future on Windows | Windows Experience Blog

Copilot Studio ! Extend your agent with Model Context Protocol (preview) - Microsoft Copilot Studio | Microsoft Learn

and why are these changing everything?

What are MCP Servers And Why It Changes Everything 

 


MS Build 2025: The new interaction platform

Last year when I was listening to MS Build 2024 and I must admit a lot of it was so technical and so focused on foundation that I wasn't very excited. It was a bit like swimming through the mud. Technology changes daily and although this continued with big changes, it was still the cement of the foundation.

This year listening to MS Build 2025 reality struck a huge "Wow" for me, there really are huge changes that solve real needs and frustrations.

Across the global customer engagement landscape, AI "prompts" are redefining how professionals can engage with customer data. As someone who has implemented Microsoft Dynamics 365 CE across diverse markets, I’ve seen firsthand how organizations are balancing innovation with responsibility. There are data initiatives, privacy alignment, balancing access and overall talking about how to use these new tools while people are using them.

The opportunity is substantial: "according to IDC, organizations that integrate AI into their customer engagement workflows are improving customer lifetime value by 25% on average, while reducing time spent on administrative tasks by up to 40%. These gains aren’t just about efficiency—they’re about accessibility."

For UI/UX designers and solution architects of "heartbeat applications", such as business applications that manage a huge asset pool of critical information, the focus is shifting to building interactive and collaborative access. This evolution doesn’t mean abandoning control—it means creating interfaces that surface the right data at the right time to the right person, without overwhelming the person. It also means building transparency into AI suggestions, so users understand why a recommendation was made. The way we interact with systems is changing and it is changing fast.

As technologist, bridging the gap between business think and technology think, this is not an unusual ask. It is extremely exciting to be able to learn and interact with the software we have long championed AND training and learning curves are reduced. 

One of my favorite tips for anyone diving into learning AI, is to use AI to learn AI.  Ask your favorite AI tool how to write the best prompt or ask AI to build out a training plan for you using your favorite way to learn. The options are endless.


Across the Data Sources: Azure, Dataverse and others

In today's world of fast paced change and increasing reliance on data it is important to focus on the other side of complexity. Make sure your data is aligned and accessible using the least complex and yet highest quality solution.

Seamless integrations that are technically put into place with considerations such as. 

1) Is the data secure?

2) Is there quality and data review processes in place so that the data is clean and accurate?

3) Can you access the data at the desired speeds?

The Microsoft Stack of technologies continues to deepen and is sometimes hard to explain. Take for instance the "Common Data Universe", Dataverse. More than a Microsoft SQL Server database with layers and layers of role-based entitlement, external security tying deeply into the entire infrastructure using Entra (formerly Active Directory) and so many choices. Choices such as being able to split the database using Business Units which can significantly increase performance. 

If you take Dataverse and then consider that Azure extends the options, you end up with so many technically beautiful possibilities 

  • Azure Data Factory: Ingest and transform data from Microsoft and non-Microsoft sources. Azure Data Factory is a managed cloud service that's built for complex hybrid and non-hybrid extract-transform-load (ETL), and data integration projects. Introduction to Azure Data Factory - Azure Data Factory | Microsoft Learn to read more.

  • Azure Synapse Analytics: A bit of a learning curve for the technical folks (seen that first hand more than once), but Azure Synapse can be used to create a customer data lake and analyze large datasets.  "Azure Synapse brings together the best of SQL technologies used in enterprise data warehousing, Spark technologies used for big data, Data Explorer for log and time series analytics, Pipelines for data integration and ETL/ELT, and deep integration." retrieved from What is Azure Synapse Analytics? - Azure Synapse Analytics | Microsoft Learn What more do you need? 


ACCESS TO APIs, Of course! 

  • Azure API Management: Securely expose data services. Azure API Management. "Azure API Management is made up of an API gateway, a management plane, and a developer portal, with features designed for different audiences in the API ecosystem. These components are Azure-hosted and fully managed by default." Azure API Management - Overview and key concepts | Microsoft Learn Not always required, but really good to know about as the technology options either are layered or are in parallel.

Azure API Management


From Words to Pictures

Taking a dive into Sora (https://sora.chatgpt.com/) today to see what it can do when it comes to translating words to pictures or videos. It did an interesting job with a vague topic. I asked it to create a picture of change management and what continues to be critically important is the way you put your prompts together. A slight change in the prompt created two very different experiences. The first is the futuristic version and the second was the simple ask. 

We can live in PRESENT, but the Past or the Future is also sometimes a choice that people don't realize they are making.

Futuristic Change Management
Futuristic Change Management


The Hidden ROI of Dynamics 365 Copilot: What You Might Be Missing

Individuals will always have a preferred method for working.

Some individuals will push their comfort zones and always dive into the latest and greatest tooling and options and others take the stance of if it is working for me, I am not changing unless I have a huge reason to change or if I don't have a choice.

We are living in a world of more and more choices.

One of the hidden ROI benefits of Copilot is that the various consistent Copilots roll across all of the Microsoft products and stack of available technologies. This not only includes products within Modern Workplace (Excel, Word, Teams, etc.) or Business Applications (Dataverse, Dynamics 365 Sales, Dynamics 365 Customer Service, Power Platform), but also a wave across all of the various "more developer or IT centric" tooling (Visual Studio, Github, Azure DevOps, etc.).

  • Copilot has the ability to silently break down silos while also maintaining individual choice.

Data here, data there, data everywhere. Everyone is creating data on a daily basis, good data, bad data, helpful data, temporary data.

  • In the world of noise, Copilot can be the saw that cuts through to the answers.

Speaking of data, the culture of a company contains a huge amount of proprietary data. Tribal Knowledge is a company asset that is rarely harvested and yet this knowledge can be the differentiator for a company. The difference between a "WOW" experience where issues are quickly discovered and successfully resolved and an "average" experience where people do their jobs and go home. Using AI to tap into Tribal Knowledge is an untapped goldmine. 

  • Tribal Knowledge, the gold that is often untapped.

There are most likely areas of the business or processes that happen like clockwork every month or quarterly or even daily. Employees go on autopilot and get the work done that always needs to get done. They don't have to think too hard about this if they have been at the company for a long time.  Most probably don't like this type of work. It is mundane, repetitive and yet drives other dependencies so has to get done. 

  • Agents might make your Employees happier. 

Speaking of employees. You might be surprised to learn that they might be using AI already. ChatGPT is right at their fingertips and although helpful might not be where you want proprietary customer or company specific questions being asked. Many a risk was taken in pure innocence. I just needed a quick image, the question was pretty generic, I didn't use any names.

  • Providing people with the right AI options, that give them even more, mitigates people using tools that might introduce company risk.