Your Technology Stack Wasn’t Built for AI

AI tools don’t fail in isolation. They rely on the same files, permissions, and systems your organization already uses. If those aren’t aligned, adoption struggles. · Read more →

Your Technology Stack Wasn’t Built for AI
Photo by Wesley Tingey / Unsplash

Organizations are starting AI adoption repeating the same patterns we’ve seen forever in tools and process change. They buy the license, announce the initative and ask the team to use it. 

The real test is what a user does first. They will start with something simple like trying to find a file or basic information. 

“Where is the latest company handbook?"
“What is our PTO Policy?"

The response is incomplete or outdated. Trust in the tools is lost. You’re now fighting an uphill battle. You’ve already got users that step into a new tool or process and tell you why it won’t work.  It all feels like friction rather than helpful.  

In AI enablement work, this is the part that AI strategies tend to miss.

The tool itself didn’t fail. The environment it operates in did.

AI tools work with existing organizational knowledge. They operate on the same files, permissions, systems, and communication patterns your team already depends on. If your organization doesn’t have a reliable shared context, the AI tools won’t either.

Previous software lived in it's own world with it's own workflows. AI operates across systems. That changes the problem organizations are actually trying to solve.

Most tools only needed to work inside one context. If your files, permissions, chats, documents, and wikis were never designed to function as a shared information environment, the AI will surface that immediately. What used to be minor inefficiencies in one area now becomes visible to every employee at once.

We’ve been through this many times in the tech industry. The pace of change is now exponential and across every role in an organization. Users had more time to get on board with the internet or smart phones. An organization has the responsibility now to empower their team with access to the tools and ensure they can use them to their full capabilities.

One of the quickest wins I see available to organizations is understanding how their technology stack plays into empowering their end users with AI tools.

Microsoft 365 Copilot provides a clear example to illustrate a common problem with creating a shared context. You can pay the $30/user/month to turn the license on for a user but I imagine it has only led to immediate disappointment in many organizations. When leaders say AI “isn’t working”, the first place the issue appears is in the organization’s source of truth.

Is your SharePoint house in order?

No one’s really is. All those years of unorganized files, duplication, and outdated information is coming back to haunt us. Think of users files synced across their machines to OneDrive confusing Copilot with what the source of truth is. All those FileName-V1-2.docx files containing similar information is making your environment harder for the AI tools to work in. Multiply this problem across your users and your problem is now exponential. AI tools are getting better at discerning versions but if your users can’t tell you which of the files is the real one, you can’t expect the AI to know either. You’ll just end up with one more opinion on it.

Your data security, governance, and authentication policies now have to be taken more seriously too.

If you don’t have a solid base you’ve built and understand, you’re handing users a very smart assistant to quickly gather information. Think about your salary, PII, HR records, financial information. It could be anything. How sure are you that users can’t access executive level files, meetings and conversations?

You also need to be prepared to pay. If your AI enablement strategy involves free versions of tools or users each running their own paid instances, your information is being used to train models and could be a lawsuit waiting to happen.

If your policy right now is you "don’t have a policy". Or it is "don’t use AI tools". I’ve got bad news for you. Your users are using it on their personal devices off network.

Lack of a policy, AI strategy or making tools available to your team doesn’t prevent AI use, it moves it outside your organization’s visibility and governance. 

Let’s be honest too. This isn’t entirely Microsoft’s fault. It would be easy to treat this as a Microsoft configuration issue. They give admins a lot of control but it requires knowledge, configuration, and, most importantly, time. Everything moves so quickly right now. It’s hard for strained IT teams to have the knowledge required to secure the environment.

These aren’t just Microsoft 365 Copilot problems. AI tools connect across systems. You’ll have the same problem with Google Drive and their Gemini AI. ChatGPT and Claude all have connectors into Google, Microsoft and into many other productivity and operational tools. Information that may have been department or role specific is now available through a single interface. I’m seeing new apps & connectors added daily. Is your Atlassian environment locked in for Jira and Confluence spaces? How about your Figma designs for that top secret project that hasn’t been announced yet? Your sales opportunities?

There are different ways to configure these tools too. Some allow users to connect their accounts centering the permissions around that user. Others allow you to configure at the organization level using service accounts. Someone moving quickly with elevated permissions might authenticate their account giving access to administrator level data.

When organizations create a shared context, the benefits appear quickly.

The tools are only getting smarter and more deeply embedded, there are so many possibilities to remove friction from users. Continuing on the Microsoft Copilot example, users can quickly gather recaps of their day and action items by prompting to pull information across Email, Teams messages, meetings, SharePoint and everywhere they do work. For asynchronous and remote teams, this means less disruptions and empowerment without all the DMs or “quick 5 minute calls” that turn into an hour.

Microsoft Copilot has features to share branding and document templates across the organization so everyone can start from the same context. Shared access to recordings, recaps, and action items produced from calls. The feature package is robust but it is a full time job to keep up with just one of these tools. 

The specific AI tool an organization uses will continue to change. My experience shows different tools work better for different roles. The pace of change also means you may be using something different today than you will in 3 months.

As an example from the last year, Google’s Gemini was lagging behind ChatGPT and Claude. It really felt like Google was done for. They made some huge leaps in 2025 and are gaining speed for a general use LLM.

Another example is Claude Code. In December 2025, Claude Opus 4.5 was released and turned the coding world on its head. ChatGPT’s codex has lost some ground but, don’t worry, the latest codex 5.3 release is now causing a stir.

There isn’t a best tool forever right now.

The competition is fierce. Organizations that can move quickly with the changes will thrive. Today it's Claude, tomorrow it’s Gemini. That needs to be OK. If it takes a year to get Claude Code approval in your organization, you’ll be disappointed in a year when everyone is using something else and you’re still behind.

New features are released every week and, without help, it’s impossible to keep up with the pace of change.

This work doesn’t fit cleanly into existing IT and operations roles.

If you’ve been neglecting the housekeeping and investment required for these tools before AI, it’s hurting you right now from seeing the full ROI of your investments in the tools.  It’s time to invest so you can move forward. 

The pain points are exponential for any organization right now because AI isn’t just a tool for one segment of your organization, it’s an empowerment tool for every single person in all of your roles.

A shared context of files and communication has value up and down your organization. If people can’t find the information, or even worse, trust the information, you are only creating more confusion, meetings and misalignment. The AI is now just creating more work instead of helping. Does that sound familiar?

These are not insurmountable problems. The reality is that AI depends on how the organization actually operates. It’s an exciting time to be working across operations, AI and productivity tooling. It requires the organization to invest time and build a real technical foundation across their operational workflows. 

If this feels familiar, a short conversation is often the easiest place to start.

Get Practical Insights on Leadership, Operations, and AI