AI Strategy: We’ve done this before
Image provided by Kindel Media
Higher education has navigated major disruptions before by learning collectively rather than individually. As AI reshapes the field, communities of practice offer the same proven path forward. This piece makes the case for why, and points to a free resource to help institutions get started.
We've Done This Before
When online education began scaling in the early 2000s, there wasn't a shared definition of quality. Programs were launching quickly, new student populations were emerging, and expectations were shifting faster than the systems designed to support them. Across institutions, people were building in real time without a clear set of standards to anchor to. Most were asking some version of the same question: are we actually doing this well?
There wasn't a central authority that stepped in to solve it. What happened instead was more distributed and, in hindsight, more effective. Training programs brought practitioners together. Informal networks formed. Communities of practice emerged across institutions. People compared notes, shared what was working, admitted what wasn't, and pressure-tested ideas before they hardened into policy.
Over time, the field got sharper, not because one organization figured it out, but because many were learning alongside each other at the same time. That is how standards emerged, how instincts improved, and how online education matured into something far more credible than it was at the start.
A Pattern Worth Recognizing
We saw a version of this again during the pandemic. Nearly overnight, institutions were forced to shift delivery models, adopt new tools, and rethink how they supported students. Faculty, staff, and leaders were navigating unfamiliar territory with limited guidance, and once again the response leaned heavily on shared learning. Working groups formed, peer networks reconvened, and rapid knowledge-sharing became the primary way institutions stayed functional. It wasn't perfect, but it was effective. That moment accelerated adoption, normalized new models, and permanently reshaped expectations for flexibility and access across the sector.
The Cost of Moving Without Each Other
AI is creating a similar inflection point, though the pace feels different this time. Capabilities are advancing faster than most institutions are structured to absorb, and organizations are making consequential decisions without the benefit of long-term evidence. Underneath many of these efforts is the same familiar tension: are we doing this right?
The risk is that individuals and institutions try to answer that question in isolation.
When organizations move quickly without shared spaces to think, predictable patterns emerge. Efforts are duplicated. Decisions are made on incomplete information. Progress within one unit can look more solid than it actually is when no one is comparing notes across the institution or the system. We have seen how that plays out.
Communities of practice (COPs) offer a more effective path forward, particularly in moments like this. At their best, they create space for practitioners to engage honestly with the work, including what is working, what is unclear, and what is still evolving. That kind of psychological safety directly improves the quality of decision-making. When people can surface challenges early, institutions can adjust earlier and avoid more costly missteps down the road.
COPs also expand perspective in ways that are difficult to replicate internally. Some of the most useful insights come from adjacent contexts where the circumstances differ but the underlying problem is similar. A healthcare system thinking through outcomes measurement, a technologist navigating governance at scale, a higher education leader rethinking student engagement: these practitioners are often working through parallel questions. That is where thinking sharpens. That is where better decisions begin to take shape.
The institutions navigating this moment well are not the ones with the most polished AI strategies. They are the ones creating intentional space to learn while they build.
At T3, we see a consistent pattern. The institutions navigating this moment well are not necessarily the ones with the most polished AI strategies. They are the ones creating intentional space to learn while they build, comparing approaches, surfacing what they are actually seeing, and adjusting based on evidence from across contexts. They are not outsourcing their thinking, but they are not doing it alone either.
Most organizations say they value collaboration. Far fewer invest in the conditions required for it to function. If groups only share outcomes, they are reporting, not learning. If there is no space to test ideas or name uncertainty openly, progress slows even when activity remains high.
We did not navigate the early growth of online education, or the rapid shifts of the pandemic, by working in silos. We navigated those moments by building shared understanding in real time, and in doing so, changed how the field operates. There is no reason this moment should be any different for AI.
That is part of the thinking behind the AI Community of Practice Starter Kit. It is a practical resource designed to help institutions move from scattered experimentation to more coordinated, sustainable learning. It outlines how to define a community's purpose, structure participation, run early sessions, and maintain momentum over time. The goal is not to prescribe a single model, but to make it easier to get started and to connect individual experimentation to broader institutional strategy.
We made it free because this work needs to scale and we are here to help. If you are trying to stand up a community of practice, or strengthen one that already exists, it is a useful place to begin. And if you want to go further, T3 is always interested in partnering with institutions that are building this capability in a more intentional way.
The institutions that navigate this moment best will not be the ones that move the fastest alone. They will be the ones that learn the fastest together.
Work with T3 Advisory
T3 works with colleges, universities, and state systems to build governance structures, faculty engagement strategies, and change-management capacity that make AI adoption sustainable and student-centered. Visit t3advisory.com or connect with us on LinkedIn.
Daniel Gannon is a Director at T3 Advisory, specializing in academic strategy and AI transformation consulting for institutions, systems, non-profits, and foundations.

