This Is Not an Adoption Technology: Reflections from the AASCU Symposium for Provosts
Part 1 of 2
A reflection from the AASCU Symposium for Provosts, March 2026
There is a word we use in higher education that has been quietly giving us permission to move slowly. That word is adoption.
Erin Mote, CEO of InnovateEDU, called it out directly at the AASCU Symposium for Provosts this week: "This is not an adoption technology. This [AI] is an arrival technology."
She is right. And it changes everything about what leaders are actually being asked to do.
I will dig into that distinction more fully in tomorrow's post. But first: what the room taught me.
“This is not an adoption technology. AI is an arrival technology.”
Who Was in the Room
This week, I had the privilege of presenting at and participating in the AASCU Symposium for Provosts in Washington, D.C. AASCU, the American Association of State Colleges and Universities, convened two days of focused conversation among provosts and chief academic officers from regional public universities across the country, alongside a remarkable set of co-presenters, practitioners, and partners.
The design of the convening was intentional in a way that is genuinely rare: not a parade of vendor demos or a policy briefing dressed up as a symposium, but honest, peer-to-peer dialogue about one of the most disorienting moments any of us have worked through in higher education.
The room was full of people doing hard, consequential work at institutions that do not have the luxury of waiting to figure this out. Regional public universities enroll more than three quarters of the nearly two million Pell-eligible students at four-year institutions in this country. They tend to sit in economically distressed regions. Their students are often first-generation, often working, often one unexpected expense away from stopping out. What happens at these institutions with AI does not stay there. It shapes what the technology means for social mobility in America.
The Hardest Questions Were About People, Not Tools
When I presented findings from T3's national study on AI Adoption in higher ed, the questions that came back were not about which tools to buy or which policies to write first. They were about people.
What do you do when your leadership has turned over three times in two years and a new president has handed you seven new initiatives, none of them AI?
How do you have an honest conversation with faculty who are not just skeptical but scared?
How do you lead within a system where your institution is being asked to move faster than the system office's guidance can keep up with?
How do you open genuine dialogue with stakeholders who are not performing resistance but carrying real fear about what this moment means for their work, their value, and their careers?
These are not technology questions. They are change management questions. And they are the ones keeping academic leaders up at night.
One thread that surfaced repeatedly was the belonging dimension of AI anxiety on campus. There is emerging data suggesting that more students are being accused of AI-assisted cheating than are actually cheating. That is not primarily an integrity problem. It is a belonging problem. When fear drives policy and policy drives suspicion, students feel it, particularly students who were already navigating campus on thinner margins of trust and belonging. The leaders in that room felt the weight of that.
What I appreciated was that no one pretended otherwise. There was no performing of confidence in the room. There was honest acknowledgment that the emotional reality of this moment matters, that faculty and staff are not obstacles to AI strategy but are the strategy, and that they deserve leaders who will name the complexity of what they are being asked to absorb, not just cast a vision and expect everyone to follow.
The technology has arrived. The people have not all arrived with it yet. And rushing them is not the same as leading them.
The Students Already Know This
Student ambassadors from Coppin State University: Mattox Slaughter, Jamilah El-Amin, and Lisa Afolabi, with T3 Advisory’s Audrey Ellis.
One of the most energizing parts of the symposium was the presence of student AI ambassadors from Coppin State University, who were there not as audience members or symbolic gestures but as genuine participants and resources. They asked thoughtful questions of presenters and fielded questions from provosts. They demonstrated AI fluency. They were, frankly, a model of what "students as partners" actually looks like when an institution means it rather than just says it.
This was a through-line of the entire convening: we are not only there to support students' learning. We have a great deal to learn from them. Students are already living in this moment. They are using tools, navigating ambiguity, and developing practical fluency that many institutions have not yet built into any formal program. The question is whether we are creating the conditions for that knowledge to flow in both directions, or whether we are designing AI strategies for students without them at the table.
When Asked What They Needed, Provosts Asked for Small Things
One of the most revealing moments of the two days came not from a presentation but from a facilitated conversation. Representatives from some of the most well-resourced philanthropic organizations working in higher education were in the room: Gates, Lumina, Mellon Foundation, Axim Foundation, and more. They asked directly and genuinely: what would be most helpful? What do you need? What should we fund?
The requests that came back were thoughtful and real: workshops for humanities faculty, AI workflow tools to reduce administrative burden, platforms to share evidence from funded projects, resources to build faculty engagement.
Modest. Reasonable. Bounded.
These requests reflected genuine need and genuine care. And I want to sit with them for a moment, because I think they reveal something important.
Resource constraints came up repeatedly throughout the two days as one of the most significant barriers to AI progress at regional public universities. This is consistent with what we hear in T3's research: institutions are not struggling to find interest or energy. They are struggling to find the capacity, infrastructure, and sustained investment to move beyond pockets of experimentation toward something coherent and lasting.
And yet, when funders with significant resources available asked what institutions needed, the room made incremental asks.
There is a real tension here. Right now, there are substantial pools of philanthropic and federal funding oriented toward AI in higher education. These dollars will not wait indefinitely, and many institutions and organizations will compete for them. Regional public universities, the institutions enrolling the majority of Pell-eligible students in this country, have every reason to be at the front of that line. Their students are precisely who these investments are meant to serve.
So the question I keep sitting with is this: if we know resource constraints are the barrier, and we know the money exists, what would it look like to ask for something commensurate with the actual challenge?
What if the ask was not a workshop series but a funded AI leadership role, or a team of them? What if it was not a platform to share evidence but a multi-institution research and learning infrastructure with real staffing behind it? What if it was not a pilot but a sustained, multi-year investment in building the governance, capacity, and coordination that our research shows is what actually moves institutions forward?
The arrival framing matters here too. Adoption-sized asks are appropriate for a technology you are still deciding whether to bring in. They are not appropriate for a technology that has already changed the conditions your institution operates in. If we have accepted that AI has arrived, our asks need to reflect that.
Tomorrow I will dig into what arrival actually means, why it is a fundamentally different frame than adoption, and what it demands of leaders who are ready to take it seriously.
A sincere thank you to AASCU for the invitation to present at and participate in this symposium. The intentionality of the design, the quality of the conversation, and the commitment to honest dialogue across institutional contexts made this one of the more meaningful convenings I have attended in some time. Terry and Lisa, thank you for giving us a platform and for creating a space where the hard questions were welcome. We do not take it lightly.
This blog originally appeared on T3 Advisory’s LinkedIn Page.

