An AI University would be great
Especially since it is defined as "a great university"
About a month ago, a piece by Scott Latham entitled “Are you ready for the AI university?” was published in the Chronicle of Higher Education. It’s paywalled, but don’t worry, you don’t need to read it as I’ll summarise it here: imagine you work in a university (as faculty or staff) and that you can imagine a super-being who can do everything you dislike doing (no matter how small) and do it better and cheaper than you do. Imagine that being exists. Who will pay you to do what stuff? The answer: no one. Now call that imagined super-being AI, and you know what the article says.
This type of piece is a widespread trope these days. Take something you do, imagine what it would be like for it to be done really, really well. Claim AI can do that. And strike fear into the reader about that very same AI coming for their jobs.
However, that, I guess you would call it a “thought experiment” with respect to Universities, is premised on two things:
There exists (or will soon exist) an AI that could do all of this stuff; and
That Universities would then move to replace people with AIs.
I should add that the latter is a subset of the conjecture that the Universities could do anything, let alone that.
To be fair, the article does tackle both of these things. I’ll say it is a tad optimistic on whether Universities could adjust, given a massive amount of historical evidence to the contrary. But it is trying to reset people on what AI might be able to do:
The big mistake faculty members make is underestimating the existential threat AI represents to their livelihoods. Professors need to dispense with the delusional belief that AI can’t do their job. Faculty members often claim that AI can’t do the advising, mentoring, and life coaching that humans offer, and that’s just not true. They incorrectly equate AI with a next-generation learning-management system, such as Blackboard or Canvas, or they point out AI’s current deficiencies. They’re living in a fantasy. AI is being used to design cars and discover drugs: Do professors really think it can’t narrate and flip through PowerPoints as well as a human instructor?
He goes on:
As with other IT applications on campus, such as data storage and enterprise resource planning, colleges will rent capacity and processing from third-party providers such as OpenAI or Salesforce. Out of the gate, professors will work with technologists to get AI up to speed on specific disciplines and pedagogy. For example, AI could be “fed” course material on Greek history or finance and then, guided by human professors as they sort through the material, help AI understand the structure of the discipline, and then develop lectures, videos, supporting documentation, and assessments.
Well I can’t argue with that; after all, I’m in the business now of making that happen.
Importantly, AI will not only replicate the current learning approach. It will draw from all course sections on campus and correlate course pedagogy with student performance. By extension, it will develop the “best practices” in a discipline as a means for better understanding educational approaches that yield better student outcomes such as retention, degree progress, and graduation. AI will undertake a process of continuous learning and improvement to ensure a personalized student experience. Two students might both be enrolled in Greek History 1010 in the same semester, but they won’t be taking the same class: It will be entirely tailored to their learning styles. Today, such AI capabilities are already a reality.
Of course, even this is rather odd. There are already many people who can do these things better than the people who are actually holding these jobs. The idea that Universities are selecting on competency to do these things is pretty much laughable. Instead, they are selecting on other factors that, for want of a better term, enhance other aspects of University branding than core job competence. And those other things have an important quality that AI cannot replicate: the people doing them would happily do it for free. The only way they earn a living is (a) by being scarce relative to the value those things bring to Universities in terms of brand and (b) that they are willing to do the other stuff as part of the deal performatively. In other words, faculty are fairly cheap labour when it comes down to it. An AI is going to cost something, and so it can’t compete with job satisfaction. This is precisely the reason why the online education revolution that was supposed to completely disrupt the University did no such thing.
This is important for working out what will happen here. Imagine that there are two things a faculty member does: research and teaching. Imagine that they will do the research for free, but it has value in ultimately attracting students, so long as they have some toe in teaching. The teaching is costly to them, but can be done by AI. Latham imagines that the University will pay the faculty less or not employ them at all. But, in reality, the combination of research plus a toe in teaching means that it won’t happen. Instead, the faculty will keep the toe, but the University will hand them an AI so that they don’t need to put more than that toe in. This is for a simple reason: if they don’t need the toe, then no one needs the University. So, the idea that Universities will exist and run solely by AI doesn’t add up.
This is really an argument about the end of Universities
Of course, the Universities may not exist at all when AI comes in. This is not something that Latham thoroughly entertains. Instead, he writes:
Importantly, AI will not only replicate the current learning approach. It will draw from all course sections on campus and correlate course pedagogy with student performance. By extension, it will develop the “best practices” in a discipline as a means for better understanding educational approaches that yield better student outcomes such as retention, degree progress, and graduation. AI will undertake a process of continuous learning and improvement to ensure a personalized student experience. Two students might both be enrolled in Greek History 1010 in the same semester, but they won’t be taking the same class: It will be entirely tailored to their learning styles. Today, such AI capabilities are already a reality.
And when it comes to the administration of students, this is even stronger:
Picture this: Students will no longer sign up for courses; they will work with their AI agents to build personalized instruction. A student who requires a biology course as part of their major won’t take the standard three-credit course with a lecture and lab that meets for 14 weeks with the same professor. Instead, the student will ask their AI agent to construct a course that transcends the classroom, campus, and time. The AI agent would find expert scholars across the globe, line up real-time or recorded video lectures, and simultaneously incorporate material from YouTube, Google, and university libraries. If the AI agent can’t find lab space on campus, it will help find a lab with capacity halfway across the world and enable the student to participate using an augmented-reality headset. AI agents that have evolved with a student throughout college will be able to design assessments that reflect that student’s learning style, ensuring they have achieved fluency in the subject. …
If we look at day-to-day operations at the average college, AI will first disrupt the registrar. Classroom scheduling and capacity management will be entirely automated — well beyond the current system of Excel spreadsheets that most institutions still employ. Student records will be entirely managed by AI using blockchain, and they will be owned by the student and empower the student. Gone will be the days when universities held students hostage by not sharing their student records. Similarly, student credit transfer — long a bane of most institutions — will be entirely automated. AI will “read,” evaluate, and award the appropriate amount of student credit. AI will also streamline and automate student financial aid at the institutional level.
This is as if a student is attending a University. But why do that if an AI can teach you stuff really effectively and serve you up to employers in a neat package? Latham can’t imagine.
Will traditional universities survive? Absolutely. Millions of students will continue to want an old-fashioned college experience complete with dorm rooms, a football stadium, and world-class dining. However, these experiences are not mutually exclusive: Even these tradition-bound institutions will employ AI. The market expectation will be that top-tier institutions will provide both an unparalleled student experience and AI-empowered education.
At least there is a place for the Chronicle of Higher Education in that world.
Is any of this bad?
The big takeaway from this piece has a strong smell of impending disaster.
None of this can happen, though, if professors and administrators continue to have their heads in the sand. For everyone who works in higher education, there is a great deal of pain and disruption to come. We can minimize the damage, though, by helping people understand how AI will transform higher education. Uncomfortable as they are, these are the conversations we need to start having if we want to be ready for what’s coming.
But we should step back for a moment and realise just how wrong this. Suppose AIs can come along and do your job well. If you are committed to a system that does your job well, that should be something you celebrate. Now I know, you may be committed to that if you are the one doing it. To quote Gavin Belson from Silicon Valley:
I don't know about you people, but I don't wanna live in a world where someone else makes the world a better place better than we do.
But are you? If you asked anyone (if you can still find them) who had to wash clothes by hand whether they wanted the washing machine to be invented, I am pretty sure that would be a hard “yes” regardless of what that meant for their value in society.
And we at Universities are supposed to be committed to the value of the jobs we are doing. In fact, that is why we complain all the time about the stuff that is annoying and hard to do that distracts from concentrating on the “important stuff,” aka the stuff we would want to do anyway … for free.
In other words, if you are worried about the AI University and how well it could do, then you either have to admit that doing University stuff well is not what you are on about or that you want to be protected from competition by the good.
For my own part, I want AI to be used as much as possible to make whatever I do better, regardless of whether I do it or not. Practically, however, I don’t think AI is going to get there (sadly) quite as fast as people think and even when it does, the Universities will adopt it at a rate that is as fast as they should do so. Thus, I can happily cheer for it to happen more quickly from a nice, comfortable, technological high ground.