Search This Blog

Monday, April 27, 2026

You Will Not Like It, But There Is a Solution for AI in the Classroom

The mood in faculty meetings has tipped into a kind of resigned helplessness. Detection tools fail. Honor codes wobble. Bans get circumvented by a sophomore on a phone. So we shrug and say nothing works. But it is not true. Something works. The reason you have not heard much about it is that it is expensive, slow, and demands the resource universities ration most carefully: collective faculty attention and sustained admin support. 

Here is the solution. Take every program, in every discipline, and pull it apart down to the learning outcomes. Audit each outcome against what AI now does competently. Strip out the procedural skills that machines have automated. Strengthen the advanced, conceptual outcomes that survive the audit. Add new outcomes that did not exist five years ago, the ones that deal with directing, verifying, and integrating AI work. Then rebuild the curriculum upward from the revised outcomes. New sequences. New assignments. New assessments. Not the old course with AI bolted on the side. A different course.

Most courses are not designed; they are inherited. A syllabus arrives in a shared drive, gets a date update, and circulates again. The learning outcomes, if anyone notices them, were written by a committee a decade ago and rounded off to fit the textbook. In many professional fields, those outcomes are not even local. They come from accreditation bodies and professional standards, which are themselves the product of expert consensus rather than evidence about how learning actually progresses. Calculus before statistics. Anatomy before physiology. Theory before practice. Ask anyone to defend a particular sequence on empirical grounds, and the answer reduces to tradition. This is not a failing. It is what disciplines do when they lack a better mechanism for coordinating themselves. But it means the curriculum we have inherited is far less solid than it appears.

To deconstruct a course means to set the inheritance aside. Strip the readings. Strip the assignments. Strip the schedule. What remains should be a small set of claims about what students should know and be able to do. That is the course. Everything else is delivery. If the outcomes are vague or quietly procedural, they have to be rewritten first. In regulated fields, this forces a conversation with the accrediting body about whether the standards still describe competent practice. That conversation is overdue in most fields.

Then apply the test. For each outcome, ask whether AI can already do it competently, partially, or not at all. Outcomes AI fully handles should not be assessed anymore. Format an APA citation. Calculate a t-test. Translate a paragraph. Summarize a chapter. These belong to a vanished regime. What remains are the outcomes that demand judgment under ambiguity, interpretation, defense of choices, ethical reasoning, complex multi-stage tasks. These need sharpening. Then add the outcomes that are new. Students must learn to decompose a task and decide which pieces belong to the human and which to the machine. They must learn to specify inputs precisely. They must learn to verify, triangulate sources, revise AI drafts toward a real audience, and leave a recognizable human value-added in the final artifact. These are not soft skills bolted on. They are content. They belong in the outcomes list, with rubrics and assessments to match.

Only after the outcomes are revised can the course be rebuilt. New assignment sequences follow from new outcomes, not the other way around. A research methods course that drops formatting and adds source verification looks different from week one. A composition course that drops grammar drills and adds revision of AI drafts has a different rhythm. One reformed course is hard. A rebuilt major requires every faculty member in the program to do the audit, then coordinate so prerequisites still mean something. A reformed discipline requires cross-institutional collaboration, because no single department has the standing to declare what counts as competence in chemistry or accounting.

The reasons we resist are not obscure. The work is enormous. The cost is real. The collaboration is uncomfortable, because it forces colleagues to argue about what the discipline is for, a question most programs have spent decades politely avoiding. There is no vendor selling this as a turnkey solution, because it cannot be one.

The solution exists. The hard part is deciding, together, that we are willing and able to do the work. And yes, the governments should help. I know none of this is likely to happen, so it is going to get worse before it gets better. We just need some longer term perspective.


No comments:

Post a Comment

You Will Not Like It, But There Is a Solution for AI in the Classroom

The mood in faculty meetings has tipped into a kind of resigned helplessness. Detection tools fail. Honor codes wobble. Bans get circumvente...