A mastery view of prerequisites

Imagine if EVERY student came to class having read the assigned readings, knowing the background material, and came to class fully prepared to participate at the highest level.

“Wait a minute!” You say: “Isn’t that what prerequisites are for?”

Perhaps.

Prerequisites are often included in upper-level courses as a way of establishing prior knowledge. But when prerequisites are courses, this can create issues:

First, we all know students who have taken a course, who are nonetheless unprepared for subsequent upper-level material, and there are similarly students who have not take a prerequisite course who would do just fine. Or, to take an even more extreme case: students who bomb the final exam in a prerequisite course. At present, their ONLY option is to retake the entire course, and there is no guarantee they will do any better. (In fact, the evidence points in the opposite direction.)

Wouldn’t it make more sense to test students on the foundational knowledge before class begins? Better yet, if they don’t know the material, shouldn’t they be required to study it until they can demonstrate a mastery of the foundational material? Then they walk into class profoundly prepared to move the discussion forward.

We already do something like this for math scores and other topics via Advanced Placement Tests, but shouldn’t this become de rigueur for all upper level courses?  Maybe this is a solution to rampant course retakes. You only get one shot at a course, but you get many shots to demonstrate mastery and move on in the program.  This way course grades are accurate reflections of effort in the class but mastery is demonstrated in a different manner.