Monday, January 9, 2023

Learning outcomes: Going along with useless, baseless bullshit, wasting time and money, with no end in sight

By Gayle Greene 

The Terrible Tedium of “Learning Outcomes” 
(Chronicle of Higher Ed, 1-4-23)

Accreditors’ box-checking and baroque language have taken over the university. 

Every six years, the accountability police swoop down on my campus in the form of WASC, the Western Association of Schools and Colleges. The West Coast accreditation organization comes to Scripps, as it comes to all colleges in our region, to do our reaccreditation. The process used to take a couple of months, generating a flurry of meetings, self-studies, reports to demonstrate we’re measuring up. We’d write a WASC report — “wasp,” we called it, for the way it buzzed around making a pest of itself. 

The WASC committee would come to campus, stirring up much hoopla and more meetings. They’d write up a report on our report, and after their visit, we’d write a report responding to their report on our report; the reports would be circulated, and more meetings would take place. Then it was over, and we could get back to work. It’s fairly pro forma with us; Scripps College runs a tight ship. 

At least that’s how it used to be, just one of those annoying things to be got through, like taxes. Now that the reaccreditation process has become snarled in proliferating state and federal demands, it’s morphed from a wasp into Godzilla, a much bigger deal — more meetings, reports, interim reports, committees sprouting like mold on a basement wall. WASC demands that we come up with “appropriate student-outcome measures to demonstrate evidence of student learning and success,” then develop tools to monitor our progress and track changes we’ve made in response to the last assessment. 

There are pre-WASC preps and post-WASC post mortems, a flurry of further meetings to make sure we’re carrying out assessment plans, updating our progress, and updating those updates. Every professor and administrator is involved, and every course and program is brought into the review. The air is abuzz with words like models and measures, performance metrics, rubrics, assessment standards, accountability, algorithms, benchmarks, and best practices. Hyphenated words have a special pizzazz — value-added, capacity-building, performance-based, high-performance — especially when one of the words is data: data-driven, data-based, benchmarked-data. The air is thick with this polysyllabic pestilence, a high-wire hum like a plague of locusts. Lots of shiny new boilerplate is mandated for syllabi, spelling out the specifics of style and content, and the penalties for infringements, down to the last detail. 

. . . 

Then the boxes with “comments, results, and summaries” are to be incorporated into an Educational Effectiveness Review Report. “By applying the rubric to last year’s senior theses enables you to evaluate both the rubric and your results to help fine-tune the assessment of this year’s theses.” (That sentence is why some of us still care about dangling participles.) This is all written in a language so abstract and bloodless that it’s hard to believe it came from a human being. But that is the point, phasing out the erring human being and replacing the professor with a system that’s “objective.” It’s lunacy to think you can do this with teaching, or that anyone would want to. 

. . . 

Do not think I am singling out Scripps College for special criticism. From what I’ve heard, it’s as bad or worse elsewhere. I think most of our faculty see our dean and president as indefatigable women who work for and not against us and genuinely respect the liberal arts. This outcomes-assessment rigmarole has been foisted on all colleges, adding a whole new layer of bureaucratic make-work. Reports and meetings bleed into one another like endless war. Forests die for the paperwork, brain cells die, spirits too — as precious time and energy are sucked into this black hole. And this is to make us more … efficient? Only in an Orwellian universe. This is to establish a “culture of evidence,” we’re told. Evidence of what? Evidence of compliance, I’m afraid. 

. . . 

Outcomes are “what a student must be able to do at the conclusion of the course,” explains an online source, and in order to assure these, it is best to use verbs that are measurable, that avoid misinterpretation. Verbs like write, recite, identify, sort, solve, build, contract, prioritize, arrange, implement, summarize, estimate are good because they are open to fewer interpretations than verbs like know, understand, appreciate, grasp the significance of, enjoy, comprehend, feel, learn, appreciate. This latter set of verbs is weak because the words are less measurable, more open to interpretation.

 . . . 

“Academics are grown-up people who do not need the language police to instruct them about what kind of verbs to use,” wrote Frank Furedi in a blistering denunciation of “learning outcomes” in Times Higher Education in 2012. Warning faculty against using words like know, understand, appreciate because “they’re not subject to unambiguous test” is fostering “a climate that inhibits the capacity of students and teachers to deal with uncertainty.” Dealing with ambiguity is one of the most important things the liberal arts can teach. 

. . . 

We in the humanities try to teach students to think, question, analyze, evaluate, weigh alternatives, tolerate ambiguity. Now we are being forced to cram these complex processes into crude, reductive slots, to wedge learning into narrowly prescribed goal outcomes, to say to our students, “here is the outcome, here is how you demonstrate you’ve attained it, no thought or imagination allowed.” 

Find the whole article HERE

* * *

...All this assessing requires a lot of labor, time and cash. Yet even its proponents have struggled to produce much evidence — beyond occasional anecdotes — that it improves student learning. “I think assessment practices are ripe for re-examining,” said David Eubanks, assistant vice president for assessment and institutional effectiveness at Furman University in Greenville, S.C., who has worked in assessment for years and now speaks out about its problems. “It has forced academic departments to use data that’s not very good,” he added. “And the process of getting this data that’s not very good can be very painful.”....
...Essentially, the ACCJC adopted MSLOs [measurable student learning outcomes] as the overarching basis for accrediting community colleges based on their faith in the theoretical treatises of a movement.... After repeated requests for research showing that such use of MSLOs is effective, none has been forthcoming from the ACCJC [accreditors]. Prior to large scale imposition of such a requirement at all institutions, research should be provided to establish that continuous monitoring of MSLOs has resulted in measurable improvements in student success at a given institution. No such research is forthcoming because there is none….

If Bobby Hatfield were a very talented little Norwegian girl

 

Hatfield, 1965

Roy's obituary in LA Times and Register: "we were lucky to have you while we did"

  This ran in the Sunday December 24, 2023 edition of the Los Angeles Times and the Orange County Register : July 14, 1955 - November 20, 2...