Assessment Rocks

Not-so-helpful Advice

I solemnly swear that this page contains No Actual Useful Advice. If you want to learn how to do assessment effectively, check out the resources page, attend a conference, or seek out one of the many excellent books on assessment. If you want to have a little fun, continue below. New articles posted occasionally. This page may include paid promotions or affiliate links. You can also just send me cash through the mail.

Even the Wild Horses Rolled Their Eyes

On a recent tour stop in Chincoteague Island Virginia, I had two distinct experiences that illustrated the importance of listening to students and of engaging in continuous improvement.
The first experience was at the locak McDonald's. They had a big sign in the window advertising a tasty (?) new variety of McFlurry, called the Banana Split McFlurry. I went inside, thinking I would check it out.

McFlurry Advertisement

I went inside, and there was a small sign on one of the ordering kiosks: "Sorry, our McFlurry machine is broken." I was sad but not at all surprised, because McDonald's has a long history of broken ice cream machines. In fact, there have been lawsuits from franchise owners who are simply fed up (pun intended) with having to pay cartel level prices to a McDonald's-owned company simply to repair a machine they own. I'm not going to wander down the right-to-repair legal issues here, which might be interesting some time in the future (in case my primary ax ever breaks), but let me get to the point: THEY ARE SPENDING MONEY TO ADVERTISE A PRODUCT THEY ROUTINELY CANNOT SELL TO THEIR CUSTOMERS...resulting in customers leaving unhappy and hungry.

In contrast, right across the street was the hotel where the band partied in the pool and got some rest. While not the newest hotel property in town, it was evident they paid close attention to what guests wanted. They had a gas fire pit for keeping warm on cool nights:

Fire Pit

They turned what had been a shabby drive-through into a comfortable place to relax out of the sun:

BW Couches

They worked hard to keep the pool attractive and clean:

Pool

They also invested in landscaping and in keeping the grounds attractive.

BW external

In contrast to McDonald's - where you could not even access a product they were heavily advertising - this Best Western (I do not own any stock, BTW) paid attention to what guests wanted, and then worked hard to give it to them. Sometimes people think running a business or managing a University is extremely complex and difficult, or they fall all over themselves chasing the latest fad or what competitors appear to be doing. But as this example points out, it's not that hard: find out what your students and other customers want and need, and then give it to them. It's really that simple! Looking back at these pictures, I still really have a craving for a KitKat. Time to go barter with the kids over their leftover Halloween candy...

Precision vs. Meaning

Someone asked me today the best way to record a date. In the United States, we often write a date as: August 12, 1991, or 08/12/1991. In Europe, they commonly put the month first, such as 12/08/1991. For those who are old enough, one of the primary Y2K fears was that computers would stop working because dates in computers were often recorded using just a two-digit year (08/12/91). The fear was that instead of recognizing that '00 meant "2000," computers would think it was 1900 which would mess up all kinds of other important calculations (such as 'what additional fees can we slip into this Ticketmaster order without anyone complaining?').
In addition to difficulties with converting dates and times between continents, there's the time zone problem. If an event is said to "happen" at 5:03 PM (such as Jeremy spilling barbecue sauce on his work shirt), is that 5:03 PM in the timezone where the event occurred (follwed immediately by Jeremy looking around to make sure no one is watching and then licking the barbecue sauce off his shirt), or is it 5:03 PM in the timezone of the person who is reporting the event, or some standardized timezone that we can all agree on?
My suggestion, to increase accuracy, is to measure time the same way SPSS measures time, which is to count the number of seconds that have elapsed since October 14, 1582. (SPSS doesn't have a specification for this, but I suppose we could mark events before that date with negative seconds, but I'll let the historians worry about that problem.) For instance, as I write this, the time would be 13,940,076,213. Very precise and specific, with no room for confusion. Unfortunately, such a measure of time, while precise, is not very meaningful because it takes considerable mental power (and, ironically, time) to figure out what that number means.

In contrast, if I use a very imprecise measure of time, say "the 1980s", this immediately is meaningful and, depending on your age and perspective, brings to mind various feelings, thoughts, images, sounds, or even smells. But it is quite vague because, as those of us who lived through the 1980s will tell you, there is a big difference between 1987 (when rock ruled the airwaves) and 1981 (when disco was still in its groove. Ick.).

We face a similar challenge when it comes to assessment. While precision and accuracy are important in assessment, it gets expensive and time consuming to let our obsession with precision get carried away. And an overemphasis on precision tests the limits of meaningfulness. For example, we care if students know how to write. But we don't really care if they know how to write 1.84902% better than last year's students. We just want to know if what we are doing is working and what we can do to get better. This - knowing if what we are doing is working and how to get better - is the heart of assessment, and it means we sometimes have to let go of our obsession with precision. Oh, which reminds me, it's now 13,940,077,137, so I'd better get going to me next meeting!

World Tour!

I recently completed a world tour (well, Berlin plus an airpot layover in Frankfurt) with the Assessment Rocks house band and sat-in on a few gigs with Steel Panther. Here's a tour promotional photo:

World Tour Promotional Photo

Here are a couple key things I learned on the 2024 world tour:

The Tyranny of Assessment

The Russian Republic of Chechnya recently banned all music that had tempos below 80 beats per minute or above 116 beats per minute. This means songs like Nirvana's Smells Like Teen Spirit would be banned (slightly too fast at 117 BPM, although their unplugged version might make the cut). As a rocker, I object! I wonder if a song like Psychoctick's song NSFW, which uses the f--- word 455 times, would be permissible under these guidelines? If they promised to keep their performance of the song within the required tempo guidelines, would that work? (Aside: How widely known is the F-word around the world? This would be a good topic for a dissertation if you ask me.)

As I was thinking about the tyranny of banning songs with specific tempos, it got me thinking about tyranny in assessment. I personnally have tried to avoid tyranny in my own work, but I have seen it - from accreditors who have extremely stringent assessment requirements (which are based only on anecdotal evidence), to peer reviewers who insist that one must use a specific rubric to assess everything, to institutions who unjustly use assessment to close programs and cut funding.

At its heart, assessment should be about learning; not about control. It should be empowering, not taking power away. It should be about growing and improving, not forcing adherenece to a silly set of rules. The more you squeeze, the more slips through your fingers.

As assessment leaders, let's let go of tyranny and keep our work focused on learning. And when we let go of tyranny, it opens up all kinds of possibilities, like rocking out to the classic rock anthem Three Little Pigs by Green Jelly (144 beats per minute).

Top 10 Tips for Successful Assessment from an Extremely Seasoned Assessment Rocker

I was thinking about it this morning, and, as hard as it is to believe, I have now been professionally involved in higher education assessment for more than 15 years. I used to say "more than 10 years,"...how time flies! I am so grateful for the many people who have so freely shared their advice and guidance with me over the years - and I still have so much to learn! To keep the tradition of sharing going, here are the top 10 things I've learned in assessment so far!

The Late Show Theater in New York City

#10: Faculty are not the enemy. Without faculty, the university would just be a social club with mediocre sports teams. Much of the work of assessment is about serving faculty.

#9: Students are not the enemy. Without students, my job would not exist. Much of the work of assessment is about serving students.

#8: Accreditors are not the enemy. I've met the accreditors and they are us. They are member organizations. If there is a problem with an accreditor, volunteer. Sign-up. Join. Change comes from within.

#7: Assessment fads come and go. Don't worry about them. Let them pass in the night like a tour bus rolling past a White Castle at 2 AM. Okay, on second thought maybe you can stop for just a minute, or maybe just hit the drive through. But it's not the destination. You know your students and your faculty. Focus on what meets their needs rather than one some outsider claims is the next hottest thing in assessment (especially this lousy site).

#6: Surveys are a starting point, not an ending point. I've never once handed someone survey results and had them respond: "Thank you, this answers every single one of my questions and now we know exactly what to do with our lousy degree program." Surveys are great and provide information, but they will never tell you what to do. Which brings us to:

#5: Data don't speak for themselves. All data require interpretation. Be aware of your own bias and lens when you are interpreting data and recognize that others will see the data differently.

#4: Recognize the limitations of data analysis. In assessment we routinely have to make decisions with incomplete or inconclusive data. It is always possible to collect more data, but usually this just delays a decision you know must be made and rarely changes the decision.

#3: Get connected: With colleagues through professional organizations, with those at nearby schools, at conferences and events. You will learn so much from them and it will keep this job from feeling so lonely.

#2: Avoid assessment dogma. There is no One Right Way to do assessment. Have a big toolbox where you can pull out assessment methods and instruments that meet the interests and needs of the faculty and students you work for. Dogma kills good assessment. Innovation, creativity, and openness to new ways of assessing things will create energy, interest, and engagement.

#1: Have fun. If you enjoy your work, others will enjoy working with you. Why be miserable? Life is too short. Assessment gives you an amazing way to engage with faculty and students, and it should be fun!

Hertz's Investment in EVs Fails: Shocking.

Last year I was scheduled to present at a conference. Instead of driving my own extremely-functional but also extremely-boring and borderline unsafe Toyota Corolla (it is smaller than just about any other vehicle on the road and lacks any driver assist features), I thought it would be fun to rent a car for the trip. Hertz had been advertising the availability of Electric Vehicles, and I thought, what a great time to try one out!

A white Tesla Model 3

I planned out my route in advance to ensure there was a charger along the route, and planned extra time, about two extra hours, for the charging (I planned to eat lunch and charge at a convenient location along the way). I go to pick up the Tesla early Monday morning, and the first thing I notice is that the car is not currently charging. The pick-up location has no chargers. I get into the car, and it indicates 15% battery remaining. Crap. I take it to the nearest supercharger. The car tells me: "60 minutes to full charge." Then: "80 minutes to full charge." Then: "100 minutes to full charge."

Now, I'm no time or physics expert, but generally it is my understanding that, as time elapses, the remaining time to wait before continuing your journey should be decreasing...?

It becomes obvious, after sitting in the car for 25 minutes and only adding about 30 additional miles, that it is not going to be possible to complete the journey with this vehicle. So I drive home, and it sits in my driveway, as I fume for the whole trip  o KC in my personal vehicle.

I'm not here to whine or complain about Hertz (I've done that in plenty of other places), and they did refund all charges for the "trip," (in quotes because I missed my first presentation due to being LATE) but I am here to point out that understanding Hertz's failure is important for us assessment rockers out there.

Hertz bet big on EVs (and even had a deal with Uber where certain drivers could rent a Tesla to use for their ridesharing work). But what they didn't understand was that you can't just buy a bunch of EVs and throw them into a system designed for vehicles with internal combustion engines and expect them to work. Why didn't they, for example, supply their rental locations with something simple, like extension cords, to improve the performance of these rentals?

An orange extension cord

A stupid extenion cord. Just have one laying around so the vehicle could be plugged in overnight and get a few more miles on it for the next customer. But Hertz failed to consider how this change to their business would effect all other parts of their business. You can't make a major change and expect it just to work given that everything else in the business was built around ICE vehicles. It is an important lesson for assessment leaders - as we seek to implement change, we can't just change one thing and expect everything else to just fall in place. Higher Education institutions are complicated systems (maybe even more complicated then renting cars to people), and when we implement change we have to consider the whole system in which those various processes operate. And maybe keep a few extension cords laying around just in case.

Assessment: This is the Way logo in the style of the Mandalorian

In the Disney+ TV series The Mandalorian, The Mandalorian (named Din Djarin) lives his life by a set of very strict creeds. Three of the most commonly used are: never remove your helmet in front of another living being, always help another Mandalorian, and never finish a sentence with a preposition.

Mandalorian and Grogu

He follows these three creeds extremely carefully throughout the first two seasons. Amazingly, even though he spends much of his time on desert planets and in various spaceships (with no obviously visible humidifiers), he never once removes his helment to pick his nose. Frequently, when a decision is made that aligns with some aspect of the Mandalorian creed, Din (or other Mandalorians) state "This Is the Way." This statement immediately ends any additional discussion about what to do, strategy, etc. However...in the last episode of season 2, he meets Bo-Katan Kryze, the leader of a group of Mandalorians known as the Mandalore Resistance. Bo-Katan and her followers have also pledged to follow the creed of the Mandalorians, but shockingly they are more than happy to remove their helmets. THIS IS NOT THE WAY Mando (as his friends call him) must be screaming in his head. He his SHOCKED. Simply SHOCKED! To see that others who claim to follow the creed behave so differently than him (I'm still waiting to see if, when he finally removes his helment, he goes straight for the nose.)

What does this have to do with assessment? Well, there are many similarities between us assessment leaders and the Mandalorians. We too have taken a form of a creed, in that we have committed our work to the task of ensuring and improving student learning. And, also like the Mandalorians, too often we also dogmatically follow elements of the "creed" that, while perhaps made sense at one point, are not critically important to our work.

For the Mandalorians, maybe it made sense at one point to always wear helments. Maybe rocks were falling from the sky and members kept getting bonked in the head. Maybe some of the early Mandalorians were unbearably ugly. And by the start of season 3 it is obvious that, whaterver the reason once was, it is now lost to time.

Similarly, as assessment leaders we may have committed ourselves to a specific piece of assessment dogma: maybe it's insisting everything be measured with a rubric, or maybe it's requiring every single program, no matter how small or large, to complete a report annually using the same 15-page template, or maybe it's insisting that only certain action verbs can be used to write effective learning outcomes statements...those are all dogma...and what is most important is that we keep focused on why we got into this work in the first place: to ensure and improve student learning. And let everything else go...

Unsatisfied with Satisfaction

Interpreting results from satisfaction surveys can be a bit tricky. The most important thing to remember about satisfaction is that it is the gap between expectations and performance. If performance exceeds expectations, then you have positive satisfaction. If performance is below expectations, then you have dissatisfaction.

Let me provide an illustration. I was on a car trip across the state to give a presentation at an institution. The drive was 4.5 hours, and by the time I arrived in town I was looking forward to getting a fresh, crispy, juicy salad for dinner (I know a salad isn't rock-and-roll, so don't give away my secret.) I had in mind a salad perhaps a bit like this:

A very Tasty Salad

I completed my order at the restaurant, drove back to the hotel, went up the stairs to my room, and sat down in front of the TV to watch the football game and enjoy my tasty, juicy salad. I opened the clamshell, and this is what I saw (this is the real, actual picture of my salad):

a horrible salad

I laughed out loud. I laughed so hard it totally made the $4.99 I paid for the salad totally worth it. I've sat through comedy movies that cost me $8.50 plus $12 for popcorn and soda where I didn't laugh nearly as hard as I laughed at this salad. In my mind I was imagining the conversation in the kitchen:

Manager: "Someone finally ordered a salad! I'll need it to go in five minutes!"

Cook (a 16-year-old who doesn't really want to be at work in the first place): "It's a Sunday night! We are running low on a few items! Like: carrots, tomatoes, onions, cheese, cucumbers, olives, lettuce."

Manager (who thought he would be running an upscale, four-star bistro instead of this dumpy diner): "I SAID, give me a salad! Now you are down to 3 minutes!"

Cook: "Fine. All we have is some brown lettuce that I found under the ketchup in the back of the fridge. I was going to eat it to give myself food poisoning, but I suppose I could give it to this idiot."

Now, that's dissatisfaction!

GOAT Coffee

I heard on the news yesterday that one of the greatest metal guitarists of all time - Kirk Hammett of Metallica - is now selling his own coffee line. Hearing this news made me feel really old. And not because coffee is a drink I always considered a drink for grown-ups. And then I felt old.
Coffee on the top of a mountain with cozy chairs and a table

Then after poking around on the internet a bit, I realized I am probably the only person left in the United States who does not already have their own coffee line. Making matters worse, this means that almost every good coffee company name is now taken. For example, there's a coffee company called "GOAT" (Greatest Of All Time). Think about that for a moment. They are claiming to the produce The Greatest Coffee Of All Time. Not just good. Not great. Not excellent. The Greatest Of All Time. While as a rock-and-roller, I appreciate the idea of going big or going home...I wonder if it is even possible to defend this assertion. I mean, how many coffees can they have tested in their lifetime? What evidence do they have that they really are better than any other coffee? Their website includes "their story," which says its mission is to "craft new coffee experiences for you." I am hard pressed to describe a "coffee experience," let alone a New Coffee Experiences. Most of the time, I don't want an experience - I just want something to keep my head off my keyboard at 2:00 PM.

In my experience, most higher education institutions have at least one GOAT departments. (If you work at a school with an animal science program your school might literally have a GOAT department.) By GOAT, I mean departments that are so full of themselves, so obsessed with their own greatness and excellence.

How to spot a GOAT department? Here are some key symptoms:

GOAT departments. GOAT coffee. The lesson from assessment is that there is no such thing. That no matter how good you are, you can always be better. And the moment you stop working to improve, the moment you think you've made it, is the moment when you start to decline. Think about it: Michael Jordan, the greatest to ever play the game, practiced exteremely hard. He was the greatest of all time and yet he still put in the work. Back at it (but maybe a refresh on my coffee first!).

Never in the History of Humankind

There's an old saying that if you put 1,000,0000 monkeys on 1,000,0000 typewriters for 1,000,000 years, eventually one of the monkeys will fling poo directly into your face and laugh. I'm not sure about the monkeys but I know my kids would laugh.

But the point of the monkey story is that, given enough time and opportunities, any event can occur by chance. I call BS (baloney stuff) on this one. Here's my counter-argument:

In the course of my career in higher education I have often been responsible for presenting program enrollment numbers to program leaders and faculty. It seems like a simple request: simply count up how many students are enrolled in a certain degree program. But it doesn't matter if the counts I am presenting are for undergraduate programs or graduate programs, large or small programs, new programs or old programs, or programs whose name is an anagram for "adieu crotch Teena", invariabley upon receiving the reported counts, a member of the faculty committee will, after frowning at the numbers for a few minutes, state "I don't think this is quite correct."

It is possible the lack of agreement between the count of students in the program I produce and what faculty expect to see is due to some systematic error I cause. As any reader of this site can tell you, I am by no means perfect. I've evn ben known to mke a spleling eror or too.

It is also possible that the discrepancy is due to the complexity of how students enroll in our programs. Some students enroll in more than one program (should we count them in one program to avoid double counting or both to capture their true enrollment pattern), some students enroll part-time (are you interested in enrollment FTE, to gauge the load on the faculty or headcount, in case you want to stuff them all in a room to tell them the program is moving online immediately due to the outbreak of an infectious disease), and sometimes students participate in programs before they are fully enrolled.

But I think the most likely reason, and you can quote me on this, is lack of sufficient funding. Due to the pandemic, budget cuts, and a worldwide banana shortage, for the last two years my office has been getting by with only 734,852 monkeys typing on 734,852 typewriters. But with proper funding we can get my monkey team back above that 1,000,000 mark and then, some day in the future, I will have the joy of sitting in a meeting and waiting for that magical moment when a data user carefully reviews program enrollment counts, looks up with a smile slowly spreading across their face, and says, "By jove, I think you've done it!"

Let's Face it: Nobody is Reading your Report - Not Even on Accident

"5:00 PM: Solve world hunger: tell no one."

One of the top fallacies in assessment - in all of science, really - is that a well-writte report, posted on your lousy website, with the answer to an immediately pressing problem - like solving the puzzle of why there is such a high correlation between the amount of blood running out of your nose and the likelihood that the kleenex box will be empty (just ask Billie Eilish) - will be sufficient to create action to solve your problem. (Here's a preview: the solution? Stop picking your nose.)

Unfortunately, a study by the World Bank found few of their PDF reports received much attention. In fact, just under a third had never been downloaded, not even once, and only 13% had more than 250 downloads. Those are pretty sad statistics, especially because almost every website now gets dozens of hits from random web bots who want to steal your content (I solved this by not having any content of any value - haha, take that!).

But maybe, if you are like me, you are thinking, "but I am sure my assessment reports are far more interesting than some stupid World Bank report on world hunger and economic policy!" Now that I've written this down I can see how ridiculously insane that sounds. Because I hope avoiding worldwide economic financial meltdowns and decreasing world hunger is more important than discovering Jackson Hall's floor 8's level of satisfaction with our recent response to their bedbug infestation (so you don't have to go looking: 43% were "screaming into their pillows").

It is therefore obvious that if we want people to pay attention to our assessment and scientific findings, we are going to have to find other ways of communicating these results. Creating a PDF, no matter how beautifully prepared, and dropping it on an obscure page on your lousy website on a Friday afternoon is not getting it done. Hey, here's an idea! Collect a bunch of empty kleenex boxes (these should be easy to find the next time you have a bloody nose). Then fill them with copies of your latest report, and place them back in their various locations. Turn off your building's humidifier, stir up some dust into the air, and, soon enough, people all over your building will be putting their eyes on your report. Problem solved!