Hero Image

Learning is hard

Yesterday, Herbert Tsang and Qinqin Zhang from TWU hosted a colloquium on Academic Integrity at our Langley campus. I learned a bit about learning after my presentation...

Qinqin had asked me to present on 'Technology and Academic Integrity' and I happily agreed, as I have some ideas about that particular topic. I am not an expert on academic integrity and how to prevent it, but I am an expert in the area of educational technology and have thought deeply and read widely.

Early in the day, the keynote speaker talked about what I would call 'humanizing academic integrity policies'. Colloquium participants shared horror stories of students failing papers for missing closing quotation marks; we talked about the moral bankruptcy of 'zero-tolerance' and similar policies that remove any agency from faculty and students. We talked about trust, honesty, courage, and other character traits displayed by academically honest people. She spoke of relaxing some policies to reflect the complexity of the issue of academic dishonesty and making instances of violations formative learning experiences rather than punitive sanctions.

It was a great morning with a significant amount of agreement in the room about what she was saying. There were gentle critiques of Turnitin, and a couple of people who clearly supported the institutional support for and use of Turnitin. Nothing at all out of the ordinary.

I started my presentation late in the day, (slide deck here) just after the last coffee break, and we were behind schedule a little bit. I started with a Padlet for participants to share their views on the following stems:

  • learning is...
  • teaching is...
  • students are...

The submitted answers clearly reflected a depth of understanding that teaching and learning is a complex, messy, non-linear process that involves the most difficult task in the world, changing someone's mind.

The next couple stems were:

  • technology is...
  • algorithms are...

And again the responses were reflective of the fact that people in the room were rather suspicious of technology. We talked about the recent NY Times article describing how 'smart home' technology is being used to abuse and terrorize victims in domestic situations. We talked about how Alexa is showing up in classrooms and the negative effects of having for-profit, third-party companies processing and storing the conversations of marginalized students, including undocumented minors who may face deportation.

The first submission related to algorithms was "racist, sexist, homophobic". I agreed with one commenter who spoke at length about how that was true. We talked about why it was true. We talked about the commoditization of the web and the idea of surveillance capitalism. None of this was controversial. Some of it was new, like the idea that university students are largely digitally illiterate.

It was at this point that I brought Turnitin into the equation.

I pointed out that Turnitin was guilty on all counts against algorithmic technology. Surveillance capitalism. For-profit. Appropriating student work. Black-box algorithms that nobody understands. Digital redlining. Policing students. Creating a culture of suspicion and the presumed guilt of students.

Suddenly, the Turnitin apologists were on the defensive. Suddenly my sources, Jesse Stommel and Audrey Watters, who had informed every bit of my presentation and the preceding discussion, were "way to the radical left" in education, with a small group of niche followers who have clearly drunk the Kool-Aid. And Audrey isn't even teaching right now (oh the horror...)! So anything that they say should be taken with a grain of salt. Obviously, none of these criticisms are remotely relevant to the ideas that I had presented.

I am disappointed in myself for not having a coherent, polite response. All I could muster was "That's fair." (Sometimes, my introversion is really inconvenient.) It wasn't fair, and it wasn't until I was driving home that I clued in that the person who was completely undermining my sources, was, only a few minutes prior, totally on board. In fact, the person was so on board that I had joked that they should be included as an author of my slides.

It wasn't until still later that I remembered that the slide in my deck at the particular time this person was complaining about how radical and 'leftist' my sources were, was calling for educators to trust students more and not reduce them to data points. Why is that radical?

Why is compassion 'left-wing'? Shouldn't compassion be a 'human thing'?

But all in all, it does go to show that we can be blind to our biases, we can be blind to the inconsistencies in our own views. The person who became so defensive isn't a bad person, but when an inconsistency between their views and their actions became apparent, it was extremely difficult for them to step outside their initial reactions and realize what was going on. Their first reaction was an ad hominem response that criticized a person rather than an idea.

But I didn't need to take their discomfort and response as an affront to me as a person. They were experiencing cognitive dissonance, which is necessary in order to learn, but is also difficult. I believe that what I had to say is important and that the participants in the colloquium likely hadn't encountered resistance to using Turnitin before. After all, it's the kind of SaaS that makes life easier for faculty and promotes academic integrity at the same time. That should be a no-brainer, right?

Unfortunately, there are too many technologies in education that really are pushed as 'no-brainers'. Audrey Watters' career has been dedicated to warning us that the 'no-brainers' have all been done before, and the results are far less than encouraging. Apparently, learning that lesson is hard.