Camosun Story #69: Tim

Over the past few months, CETL educational developers have been working with faculty across the college exploring the advantages and disadvantages of Generative Artificial Intelligence (GenAI) in teaching and learning.  As we talked to more and more faculty, we discovered several who were already working GenAI into their assessments and talking about its implications with their students and I wanted to share some of their stories with you.  So here is the first of these interviews focused on using GenAI in the classroom with Tim, a full-time Instructor in the School of Business at Camosun who teaches everything from International Management to Marketing Research to Workplace Professionalism.

When I asked Tim how he is integrating GenAI into his teaching, he told me “I’d been teaching about AI for the last ten years or so when it became apparent that something like GenAI was imminent.  Up until recently, I’ve taught it in a very general way and stayed abreast of its development.  But with the rise of ChatGPT over the past year, I was asked, along with three other Instructors in the School of Business, if I would be willing to put some Professional Development time into figuring out what a good response to AI might be.  We were starting to see people misusing it from an academic honesty perspective.”

Tim spent quite a bit of time over summer 2023 keeping an eye on the various AIs releases (at one point there were about a dozen new English language AIs released every day of the week) and by the end of August, he had built a list categorizing about 80 in an Excel sheet that anyone can access.

As Tim explored, he concluded that “It’s a mistake to be afraid of AI. What I tell students, is: You’ve been told that AI is coming for your job. It’s not. Somebody who knows how to use AI is coming for your job. That means you had better get out ahead of the curve and learn how to use it effectively.”

Tim explained that his approach is to turn artificial intelligence into a Research Assistant. “When I went to college and grad school, the Internet as we know it didn’t exist. Instead, we spent time going to the library, digging through card catalogs, and writing notes on cue cards.  Took forever. The Internet changed all that. But while it’s become easy to find information, it’s hard to sift through because there’s so much out there. I think AI is most useful in an academic world as a Research Assistant because it can find information and put ideas together for you in minutes rather than in hours or more. That said, we still have to teach students how to determine what information is valid.”

In other words, Tim encourages students to use GenAI tools to find ideas but to personally review the sources and websites where the ideas come from.  “You have to be careful because AIs sometimes make things up. For example, I asked an AI tool to create a timeline of Camosun College history, and it did in two minutes. Beautifully presented. All the key events were there, but they were placed in the wrong years, and some were out by ten or more years. The AI had done the research, found the events, but when it couldn’t figure out when these events happened relevant to each other, it made things up and presented them as fact.  If I didn’t know Camosun’s history, I’d have believed it.” Lesson learned: “Use AI to do the initial research and collect basic information, but then go dig and make sure that the information was used correctly.”

I wondered how Tim supports with students as they work with AI tools in class and for assessments, and aside from warning them about plagiarism and checking original sources, he works with them to ensure they understand what they are presenting (in Tim’s classes, students present their papers live).  “I come from government where if the Minister of Advanced Education has a question in the middle of your presentation, she doesn’t wait until the end to ask.  So, to replicate real-world experiences, I interrupt students in the middle of their presentations and pepper them with questions to make sure they understand what they are presenting.  Demonstrating comprehension is critically important. It’s also important they understand that while AI will do the writing for them, if fail to develop their ability to write, they will harm their professional and personal development.” “In a very real sense, learning to write is learning to think.”

Tim also teaches his students how to use various AI tools in his 400-level class.  “I teach them how to use ChatGPT and the one built into Bing which is the easiest to use, as well as how to get the tool to show you the original sources and provide APA citations.” “In my 400-level course, student teams do an hour-long group-presentation on a particular topic each week. I give them a Backgrounder on their topic, and their job is to boil it down to something that can be explained in an hour to people who know nothing about it. For example, for a recent presentation on Fake News, I had the student team use the Gamma AI tool to build a PowerPoint-like website.  It does the research, but also allows you to edit the results.”  Tim sees Gamma AI and other GenAI tools as the next step up from the Internet and says, “If we don’t get on board and learn how to use them, we will be left behind by those who do.”

In his lower-level classes, Tim’s approach to students using AI is a bit different.  “In the Market Research class students take after completing Intro level Statistics, AI can’t really help. Student teams conduct Primary Research, interviewing real clients from the community, design a survey, obtain ethics approval, collect data, and analyze it using Excel. Then we do Boardroom Simulations in the last two weeks of class where they present their Findings, Conclusions, the Options, and Recommendations to the Board, of which I am the Chair.  It’s great fun!”

In Tim’s Workplace Professionalism course, “students complete a series of short presentations on various topics, and AI can be very helpful in conducting secondary research.  I check their comprehension in real-time by asking questions during their presentations.  I think in the future academic research skills are likely to change much as they did when we learned to use the Internet.  This means we have to focus on comprehension and application.”

When I asked how students are reacting to AI, Tim said “They’re not afraid of it at all. They live on their screens, and this is just another way of getting something done. The industrious ones will use AI to build a framework and then they will do the deep dive themselves because they’re curious. The ones who are looking for shortcuts will not do the deep dive and just pretend they understand. That’s why it’s on us to check for comprehension.”

I wondered how Tim’s colleagues have been reacting to all of this.  “It depends on what you’re teaching. If I was still teaching Statistics, AI wouldn’t bother me at all because there are already thousands of videos online students can watch until they understand the concepts.  It’s when students must engage in research that it becomes dangerous. In fact, some of my colleagues are playing with the idea of accepting only peer-reviewed sources because it is more challenging for AI to work behind paywalls (although there are ways around this).”

As we reached the end of our conversation, I asked Tim what is in the future for GenAI and his classes, and he indicated he would still be teaching GenAI tools to his 400-level courses but said “we’ll see when I review their final papers this term whether I will have to begin checking for comprehension even more now.”

As for GenAI itself, Tim says “It’s not clear to me where AI is going to end up. On November 1, 50 countries (including Canada) – countries who recognize that AI has unintended consequences for economies – met at Bletchley Park and signed a declaration about how to regulate AI going forward. But regulations or not, we’re rapidly reaching the stage where you either use GenAI or get replaced by someone who knows how to use it. That’s why I’m teaching it.”