Camosun Story #93: Liz and how GenAI can support student learning

I’ve interviewed Liz before, first way back when I was first talking to faculty in 2021 about their teaching experiences during COVID, and then about her Open Education work as part of our Open Sustainability project. This last March, Liz also received an Open Education Recognition award for her work in Open Education at the college. But this story is different. I was in the Lansdowne office one day several months ago when Liz came running in to talk to my colleague Sue about a Generative Artificial Intelligence (GenAI) assignment she had just finished running with her students. And she was SO excited I just had to see if she would tell me this story too…and she did!

Liz has been a faculty member in the Dental Hygiene program at Camosun for 35 years. In addition to supporting her students, Liz has a passion for keeping up with and teaching students about the use of current technologies and exploring different ways of evaluating learning. “Traditional ways of evaluation just don’t inspire students in their educational journey, so I try to be innovative and to find ways to ignite excitement in students.”

Preliminaries complete, we dove into a discussion of how Liz worked with GenAI in her course during the Winter 2024 term. “I didn’t know a lot about GenAI when I started this journey, but I knew that we are already behind what our students know, and that GenAI is a game changer in the information world.” Of course, there’s a lot of concern in our educational institutions about what the growth of GenAI means for us, resulting in resistance and fear amongst administrators, staff, and faculty. Not that this is anything new, as Liz noted, saying, “I remember back in high school when there was concern about the impact of calculators. I find it interesting how the initial reaction of education is to try to keep new technology out, but can we stop students from using it? I don’t think that’s possible. So, I took a different approach when it came to GenAI.” And related to all of this, Liz believes that one of the most critical things we can do for students is to teach them the difference between information, disinformation, and misinformation. “In a world where so much information comes from unreliable sources, we need to teach students to critically examine what they’re reading and assess it for validity and reliability.”

Liz began by learning about GenAI – what it is and how it works, and also met with Patsy, one of the librarians at Camosun, who helped her understand the benefits and limitations a bit more. Then Liz met with Sue and Kristina, two instructional designers in CETL to discuss what she was thinking. “What we see in dental hygiene is patients coming into clinic after going to ‘Doctor Google’ to ‘research’ their symptoms [research in quotes because, as Liz notes, there is a difference between academic research and ‘looking stuff up’ on the Internet.] Equipped with findings from Google, patients can believe they know their problems and come seeking validation so it’s important for students to learn how to ask patients investigative questions, in a nonjudgemental way, to assess where the information came from to determine reliability and validity.”

To support students to build this skill, Liz decided to add a new assessment to her nutrition course, choosing this course because the outcomes are broad enough to allow for flexibility. Liz chose to create an activity and assessment around the topic of the role nutrition plays in how the microbiome of the gut may contribute to inflammation and how this may impact the inflammation in the mouth and vice versa. “After choosing the topic, we [Liz, Kristina, and Sue] discussed how students could use GenAI to explore it. Students, in groups, chose a topic that related inflammation, nutrition, and periodontal disease, then created a prompt which they entered into ChatGPT. Groups then would examine the information provided for them and had to look for traditional peer-reviewed evidence to determine the reliability of the ChatGPT information.”

Before setting students loose, Patsy came to her class to give students an introduction to GenAI, walk them through how to use ChatGPT (the tool Liz recommended students use), and explain how to check for source reliability. Then they began. “I wasn’t sure how it was going to play out. Students learned something about nutrition, of course, but they also learned about GenAI and how it works,” supporting some soft skills development surrounding the use of AI.

Liz had students use traditional academic research tools to verify the sources presented by ChatGPT. She had broken them into larger groups than she normally would because “larger groups invited more conversation and discussion among the students and presented less risk because only one of them needed to sign up for ChatGPT. We then had a class where students presented their findings. They put up their prompts on the board and we talked about what they’d discovered. Then we put up the information they found and discussed the sources of the information. And that was where the discussion took off, because in each case, many of the sources provided by AI were made up: sometimes the article title was correct, but the author was incorrect, and several of the journals cited were nonexistent which was eye opening for them.” And that direct experience taught them more than Liz ever could.”

The class then moved on to a discussion around how they verified the information ChatGPT provided. “Again, the discussion was very rich. Students noticed that the information provided by GenAI was often general, although they were surprised with how much of the information was accurate overall. The other thing they noted was that ChatGPT provided a lot of qualifiers before answering prompts, for example saying, ‘you know, I’m not a doctor…,’ which they also found interesting.” In the end, students learned that GenAI might be useful to provide basic information as a starting point, but the specific information that may be needed in evidence-based care for patients.

Liz was excited by the engaging conversation the assignment produced. “Students were pumped. It was one of those magical classes where students are all talking, saying ‘Yeah, we found that – did you find that too? and ‘What do you think about that?’ They learned so much more than they could have learned by reading a single research paper which wouldn’t have created that excitement and engagement. When I asked them what they thought of this assignment, they said it was their favorite assignment for the whole year. It was another example of how, when you get out of the way of students and allow them to learn, with you as the guide on the side, it blows your mind.”

Liz’s assignment is also exciting for a few other reasons. First, from an employment standpoint. “One of the things employers will be looking for is knowledge of GenAI and how to use it, but with a healthy skepticism.” And second, a realization that this is the direction we should be heading around student assessments. “Sometimes in education, we’re afraid to let go of control. But we need to look at where our students are today and ask: Who are they? What do they want and need to learn? What kinds of tools are they familiar with? And we have to catch up to them.”

Liz emphasized the collaboration that went into this assessment creation. “I would never have been able to do this without support, to remind me about the concerns about student data or been able to maneuver the intricacies of the technology on my own. When I first sat down with Kristina and Sue the first time and I said, ‘we should be teaching students more about GenAI because they’re already using it but may not be aware of the benefits and limitations,’ they walked me through a thought process that helped me get to where I wanted to be. Then Kristina provided me with a sample, and I modified it from there. But without that collaboration, along with the support and encouragement to take the risk, this assessment would not have happened.”

As we wrapped up our conversation, Liz had some final words. “I think that the whole college community can benefit and learn from an experience like this. We have such a rich teaching and learning environment here, and there are so many instructors doing amazing things, but they are still not well known across the college. I think it’s a shame there aren’t more opportunities for cross-college learning and sharing.” We in CETL agree and will continue to support instructors in sharing their experiences so we can all learn from each other.

Camosun Story #69: Tim

Over the past few months, CETL educational developers have been working with faculty across the college exploring the advantages and disadvantages of Generative Artificial Intelligence (GenAI) in teaching and learning.  As we talked to more and more faculty, we discovered several who were already working GenAI into their assessments and talking about its implications with their students and I wanted to share some of their stories with you.  So here is the first of these interviews focused on using GenAI in the classroom with Tim, a full-time Instructor in the School of Business at Camosun who teaches everything from International Management to Marketing Research to Workplace Professionalism.

When I asked Tim how he is integrating GenAI into his teaching, he told me “I’d been teaching about AI for the last ten years or so when it became apparent that something like GenAI was imminent.  Up until recently, I’ve taught it in a very general way and stayed abreast of its development.  But with the rise of ChatGPT over the past year, I was asked, along with three other Instructors in the School of Business, if I would be willing to put some Professional Development time into figuring out what a good response to AI might be.  We were starting to see people misusing it from an academic honesty perspective.”

Tim spent quite a bit of time over summer 2023 keeping an eye on the various AIs releases (at one point there were about a dozen new English language AIs released every day of the week) and by the end of August, he had built a list categorizing about 80 in an Excel sheet that anyone can access.

As Tim explored, he concluded that “It’s a mistake to be afraid of AI. What I tell students, is: You’ve been told that AI is coming for your job. It’s not. Somebody who knows how to use AI is coming for your job. That means you had better get out ahead of the curve and learn how to use it effectively.”

Tim explained that his approach is to turn artificial intelligence into a Research Assistant. “When I went to college and grad school, the Internet as we know it didn’t exist. Instead, we spent time going to the library, digging through card catalogs, and writing notes on cue cards.  Took forever. The Internet changed all that. But while it’s become easy to find information, it’s hard to sift through because there’s so much out there. I think AI is most useful in an academic world as a Research Assistant because it can find information and put ideas together for you in minutes rather than in hours or more. That said, we still have to teach students how to determine what information is valid.”

In other words, Tim encourages students to use GenAI tools to find ideas but to personally review the sources and websites where the ideas come from.  “You have to be careful because AIs sometimes make things up. For example, I asked an AI tool to create a timeline of Camosun College history, and it did in two minutes. Beautifully presented. All the key events were there, but they were placed in the wrong years, and some were out by ten or more years. The AI had done the research, found the events, but when it couldn’t figure out when these events happened relevant to each other, it made things up and presented them as fact.  If I didn’t know Camosun’s history, I’d have believed it.” Lesson learned: “Use AI to do the initial research and collect basic information, but then go dig and make sure that the information was used correctly.”

I wondered how Tim supports with students as they work with AI tools in class and for assessments, and aside from warning them about plagiarism and checking original sources, he works with them to ensure they understand what they are presenting (in Tim’s classes, students present their papers live).  “I come from government where if the Minister of Advanced Education has a question in the middle of your presentation, she doesn’t wait until the end to ask.  So, to replicate real-world experiences, I interrupt students in the middle of their presentations and pepper them with questions to make sure they understand what they are presenting.  Demonstrating comprehension is critically important. It’s also important they understand that while AI will do the writing for them, if fail to develop their ability to write, they will harm their professional and personal development.” “In a very real sense, learning to write is learning to think.”

Tim also teaches his students how to use various AI tools in his 400-level class.  “I teach them how to use ChatGPT and the one built into Bing which is the easiest to use, as well as how to get the tool to show you the original sources and provide APA citations.” “In my 400-level course, student teams do an hour-long group-presentation on a particular topic each week. I give them a Backgrounder on their topic, and their job is to boil it down to something that can be explained in an hour to people who know nothing about it. For example, for a recent presentation on Fake News, I had the student team use the Gamma AI tool to build a PowerPoint-like website.  It does the research, but also allows you to edit the results.”  Tim sees Gamma AI and other GenAI tools as the next step up from the Internet and says, “If we don’t get on board and learn how to use them, we will be left behind by those who do.”

In his lower-level classes, Tim’s approach to students using AI is a bit different.  “In the Market Research class students take after completing Intro level Statistics, AI can’t really help. Student teams conduct Primary Research, interviewing real clients from the community, design a survey, obtain ethics approval, collect data, and analyze it using Excel. Then we do Boardroom Simulations in the last two weeks of class where they present their Findings, Conclusions, the Options, and Recommendations to the Board, of which I am the Chair.  It’s great fun!”

In Tim’s Workplace Professionalism course, “students complete a series of short presentations on various topics, and AI can be very helpful in conducting secondary research.  I check their comprehension in real-time by asking questions during their presentations.  I think in the future academic research skills are likely to change much as they did when we learned to use the Internet.  This means we have to focus on comprehension and application.”

When I asked how students are reacting to AI, Tim said “They’re not afraid of it at all. They live on their screens, and this is just another way of getting something done. The industrious ones will use AI to build a framework and then they will do the deep dive themselves because they’re curious. The ones who are looking for shortcuts will not do the deep dive and just pretend they understand. That’s why it’s on us to check for comprehension.”

I wondered how Tim’s colleagues have been reacting to all of this.  “It depends on what you’re teaching. If I was still teaching Statistics, AI wouldn’t bother me at all because there are already thousands of videos online students can watch until they understand the concepts.  It’s when students must engage in research that it becomes dangerous. In fact, some of my colleagues are playing with the idea of accepting only peer-reviewed sources because it is more challenging for AI to work behind paywalls (although there are ways around this).”

As we reached the end of our conversation, I asked Tim what is in the future for GenAI and his classes, and he indicated he would still be teaching GenAI tools to his 400-level courses but said “we’ll see when I review their final papers this term whether I will have to begin checking for comprehension even more now.”

As for GenAI itself, Tim says “It’s not clear to me where AI is going to end up. On November 1, 50 countries (including Canada) – countries who recognize that AI has unintended consequences for economies – met at Bletchley Park and signed a declaration about how to regulate AI going forward. But regulations or not, we’re rapidly reaching the stage where you either use GenAI or get replaced by someone who knows how to use it. That’s why I’m teaching it.”