1 00:00:02,370 --> 00:00:03,810 Mary Loder: - Welcome to Course Stories, 2 00:00:03,810 --> 00:00:06,510 Mary Loder: produced by the Instructional Design and New Media team 3 00:00:06,510 --> 00:00:09,480 Mary Loder: of EdPlus at Arizona State University. 4 00:00:09,480 --> 00:00:12,150 Mary Loder: In this podcast, we tell an array of course design stories 5 00:00:12,150 --> 00:00:16,050 Mary Loder: alongside other ASU online designers and faculty. 6 00:00:16,050 --> 00:00:17,343 Mary Loder: On today's course story. 7 00:00:18,480 --> 00:00:21,060 Andrew Maynard: - This is an incredibly powerful, 8 00:00:21,060 --> 00:00:23,460 Andrew: but an open-ended platform 9 00:00:23,460 --> 00:00:26,580 Andrew: where it's as good as the creativity you bring to it. 10 00:00:26,580 --> 00:00:28,980 Andrew: And helping students understand that 11 00:00:28,980 --> 00:00:31,950 Andrew: within the limitations of the system, 12 00:00:31,950 --> 00:00:34,260 Andrew: they can play and they can innovate themselves 13 00:00:34,260 --> 00:00:35,610 Andrew: and they can learn for themselves 14 00:00:35,610 --> 00:00:36,780 Andrew: how to get the most out of it. 15 00:00:36,780 --> 00:00:39,060 Andrew: That then becomes transformational. 16 00:00:39,060 --> 00:00:41,010 (lively music) 17 00:00:41,010 --> 00:00:42,510 Mary: - Hi, I'm Mary Loder, 18 00:00:42,510 --> 00:00:44,640 Mary: an instructional designer from ASU Online. 19 00:00:44,640 --> 00:00:46,650 Ricardo Leon: - I'm Ricardo Leon, I'm a media specialist 20 00:00:46,650 --> 00:00:47,820 Ricardo: at the same place. 21 00:00:47,820 --> 00:00:48,660 Mary: - Yeah, we work together. 22 00:00:48,660 --> 00:00:50,560 Ricardo: - Let's get on with the show. Mary: - Okay. 23 00:00:52,380 --> 00:00:53,250 Ricardo: - Hey Mary, and hey Liz. 24 00:00:53,250 --> 00:00:54,750 Liz Lee: - Hey. Mary: - Hello. 25 00:00:54,750 --> 00:00:55,770 Ricardo: - How are you guys today? 26 00:00:55,770 --> 00:00:57,120 Mary: - Good. Liz: - Pretty good. 27 00:00:57,120 --> 00:00:58,080 Ricardo: - What's going on today? 28 00:00:58,080 --> 00:01:00,330 Mary: - We had a few great people at the university 29 00:01:00,330 --> 00:01:02,970 Mary: around generative AI join a conversation today. 30 00:01:02,970 --> 00:01:05,370 Mary: Generative AI is like a very hot topic. 31 00:01:05,370 --> 00:01:08,940 Mary: This will be a highly listened-to episode, I have a feeling. 32 00:01:08,940 --> 00:01:09,990 Ricardo: - And so what's the course 33 00:01:09,990 --> 00:01:11,580 Ricardo: that we're focusing on this episode? 34 00:01:11,580 --> 00:01:14,610 Mary: - Well, FIS 394 is an introduction to 35 00:01:14,610 --> 00:01:18,540 Mary: basic prompt engineering using specifically ChatGPT. 36 00:01:18,540 --> 00:01:21,036 Mary: So Andrew hosted this class for the very first time, 37 00:01:21,036 --> 00:01:24,119 Mary: and I think it's the first of its kind that was hosted 38 00:01:24,119 --> 00:01:28,170 Mary: in like a really large public university this summer. 39 00:01:28,170 --> 00:01:29,910 Mary: There were quite a few enrollments, 40 00:01:29,910 --> 00:01:32,700 Mary: not only of like students who are fully enrolled here, 41 00:01:32,700 --> 00:01:35,130 Mary: but also staff who we're gonna be hearing from 42 00:01:35,130 --> 00:01:36,360 Mary: in the episode as well, 43 00:01:36,360 --> 00:01:39,390 Mary: because we have access to these classes as ASU employees. 44 00:01:39,390 --> 00:01:41,940 Ricardo - Yeah, so who is Andrew and who are these people 45 00:01:41,940 --> 00:01:43,470 Ricardo that we're gonna be talking to? 46 00:01:43,470 --> 00:01:45,540 Mary: - Andrew Maynard is, well, 47 00:01:45,540 --> 00:01:47,880 Mary: he's who I previously called the king of podcasting 48 00:01:47,880 --> 00:01:49,906 Mary: until it offended you, but he is a wonderful-- 49 00:01:49,906 --> 00:01:51,492 Ricardo: - Until I defeated him at podcasting. 50 00:01:51,492 --> 00:01:53,190 (Mary laughs) 51 00:01:53,190 --> 00:01:57,420 Mary: - Yes. He's a wonderful human being who has lots of insight 52 00:01:57,420 --> 00:02:00,600 Mary: into how technology integrates into our future. 53 00:02:00,600 --> 00:02:02,100 Ricardo: - And who else did we talk to? 54 00:02:02,100 --> 00:02:04,200 Liz: - We also talked to Mary Ann Naumann, 55 00:02:04,200 --> 00:02:06,930 Liz: she works with the ASU Library. 56 00:02:06,930 --> 00:02:10,620 Liz: And she has a really unique perspective being a librarian, 57 00:02:10,620 --> 00:02:15,060 Liz: especially her views on ChatGPT, AI and research, 58 00:02:15,060 --> 00:02:17,640 Liz: how it can help and how it doesn't help, 59 00:02:17,640 --> 00:02:19,200 Liz: which is fascinating. 60 00:02:19,200 --> 00:02:23,370 Liz: And she also took not only Andrew Maynard's course, 61 00:02:23,370 --> 00:02:26,970 Liz: but she also took the GenAI course developed by Tamara, 62 00:02:26,970 --> 00:02:28,680 Liz: who we also spoke to today. 63 00:02:28,680 --> 00:02:30,540 Mary: - And Tamara worked with lots of people 64 00:02:30,540 --> 00:02:32,520 Mary: across the entire university to put that course together. 65 00:02:32,520 --> 00:02:34,440 Mary: She was an excellent project leader, 66 00:02:34,440 --> 00:02:36,120 Mary: did a ton of work herself. 67 00:02:36,120 --> 00:02:39,420 Mary: This is a gorgeous course for faculty and staff at ASU. 68 00:02:39,420 --> 00:02:41,370 Mary: And while it's not available to everybody, 69 00:02:41,370 --> 00:02:42,810 Mary: we will put a link in our show notes. 70 00:02:42,810 --> 00:02:43,890 Mary: 'Cause if you haven't gone into it, 71 00:02:43,890 --> 00:02:45,150 Mary: or if you need to reintegrate 72 00:02:45,150 --> 00:02:47,190 Mary: 'cause it came out just during a really busy time. 73 00:02:47,190 --> 00:02:49,830 Ricardo: - Yeah, and it was all hands on deck. 74 00:02:49,830 --> 00:02:50,760 Mary: - Oh, yes. 75 00:02:50,760 --> 00:02:51,690 Ricardo: - We were involved in it, 76 00:02:51,690 --> 00:02:53,940 Ricardo: we were creating video content for it. 77 00:02:53,940 --> 00:02:56,220 Mary: - Writing content, revising content, 78 00:02:56,220 --> 00:02:58,020 Mary: helping others who were doing parts of it. 79 00:02:58,020 --> 00:02:59,040 Mary: I mean, it was just, 80 00:02:59,040 --> 00:03:02,340 Mary: anybody who was interested in helping was, you know, 81 00:03:02,340 --> 00:03:03,300 Mary: invited in to help. 82 00:03:03,300 --> 00:03:05,340 Mary: And then those who were helping, it was like, 83 00:03:05,340 --> 00:03:07,890 Mary: that was our sole focus of our life for the entire summer. 84 00:03:07,890 --> 00:03:10,680 Ricardo: - You guys got invitations, I was voluntold? 85 00:03:10,680 --> 00:03:12,450 Mary: - You kind of get voluntold to do a lot of things 86 00:03:12,450 --> 00:03:14,820 Mary: through the media team because you know, you're offered up 87 00:03:14,820 --> 00:03:16,470 Mary: because you are so good. 88 00:03:16,470 --> 00:03:19,800 Mary: That's what happens when you're good. (laughs) 89 00:03:19,800 --> 00:03:21,390 Ricardo: - There's a punishment for doing a good job 90 00:03:21,390 --> 00:03:22,749 Ricardo: or something like that. Mary: - It's not a punishment. 91 00:03:22,749 --> 00:03:25,260 Ricardo: - Well, there's like a term, there's a. 92 00:03:25,260 --> 00:03:26,430 Mary: - An elevation of work. 93 00:03:26,430 --> 00:03:29,070 Ricardo: - Okay, that sounds good. 94 00:03:29,070 --> 00:03:30,120 Ricardo: And well, so yeah, 95 00:03:30,120 --> 00:03:31,633 Ricardo: we're gonna listen to a really good conversation, 96 00:03:31,633 --> 00:03:32,550 Ricardo: I wanna jump right in. 97 00:03:32,550 --> 00:03:34,530 Ricardo: But also I wanna billboard that 98 00:03:34,530 --> 00:03:38,070 Ricardo: the end of this episode, we've got a special treat, 99 00:03:38,070 --> 00:03:42,840 Ricardo: a special use of AI that Liz is going to talk about. 100 00:03:42,840 --> 00:03:46,320 Liz: - Yeah, so stick around if you wanna hear more about how to 101 00:03:46,320 --> 00:03:50,400 Liz: deceive your in-laws using ChatGPT. 102 00:03:50,400 --> 00:03:51,960 Liz: I'll just leave that teaser. 103 00:03:51,960 --> 00:03:53,160 Mary: - We got tips from the experts, 104 00:03:53,160 --> 00:03:55,230 Mary: that was a good call to ask them those questions 105 00:03:55,230 --> 00:03:56,063 Mary: while they were here. 106 00:03:56,063 --> 00:03:57,750 Liz: - I made sure to join this podcast 107 00:03:57,750 --> 00:04:00,180 Liz: specifically to be a part of this conversation 108 00:04:00,180 --> 00:04:04,710 Liz: so that I could get some good prompt engineering feedback. 109 00:04:04,710 --> 00:04:06,180 Mary: - We're all rooting for you. 110 00:04:06,180 --> 00:04:07,730 Liz: - Thank you, I'm gonna need it. 111 00:04:10,770 --> 00:04:13,350 Mary: - All right, welcome you guys, gals. 112 00:04:13,350 --> 00:04:14,981 Andrew Maynard: - Good to be here. 113 00:04:14,981 --> 00:04:16,590 (group chuckles) 114 00:04:16,590 --> 00:04:17,730 Mary: - We're gonna have a whole discussion 115 00:04:17,730 --> 00:04:19,230 Mary: around generative AI today, 116 00:04:19,230 --> 00:04:22,440 Mary: but specifically from a very unique perspective 117 00:04:22,440 --> 00:04:25,740 Mary: 'cause we have Andrew Maynard, Mary Ann Nauman, 118 00:04:25,740 --> 00:04:28,380 Mary: and Tamara Mitchell, who are all major contributors 119 00:04:28,380 --> 00:04:31,890 Mary: in different ways for generative AI around the campus. 120 00:04:31,890 --> 00:04:35,460 Mary: And not only do we have faculty perspective, with Andrew, 121 00:04:35,460 --> 00:04:37,830 Mary: we have designer perspective with Tamara 122 00:04:37,830 --> 00:04:39,090 Mary: in a very unique way, 123 00:04:39,090 --> 00:04:42,150 Mary: and student perspective with Mary Ann 124 00:04:42,150 --> 00:04:44,486 Mary: because she was a student in both of the courses 125 00:04:44,486 --> 00:04:46,440 Mary: that we're talking about today. 126 00:04:46,440 --> 00:04:47,730 Mary: So, welcome, I'd like to have you guys 127 00:04:47,730 --> 00:04:49,140 Mary: just briefly introduce yourselves, 128 00:04:49,140 --> 00:04:51,120 Mary: tell us who you are officially at the university 129 00:04:51,120 --> 00:04:53,040 Mary: and anything else that you'd like to add. 130 00:04:53,040 --> 00:04:54,960 Andrew: Yes, my name's Andrew Maynard, 131 00:04:54,960 --> 00:04:58,770 Andrew: I'm a professor of Advanced Technology Transitions 132 00:04:58,770 --> 00:05:00,750 Andrew: in the school for the Future of Innovation in Society, 133 00:05:00,750 --> 00:05:03,780 Andrew: which basically means that I study how we get from here 134 00:05:03,780 --> 00:05:05,970 Andrew: to the future without making a complete mess of things. 135 00:05:05,970 --> 00:05:09,480 Andrew: And that covers everything from AI 136 00:05:09,480 --> 00:05:11,850 Andrew: to any other technology you can imagine. 137 00:05:11,850 --> 00:05:13,860 Mary: Super important perspective 138 00:05:13,860 --> 00:05:16,050 Mary: when we enter the world of generative AI, 139 00:05:16,050 --> 00:05:17,580 Mary: and we'll talk more about that. 140 00:05:17,580 --> 00:05:18,810 Mary: And Mary Ann. Mary Ann Naumann- Yes. 141 00:05:18,810 --> 00:05:20,070 Mary Ann: My name is Mary Ann Naumann. 142 00:05:20,070 --> 00:05:21,660 Mary Ann: I'm an undergraduate instruction 143 00:05:21,660 --> 00:05:23,400 Mary Ann: and assessment librarian at ASU. 144 00:05:23,400 --> 00:05:24,780 Mary Ann: The group of librarians I work with 145 00:05:24,780 --> 00:05:26,700 Mary Ann: primarily work with new students 146 00:05:26,700 --> 00:05:29,100 Mary Ann: or those who are new to the academic journey, 147 00:05:29,100 --> 00:05:31,590 Mary Ann: figuring out how to find and evaluate sources, 148 00:05:31,590 --> 00:05:33,300 Mary Ann: so that's my specialty. 149 00:05:33,300 --> 00:05:35,070 Mary: Lovely. You've built a wonderful site 150 00:05:35,070 --> 00:05:36,990 Mary: that we'll plug later in the show notes as well, 151 00:05:36,990 --> 00:05:37,823 Mary: if you don't mind. 152 00:05:37,823 --> 00:05:39,410 Mary Ann: Sure, thank you. (both chuckles) 153 00:05:39,410 --> 00:05:42,654 Mary: And Tamara, my course design bestie. 154 00:05:42,654 --> 00:05:44,190 Tamara Mitchell: I'm so happy to be here. 155 00:05:44,190 --> 00:05:45,420 Tamara: I'm Tamara Mitchell, 156 00:05:45,420 --> 00:05:48,660 Tamara: I'm a senior instructional designer with ASU Online. 157 00:05:48,660 --> 00:05:51,420 Tamara: And I have the privilege of working with 158 00:05:51,420 --> 00:05:53,130 Tamara: a wide variety of faculty 159 00:05:53,130 --> 00:05:54,930 Tamara: who are wrestling with the opportunities 160 00:05:54,930 --> 00:05:56,340 Tamara: and challenges of AI. 161 00:05:56,340 --> 00:05:59,220 Tamara: I also feel like this is kind of a full circle moment for me 162 00:05:59,220 --> 00:06:04,220 Tamara: because I would have to be about 10 years ago, 7 years ago, 163 00:06:04,350 --> 00:06:07,200 Tamara: I developed a self-paced course 164 00:06:07,200 --> 00:06:09,960 Tamara: in partnership with one of the leaders at IBM, 165 00:06:09,960 --> 00:06:13,110 Tamara: who is part of the core team for developing Watson. 166 00:06:13,110 --> 00:06:16,035 Tamara: And he said, "Watch out, things are coming." 167 00:06:16,035 --> 00:06:18,900 Tamara: 10 years later, we kind of see the ripples of that. 168 00:06:18,900 --> 00:06:21,960 Tamara: And I know Watson isn't even the generative AI 169 00:06:21,960 --> 00:06:22,950 Tamara: that we're talking about, 170 00:06:22,950 --> 00:06:25,440 Tamara: but it's a shift toward the future and I think 171 00:06:25,440 --> 00:06:28,290 Tamara: we're experiencing another shift toward the future. 172 00:06:28,290 --> 00:06:29,640 Mary: Absolutely. 173 00:06:29,640 --> 00:06:32,910 Mary: Now let's talk first with Andrew about 174 00:06:32,910 --> 00:06:34,290 Mary: the course that you developed this summer. 175 00:06:34,290 --> 00:06:35,670 Mary: So tell us about the course, 176 00:06:35,670 --> 00:06:39,420 Mary: tell us about why you decided to host the course. 177 00:06:39,420 --> 00:06:42,570 Andrew: Sure. So this was an online course 178 00:06:42,570 --> 00:06:45,810 Andrew: that taught students the basics of how to 179 00:06:45,810 --> 00:06:49,170 Andrew: get the best out of ChatGPT specifically. 180 00:06:49,170 --> 00:06:52,020 Andrew: We called it, I forget what the exact name was, 181 00:06:52,020 --> 00:06:56,580 Andrew: but it was basics of prompt engineering using ChatGPT. 182 00:06:56,580 --> 00:06:59,580 Andrew: And the prompt engineering here was not the coding 183 00:06:59,580 --> 00:07:01,050 Andrew: that we used to think about 184 00:07:01,050 --> 00:07:03,330 Andrew: as prompt engineering 6, 12 months ago, 185 00:07:03,330 --> 00:07:07,200 Andrew: but it was how you write those ordinary language prompts 186 00:07:07,200 --> 00:07:09,570 Andrew: that enable you to get the platform to 187 00:07:09,570 --> 00:07:11,040 Andrew: do what you want it to do. 188 00:07:11,040 --> 00:07:14,580 Andrew: And we put this together actually in quite a hurry 189 00:07:14,580 --> 00:07:17,160 Andrew: because we realized earlier this year that 190 00:07:17,160 --> 00:07:20,220 Andrew: this was massively accelerating technology. 191 00:07:20,220 --> 00:07:22,800 Andrew: It went from 0 to 1000 miles an hour 192 00:07:22,800 --> 00:07:24,630 Andrew: almost overnight it seemed. 193 00:07:24,630 --> 00:07:27,150 Andrew: And we were in a position where our students 194 00:07:27,150 --> 00:07:30,720 Andrew: were already using it and we were scrambling to understand 195 00:07:30,720 --> 00:07:31,590 Andrew: sort of what that meant. 196 00:07:31,590 --> 00:07:35,340 Andrew: But more importantly, it seemed very, very clear that 197 00:07:35,340 --> 00:07:38,340 Andrew: as they graduated, our students in order to thrive, 198 00:07:38,340 --> 00:07:41,340 Andrew: needed a set of skills around generative AI 199 00:07:41,340 --> 00:07:43,170 Andrew: that we simply weren't teaching. 200 00:07:43,170 --> 00:07:45,240 Andrew: And they also needed confirmation 201 00:07:45,240 --> 00:07:46,140 Andrew: that they had those skills, 202 00:07:46,140 --> 00:07:48,030 Andrew: so they needed something on their transcript. 203 00:07:48,030 --> 00:07:50,640 Andrew: So we rushed to put together a course 204 00:07:50,640 --> 00:07:53,310 Andrew: on teaching them the basics of how to do this 205 00:07:53,310 --> 00:07:55,650 Andrew: and taught it for the first time this summer. 206 00:07:55,650 --> 00:07:57,750 Andrew: But the course had a number of 207 00:07:57,750 --> 00:07:59,730 Andrew: really intriguing aspects to it, 208 00:07:59,730 --> 00:08:03,630 Andrew: one of them being, that to formulate it fast, 209 00:08:03,630 --> 00:08:06,480 Andrew: I literally sat down one evening in front of ChatGPT 210 00:08:06,480 --> 00:08:08,400 Andrew: and said, hey, I wanna teach a course 211 00:08:08,400 --> 00:08:11,100 Andrew: about how to use ChatGPT, what should we teach? 212 00:08:11,100 --> 00:08:15,150 Andrew: And ChatGPT produced the bare bones of a syllabus, 213 00:08:15,150 --> 00:08:17,370 Andrew: and we actually built it out from there. 214 00:08:17,370 --> 00:08:18,203 Mary: That's amazing. 215 00:08:18,203 --> 00:08:20,607 Mary: That's actually very similar to how Tamara and I worked, 216 00:08:20,607 --> 00:08:21,930 Mary: and many of the others worked 217 00:08:21,930 --> 00:08:23,957 Mary: when we were developing the Generative AI course, right? 218 00:08:23,957 --> 00:08:26,550 Mary: ChatGPT was our first draft best friend. 219 00:08:26,550 --> 00:08:27,990 Mary: So, fabulous. 220 00:08:27,990 --> 00:08:29,910 Mary: Now, the first draft was never the final draft, 221 00:08:29,910 --> 00:08:33,387 Mary: as Tamara can tell you from her hours of revision work. 222 00:08:33,387 --> 00:08:34,973 Tamara: So many revisions. Mary: But it was a starting point. 223 00:08:36,418 --> 00:08:38,190 (group chuckles) 224 00:08:38,190 --> 00:08:39,420 Mary: And Mary Ann, you were a student 225 00:08:39,420 --> 00:08:41,040 Mary: in this class with Andrew. 226 00:08:41,040 --> 00:08:43,290 Mary: What was your first impression coming into the class? 227 00:08:43,290 --> 00:08:45,090 Mary: And also, why did you take it? 228 00:08:45,090 --> 00:08:47,040 Mary Ann: Yes, so as my role as a librarian, 229 00:08:47,040 --> 00:08:50,430 Mary Ann: I'm really interested in when I saw the technology come out 230 00:08:50,430 --> 00:08:54,450 Mary Ann: and seeing how it was impacting potentially writing skills, 231 00:08:54,450 --> 00:08:57,420 Mary Ann: critical thinking skills, research skills, 232 00:08:57,420 --> 00:09:00,804 Mary Ann: that one of the things that is really not 233 00:09:00,804 --> 00:09:02,820 Mary Ann: where there was a huge gap was 234 00:09:02,820 --> 00:09:04,830 Mary Ann: in the aspect of prompt engineering, 235 00:09:04,830 --> 00:09:07,890 Mary Ann: how do we formulate searches in this area? 236 00:09:07,890 --> 00:09:10,830 Mary Ann: There were a lot of people who immediately jumped online 237 00:09:10,830 --> 00:09:13,980 Mary Ann: and started to kind of pontificate about 238 00:09:13,980 --> 00:09:15,806 Mary Ann: what prompt engineering was, 239 00:09:15,806 --> 00:09:18,900 Mary Ann: filling that void for clicks and clout. 240 00:09:18,900 --> 00:09:21,780 Mary Ann: And so I was really excited when I saw this course come up 241 00:09:21,780 --> 00:09:23,940 Mary Ann: because one, I knew it was gonna be taught by somebody 242 00:09:23,940 --> 00:09:25,140 Mary Ann: who was thinking about it from the 243 00:09:25,140 --> 00:09:27,060 Mary Ann: ethical perspective as well, 244 00:09:27,060 --> 00:09:29,400 Mary Ann: but who would be able to guide us in that way? 245 00:09:29,400 --> 00:09:31,980 Mary Ann: So I was really excited to take the course because of that. 246 00:09:31,980 --> 00:09:33,450 Mary: Awesome, awesome. 247 00:09:33,450 --> 00:09:36,390 Mary: So what was your favorite assignment in the class? 248 00:09:36,390 --> 00:09:38,910 Andrew: That's a bit of a pregnant pause there. 249 00:09:38,910 --> 00:09:40,277 Andrew: Were there any favorite assignments? 250 00:09:40,277 --> 00:09:41,297 Mary Ann: It's all of them. 251 00:09:41,297 --> 00:09:43,853 Mary Ann: That's hard to say because there were a bunch of times, 252 00:09:43,853 --> 00:09:45,570 Mary Ann: but I know there were a couple times 253 00:09:45,570 --> 00:09:47,460 Mary Ann: where we were really getting to prompt engineering 254 00:09:47,460 --> 00:09:49,650 Mary Ann: where we were talking about constraints, 255 00:09:49,650 --> 00:09:50,700 Mary Ann: ambiguity reduction, 256 00:09:50,700 --> 00:09:53,520 Mary Ann: some of the different strategies in creating templates 257 00:09:53,520 --> 00:09:55,890 Mary Ann: where it started to click because you were using it 258 00:09:55,890 --> 00:09:57,900 Mary Ann: in a more advanced way more frequently. 259 00:09:57,900 --> 00:10:01,860 Mary Ann: So that was kind of the aha moment where you recognize that, 260 00:10:01,860 --> 00:10:02,820 Mary Ann: okay, I'm getting this, 261 00:10:02,820 --> 00:10:05,190 Mary Ann: I'm getting better output from the system, 262 00:10:05,190 --> 00:10:07,080 Mary Ann: as well as then the modules about 263 00:10:07,080 --> 00:10:08,790 Mary Ann: how do I assess what I'm getting out of it? 264 00:10:08,790 --> 00:10:10,320 Mary Ann: How can I change my prompt based on 265 00:10:10,320 --> 00:10:12,840 Mary Ann: what I'm seeing the output and the response from the 266 00:10:12,840 --> 00:10:14,070 Mary Ann: generative AI system. 267 00:10:14,070 --> 00:10:16,800 Mary Ann: So those were my favorite just because 268 00:10:16,800 --> 00:10:18,360 Mary Ann: it was that aha moment of, 269 00:10:18,360 --> 00:10:20,430 Mary Ann: I think I'm kind of getting this 270 00:10:20,430 --> 00:10:23,610 Mary Ann: where I'm doing a better job at querying the system 271 00:10:23,610 --> 00:10:25,680 Mary Ann: and getting better responses. 272 00:10:25,680 --> 00:10:27,780 Mary: So Tamara, you and the Generative AI course 273 00:10:27,780 --> 00:10:28,920 Mary: for our faculty and staff. 274 00:10:28,920 --> 00:10:31,830 Mary: You also were very focused on prompt engineering, 275 00:10:31,830 --> 00:10:35,610 Mary: how to help those who are in that prompt engineering space. 276 00:10:35,610 --> 00:10:40,500 Mary: What did you do in that course design consideration 277 00:10:40,500 --> 00:10:42,690 Mary: that you think would benefit 278 00:10:42,690 --> 00:10:45,960 Mary: maybe outside of our faculty and staff or, 279 00:10:45,960 --> 00:10:47,580 Mary: I don't know, maybe there was an artifact created 280 00:10:47,580 --> 00:10:49,380 Mary: that we could share somewhere? 281 00:10:49,380 --> 00:10:51,840 Tamara: So I think in the Generative AI course, 282 00:10:51,840 --> 00:10:54,437 Tamara: Teaching and Learning with Generative AI course, 283 00:10:54,437 --> 00:10:55,270 Tamara: that was our module four, 284 00:10:55,270 --> 00:10:57,420 Tamara: and that was the module everybody wants to skip to 285 00:10:57,420 --> 00:10:59,160 Tamara: because everybody wants to play 286 00:10:59,160 --> 00:11:00,870 Tamara: with generative AI, don't they? 287 00:11:00,870 --> 00:11:03,000 Tamara: They want to get into the trenches, they want to experiment, 288 00:11:03,000 --> 00:11:06,120 Tamara: and they want to get some really important things 289 00:11:06,120 --> 00:11:07,535 Tamara: from that work. 290 00:11:07,535 --> 00:11:09,003 Tamara: It's funny though. 291 00:11:10,920 --> 00:11:14,490 Tamara: It's funny because we have so many people who will go to 292 00:11:14,490 --> 00:11:18,480 Tamara: ChatGPT per se and put in a prompt and say, 293 00:11:18,480 --> 00:11:20,460 Tamara: and have it, you know, spit out something 294 00:11:20,460 --> 00:11:22,050 Tamara: that's nothing like what they wanted. 295 00:11:22,050 --> 00:11:24,090 Tamara: And they'll say, "Oh, the tool isn't great." 296 00:11:24,090 --> 00:11:26,370 Tamara: And so it's kind of like somebody 297 00:11:26,370 --> 00:11:29,010 Tamara: walking over to a guitar that they've never seen before, 298 00:11:29,010 --> 00:11:31,260 Tamara: playing a note, realizing it didn't make a song, 299 00:11:31,260 --> 00:11:33,150 Tamara: and then putting it down because they don't like it. 300 00:11:33,150 --> 00:11:36,060 Tamara: And so I think the first and most important thing 301 00:11:36,060 --> 00:11:39,660 Tamara: I would say is that it's a skill and it's a literacy 302 00:11:39,660 --> 00:11:42,300 Tamara: and it's something that we have never had to develop 303 00:11:42,300 --> 00:11:43,500 Tamara: in the past. 304 00:11:43,500 --> 00:11:46,860 Tamara: And because we've had to develop it now, 305 00:11:46,860 --> 00:11:47,880 Tamara: one of the biggest, 306 00:11:47,880 --> 00:11:51,450 Tamara: and I think the best tips that I've seen in 307 00:11:51,450 --> 00:11:53,280 Tamara: Andrew's course where I lookie-lood, 308 00:11:53,280 --> 00:11:57,060 Tamara: and then in the Teaching and Learning Generative AI course, 309 00:11:57,060 --> 00:11:59,100 Tamara: yes, I stalked you Andrew, 310 00:11:59,100 --> 00:12:00,810 Tamara: in the teaching and learning. 311 00:12:00,810 --> 00:12:02,910 Tamara: Can you say that? (chuckles) 312 00:12:02,910 --> 00:12:03,900 Mary: Yes, you should say that. 313 00:12:03,900 --> 00:12:04,740 Mary: We are observers, 314 00:12:04,740 --> 00:12:06,690 Mary: if you wanna use the professional name in canvas. 315 00:12:06,690 --> 00:12:10,050 Mary: But yes, we were full on stalkers. (chuckles) 316 00:12:10,050 --> 00:12:13,020 Tamara: Professionally observed and learned from you. 317 00:12:13,020 --> 00:12:16,150 Tamara: But I think one of the most important things that 318 00:12:17,250 --> 00:12:20,493 Tamara: is taught, and I think that would benefit everyone, 319 00:12:21,390 --> 00:12:25,290 Tamara: was number one, treating it like a conversation partner. 320 00:12:25,290 --> 00:12:28,080 Tamara: If you can't show up with good information, 321 00:12:28,080 --> 00:12:29,763 Tamara: your partner can't respond. 322 00:12:31,110 --> 00:12:33,120 Tamara: Being clear, number two, 323 00:12:33,120 --> 00:12:37,860 Tamara: and number three, one thing that Andrew does quite well 324 00:12:37,860 --> 00:12:41,160 Tamara: that we actually used with his permission 325 00:12:41,160 --> 00:12:43,440 Tamara: in the Teaching and Learning with Generative AI course, 326 00:12:43,440 --> 00:12:46,320 Tamara: was actually evaluating your prompts, 327 00:12:46,320 --> 00:12:48,630 Tamara: and I'm sure we're gonna get to get that a little bit later. 328 00:12:48,630 --> 00:12:50,670 Tamara: But if you can do those three things, 329 00:12:50,670 --> 00:12:52,680 Tamara: then you can refine and iterate 330 00:12:52,680 --> 00:12:54,120 Tamara: and you can get better results 331 00:12:54,120 --> 00:12:57,240 Tamara: and you'll have a better experience with generative AI. 332 00:12:57,240 --> 00:12:58,830 Mary: That is a really good perspective. 333 00:12:58,830 --> 00:13:01,230 Mary: Now Andrew, how did you articulate that to your students 334 00:13:01,230 --> 00:13:03,210 Mary: when it came to prompt engineering in your class? 335 00:13:03,210 --> 00:13:04,980 Mary: Was there like a framework that you gave them 336 00:13:04,980 --> 00:13:06,780 Mary: or some step-by-step instructions, 337 00:13:06,780 --> 00:13:07,613 Mary: how did that work? 338 00:13:07,613 --> 00:13:08,446 Andrew: There was. 339 00:13:09,390 --> 00:13:13,320 Andrew: To go back a step, we actually had a systematic approach 340 00:13:13,320 --> 00:13:16,857 Andrew: to helping students understand what ChatGPT is 341 00:13:16,857 --> 00:13:18,990 Andrew: and how to begin to use it. 342 00:13:18,990 --> 00:13:21,570 Andrew: So we built things up, first of all, 343 00:13:21,570 --> 00:13:23,970 Andrew: by ensuring that pretty much everything they did 344 00:13:23,970 --> 00:13:25,920 Andrew: was within the ChatGPT environment. 345 00:13:25,920 --> 00:13:29,250 Andrew: So they were building up their ChatGPT mental muscles, 346 00:13:29,250 --> 00:13:31,110 Andrew: if you like, around it. 347 00:13:31,110 --> 00:13:34,200 Andrew: But then we started off by giving them assignments 348 00:13:34,200 --> 00:13:37,380 Andrew: that gave them a sense of what it can do, what it can't do, 349 00:13:37,380 --> 00:13:39,030 Andrew: what the limitations are. 350 00:13:39,030 --> 00:13:41,310 Andrew: We then went on to looking at the basics 351 00:13:41,310 --> 00:13:43,350 Andrew: of how you sort of create prompts 352 00:13:43,350 --> 00:13:45,000 Andrew: and how you create prompt templates. 353 00:13:45,000 --> 00:13:47,460 Andrew: But the evaluation bit was really interesting 354 00:13:47,460 --> 00:13:49,590 Andrew: because it was very clear that once 355 00:13:49,590 --> 00:13:52,440 Andrew: they developed those skills in how to develop 356 00:13:52,440 --> 00:13:54,810 Andrew: and how to write a fairly sophisticated prompt, 357 00:13:54,810 --> 00:13:58,320 Andrew: they needed a framework with which they could assess 358 00:13:58,320 --> 00:14:00,810 Andrew: the quality of that prompt and the quality of the response 359 00:14:00,810 --> 00:14:03,300 Andrew: and they could iterate around to make things better. 360 00:14:03,300 --> 00:14:05,550 Andrew: So as we were developing this, again, 361 00:14:05,550 --> 00:14:08,550 Andrew: I was working very closely with ChatGPT. 362 00:14:08,550 --> 00:14:10,410 Andrew: I asked, well, how do we do this? 363 00:14:10,410 --> 00:14:12,000 Andrew: We've gotta evaluate this. 364 00:14:12,000 --> 00:14:13,320 Andrew: What's a really good way of doing it? 365 00:14:13,320 --> 00:14:14,940 Andrew: And it came up with this framework, 366 00:14:14,940 --> 00:14:17,100 Andrew: which we ended up calling the RACCCA framework, 367 00:14:17,100 --> 00:14:19,320 Andrew: R-A-C-C-C-A. 368 00:14:19,320 --> 00:14:22,225 Andrew: Please don't ask me to say what that stands for. 369 00:14:22,225 --> 00:14:24,030 Mary: I could read it off if you need to. (laughs) 370 00:14:24,030 --> 00:14:26,700 Andrew: Oh, thank God I've got two experts here. 371 00:14:26,700 --> 00:14:30,600 Andrew: But the key thing is that ChatGPT came up with these 372 00:14:30,600 --> 00:14:34,380 Andrew: six criteria, which I'll let you read off in a second, 373 00:14:34,380 --> 00:14:35,280 Andrew: which made sense. 374 00:14:35,280 --> 00:14:38,430 Andrew: So we worked around sort of how we would apply that. 375 00:14:38,430 --> 00:14:40,230 Andrew: But then the thing that blew me away was, 376 00:14:40,230 --> 00:14:41,370 Andrew: in this conversation, 377 00:14:41,370 --> 00:14:44,580 Andrew: ChatGPT started abbreviating it as RACCCA 378 00:14:44,580 --> 00:14:45,600 Andrew: without any prompting. 379 00:14:45,600 --> 00:14:48,297 Andrew: So it became the RACCCA framework, which stands for, 380 00:14:48,297 --> 00:14:49,440 Andrew: now you can read it off. 381 00:14:49,440 --> 00:14:51,060 Mary: Well, let's have Mary Ann read it off 382 00:14:51,060 --> 00:14:53,060 Mary: because she might be able to do it rote. 383 00:14:54,180 --> 00:14:57,363 Mary Ann: R for relevance, we've got A for accuracy, 384 00:14:59,880 --> 00:15:03,183 Mary Ann: C for completeness, C for clarity, 385 00:15:04,470 --> 00:15:05,970 Mary Ann: C for coherence. 386 00:15:05,970 --> 00:15:07,110 Mary: I always forget the third C. 387 00:15:07,110 --> 00:15:10,170 Mary Ann: And then A for appropriateness. 388 00:15:10,170 --> 00:15:12,090 Mary: Awesome, and we'll put all of that 389 00:15:12,090 --> 00:15:13,530 Mary: into the show notes as well. 390 00:15:13,530 --> 00:15:15,180 Mary: I think you created a video on RACCCA, 391 00:15:15,180 --> 00:15:16,160 Mary: would you be okay with us sharing that? 392 00:15:16,160 --> 00:15:17,130 Andrew: Oh absolutely, yes. 393 00:15:17,130 --> 00:15:20,940 Andrew: So we have an introduction video to the RACCCA framework. 394 00:15:20,940 --> 00:15:24,630 Andrew: And yes, it actually ended up being a very useful framework, 395 00:15:24,630 --> 00:15:25,463 Andrew: I would say. 396 00:15:25,463 --> 00:15:27,270 Andrew: You can see how poor my memory is, 397 00:15:27,270 --> 00:15:29,550 Andrew: which is why I need ChatGPT. 398 00:15:29,550 --> 00:15:31,980 Andrew: It took me about 20 takes with that video 399 00:15:31,980 --> 00:15:34,710 Andrew: even to sort of get the sequence right on it. 400 00:15:34,710 --> 00:15:36,270 Andrew: So I can talk about RACCCA, 401 00:15:36,270 --> 00:15:39,570 Andrew: but it's very hard without working with ChatGPT 402 00:15:39,570 --> 00:15:41,910 Andrew: to remember the specific things until you're beginning to 403 00:15:41,910 --> 00:15:42,960 Andrew: apply that framework. 404 00:15:42,960 --> 00:15:45,180 Mary: Well, it's common to know how things make you feel 405 00:15:45,180 --> 00:15:47,040 Mary: without being able to recall all the specifics 406 00:15:47,040 --> 00:15:48,510 Mary: of how you got to that feeling. 407 00:15:48,510 --> 00:15:50,340 Mary: So that's very resonant. 408 00:15:50,340 --> 00:15:51,173 Mary: Thank you for letting us share that. 409 00:15:51,173 --> 00:15:54,210 Mary: 'cause I think it's probably one of the better artifacts 410 00:15:54,210 --> 00:15:56,550 Mary: for both courses because it's something that 411 00:15:56,550 --> 00:15:57,390 Mary: we're all looking for, 412 00:15:57,390 --> 00:16:01,060 Mary: like how do we mindfully use generative AI 413 00:16:02,733 --> 00:16:06,930 Mary: to properly, ethically associate this within our world. 414 00:16:06,930 --> 00:16:08,223 Andrew: And I would say on top of that, 415 00:16:08,223 --> 00:16:11,340 Andrew: so the RACCCA framework isn't the only one 416 00:16:11,340 --> 00:16:13,080 Andrew: that there is out there, or that you can use. 417 00:16:13,080 --> 00:16:16,860 Andrew: But the critical thing is bringing that process 418 00:16:16,860 --> 00:16:20,520 Andrew: to things where you evaluate what the output is like 419 00:16:20,520 --> 00:16:21,810 Andrew: and the outcomes are like, 420 00:16:21,810 --> 00:16:24,780 Andrew: and you engage your own brain in determining 421 00:16:24,780 --> 00:16:26,490 Andrew: whether it's fit for purpose and if not, 422 00:16:26,490 --> 00:16:27,810 Andrew: how you're gonna change that. 423 00:16:27,810 --> 00:16:28,643 Mary: Absolutely. 424 00:16:31,680 --> 00:16:34,051 Ricardo: Okay, so this is an acronym being used. 425 00:16:34,051 --> 00:16:35,125 Ricardo: Is it RACCCA? 426 00:16:35,125 --> 00:16:38,243 Liz: R-A-C-C-C-A, I think. Mary: Yeah, three C's. 427 00:16:38,243 --> 00:16:40,380 Ricardo: Can you guys unpack that for me a little bit? 428 00:16:40,380 --> 00:16:43,710 Mary: So in our review of key terms in the Generative AI course, 429 00:16:43,710 --> 00:16:44,940 Mary: this is how it comes across, 430 00:16:44,940 --> 00:16:48,180 Mary: and by the way, this was written in part by ChatGPT, 431 00:16:48,180 --> 00:16:50,970 Mary: not only because that's how Andrew made the framework, 432 00:16:50,970 --> 00:16:53,640 Mary: but because then we use ChatGPT also to reiterate it 433 00:16:53,640 --> 00:16:57,330 Mary: and give it to us in as concise way as possible. 434 00:16:57,330 --> 00:16:59,550 Mary: So the acronym represents a method to evaluate 435 00:16:59,550 --> 00:17:02,220 Mary: generative AI outputs created by Andrew Maynard 436 00:17:02,220 --> 00:17:04,710 Mary: at ASU in partnership with ChatGPT. 437 00:17:04,710 --> 00:17:06,870 Mary: And so R stands for relevance 438 00:17:06,870 --> 00:17:08,910 Mary: and that is to the extent which a response 439 00:17:08,910 --> 00:17:11,340 Mary: directly addresses the issue or question. 440 00:17:11,340 --> 00:17:13,980 Mary: So, is it relevant to what you prompted? 441 00:17:13,980 --> 00:17:16,350 Ricardo: So like, really, let's break this down. 442 00:17:16,350 --> 00:17:18,270 Ricardo: So, what does that mean? 443 00:17:18,270 --> 00:17:19,950 Ricardo: In terms of my prompts? 444 00:17:19,950 --> 00:17:22,477 Ricardo: I'm gonna consider the relevance, 445 00:17:22,477 --> 00:17:23,310 Ricardo: so I'm gonna make sure that-- 446 00:17:23,310 --> 00:17:25,560 Mary: Well, the output relevance, 'cause this is evaluating. 447 00:17:25,560 --> 00:17:26,580 Ricardo: Oh, this is evaluated, okay. 448 00:17:26,580 --> 00:17:28,200 Mary: Right. So is what you put in there, 449 00:17:28,200 --> 00:17:30,870 Mary: did it get you what you expected back in reference to, 450 00:17:30,870 --> 00:17:33,990 Mary: does it make sense for what you were trying to accomplish 451 00:17:33,990 --> 00:17:35,910 Mary: around your issue or question? 452 00:17:35,910 --> 00:17:37,860 Mary: And then there's accuracy, that's A, 453 00:17:37,860 --> 00:17:40,860 Mary: the degree to which the response provides accurate, 454 00:17:40,860 --> 00:17:42,780 Mary: reliable, or fact-based information, 455 00:17:42,780 --> 00:17:44,010 Mary: and that was actually part of 456 00:17:44,010 --> 00:17:45,570 Mary: why Mary Ann was taking this class, 457 00:17:45,570 --> 00:17:46,710 Mary: was because she wanted to make sure 458 00:17:46,710 --> 00:17:49,440 Mary: if we're using ChatGPT in that student space. 459 00:17:49,440 --> 00:17:50,940 Mary: Them using it as a research tool 460 00:17:50,940 --> 00:17:53,036 Mary: was probably not an accurate usage 461 00:17:53,036 --> 00:17:54,480 Mary: but it can give ideas on 462 00:17:54,480 --> 00:17:56,459 Mary: how you might go about your research 463 00:17:56,459 --> 00:17:57,901 Mary: or using research skills 464 00:17:57,901 --> 00:18:00,886 Mary: that the librarians can help you learn. 465 00:18:00,886 --> 00:18:03,360 Mary: You can then gauge the accuracy 466 00:18:03,360 --> 00:18:05,550 Mary: using really basic digital competencies, 467 00:18:05,550 --> 00:18:08,100 Mary: like who's actually saying this message, 468 00:18:08,100 --> 00:18:08,933 Mary: where did this come from? 469 00:18:08,933 --> 00:18:10,710 Mary: And with ChatGPT, you don't get citations. 470 00:18:10,710 --> 00:18:13,380 Mary: I mean, you can ask for them and get the wrong ones. 471 00:18:13,380 --> 00:18:15,120 Mary: They're not actually sourced that way 472 00:18:15,120 --> 00:18:16,140 Mary: in the large language model. 473 00:18:16,140 --> 00:18:18,840 Mary: So you need to go and actually fact check and double check 474 00:18:18,840 --> 00:18:21,660 Mary: that the same thing that ChatGPT is telling you 475 00:18:21,660 --> 00:18:23,310 Mary: is being said in other spaces 476 00:18:23,310 --> 00:18:27,210 Mary: that are reliably tracked and cited, and you know. 477 00:18:27,210 --> 00:18:30,510 Liz: Yeah, this methodology for evaluating 478 00:18:30,510 --> 00:18:32,640 Liz: kind of reminds me of some of the other methodologies 479 00:18:32,640 --> 00:18:35,594 Liz: that already exist in higher education 480 00:18:35,594 --> 00:18:38,790 Liz: SIFT, other ways that we can evaluate 481 00:18:38,790 --> 00:18:40,650 Liz: the research that we're looking at, 482 00:18:40,650 --> 00:18:42,870 Liz: the sources that we're evaluating. 483 00:18:42,870 --> 00:18:44,989 Liz: I work a lot with our English 101, 102 students, 484 00:18:44,989 --> 00:18:48,060 Liz: that's a huge part of doing the research, 485 00:18:48,060 --> 00:18:50,580 Liz: is evaluating what you're looking at for its accuracy, 486 00:18:50,580 --> 00:18:52,230 Liz: for its relevance. 487 00:18:52,230 --> 00:18:55,800 Liz: But it also reminds me of the big scare of Wikipedia. 488 00:18:55,800 --> 00:18:58,140 Liz: When it first came out, people were like, 489 00:18:58,140 --> 00:18:59,909 Liz: oh, you can't trust Wikipedia. 490 00:18:59,909 --> 00:19:01,230 Mary: People still say that. 491 00:19:01,230 --> 00:19:04,350 Liz: They do, but I feel like much like Wikipedia, 492 00:19:04,350 --> 00:19:08,940 Liz: ChatGPT is gonna turn into a ideation, 493 00:19:08,940 --> 00:19:11,490 Liz: beginning the research spot. 494 00:19:11,490 --> 00:19:14,567 Liz: And I think that this RACCCA methodology 495 00:19:14,567 --> 00:19:17,760 Liz: structure is gonna be a helpful way to 496 00:19:17,760 --> 00:19:20,370 Liz: help students learn how to work with the robots, 497 00:19:20,370 --> 00:19:21,333 Liz: not against them. 498 00:19:22,324 --> 00:19:23,220 Ricardo: And that's one of the things 499 00:19:23,220 --> 00:19:25,603 Ricardo: that Andrew kind of kept going back to, 500 00:19:25,603 --> 00:19:27,390 Ricardo: because we have this new tool 501 00:19:27,390 --> 00:19:29,700 Ricardo: doesn't mean that we still need to scrutinize 502 00:19:29,700 --> 00:19:30,533 Ricardo: what we're looking at. 503 00:19:30,533 --> 00:19:31,890 Ricardo: And he mentioned that, you know, 504 00:19:31,890 --> 00:19:34,260 Ricardo: just in any technology and also just in people in general, 505 00:19:34,260 --> 00:19:37,020 Ricardo: you still have to scrutinize and kind of vet 506 00:19:37,020 --> 00:19:38,700 Ricardo: the information that you're getting from anyone. 507 00:19:38,700 --> 00:19:41,010 Liz: Yeah, seems like critical thinking around 508 00:19:41,010 --> 00:19:43,470 Liz: what you're hearing, what you're learning 509 00:19:43,470 --> 00:19:45,660 Liz: is more important than ever. 510 00:19:45,660 --> 00:19:48,060 Ricardo: Yeah, what are those three C's now. 511 00:19:48,060 --> 00:19:49,620 Mary: What you're learning is more important than ever, 512 00:19:49,620 --> 00:19:52,110 Mary: that's a good transition to completedness, 513 00:19:52,110 --> 00:19:54,210 Mary: so the completeness of the response. 514 00:19:54,210 --> 00:19:57,750 Mary: Does the degree to which the response was given, 515 00:19:57,750 --> 00:19:59,928 Mary: does it cover all the essential aspects of the topic 516 00:19:59,928 --> 00:20:01,650 Mary: or question being asked? 517 00:20:01,650 --> 00:20:03,660 Ricardo: Right, 'cause I put in prompts before and I thought, 518 00:20:03,660 --> 00:20:05,760 Ricardo: well, you didn't reference this other thing 519 00:20:05,760 --> 00:20:07,230 Ricardo: that I asked you to reference. 520 00:20:07,230 --> 00:20:08,231 Ricardo: So that is-- 521 00:20:08,231 --> 00:20:10,350 Mary: And you need bring it up again, yeah. 522 00:20:10,350 --> 00:20:12,840 Mary: So it's a constant refinement process, right? 523 00:20:12,840 --> 00:20:13,890 Mary: And then the clarity, 524 00:20:13,890 --> 00:20:16,080 Mary: how easily the response can be understood 525 00:20:16,080 --> 00:20:17,160 Mary: by the intended audience. 526 00:20:17,160 --> 00:20:18,840 Mary: So I've had that, right? 527 00:20:18,840 --> 00:20:21,180 Mary: Give me a writeup for this letter, 528 00:20:21,180 --> 00:20:24,120 Mary: but then I didn't actually articulate who the audience is. 529 00:20:24,120 --> 00:20:25,290 Mary: And then when you articulate, 530 00:20:25,290 --> 00:20:27,000 Mary: no, the audience is a bunch of five-year-olds. 531 00:20:27,000 --> 00:20:29,340 Mary: Okay, well it's gonna be said very differently 532 00:20:29,340 --> 00:20:30,510 Mary: than how that first draft was. 533 00:20:30,510 --> 00:20:33,660 Mary: So priming the model is very important as well, 534 00:20:33,660 --> 00:20:35,670 Mary: making sure that it understands everything 535 00:20:35,670 --> 00:20:37,890 Mary: that you're looking for before you're trying to get it. 536 00:20:37,890 --> 00:20:39,540 Mary: So back to completeness, 537 00:20:39,540 --> 00:20:41,130 Mary: clarity of what you're looking for 538 00:20:41,130 --> 00:20:43,590 Mary: helps you also accomplish completeness. 539 00:20:43,590 --> 00:20:45,270 Mary: And then coherence, the extent to which 540 00:20:45,270 --> 00:20:47,550 Mary: the response is logically structured and well organized 541 00:20:47,550 --> 00:20:50,373 Mary: and flows smoothly from one point to the other. 542 00:20:51,420 --> 00:20:53,520 Mary: I think that's pretty nice. 543 00:20:53,520 --> 00:20:56,040 Mary: In ChatGPT, there's like a conversational history. 544 00:20:56,040 --> 00:20:59,310 Mary: So, that coherence is pretty easily built in 545 00:20:59,310 --> 00:21:00,630 Mary: through an ongoing conversation, 546 00:21:00,630 --> 00:21:03,886 Mary: but it does have to be ongoing to continue to refine it. 547 00:21:03,886 --> 00:21:05,490 Mary: And then the last one's appropriateness, 548 00:21:05,490 --> 00:21:08,040 Mary: how well the response aligns with the intended audience 549 00:21:08,040 --> 00:21:10,200 Mary: and context and is suitable and respectful 550 00:21:10,200 --> 00:21:12,287 Mary: in tone and content. 551 00:21:12,287 --> 00:21:14,940 Mary: Because that bias exists, 552 00:21:14,940 --> 00:21:16,737 Mary: and so knowing what the biases are 553 00:21:16,737 --> 00:21:19,593 Mary: and how individuals take certain things, like, 554 00:21:20,850 --> 00:21:23,310 Mary: the model itself, its burstiness and perplexity, 555 00:21:23,310 --> 00:21:28,310 Mary: like how it actually talks is very wordy, right? 556 00:21:28,530 --> 00:21:30,450 Mary: So if you're not refining it for certain audiences, 557 00:21:30,450 --> 00:21:34,920 Mary: it gets very liberal with its language flow, 558 00:21:34,920 --> 00:21:37,680 Mary: like, is that normal? Ricardo: It's kind of a bad writer. 559 00:21:37,680 --> 00:21:39,870 Ricardo: In my experience, it's kind of a bad writer. 560 00:21:39,870 --> 00:21:41,850 Mary: Yeah, it's like patting the audience. 561 00:21:41,850 --> 00:21:42,960 Mary: Like, anytime I get something 562 00:21:42,960 --> 00:21:44,850 Mary: that's way longer than it needs to be, 563 00:21:44,850 --> 00:21:46,530 Mary: it's fine for a first draft, 564 00:21:46,530 --> 00:21:48,180 Mary: but then refining it. 565 00:21:48,180 --> 00:21:51,570 Mary: The appropriateness, yeah, you don't need to, you know, 566 00:21:51,570 --> 00:21:53,730 Mary: try to sweeten up the message for every single person 567 00:21:53,730 --> 00:21:55,710 Mary: you're talking to, that's inauthentic. 568 00:21:55,710 --> 00:21:57,840 Ricardo: Okay so you get a response, 569 00:21:57,840 --> 00:22:00,180 Ricardo: and now you've applied these criteria 570 00:22:00,180 --> 00:22:02,160 Ricardo: to see if it was a good response, 571 00:22:02,160 --> 00:22:03,930 Ricardo: but you need to make some refinements. 572 00:22:03,930 --> 00:22:05,580 Ricardo: What are you doing when you're making that refinements? 573 00:22:05,580 --> 00:22:09,180 Ricardo: Are you telling it good on these points, bad on this? 574 00:22:09,180 --> 00:22:10,230 Ricardo: How do you refine? 575 00:22:10,230 --> 00:22:12,420 Ricardo: Are you just coming up with a new prompt? 576 00:22:12,420 --> 00:22:13,860 Mary: Sometimes I'll say "thank you very much" 577 00:22:13,860 --> 00:22:16,080 Mary: because we should always thank our overlords ahead of time 578 00:22:16,080 --> 00:22:18,985 Mary: before they take over and be very polite to them. 579 00:22:18,985 --> 00:22:19,950 Ricardo: I think about that all the time. 580 00:22:19,950 --> 00:22:22,020 Ricardo: Someone who yells at their cell phone for being an idiot. 581 00:22:22,020 --> 00:22:23,490 Mary: Absolutely, I yell at Siri a lot, 582 00:22:23,490 --> 00:22:24,730 Mary: and I'm sure she remembers. 583 00:22:24,730 --> 00:22:26,160 Ricardo: Oh yeah. 584 00:22:26,160 --> 00:22:27,540 Ricardo: I'm always polite to my. 585 00:22:27,540 --> 00:22:28,920 Mary: I should be more polite to Siri, 586 00:22:28,920 --> 00:22:30,300 Mary: that's a good reflection for me. 587 00:22:30,300 --> 00:22:33,030 Mary: But with ChatGPT, like I just, honestly, 588 00:22:33,030 --> 00:22:34,707 Mary: I do put that human aspect into it. 589 00:22:34,707 --> 00:22:35,880 Mary: And I know there are people 590 00:22:35,880 --> 00:22:37,470 Mary: who don't believe you should do that, 591 00:22:37,470 --> 00:22:39,480 Mary: but it's just like a common way to respond. 592 00:22:39,480 --> 00:22:41,040 Mary: So anyway, I'll start it with, thank you very much, 593 00:22:41,040 --> 00:22:42,270 Mary: that was a great response. 594 00:22:42,270 --> 00:22:44,250 Mary: However, I was actually looking for this. 595 00:22:44,250 --> 00:22:45,690 Mary: And then it'll be like, oh, I'm so sorry, 596 00:22:45,690 --> 00:22:47,190 Mary: 'cause it's also super polite, 597 00:22:47,190 --> 00:22:49,560 Mary: and will come up with the refined approach. 598 00:22:49,560 --> 00:22:51,123 Mary: So yeah, that's really common. 599 00:22:52,410 --> 00:22:53,700 Mary: Is that what the question was? 600 00:22:53,700 --> 00:22:55,590 Ricardo: Yeah, just how you're, you know, 601 00:22:55,590 --> 00:22:58,200 Ricardo: so you're giving it a new prompt rather than 602 00:22:58,200 --> 00:23:00,210 Ricardo: telling it's a build off its last prompt. 603 00:23:00,210 --> 00:23:02,280 Mary: Sometimes I'm asking it to build off it last prompt. 604 00:23:02,280 --> 00:23:04,530 Mary: Like, for this course when we were building them, 605 00:23:04,530 --> 00:23:07,500 Mary: Tamara did a great job of laying out like A, B, C, 606 00:23:07,500 --> 00:23:10,080 Mary: this is in every single module, this is the stuff we need. 607 00:23:10,080 --> 00:23:11,790 Mary: So I copy and paste it, 608 00:23:11,790 --> 00:23:14,610 Mary: her template in there to prime my ChatGPT. 609 00:23:14,610 --> 00:23:16,650 Mary: And I'm like, and then this is the topic. 610 00:23:16,650 --> 00:23:18,900 Mary: Give me some potential opportunities 611 00:23:18,900 --> 00:23:20,490 Mary: to like fill in the blanks of the template, 612 00:23:20,490 --> 00:23:21,810 Mary: and it would every single time. 613 00:23:21,810 --> 00:23:23,670 Mary: And then if it got it a little bit better 614 00:23:23,670 --> 00:23:24,990 Mary: in one of the other prompts, I'd be like, 615 00:23:24,990 --> 00:23:28,230 Mary: refer to this specific one when doing the rest of these, 616 00:23:28,230 --> 00:23:29,970 Mary: and then it refined it every single time. 617 00:23:29,970 --> 00:23:32,250 Mary: So yeah, absolutely building on the conversation 618 00:23:32,250 --> 00:23:33,840 Mary: and building on the prompt and priming. 619 00:23:33,840 --> 00:23:35,575 Ricardo: Just say chat 'cause you said chit. 620 00:23:35,575 --> 00:23:36,408 Mary: Chit? 621 00:23:38,167 --> 00:23:40,858 Mary: ChatGPT, my words. 622 00:23:40,858 --> 00:23:42,701 Ricardo: You're not prompt, 623 00:23:42,701 --> 00:23:43,534 Ricardo: your response is-- 624 00:23:43,534 --> 00:23:46,384 Mary: Someone needs to prompt my brain with food, I'm hungry. 625 00:23:48,210 --> 00:23:49,650 Mary: How many students did you have in your class? 626 00:23:49,650 --> 00:23:53,287 Andrew: 72 Students, plus lurkers. 627 00:23:53,287 --> 00:23:54,927 Mary: And there were a lot of us lurkers, right? 628 Mary: 00:23:54,927 --> 00:23:56,670 I kept adding people like, Andrew, 629 00:23:56,670 --> 00:23:59,523 Mary: is it okay if I just keep adding as many as we want? 630 00:24:00,450 --> 00:24:02,640 Mary: So from that, from the 72 students, 631 00:24:02,640 --> 00:24:04,380 Mary: how many assignments did you have to grade? 632 00:24:04,380 --> 00:24:05,640 Andrew: A crazy amount. 633 00:24:05,640 --> 00:24:09,060 Andrew: I thought I was putting together an easy course. 634 00:24:09,060 --> 00:24:13,890 Andrew: There were individual assignments across those 72 students, 635 00:24:13,890 --> 00:24:16,470 Andrew: a little under 3000. 636 00:24:16,470 --> 00:24:21,300 Andrew: Out of those, there were a little over 2,300 conversations 637 00:24:21,300 --> 00:24:23,880 Andrew: between individuals and ChatGPT 638 00:24:23,880 --> 00:24:24,840 Andrew: that I read through. 639 00:24:24,840 --> 00:24:26,250 Mary: You read them all? 640 00:24:26,250 --> 00:24:27,990 Andrew: I read them all, well, I skimmed them all. 641 00:24:27,990 --> 00:24:30,810 Andrew: But the thing was, yes, it was a lot of work, 642 00:24:30,810 --> 00:24:32,910 Andrew: but it gave me an incredible insight 643 00:24:32,910 --> 00:24:34,797 Andrew: into how people were engaging with the platform 644 00:24:34,797 --> 00:24:35,877 Andrew: and how they were using it. 645 00:24:35,877 --> 00:24:37,800 Mary: And how were they using it? 646 00:24:37,800 --> 00:24:40,440 Andrew: They were using it in many, many different ways 647 00:24:40,440 --> 00:24:41,460 Andrew: that I didn't expect. 648 00:24:41,460 --> 00:24:45,780 Andrew: And what I saw through many of those conversations was 649 00:24:45,780 --> 00:24:50,370 Andrew: how ChatGPT sparked their curiosity and critical thinking, 650 00:24:50,370 --> 00:24:54,540 Andrew: rather than it just dulling their brains, it lit them up 651 00:24:54,540 --> 00:24:56,610 Andrew: and you could see it in terms of the questions they ask. 652 00:24:56,610 --> 00:24:58,740 Andrew: It goes back to this idea of conversations. 653 00:24:58,740 --> 00:25:01,170 Andrew: The conversations were rich, they were deep, 654 00:25:01,170 --> 00:25:03,510 Andrew: they went in directions I didn't expect, 655 00:25:03,510 --> 00:25:06,000 Andrew: they covered topics that I didn't expect. 656 00:25:06,000 --> 00:25:09,600 Andrew: I had students having long conversations around 657 00:25:09,600 --> 00:25:12,240 Andrew: how to synthesize different types of plastics 658 00:25:12,240 --> 00:25:13,800 Andrew: and the chemistry behind them. 659 00:25:13,800 --> 00:25:15,360 Andrew: Stuff that I didn't know 660 00:25:15,360 --> 00:25:18,390 Andrew: but stuff that they actually got a lot 661 00:25:18,390 --> 00:25:21,435 Andrew: of detailed information from where they could validate 662 00:25:21,435 --> 00:25:24,780 Andrew: and test that information then carry on the conversation. 663 00:25:24,780 --> 00:25:25,613 Mary: Interesting. 664 00:25:25,613 --> 00:25:28,350 Mary: So what was the validation mechanisms in those conversations 665 00:25:28,350 --> 00:25:29,700 Mary: that you noticed that you thought 666 00:25:29,700 --> 00:25:32,833 Mary: that was a good way to do it? 667 00:25:32,833 --> 00:25:35,910 Andrew: Much of it was typical approaches to critical thinking 668 00:25:35,910 --> 00:25:38,940 Andrew: where if you get a piece of information, 669 00:25:38,940 --> 00:25:42,360 Andrew: you work out what feels like it makes sense, 670 00:25:42,360 --> 00:25:44,730 Andrew: what aligns with what you already know, 671 00:25:44,730 --> 00:25:45,990 Andrew: what maybe doesn't align. 672 00:25:45,990 --> 00:25:49,590 Andrew: And then the next question you ask is based on 673 00:25:49,590 --> 00:25:52,320 Andrew: where those alignments are or disalignments are. 674 00:25:52,320 --> 00:25:54,930 Andrew: So you had students push back a little bit or say, 675 00:25:54,930 --> 00:25:56,370 Andrew: can you explain this a little bit more? 676 00:25:56,370 --> 00:26:01,287 Andrew: Or what is your reasoning behind what you said? 677 00:26:01,287 --> 00:26:03,750 Andrew: And the really interesting thing about this platform is 678 00:26:03,750 --> 00:26:05,400 Andrew: that it can simulate reasoning, 679 00:26:05,400 --> 00:26:08,190 Andrew: it can tell you why it comes up with a response. 680 00:26:08,190 --> 00:26:10,410 Mary: Interesting, and sometimes it'll tell on itself, right? 681 00:26:10,410 --> 00:26:12,750 Mary: Like it'll say, I had to tell you something. (chuckles) 682 00:26:12,750 --> 00:26:14,340 Andrew: Oh, it does, yes. 683 00:26:14,340 --> 00:26:16,230 Andrew: It's very endearing the way it sort of 684 00:26:16,230 --> 00:26:18,870 Andrew: either sort of corrects itself or apologizes or says, 685 00:26:18,870 --> 00:26:21,470 Andrew: well, this is what I was told to say. 686 00:26:21,470 --> 00:26:24,960 Tamara: It also has some pretty interesting safety features. 687 00:26:24,960 --> 00:26:28,740 Tamara: Over spring break, I was showing my youngest daughter to 688 00:26:28,740 --> 00:26:32,730 Tamara: use ChatGPT to do kind of an educational scenario 689 00:26:32,730 --> 00:26:34,890 Tamara: that I had her feed in as a prompt, 690 00:26:34,890 --> 00:26:37,680 Tamara: and it was based off of Harry Potter. 691 00:26:37,680 --> 00:26:39,630 Tamara: So maybe there was copyright there, I don't know, 692 00:26:39,630 --> 00:26:41,340 Tamara: but it's personal. 693 00:26:41,340 --> 00:26:45,210 Tamara: But she was trying to convince the ChatGPT to let her go 694 00:26:45,210 --> 00:26:47,910 Tamara: chase the Dementors into the dark forest. 695 00:26:47,910 --> 00:26:49,590 Tamara: And it kept telling her it wasn't safe 696 00:26:49,590 --> 00:26:51,330 Tamara: and it wouldn't allow that action. 697 00:26:51,330 --> 00:26:55,140 Tamara: And so she had to reason with ChatGPT to explain, 698 00:26:55,140 --> 00:26:57,450 Tamara: and she learned how to reason through 699 00:26:57,450 --> 00:26:59,100 Tamara: what was safe and what was not safe 700 00:26:59,100 --> 00:27:02,400 Tamara: because of the conversation, which was an unintended result 701 00:27:02,400 --> 00:27:05,520 Tamara: but it was a nice safety feature that it has, so. 702 00:27:05,520 --> 00:27:07,493 Tamara: Not a bad thing. Mary: I love that story too. 703 00:27:09,750 --> 00:27:11,500 Mary: You're such a fun mom, I love that. 704 00:27:12,450 --> 00:27:15,300 Mary: From the perspective of the student, Mary Ann, 705 00:27:15,300 --> 00:27:17,910 Mary: were there any conversations that you had with ChatGPT 706 00:27:17,910 --> 00:27:20,220 Mary: from the outcomes of the assignments you were given that 707 00:27:20,220 --> 00:27:24,180 Mary: surprised you or where you learned deeper skills? 708 00:27:24,180 --> 00:27:27,540 Mary Ann: So most of my focus in the class was very, very narrow. 709 00:27:27,540 --> 00:27:31,110 Mary Ann: So I looked at it as to inform my position 710 00:27:31,110 --> 00:27:32,820 Mary Ann: and my role as an instruction librarian 711 00:27:32,820 --> 00:27:35,580 Mary Ann: and working with first-year students 712 00:27:35,580 --> 00:27:37,830 Mary Ann: who are learning how to do research 713 00:27:37,830 --> 00:27:41,100 Mary Ann: and learning that it goes beyond finding information 714 00:27:41,100 --> 00:27:44,253 Mary Ann: to exploring and building knowledge and understanding. 715 00:27:45,120 --> 00:27:46,980 Mary Ann: So I focused mostly on that. 716 00:27:46,980 --> 00:27:51,120 Mary Ann: And when I got to some of the more sophisticated prompts, 717 00:27:51,120 --> 00:27:53,130 Mary Ann: as I got better at it, 718 00:27:53,130 --> 00:27:54,300 Mary Ann: I was starting to see 719 00:27:54,300 --> 00:27:56,640 Mary Ann: where it could be very useful for students. 720 00:27:56,640 --> 00:27:59,260 Mary Ann: So I would start to get responses that would 721 00:28:01,290 --> 00:28:04,080 Mary Ann: approximate what talking to a librarian 722 00:28:04,080 --> 00:28:06,387 Mary Ann: or talking to your instructor might be, 723 00:28:06,387 --> 00:28:09,930 Mary Ann: and so I got very excited about that, so. 724 00:28:09,930 --> 00:28:12,510 Mary: You created an artifact that you shared 725 00:28:12,510 --> 00:28:15,060 Mary: in our Generative AI course, would you talk about that? 726 00:28:15,060 --> 00:28:18,090 Mary: This was a result of the work you did in Andrew's class, 727 00:28:18,090 --> 00:28:19,950 Mary: is that correct? Mary Ann: Yes. 728 00:28:19,950 --> 00:28:23,060 Mary Ann: Sure, so one of the things that is lacking out there 729 00:28:23,060 --> 00:28:25,950 Mary Ann: at this point are really 730 00:28:25,950 --> 00:28:30,950 Mary Ann: what I thought were decent academic-focused prompts 731 00:28:31,140 --> 00:28:35,790 Mary Ann: to help students or learners through the research process. 732 00:28:35,790 --> 00:28:38,760 Mary Ann: Rather than asking ChatGPT to write a paper for you 733 00:28:38,760 --> 00:28:40,470 Mary Ann: or to find you sources, 734 00:28:40,470 --> 00:28:42,570 Mary Ann: what is the role it could play 735 00:28:42,570 --> 00:28:45,630 Mary Ann: in being a brainstorm or being a partner. 736 00:28:45,630 --> 00:28:48,600 Mary Ann: And so I started to develop prompts in that way, 737 00:28:48,600 --> 00:28:51,270 Mary Ann: focusing in on those composition classes 738 00:28:51,270 --> 00:28:53,040 Mary Ann: that we work very closely with, 739 00:28:53,040 --> 00:28:55,860 Mary Ann: kind of identifying where students get stuck in the process, 740 00:28:55,860 --> 00:28:59,310 Mary Ann: whether it's from needing to do a lot of research 741 00:28:59,310 --> 00:29:01,830 Mary Ann: before jumping into academic sources, 742 00:29:01,830 --> 00:29:04,410 Mary Ann: maybe that's learning what kind of words 743 00:29:04,410 --> 00:29:06,660 Mary Ann: or jargon they might need to use 744 00:29:06,660 --> 00:29:09,060 Mary Ann: when they're going into databases 745 00:29:09,060 --> 00:29:10,740 Mary Ann: to search for academic sources. 746 00:29:10,740 --> 00:29:12,150 Mary Ann: That was my focus. 747 00:29:12,150 --> 00:29:14,790 Mary Ann: And you know, it was really interesting to come at it 748 00:29:14,790 --> 00:29:18,690 Mary Ann: also as an expert in search, but a novice in this, 749 00:29:18,690 --> 00:29:21,120 Mary Ann: and having that intellectual humility. 750 00:29:21,120 --> 00:29:22,830 Mary Ann: And I think it's super important 751 00:29:22,830 --> 00:29:25,170 Mary Ann: and being able to kind of embody that space 752 00:29:25,170 --> 00:29:26,850 Mary Ann: where students are, where they're learning how to 753 00:29:26,850 --> 00:29:28,560 Mary Ann: use the tool for the first time. 754 00:29:28,560 --> 00:29:30,492 Mary Ann: So I'm not sure if that kind of gets to your question. 755 00:29:30,492 --> 00:29:31,325 Mary: Definitely gets to it. 756 00:29:31,325 --> 00:29:33,510 Mary Ann: But you know, I really looked at the assignments 757 00:29:33,510 --> 00:29:36,540 Mary Ann: and the kind of work that students would need to be doing 758 00:29:36,540 --> 00:29:39,390 Mary Ann: and tried to generate prompts that people could just 759 00:29:39,390 --> 00:29:42,840 Mary Ann: pull from that would guide them through that process. 760 00:29:42,840 --> 00:29:45,570 Mary Ann: So rather than asking basic questions, 761 00:29:45,570 --> 00:29:49,140 Mary Ann: it would lead students to asking ChatGPT, for example, 762 00:29:49,140 --> 00:29:51,930 Mary Ann: the questions that would have it respond back 763 00:29:51,930 --> 00:29:54,630 Mary Ann: and kind of guide some conversation, so. 764 00:29:54,630 --> 00:29:57,540 Mary: Do you plan to share those prompts on your library kit? 765 00:29:57,540 --> 00:29:58,373 Mary Ann: - Yes. 766 00:29:58,373 --> 00:29:59,206 Mary: Is that how we should be referring to it, 767 00:29:59,206 --> 00:30:01,530 Mary: or library guide? Mary Ann: Library kit, yep. 768 00:30:01,530 --> 00:30:03,120 Mary: Yeah, it's a fabulous little site 769 00:30:03,120 --> 00:30:04,140 Mary: that you've put together. 770 00:30:04,140 --> 00:30:05,850 Mary: I think a great resource for all instructors 771 00:30:05,850 --> 00:30:06,720 Mary: to share in their courses, 772 00:30:06,720 --> 00:30:08,760 Mary: especially if they need a little assistance 773 00:30:08,760 --> 00:30:10,620 Mary: with how they might use ChatGPT in their course. 774 00:30:10,620 --> 00:30:12,360 Mary: It might be good to review that resource. 775 00:30:12,360 --> 00:30:13,380 Tamara: I just wanted to say, 776 00:30:13,380 --> 00:30:16,050 Tamara: I really love what we're talking about when I hear 777 00:30:16,050 --> 00:30:18,690 Tamara: Andrew mentioning, you know, 778 00:30:18,690 --> 00:30:22,050 Tamara: how he can help his students actually learn more 779 00:30:22,050 --> 00:30:22,950 Tamara: with these prompts. 780 00:30:22,950 --> 00:30:25,424 Tamara: And then we have this wonderful library guide 781 00:30:25,424 --> 00:30:27,363 Tamara: to prepare students for research. 782 00:30:28,590 --> 00:30:31,410 Tamara: And that's kind of, I think the best part about that, 783 00:30:31,410 --> 00:30:33,120 Tamara: I think it goes back to Mary, 784 00:30:33,120 --> 00:30:34,620 Tamara: maybe a conversation that we had, 785 00:30:34,620 --> 00:30:37,290 Tamara: I think at the end of last week where we were talking about 786 00:30:37,290 --> 00:30:40,860 Tamara: there's a lot of concern about using generative AI, 787 00:30:40,860 --> 00:30:43,950 Tamara: that students will become, you know, 788 00:30:43,950 --> 00:30:45,510 Tamara: overdependent on the tool 789 00:30:45,510 --> 00:30:47,640 Tamara: or that they won't think critically, 790 00:30:47,640 --> 00:30:51,060 Tamara: when in fact, it actually requires to use the tool properly, 791 00:30:51,060 --> 00:30:54,120 Tamara: you have to be on the higher level of blooms 792 00:30:54,120 --> 00:30:57,606 Tamara: and have critical thinking and be on the evaluate level 793 00:30:57,606 --> 00:31:01,740 Tamara: so that you can actually look at your response 794 00:31:01,740 --> 00:31:05,250 Tamara: and then change the way you communicated, 795 00:31:05,250 --> 00:31:06,750 Tamara: fill in the gaps of the information 796 00:31:06,750 --> 00:31:08,610 Tamara: that you should have provided anyway, 797 00:31:08,610 --> 00:31:12,090 Tamara: and then make decisions about refining what you're doing. 798 00:31:12,090 --> 00:31:16,110 Tamara: And I think that's exciting for educators because that means 799 00:31:16,110 --> 00:31:19,170 Tamara: that we're capable of doing so much more in classes, 800 00:31:19,170 --> 00:31:21,990 Tamara: helping our students to go from just rote memorization 801 00:31:21,990 --> 00:31:23,220 Tamara: to the identify level, 802 00:31:23,220 --> 00:31:25,710 Tamara: all the way up to more of a creative level 803 00:31:25,710 --> 00:31:28,980 Tamara: and an evaluate level with fewer resources 804 00:31:28,980 --> 00:31:32,130 Tamara: and with lower course costs. 805 00:31:32,130 --> 00:31:35,013 Tamara: So it's kind of an exciting time to be in. 806 00:31:36,300 --> 00:31:37,980 Mary: Absolutely. I do wanna highlight that 807 00:31:37,980 --> 00:31:39,780 Mary: I think it would really work well at scale 808 00:31:39,780 --> 00:31:41,550 Mary: based on the model that you used, Andrew, 809 00:31:41,550 --> 00:31:44,580 Mary: and the prompt refinement questions that you included 810 00:31:44,580 --> 00:31:45,413 Mary: in some of your assignments. 811 00:31:45,413 --> 00:31:46,660 Mary: Could you talk about that? 812 00:31:47,760 --> 00:31:51,930 Andrew: Yes, so with a proviso here. 813 00:31:51,930 --> 00:31:56,460 Andrew: So absolutely, this is a system that could 814 00:31:56,460 --> 00:31:59,880 Andrew: help us to scale up learning massively 815 00:31:59,880 --> 00:32:02,100 Andrew: if we create the right environments. 816 00:32:02,100 --> 00:32:05,190 Andrew: And certainly what we did with prompt refinement, 817 00:32:05,190 --> 00:32:07,680 Andrew: we helped our students understand how you start off 818 00:32:07,680 --> 00:32:09,750 Andrew: with a simple prompt and see what you get 819 00:32:09,750 --> 00:32:12,720 Andrew: but then how you begin to either refine the prompt 820 00:32:12,720 --> 00:32:14,310 Andrew: or refine the conversation 821 00:32:14,310 --> 00:32:17,040 Andrew: so you get much richer information coming back, 822 00:32:17,040 --> 00:32:20,730 Andrew: specifically in that realm of brainstorming ideas, 823 00:32:20,730 --> 00:32:23,940 Andrew: developing your own understanding in a partnership. 824 00:32:23,940 --> 00:32:26,340 Andrew: So that's where this now becomes 825 00:32:26,340 --> 00:32:29,250 Andrew: a personalized learning environment 826 00:32:29,250 --> 00:32:31,110 Andrew: that you can put out there at scale. 827 00:32:31,110 --> 00:32:35,040 Andrew: But the proviso is students have gotta be able to understand 828 00:32:35,040 --> 00:32:37,830 Andrew: when to trust what they get, when not to trust it, 829 00:32:37,830 --> 00:32:39,300 Andrew: and when to push back. 830 00:32:39,300 --> 00:32:42,810 Andrew: If they don't have that level of digital literacy, 831 00:32:42,810 --> 00:32:44,190 Andrew: they're not gonna, well, A, 832 00:32:44,190 --> 00:32:46,350 Andrew: they're not gonna get what they need outta the system, 833 00:32:46,350 --> 00:32:49,140 Andrew: but also the system becomes potentially dangerous 834 00:32:49,140 --> 00:32:51,960 Andrew: as it pushes them in directions which aren't that helpful. 835 00:32:51,960 --> 00:32:53,850 Andrew: So then the question comes, 836 00:32:53,850 --> 00:32:56,430 Andrew: how do we develop that level of digital literacy, 837 00:32:56,430 --> 00:32:58,800 Andrew: that generative AI literacy in our students 838 00:32:58,800 --> 00:33:01,470 Andrew: so that they can thrive with these tools? 839 00:33:01,470 --> 00:33:03,060 Mary: Any suggestions? 840 00:33:03,060 --> 00:33:05,760 Andrew- Well, starting to teach courses like this 841 00:33:05,760 --> 00:33:08,220 Andrew-is a really good way of doing it. 842 00:33:08,220 --> 00:33:11,250 Andrew-But beyond that, I think this isn't just a single course, 843 00:33:11,250 --> 00:33:15,090 Andrew-this is a set of competencies that should be universal 844 00:33:15,090 --> 00:33:17,250 Andrew-in higher education these days. 845 00:33:17,250 --> 00:33:20,760 Andrew-If we are gonna equip our students to either learn in class 846 00:33:20,760 --> 00:33:23,010 Andrew-using these tools or learn on their own 847 00:33:23,010 --> 00:33:25,170 Andrew-just through their own curiosity, 848 00:33:25,170 --> 00:33:27,270 Andrew-they've gotta have that toolkit. 849 00:33:27,270 --> 00:33:30,210 Andrew-And that means both incorporating it 850 00:33:30,210 --> 00:33:32,340 Andrew-into a wide array of courses, 851 00:33:32,340 --> 00:33:33,870 Andrew-but also giving them the tools 852 00:33:33,870 --> 00:33:37,410 Andrew-that where they can independently develop these skills. 853 00:33:37,410 --> 00:33:40,470 Mary- Any suggestions on what these competencies could be? 854 00:33:40,470 --> 00:33:41,490 Tamara- I was just gonna say, you know, 855 00:33:41,490 --> 00:33:44,520 Tamara- something that's really interesting about that is 856 00:33:44,520 --> 00:33:47,340 Tamara- we are constantly telling our students ever since the 857 00:33:47,340 --> 00:33:49,770 Tamara- onset of the worldwide web 858 00:33:49,770 --> 00:33:52,140 Tamara- to challenge what they read online. 859 00:33:52,140 --> 00:33:55,293 Tamara- Now we have to challenge and ask critical questions about 860 00:33:55,293 --> 00:33:58,950 Tamara- what we read in the newspaper or what we read on our phones. 861 00:33:58,950 --> 00:34:00,830 Tamara- And so what's interesting is, 862 00:34:00,830 --> 00:34:03,930 Tamara- is we're not asking students to do anything new, 863 00:34:03,930 --> 00:34:06,120 Tamara- we're asking them to do it at a higher level 864 00:34:06,120 --> 00:34:08,730 Tamara- and we're asking them to do it in a better way. 865 00:34:08,730 --> 00:34:11,888 Tamara- And so we're using ChatGPT or other generative AI tools 866 00:34:11,888 --> 00:34:14,490 Tamara- to not only teach them to do something 867 00:34:14,490 --> 00:34:17,040 Tamara- that we need them to do anyway in a digital 868 00:34:17,040 --> 00:34:19,140 Tamara- and technologically advanced world, 869 00:34:19,140 --> 00:34:22,440 Tamara- but we're also giving them the tools to support that. 870 00:34:22,440 --> 00:34:24,333 Tamara- So I think that's what's exciting. 871 00:34:25,620 --> 00:34:26,820 Andrew- I would absolutely agree. 872 00:34:26,820 --> 00:34:29,730 Andrew- And if we approach it in that way, 873 00:34:29,730 --> 00:34:32,760 Andrew- helping students understand, well, first of all, 874 00:34:32,760 --> 00:34:33,720 Andrew- breaking the barriers down 875 00:34:33,720 --> 00:34:35,940 Andrew- so they're comfortable actually using these tools, 876 00:34:35,940 --> 00:34:37,440 Andrew- but then helping them understand 877 00:34:37,440 --> 00:34:39,870 Andrew- how to navigate what they get outta them 878 00:34:39,870 --> 00:34:41,580 Andrew- and how to have those conversations 879 00:34:41,580 --> 00:34:43,230 Andrew- then becomes very empowering. 880 00:34:43,230 --> 00:34:45,960 Andrew- It's almost like, almost but not quite, 881 00:34:45,960 --> 00:34:48,570 Andrew- like when students first come to university 882 00:34:48,570 --> 00:34:50,580 Andrew- in their first year maybe 883 00:34:50,580 --> 00:34:54,720 Andrew- and they have access to all these incredible instructors 884 00:34:54,720 --> 00:34:57,450 Andrew- and yet they don't know necessarily how to engage with them. 885 00:34:57,450 --> 00:34:59,400 Andrew- They don't know how to go and ask questions 886 00:34:59,400 --> 00:35:01,320 Andrew- and learn through those conversations, 887 00:35:01,320 --> 00:35:03,360 Andrew- and that's a skill that they need to develop. 888 00:35:03,360 --> 00:35:06,810 Andrew- They need to do exactly the same with generative AI. 889 00:35:06,810 --> 00:35:09,870 Mary- Agreed, and this really gets to the new design principle, 890 00:35:09,870 --> 00:35:11,580 Mary- right, principled innovation. 891 00:35:11,580 --> 00:35:14,050 Mary- So how we engage, why we engage 892 00:35:15,150 --> 00:35:16,770 Mary- is very important to consider. 893 00:35:16,770 --> 00:35:17,910 Andrew- Absolutely. 894 00:35:17,910 --> 00:35:21,210 Mary- There was recently a release of 895 00:35:21,210 --> 00:35:25,107 Mary- an internal generative AI option through ChatGPT, 896 00:35:25,107 --> 00:35:26,280 Mary- the one that you were talking about, 897 00:35:26,280 --> 00:35:28,230 Mary- the enterprise integration potentials. 898 00:35:28,230 --> 00:35:29,063 Mary- You wanna talk about that? 899 00:35:29,063 --> 00:35:30,540 Mary- 'Cause that's kind of new, 900 00:35:30,540 --> 00:35:31,373 Mary- I mean it might not be new 901 00:35:31,373 --> 00:35:32,910 Mary- when this episode comes out in a few weeks. 902 00:35:32,910 --> 00:35:35,250 Andrew- No, but as of today, it's brand new. 903 00:35:35,250 --> 00:35:40,080 Andrew- So despite all of the huge potential of not only ChatGPT, 904 00:35:40,080 --> 00:35:43,800 Andrew- but other large language models, there is a problem, 905 00:35:43,800 --> 00:35:46,650 Andrew- and that is a problem of accessibility. 906 00:35:46,650 --> 00:35:51,540 Andrew- And so if in education and learning we embrace these, 907 00:35:51,540 --> 00:35:53,970 Andrew- we run the risk of only those students 908 00:35:53,970 --> 00:35:55,530 Andrew- that have access to the systems 909 00:35:55,530 --> 00:35:57,120 Andrew- really being able to benefit. 910 00:35:57,120 --> 00:35:59,670 Andrew- So for instance, in my course we use ChatGPT Plus, 911 00:35:59,670 --> 00:36:00,990 Andrew- which is the paid version, 912 00:36:00,990 --> 00:36:04,230 Andrew- which meant every student needed to be paying $20 a month 913 00:36:04,230 --> 00:36:05,877 Andrew- to complete that course. 914 00:36:05,877 --> 00:36:07,710 Andrew- And we did that because it felt like 915 00:36:07,710 --> 00:36:09,540 Andrew- this was the equivalent of a textbook, 916 00:36:09,540 --> 00:36:11,580 Andrew- but not every student can afford that. 917 00:36:11,580 --> 00:36:14,490 Andrew- So now immediately, if you have classes where 918 00:36:14,490 --> 00:36:17,460 Andrew- some students are using advanced generative AI 919 00:36:17,460 --> 00:36:20,643 Andrew- and some aren't, there's a digital divide there. 920 00:36:21,990 --> 00:36:23,220 Andrew- And the way to get round that, 921 00:36:23,220 --> 00:36:24,660 Andrew- or one of the ways to get round that 922 00:36:24,660 --> 00:36:27,810 Andrew- is for places like ASU to ensure that 923 00:36:27,810 --> 00:36:31,140 Andrew- everybody has access to these powerful tools. 924 00:36:31,140 --> 00:36:34,650 Andrew- Up until now, that wasn't possible with ChatGPT 925 00:36:34,650 --> 00:36:36,720 Andrew- because they didn't have an enterprise model. 926 00:36:36,720 --> 00:36:38,550 Andrew- That has just been announced, 927 00:36:38,550 --> 00:36:41,850 Andrew- which means that higher education institutions 928 00:36:41,850 --> 00:36:45,930 Andrew- can buy into it in ways that both protect their students, 929 00:36:45,930 --> 00:36:48,870 Andrew- they protect their privacy, they protect their data, 930 00:36:48,870 --> 00:36:52,170 Andrew- but they give them access to incredibly powerful tools 931 00:36:52,170 --> 00:36:53,010 Andrew- across the board. 932 00:36:53,010 --> 00:36:55,740 Andrew- So you no longer have that digital divide 933 00:36:55,740 --> 00:36:58,500 Andrew- because people can't afford to access the system. 934 00:36:58,500 --> 00:37:00,870 Andrew- There is still a digital divide in terms of 935 00:37:00,870 --> 00:37:03,030 Andrew- the mentality where students 936 00:37:03,030 --> 00:37:05,130 Andrew- will or won't engage with these technologies, 937 00:37:05,130 --> 00:37:07,260 Andrew- and that's where digital literacy comes in. 938 00:37:07,260 --> 00:37:08,670 Andrew- But at least we are removing 939 00:37:08,670 --> 00:37:10,950 Andrew- potentially one of those barriers. 940 00:37:10,950 --> 00:37:11,970 Mary- So that brings us back to the 941 00:37:11,970 --> 00:37:13,890 Mary- digital literacy competencies. 942 00:37:13,890 --> 00:37:16,800 Mary- What competencies would you label 943 00:37:16,800 --> 00:37:18,120 Mary- that students need to have, 944 00:37:18,120 --> 00:37:20,190 Mary- that everybody really needs to have, not just students? 945 00:37:20,190 --> 00:37:23,820 Andrew- So I think we're still feeling our way here, 946 00:37:23,820 --> 00:37:26,160 Andrew- this is very early days. 947 00:37:26,160 --> 00:37:30,630 Andrew- But I would certainly have one of those understanding 948 00:37:30,630 --> 00:37:33,000 Andrew- the limitations and the benefits 949 00:37:33,000 --> 00:37:34,170 Andrew- and the power of the system. 950 00:37:34,170 --> 00:37:38,160 Andrew- So, students go in with open eyes and realize that 951 00:37:38,160 --> 00:37:40,980 Andrew- systems like ChatGPT will lie to them, 952 00:37:40,980 --> 00:37:43,920 Andrew- they will hallucinate, so they can't trust everything. 953 00:37:43,920 --> 00:37:45,420 Andrew- At the same time realizing that 954 00:37:45,420 --> 00:37:47,160 Andrew- through the process of conversation 955 00:37:47,160 --> 00:37:48,750 Andrew- they can glean new knowledge from it, 956 00:37:48,750 --> 00:37:50,460 Andrew- that's an incredibly important skill. 957 00:37:50,460 --> 00:37:53,280 Andrew- And of course that ties in with critical thinking, 958 00:37:53,280 --> 00:37:56,730 Andrew- how you evaluate everything you learn and hear. 959 00:37:56,730 --> 00:38:00,690 Andrew- But beyond that, I think there's also a critical skill 960 00:38:00,690 --> 00:38:03,540 Andrew- around curiosity and experimentation. 961 00:38:03,540 --> 00:38:08,490 Andrew- This is an incredibly powerful but an open-ended platform 962 00:38:08,490 --> 00:38:11,640 Andrew- where it's as good as the creativity you bring to it. 963 00:38:11,640 --> 00:38:14,010 Andrew- And helping students understand that 964 00:38:14,010 --> 00:38:16,830 Andrew- within the limitations of the system 965 00:38:16,830 --> 00:38:19,320 Andrew- they can play and they can innovate themselves 966 00:38:19,320 --> 00:38:20,217 Andrew- and they can learn for themselves 967 00:38:20,217 --> 00:38:21,810 Andrew- how to get the most out of it, 968 00:38:21,810 --> 00:38:23,790 Andrew- that then becomes transformational. 969 00:38:23,790 --> 00:38:25,620 Mary- I love that curiosity and exploration 970 00:38:25,620 --> 00:38:27,270 Mary- is now part of our digital competencies. 971 00:38:27,270 --> 00:38:28,200 Andrew- They are, yes. 972 00:38:28,200 --> 00:38:29,043 Mary- That's great. 973 00:38:29,940 --> 00:38:31,410 Mary- Mary Ann, how about from your perspective, 974 00:38:31,410 --> 00:38:32,850 Mary- from the research perspective? 975 00:38:32,850 --> 00:38:34,380 Mary- Did you come away with that course going, 976 00:38:34,380 --> 00:38:37,230 Mary- okay, now this is how I'll articulate a literacy that 977 00:38:37,230 --> 00:38:39,600 Mary- I need to ensure our students have? 978 00:38:39,600 --> 00:38:42,360 Mary Ann- I think I actually came a little bit more hopeful 979 00:38:42,360 --> 00:38:43,833 Mary Ann- that it could be used in the research process 980 00:38:43,833 --> 00:38:48,390 Mary Ann- than I was initially going into it, 981 00:38:48,390 --> 00:38:49,680 Mary Ann- playing around with ChatGPT 982 00:38:49,680 --> 00:38:51,903 Mary Ann- or playing around with Bing or Bard. 983 00:38:52,830 --> 00:38:56,850 Mary Ann- It's not a great tool for research at this point. 984 00:38:56,850 --> 00:38:59,070 Mary Ann- The fact that you cannot trust accuracy 985 00:38:59,070 --> 00:39:00,510 Mary Ann- of the sources that you're getting, 986 00:39:00,510 --> 00:39:05,070 Mary Ann- that there's no way to know what the training data is 987 00:39:05,070 --> 00:39:06,990 Mary Ann- that the system is based off of, 988 00:39:06,990 --> 00:39:09,480 Mary Ann- knowing that there are inherent biases 989 00:39:09,480 --> 00:39:11,190 Mary Ann- underlying the training data. 990 00:39:11,190 --> 00:39:16,170 Mary Ann- knowing that there are huge issues there that, as a academic librarian, 991 00:39:16,170 --> 00:39:18,870 Mary Ann- I kind of get a little bit nervous about 992 00:39:18,870 --> 00:39:22,440 Mary Ann- students relying upon technology that hands them answers 993 00:39:22,440 --> 00:39:26,760 Mary Ann- or allows them to think that the system is neutral 994 00:39:26,760 --> 00:39:27,663 Mary Ann- when it's not. 995 00:39:28,620 --> 00:39:30,660 Mary Ann- So I have, I mean, 996 00:39:30,660 --> 00:39:32,790 Mary Ann- it goes back to basic information literacy, 997 00:39:32,790 --> 00:39:35,760 Mary Ann- basic critical thinking about 998 00:39:35,760 --> 00:39:37,650 Mary Ann- what are the sources that I'm getting, 999 00:39:37,650 --> 00:39:39,450 Mary Ann- can I trust this being skeptical, 1000 00:39:39,450 --> 00:39:41,280 Mary Ann- trust but verify, so. 1001 00:39:41,280 --> 00:39:43,980 Mary- Trust but verify, I like that. 1002 00:39:43,980 --> 00:39:46,140 Mary- So let's talk about that bias for a moment. 1003 00:39:46,140 --> 00:39:49,650 Mary- What have you heard, what have you noticed, 1004 00:39:49,650 --> 00:39:51,510 Mary- what have you heard through the grapevine 1005 00:39:51,510 --> 00:39:52,620 Mary- exists around bias? 1006 00:39:52,620 --> 00:39:54,360 Mary- What should our students know? 1007 00:39:54,360 --> 00:39:59,360 Andrew- They should know that just like talking to people, 1008 00:39:59,470 --> 00:40:03,900 Andrew- there are inbuilt biases to ChatGPT. 1009 00:40:03,900 --> 00:40:08,900 Andrew- Some of those biases arise from the data it was trained on, 1010 00:40:09,030 --> 00:40:12,600 Andrew- and they reflect human biases and human foibles 1011 00:40:12,600 --> 00:40:14,580 Andrew- in those big data sets. 1012 00:40:14,580 --> 00:40:16,590 Andrew- And we're actually finding out that 1013 00:40:16,590 --> 00:40:18,960 Andrew- because some of those data sets are skewed, 1014 00:40:18,960 --> 00:40:21,540 Andrew- that actually enhances the bias. 1015 00:40:21,540 --> 00:40:25,980 Andrew- So great care needs to be taken there in understanding 1016 00:40:25,980 --> 00:40:27,570 Andrew- what comes out of the system 1017 00:40:27,570 --> 00:40:29,430 Andrew- or what those implicit biases might be. 1018 00:40:29,430 --> 00:40:30,990 Andrew- There's actually a second level of bias 1019 00:40:30,990 --> 00:40:32,460 Andrew- which comes from the training. 1020 00:40:32,460 --> 00:40:36,900 Andrew- So OpenAI have trained ChatGPT to respond in certain ways 1021 00:40:36,900 --> 00:40:39,450 Andrew- and put certain guardrails in place, which is good, 1022 00:40:39,450 --> 00:40:43,590 Andrew- but it also still brings a bias to the responses. 1023 00:40:43,590 --> 00:40:48,030 Andrew- That to me is okay because everything around us 1024 00:40:48,030 --> 00:40:49,440 Andrew- has those implicit biases, 1025 00:40:49,440 --> 00:40:51,210 Andrew- but you've gotta be aware of them, 1026 00:40:51,210 --> 00:40:53,700 Andrew- you've gotta be sensitive to them and understand 1027 00:40:53,700 --> 00:40:55,530 Andrew- either how to counter them or navigate them. 1028 00:40:55,530 --> 00:40:58,710 Andrew- And that comes back to another set of skills 1029 00:40:58,710 --> 00:41:00,060 Andrew- that the students need. 1030 00:41:00,060 --> 00:41:02,700 Andrew- If they just absorb without thinking 1031 00:41:02,700 --> 00:41:04,500 Andrew- what they get outta ChatGPT, 1032 00:41:04,500 --> 00:41:06,000 Andrew- it will put them in a worse place 1033 00:41:06,000 --> 00:41:08,460 Andrew- rather than a better place. 1034 00:41:08,460 --> 00:41:09,293 Mary- I like that. 1035 00:41:09,293 --> 00:41:11,979 Mary- So now digital literacy also includes consciousness. 1036 00:41:11,979 --> 00:41:14,520 (group laughs) 1037 00:41:14,520 --> 00:41:16,740 Mary- Coming out of your unconscious bias 1038 00:41:16,740 --> 00:41:18,540 Mary- and being more conscious of your bias. 1039 00:41:18,540 --> 00:41:20,160 Andrew- But this isn't a bad thing 1040 00:41:20,160 --> 00:41:24,600 Andrew- and it extends way beyond just dealing with generative AI. 1041 00:41:24,600 --> 00:41:28,770 Andrew- That ability to sort of test and evaluate what you hear 1042 00:41:28,770 --> 00:41:31,770 Andrew- and not necessarily reject it in a black and white way 1043 00:41:31,770 --> 00:41:34,890 Andrew- but understand sort of what the influences might be 1044 00:41:34,890 --> 00:41:36,240 Andrew- in terms of what you're getting 1045 00:41:36,240 --> 00:41:39,870 Andrew- and actually using your own cognitive power and ability 1046 00:41:39,870 --> 00:41:43,020 Andrew- to untangle the threads there and make sense of things. 1047 00:41:43,020 --> 00:41:43,853 Mary- Excellent. 1048 00:41:43,853 --> 00:41:45,030 Mary- Tamara, did you have anything to add 1049 00:41:45,030 --> 00:41:46,560 Mary- in reference to digital literacies there 1050 00:41:46,560 --> 00:41:48,900 Mary- or even the things that you've noticed from your own use 1051 00:41:48,900 --> 00:41:50,700 Mary- or from observing the use of others? 1052 00:41:52,170 --> 00:41:53,420 Tamara- That's a good question. 1053 00:41:55,200 --> 00:41:58,660 Tamara- First, I love listening to these two experts because 1054 00:41:59,940 --> 00:42:03,210 Tamara- I'm thinking, oh, we should say this, and they said, so. 1055 00:42:03,210 --> 00:42:04,920 Tamara- I here and I agree. 1056 00:42:04,920 --> 00:42:08,070 Tamara- But I would also just kind of take a different perspective 1057 00:42:08,070 --> 00:42:11,070 Tamara- on the limitations. 1058 00:42:11,070 --> 00:42:13,290 Tamara- And I think that that might be a bit in line 1059 Tamara- 00:42:13,290 --> 00:42:15,753 with what Andrew was just saying. 1060 00:42:16,740 --> 00:42:21,330 Tamara- This tool was created by society and for society, 1061 00:42:21,330 --> 00:42:25,590 Tamara- and because of that it has society's limitations, 1062 00:42:25,590 --> 00:42:27,990 Tamara- limitations that we're not proud of 1063 00:42:27,990 --> 00:42:32,283 Tamara- in terms of social justice, in terms of bias, in terms of, 1064 00:42:33,330 --> 00:42:38,280 Tamara- you know, not necessarily exchanging factual information. 1065 00:42:38,280 --> 00:42:40,080 Tamara- And these things are existent 1066 00:42:40,080 --> 00:42:42,420 Tamara- and we do need to deal with these anyway. 1067 00:42:42,420 --> 00:42:47,420 Tamara- And so having some sort of prompt literacy nowadays 1068 00:42:47,910 --> 00:42:50,760 Tamara- is just about having social literacy. 1069 00:42:50,760 --> 00:42:54,720 Tamara- It's not something that we can untether from who we are 1070 00:42:54,720 --> 00:42:56,490 Tamara- and what we need our students to be 1071 00:42:56,490 --> 00:43:00,990 Tamara- to be thought leaders and innovators and pioneers 1072 00:43:00,990 --> 00:43:03,480 Tamara- in the future to making a better society. 1073 00:43:03,480 --> 00:43:08,070 Tamara- And so, I think sometimes these tools like generative AI 1074 00:43:08,070 --> 00:43:10,560 Tamara- elicit and bring up all of these issues, 1075 00:43:10,560 --> 00:43:12,450 Tamara- but they didn't create them, 1076 00:43:12,450 --> 00:43:15,480 Tamara- and we certainly need to address them. 1077 00:43:15,480 --> 00:43:20,100 Tamara- And so I look at it as an opportunity to address them. 1078 00:43:20,100 --> 00:43:23,523 Tamara- And you know, I think, one last thing, 1079 00:43:24,960 --> 00:43:29,960 Tamara- I've seen students learn really well from generative AI 1080 00:43:30,360 --> 00:43:32,670 Tamara- when they do take the approach that 1081 00:43:32,670 --> 00:43:36,510 Tamara- this is an initial way to 1082 00:43:36,510 --> 00:43:39,090 Tamara- begin to think about information 1083 00:43:39,090 --> 00:43:41,340 Tamara- and to see what's out there and challenge it 1084 00:43:41,340 --> 00:43:46,340 Tamara- and then do fact-finding or fact-checking on their own. 1085 00:43:46,500 --> 00:43:49,740 Tamara- So I think that if you pair natural skills 1086 00:43:49,740 --> 00:43:51,990 Tamara- valued by higher education in society 1087 Tamara- 00:43:51,990 --> 00:43:54,660 with the use of these generative AI tools, 1088 00:43:54,660 --> 00:43:56,910 Tamara- I think you'll get further faster. 1089 00:43:56,910 --> 00:43:58,295 Mary- I love that, and I love the, 1090 00:43:58,295 --> 00:44:00,630 Mary- because this is like the age of aquarius, right? 1091 00:44:00,630 --> 00:44:01,950 Mary- That is unknown will be known, 1092 00:44:01,950 --> 00:44:04,650 Mary- and now we have to deal with it. (laughs) 1093 00:44:04,650 --> 00:44:06,600 Mary- Sometimes it's hard to deal with our issues, 1094 00:44:06,600 --> 00:44:07,710 Mary- but it's important that we do, 1095 00:44:07,710 --> 00:44:09,237 Mary- which again goes back to principled innovation 1096 00:44:09,237 --> 00:44:12,510 Mary- and our moral responsibility, right? 1097 00:44:12,510 --> 00:44:13,980 Mary- And then considering the civic duty 1098 00:44:13,980 --> 00:44:16,350 Mary- of understanding our cultures, understanding our students, 1099 00:44:16,350 --> 00:44:17,850 Mary- how they're going to use these systems 1100 00:44:17,850 --> 00:44:19,560 Mary- is extremely important. 1101 00:44:19,560 --> 00:44:21,990 Mary- I will share the principled innovation link 1102 00:44:21,990 --> 00:44:22,823 Mary- in the show notes as well 1103 00:44:22,823 --> 00:44:24,450 Mary- 'cause I think it's a fabulous resource 1104 00:44:24,450 --> 00:44:26,490 Mary- that was created by ASU. 1105 00:44:26,490 --> 00:44:28,350 Mary- It would be great for anyone to read through 1106 00:44:28,350 --> 00:44:31,020 Mary- before you start thinking about how you might use ChatGPT 1107 00:44:31,020 --> 00:44:33,930 Mary- or any other generative AI platform in your course, 1108 00:44:33,930 --> 00:44:35,670 Mary- or integrated at the enterprise level, right? 1109 00:44:35,670 --> 00:44:38,130 Mary- There's a lot of conversations around digital trust 1110 00:44:38,130 --> 00:44:40,980 Mary- and how are we as a large university, 1111 00:44:40,980 --> 00:44:43,320 Mary- one of the largest public universities going to 1112 00:44:43,320 --> 00:44:47,190 Mary- model how this is used outside of our university 1113 00:44:47,190 --> 00:44:48,870 Mary- 'cause it will be influential. 1114 00:44:48,870 --> 00:44:50,550 Andrew- And I would say that 1115 00:44:50,550 --> 00:44:53,430 Andrew- taking that principled innovation approach also begins to 1116 00:44:53,430 --> 00:44:56,610 Andrew- address the elephant in the room around cheating. 1117 00:44:56,610 --> 00:45:00,403 Andrew- And of course, as soon as ChatGPT became a big thing, 1118 00:45:00,403 --> 00:45:03,030 Andrew- there were all these concerns about students 1119 00:45:03,030 --> 00:45:06,570 Andrew- seeing it as a shortcut to surviving and thriving courses 1120 00:45:06,570 --> 00:45:07,980 Andrew- without putting in the effort. 1121 00:45:07,980 --> 00:45:09,720 Andrew- I think the evidence is that 1122 00:45:09,720 --> 00:45:13,470 Andrew- that isn't actually as big an issue as people thought it was 1123 00:45:13,470 --> 00:45:15,299 Andrew- for a number of reasons. 1124 00:45:15,299 --> 00:45:17,203 Andrew- But it also is something that I think 1125 00:45:17,203 --> 00:45:19,587 Andrew- that we're being forced to think about. 1126 00:45:19,587 --> 00:45:23,070 Andrew- And what intrigues me is, it's forcing us as instructors 1127 00:45:23,070 --> 00:45:25,170 Andrew- to think about what we're trying to achieve 1128 00:45:25,170 --> 00:45:27,750 Andrew- through education, and what learning actually means 1129 00:45:27,750 --> 00:45:30,180 Andrew- and what the outcomes are that we want 1130 00:45:30,180 --> 00:45:33,540 Andrew- versus just sort of ticking the boxes as we teach. 1131 00:45:33,540 --> 00:45:35,160 Tamara- I think we'll be able to achieve more 1132 00:45:35,160 --> 00:45:37,200 Tamara- through education because of these tools, 1133 00:45:37,200 --> 00:45:39,573 Tamara- which is very, very interesting. 1134 00:45:42,060 --> 00:45:43,410 Tamara- I don't know about you guys, 1135 00:45:43,410 --> 00:45:47,820 Tamara- but you probably heard concerns about academic integrity 1136 00:45:47,820 --> 00:45:49,980 Tamara- and cheating, and things like that. 1137 00:45:49,980 --> 00:45:54,420 Tamara- And it's very interesting because they're accurate, 1138 00:45:54,420 --> 00:45:57,003 Tamara- but also generative AI, 1139 00:45:58,410 --> 00:46:00,839 Tamara- cheating did not start and academic integrity issues 1140 00:46:00,839 --> 00:46:03,270 Tamara- did not start with generative AI, 1141 00:46:03,270 --> 00:46:05,580 Tamara- they just could be exacerbated by it 1142 00:46:05,580 --> 00:46:08,973 Tamara- if we don't teach our students correct use of tools. 1143 00:46:10,170 --> 00:46:13,560 Tamara- And so I kind of wonder what everyone thinks, 1144 00:46:13,560 --> 00:46:17,820 Tamara- Mary, you too, about when it comes to addressing 1145 00:46:17,820 --> 00:46:19,923 Tamara- academic integrity concerns, 1146 00:46:21,758 --> 00:46:26,758 Tamara- how you teach students and how you help faculty understand 1147 00:46:26,760 --> 00:46:29,130 Tamara- a proactive way to address these things. 1148 00:46:29,130 --> 00:46:29,963 Mary- I guess I'll start with the 1149 00:46:29,963 --> 00:46:32,700 Mary- "Cheaters Never Prosper" pamphlet that we have online, 1150 00:46:32,700 --> 00:46:34,527 Mary- which we'll also share. 1151 00:46:34,527 --> 00:46:36,030 Mary- And that really gets down to, 1152 00:46:36,030 --> 00:46:37,290 Mary- where there's a will, there's a way. 1153 00:46:37,290 --> 00:46:40,890 Mary- So don't focus on those who have the will necessarily. 1154 00:46:40,890 --> 00:46:44,040 Mary- Maybe figure out why they have that will. 1155 00:46:44,040 --> 00:46:46,890 Mary- Is it too many assignments or, you know, cognitive overload? 1156 00:46:46,890 --> 00:46:50,310 Andrew- See, and that to me is critical. 1157 00:46:50,310 --> 00:46:54,570 Andrew- Again, if we think about education as a series of 1158 00:46:54,570 --> 00:46:56,040 Andrew- rewards and punishments, 1159 00:46:56,040 --> 00:46:58,320 Andrew- of course people are gonna work out 1160 00:46:58,320 --> 00:47:00,180 Andrew- how to get the rewards and avoid the punishments, 1161 00:47:00,180 --> 00:47:02,250 Andrew- it's a really bad way of teaching. 1162 00:47:02,250 --> 00:47:04,620 Andrew- On the other hand, if we think about 1163 00:47:04,620 --> 00:47:09,620 Andrew- what learning are we trying to give our students, 1164 00:47:09,690 --> 00:47:11,760 Andrew- what do we want them to be able to do at the end of it, 1165 00:47:11,760 --> 00:47:15,240 Andrew- that opens up the pathway to seeing the way forward 1166 00:47:15,240 --> 00:47:18,060 Andrew- without thinking about how are we gonna punish them 1167 00:47:18,060 --> 00:47:20,040 Andrew- if they don't do what we want? 1168 00:47:20,040 --> 00:47:22,800 Andrew- And what we are finding, not only myself, 1169 00:47:22,800 --> 00:47:26,040 Andrew- but others with the use of generative AI, 1170 00:47:26,040 --> 00:47:29,220 Andrew- is if you create a pathway toward learning 1171 00:47:29,220 --> 00:47:32,912 Andrew- where it's actually easier for them not to use generative AI 1172 00:47:32,912 --> 00:47:36,330 Andrew- to cheat than it is to actually do the learning, 1173 00:47:36,330 --> 00:47:37,710 Andrew- it's a win-win situation. 1174 00:47:37,710 --> 00:47:39,197 Andrew- And just to give you an example of that, 1175 00:47:39,197 --> 00:47:40,710 Andrew- in the ChatGPT course, 1176 00:47:40,710 --> 00:47:43,590 Andrew- most of the stuff was based in ChatGPT 1177 00:47:43,590 --> 00:47:46,590 Andrew- but we had a few assignments where students either had to 1178 00:47:46,590 --> 00:47:49,710 Andrew- write reflections or create videos. 1179 00:47:49,710 --> 00:47:51,510 Andrew- And I was worried at one point that 1180 00:47:51,510 --> 00:47:53,790 Andrew- because they were well versed in ChatGPT 1181 00:47:53,790 --> 00:47:55,530 Andrew- I was gonna have a load of written reflections 1182 00:47:55,530 --> 00:47:57,870 Andrew- that were just generated by ChatGPT. 1183 00:47:57,870 --> 00:47:59,580 Andrew- No, not at all. 1184 00:47:59,580 --> 00:48:02,010 Andrew- And I didn't, and I'd love to sort of see 1185 00:48:02,010 --> 00:48:04,890 Andrew- whether this matches sort of what it was like as a student. 1186 00:48:04,890 --> 00:48:08,160 Andrew- But my sense was students wanted to have a voice 1187 00:48:08,160 --> 00:48:10,350 Andrew- and they wanted, when they were writing their reflections, 1188 00:48:10,350 --> 00:48:12,170 Andrew- for those reflections to reflect 1189 00:48:12,170 --> 00:48:13,710 Andrew- what was going on in their head, 1190 00:48:13,710 --> 00:48:15,750 Andrew- they didn't want a machine to do that for them. 1191 00:48:15,750 --> 00:48:18,540 Andrew- So the path of least resistance was for them 1192 00:48:18,540 --> 00:48:23,040 Andrew- not to use ChatGPT, and that worked incredibly well. 1193 00:48:23,040 --> 00:48:25,261 Mary- And that kind of gets to a resource 1194 00:48:25,261 --> 00:48:27,360 Mary- that Tamara and some of our other colleagues 1195 00:48:27,360 --> 00:48:29,430 Mary- very intentionally produced this last session, 1196 00:48:29,430 --> 00:48:31,950 Mary- and I think is actually in the Generative AI course. 1197 00:48:31,950 --> 00:48:34,590 Mary- Tamara, do you mind talking about that? 1198 00:48:34,590 --> 00:48:38,700 Tamara- Oh, it's the academic integrity risk reduction guide. 1199 00:48:38,700 --> 00:48:43,560 Tamara- And it's a process and it's a series of questions 1200 00:48:43,560 --> 00:48:48,560 Tamara- that help you to determine what the risks are in your course 1201 00:48:49,050 --> 00:48:52,560 Tamara- and how to use course design decisions to address them 1202 00:48:52,560 --> 00:48:54,513 Tamara- even before students enter the course. 1203 00:48:56,589 --> 00:48:59,820 Tamara- We have received feedback that it's quite long, 1204 00:48:59,820 --> 00:49:03,120 Tamara- so we need to emphasize that it's a-- 1205 00:49:03,120 --> 00:49:04,320 Mary- There's a lot to talk about. (group chuckles) 1206 00:49:04,320 --> 00:49:05,850 Andrew- Did you have people cheating 1207 00:49:05,850 --> 00:49:07,370 Andrew- to try and get to the end of it? 1208 00:49:07,370 --> 00:49:08,610 Tamara- So for the next two hours 1209 00:49:08,610 --> 00:49:09,840 Tamara- we'll be talking about this guide. 1210 00:49:09,840 --> 00:49:11,026 Tamara- No, I'm just kidding. 1211 00:49:11,026 --> 00:49:12,872 (group chuckles) 1212 00:49:12,872 --> 00:49:17,010 Tamara- No, we do like to say that this is a reference resource 1213 00:49:17,010 --> 00:49:18,450 Tamara- the same way you wouldn't sit down 1214 00:49:18,450 --> 00:49:21,180 Tamara- and you wouldn't read the dictionary from cover to cover, 1215 00:49:21,180 --> 00:49:22,680 Tamara- you would look in alphabetical order 1216 00:49:22,680 --> 00:49:24,120 Tamara- and find the word that you need, 1217 00:49:24,120 --> 00:49:26,100 Tamara- learn the definition and move on, 1218 00:49:26,100 --> 00:49:29,160 Tamara- this guide was intended to be comprehensive in a way 1219 00:49:29,160 --> 00:49:32,550 Tamara- that would allow you to look at a categorical issue 1220 00:49:32,550 --> 00:49:34,950 Tamara- that you're expecting, experiencing, 1221 00:49:34,950 --> 00:49:36,750 Tamara- ask yourself a question, 1222 00:49:36,750 --> 00:49:39,630 Tamara- and then have probably 10 different solutions 1223 00:49:39,630 --> 00:49:41,880 Tamara- for addressing that one single issue 1224 00:49:41,880 --> 00:49:44,010 Tamara- that you could easily try in your course. 1225 00:49:44,010 --> 00:49:45,840 Tamara- And then you can kind of use that 1226 00:49:45,840 --> 00:49:48,690 Tamara- as almost like the scientific method. 1227 00:49:48,690 --> 00:49:51,060 Tamara- You've got a hypothesis, you think this is the issue, 1228 00:49:51,060 --> 00:49:53,850 Tamara- implement this one strategy, does it address the issue? 1229 00:49:53,850 --> 00:49:55,080 Tamara- Does it not address the issue? 1230 00:49:55,080 --> 00:49:56,550 Tamara- It did, that's fantastic. 1231 00:49:56,550 --> 00:49:59,670 Tamara- So it's a wonderful tool 1232 00:49:59,670 --> 00:50:02,640 Tamara- because it assumes that we are constantly learning 1233 00:50:02,640 --> 00:50:04,920 Tamara- and that our courses are constantly changing 1234 00:50:04,920 --> 00:50:07,050 Tamara- and the needs of our learners are changing. 1235 00:50:07,050 --> 00:50:11,190 Tamara- And so to make course revisions to help our learners to be 1236 00:50:11,190 --> 00:50:13,920 Tamara- more engaged in the course studies 1237 00:50:13,920 --> 00:50:16,680 Tamara- and reduce academic integrity risks 1238 00:50:16,680 --> 00:50:18,573 Tamara- is the whole goal of the guide, so. 1239 00:50:19,410 --> 00:50:20,243 Tamara- And then at the end it teaches you-- 1240 00:50:20,243 --> 00:50:22,020 Mary- I do wanna plug those six categories too, 1241 00:50:22,020 --> 00:50:24,120 Mary- because that, I think if people know like, 1242 00:50:24,120 --> 00:50:26,310 Mary- this is what you cover, they might just go to that category 1243 00:50:26,310 --> 00:50:27,870 Mary- and then it's not as overwhelming. 1244 00:50:27,870 --> 00:50:29,610 Tamara- That's very true, that's very true. 1245 00:50:29,610 --> 00:50:32,340 Tamara- Do you want me to plug the six categories, or? 1246 00:50:32,340 --> 00:50:34,830 Mary- Absolutely, or I have 'em up I can read 'em, either way. 1247 00:50:34,830 --> 00:50:36,540 Tamara- Yeah, do you wanna read 'em? That's fine. 1248 00:50:36,540 --> 00:50:38,940 Mary- - So the first one's course materials, 1249 00:50:38,940 --> 00:50:40,680 Mary- which I think everybody's a little bit worried about 1250 00:50:40,680 --> 00:50:42,390 Mary- with generative AI, 1251 00:50:42,390 --> 00:50:45,300 course tools, course Mary- activities and collaboration, 1252 00:50:45,300 --> 00:50:48,000 Mary- assessment design, that'll be the most visited space, 1253 00:50:48,000 --> 00:50:49,920 Mary- I have a feeling, I wish we could track that. 1254 00:50:49,920 --> 00:50:51,570 Mary- Course expectations and instructions, 1255 00:50:51,570 --> 00:50:53,160 Mary- and then feedback and student support. 1256 00:50:53,160 --> 00:50:54,870 Mary- So you were really very comprehensive 1257 00:50:54,870 --> 00:50:57,630 Mary- in your consideration of all of the different ways 1258 00:50:57,630 --> 00:50:59,040 Mary- that generative AI, 1259 00:50:59,040 --> 00:50:59,970 Mary- well, not just generative AI, 1260 00:50:59,970 --> 00:51:01,560 Mary- 'cause this is not just generative AI focused, 1261 00:51:01,560 --> 00:51:02,730 Mary- I think we should say that as well, 1262 00:51:02,730 --> 00:51:06,360 Mary- but many ways that academic dishonesty can show up 1263 00:51:06,360 --> 00:51:08,730 Mary- in a course, very intentional. 1264 00:51:08,730 --> 00:51:11,760 Tamara- And that it can be exacerbated by using 1265 00:51:11,760 --> 00:51:13,860 Tamara- higher level technology if you're not careful. 1266 00:51:13,860 --> 00:51:17,130 Tamara- So definitely the consideration was for first and foremost 1267 00:51:17,130 --> 00:51:19,260 Tamara- the academic integrity overall 1268 00:51:19,260 --> 00:51:22,140 Tamara- and then also addressing it with technology, 1269 00:51:22,140 --> 00:51:24,300 Tamara- which is nice because the last part of the guide is 1270 00:51:24,300 --> 00:51:28,680 Tamara- specifically designed to harness the power of generative AI. 1271 00:51:28,680 --> 00:51:31,530 Tamara- I think we've got an AI versus AI webinar coming up, 1272 00:51:31,530 --> 00:51:35,700 Tamara- and I think that the idea that you can fight AI with AI 1273 00:51:35,700 --> 00:51:36,870 Tamara- by using it properly, 1274 00:51:36,870 --> 00:51:39,450 Tamara- I think that's kind of a nice approach. 1275 00:51:39,450 --> 00:51:41,610 Mary- And it's possible because of the magic of podcasting 1276 00:51:41,610 --> 00:51:44,220 Mary- that this podcast will come out after that webinar, 1277 00:51:44,220 --> 00:51:46,650 Mary- but if you do want access to that, 1278 00:51:46,650 --> 00:51:48,540 Mary- just reach out, we can get you access. 1279 00:51:48,540 --> 00:51:50,340 Tamara- Sorry. (chuckles) 1280 00:51:50,340 --> 00:51:51,360 Mary- No, it's fine. 1281 00:51:51,360 --> 00:51:52,770 Mary- I'm excited for your webinar. 1282 00:51:52,770 --> 00:51:54,480 Mary- It's gonna be a fabulous opportunity 1283 00:51:54,480 --> 00:51:57,540 Mary- to have conversations with faculty on the real concerns 1284 00:51:57,540 --> 00:51:58,500 Mary- that are happening, 1285 00:51:58,500 --> 00:52:01,470 Mary- but also talk about how to use this risk reduction guide 1286 00:52:01,470 --> 00:52:04,560 Mary- in a very meaningful way to help alleviate concerns 1287 00:52:04,560 --> 00:52:06,450 Mary- and maybe put people in that explorative, 1288 00:52:06,450 --> 00:52:10,620 Mary- experimentive, curiosity space around how 1289 00:52:10,620 --> 00:52:13,740 Mary- these generative platforms are going to change our society 1290 00:52:13,740 --> 00:52:15,480 Mary- and therefore change the way our disciplines 1291 00:52:15,480 --> 00:52:17,010 Mary- will interact with society 1292 00:52:17,010 --> 00:52:18,630 Mary- and what we need for student readiness 1293 00:52:18,630 --> 00:52:20,280 Mary- before they leave our university. 1294 00:52:23,370 --> 00:52:25,650 Ricardo- I was really like on the edge of my seat the whole time, 1295 00:52:25,650 --> 00:52:28,290 Ricardo- waiting for the negative impacts. 1296 00:52:28,290 --> 00:52:30,930 Ricardo- From your perspectives and your experience 1297 00:52:30,930 --> 00:52:31,763 Ricardo- working with these things, 1298 00:52:31,763 --> 00:52:33,510 Ricardo- what do you guys anticipate would be the, 1299 00:52:33,510 --> 00:52:35,880 Ricardo- 'cause I feel like this course, 1300 00:52:35,880 --> 00:52:38,910 Ricardo- the Generative AI course, was a response to 1301 00:52:38,910 --> 00:52:42,480 Ricardo- faculty kind of being worried about the kind of 1302 00:52:42,480 --> 00:52:44,820 Ricardo- negative impacts of this technology. 1303 00:52:44,820 --> 00:52:47,310 Liz- Yeah, I think the big surprising part, 1304 00:52:47,310 --> 00:52:50,610 Liz- not only about using and exploring ChatGPT, 1305 00:52:50,610 --> 00:52:54,090 Liz- but also this conversation was that there wasn't actually, 1306 00:52:54,090 --> 00:52:56,217 Liz- there was a lot of hesitation going into it 1307 00:52:56,217 --> 00:52:59,130 Liz- and into explorations with ChatGPT. 1308 00:52:59,130 --> 00:53:03,960 Liz- There was kind of an initial response to ignore, avoid, 1309 00:53:03,960 --> 00:53:07,920 Liz- try to shun the robots from coming as a part of our 1310 00:53:07,920 --> 00:53:09,780 Liz- higher education landscape. 1311 00:53:09,780 --> 00:53:12,270 Liz- But I think everyone actually found that 1312 00:53:12,270 --> 00:53:15,510 Liz- it was more helpful, that it was actually, 1313 00:53:15,510 --> 00:53:18,420 Liz- that students were using it for good, not for evil, 1314 00:53:18,420 --> 00:53:20,430 Liz- they were using it to get ideas. 1315 00:53:20,430 --> 00:53:24,330 Liz- And that afterwards when reflecting that they were 1316 00:53:24,330 --> 00:53:27,690 Liz- more excited to share their own thoughts and opinions 1317 00:53:27,690 --> 00:53:30,690 Liz- and they weren't actually using ChatGPT 1318 00:53:30,690 --> 00:53:33,240 Liz- for academic dishonesty purposes, 1319 00:53:33,240 --> 00:53:37,680 Liz- and that it's more of a help than a harm in a lot of ways. 1320 00:53:37,680 --> 00:53:40,410 Mary- And honestly, early on we were hearing from faculty that 1321 00:53:40,410 --> 00:53:42,273 Mary- those students who were using ChatGPT 1322 00:53:42,273 --> 00:53:44,100 Mary- were actually telling their faculty members, 1323 00:53:44,100 --> 00:53:45,390 Mary- hey, I use this, is that okay? 1324 00:53:45,390 --> 00:53:47,760 Mary- And some of them are like, I don't know if it's okay. 1325 00:53:47,760 --> 00:53:52,200 Mary- But it really comes down to instructor decision making. 1326 00:53:52,200 --> 00:53:54,060 Mary- And I think that's what I really love about 1327 00:53:54,060 --> 00:53:54,972 Mary- Arizona State's model here is that, 1328 00:53:54,972 --> 00:53:57,540 Mary- yes, there's some syllabus language 1329 00:53:57,540 --> 00:53:59,640 Mary- that the provost office was able to provide, 1330 00:53:59,640 --> 00:54:01,260 Mary- yes, there's this Generative AI course, 1331 00:54:01,260 --> 00:54:04,380 Mary- but it really takes deep internal reflection on 1332 00:54:04,380 --> 00:54:06,595 Mary- what does this new way that 1333 00:54:06,595 --> 00:54:09,390 Mary- society interacts with technology mean 1334 00:54:09,390 --> 00:54:10,590 Mary- to all kinds of spaces, 1335 00:54:10,590 --> 00:54:12,480 Mary- the disciplines that we teach, 1336 00:54:12,480 --> 00:54:13,800 Mary- the jobs that are being created, 1337 00:54:13,800 --> 00:54:15,694 Mary- the jobs that are potentially disappearing 1338 00:54:15,694 --> 00:54:19,260 Mary- or changing dramatically because the mundane work 1339 00:54:19,260 --> 00:54:21,240 Mary- can be done in different ways. 1340 00:54:21,240 --> 00:54:22,560 Mary- And it's gonna make us better, 1341 00:54:22,560 --> 00:54:24,840 Mary- like I think that was actually said in the conversation, 1342 00:54:24,840 --> 00:54:26,040 Mary- it's gonna make us better, 1343 00:54:26,040 --> 00:54:28,410 Mary- but primarily the things that I'm hearing about 1344 00:54:28,410 --> 00:54:31,320 Mary- that are worrisome are the ways that the models 1345 00:54:31,320 --> 00:54:32,760 Mary- have been fed information. 1346 00:54:32,760 --> 00:54:33,780 Mary- So one of the examples 1347 00:54:33,780 --> 00:54:35,370 Mary- in the generative AI InScribe course, 1348 00:54:35,370 --> 00:54:37,110 Mary- so if you haven't gone in, you can see this, 1349 00:54:37,110 --> 00:54:39,660 Mary- was an example of a woman who was like, 1350 00:54:39,660 --> 00:54:42,270 Mary- I need an image of a professional. 1351 00:54:42,270 --> 00:54:43,920 Mary- And it ended up being a white woman, 1352 00:54:43,920 --> 00:54:46,620 Mary- even though the person prompting it was of Asian descent. 1353 00:54:46,620 --> 00:54:50,040 Mary- And so it's like, that is a concern that our models, 1354 00:54:50,040 --> 00:54:51,750 Mary- especially our image models, 1355 00:54:51,750 --> 00:54:55,260 Mary- have limited data based on bias perspectives 1356 00:54:55,260 --> 00:54:58,170 Mary- of what we think certain people look like. 1357 00:54:58,170 --> 00:55:01,530 Mary- And so feeding those models a diverser set 1358 00:55:01,530 --> 00:55:02,490 Mary- is an important thing. 1359 00:55:02,490 --> 00:55:04,590 Mary- So having the opportunity to feed the models 1360 00:55:04,590 --> 00:55:07,530 Mary- more intentionally, curate the models more intentionally. 1361 00:55:07,530 --> 00:55:09,180 Mary- If we get access that way, 1362 00:55:09,180 --> 00:55:10,470 Mary- I think it's gonna change the game 1363 00:55:10,470 --> 00:55:12,870 Mary- and it also will change, and this was said too, 1364 00:55:12,870 --> 00:55:16,050 Mary- the social dynamics of like how we see things 1365 00:55:16,050 --> 00:55:17,850 Mary- and correcting the historical way 1366 00:55:17,850 --> 00:55:19,590 Mary- of how things were reported. 1367 00:55:19,590 --> 00:55:20,599 Liz- I think that's so interesting 1368 00:55:20,599 --> 00:55:23,610 Liz- 'cause that kind of goes back to good prompt engineering. 1369 00:55:23,610 --> 00:55:25,740 Liz- We need to feed the information 1370 00:55:25,740 --> 00:55:27,480 Liz- so that we can give better prompts 1371 00:55:27,480 --> 00:55:31,560 Liz- and feed the robot with more information 1372 00:55:31,560 --> 00:55:34,590 Liz- so that we get better, more diverse results 1373 00:55:34,590 --> 00:55:37,650 Liz- that are actually reflective of real people 1374 00:55:37,650 --> 00:55:40,260 Liz- versus what the internet thinks it is. 1375 00:55:40,260 --> 00:55:41,160 Mary- And real occurrences, 1376 00:55:41,160 --> 00:55:43,140 Mary- because it will give you a citation if you ask for it, 1377 00:55:43,140 --> 00:55:47,400 Mary- but it's very likely and almost always not accurate. 1378 00:55:47,400 --> 00:55:49,080 Mary- We had this example in the 1379 00:55:49,080 --> 00:55:51,390 Mary- School of Historical, Philosophical, and Religious Studies. 1380 00:55:51,390 --> 00:55:52,770 Mary- Jeff Watson came out of that college, 1381 00:55:52,770 --> 00:55:55,590 Mary- he also gave a lot of content from a conversation 1382 00:55:55,590 --> 00:55:57,540 Mary- he had with DeAnna in the Generative AI course. 1383 00:55:57,540 --> 00:56:00,030 Mary- But he and many others at his school 1384 00:56:00,030 --> 00:56:02,520 Mary- hosted this session last semester. 1385 00:56:02,520 --> 00:56:05,760 Mary- And one of their examples for their fellow peers 1386 00:56:05,760 --> 00:56:07,560 Mary- that were in attendance was, 1387 00:56:07,560 --> 00:56:09,090 Mary- one of the guys who was in there, 1388 00:56:09,090 --> 00:56:11,760 Mary- he writes on a specific topic and they were like, 1389 00:56:11,760 --> 00:56:13,830 Mary- find one of the books from this individual. 1390 00:56:13,830 --> 00:56:15,840 Mary- And then it came up with a title 1391 00:56:15,840 --> 00:56:17,640 Mary- that seemed very relevant to his study. 1392 00:56:17,640 --> 00:56:18,817 Mary- And he's sitting in the room and he is like, 1393 00:56:18,817 --> 00:56:21,510 Mary- "Wow, that sounds like a book I would've written, 1394 00:56:21,510 --> 00:56:23,130 Mary- but I did not write that book." 1395 00:56:23,130 --> 00:56:24,540 Mary- And it was just such a good moment 1396 00:56:24,540 --> 00:56:26,100 Mary- for everyone to go "yikes." 1397 00:56:26,100 --> 00:56:29,310 Mary- So citations is a place where it will not do well. 1398 00:56:29,310 --> 00:56:32,040 Mary- And if we could again, curate intentionally 1399 00:56:32,040 --> 00:56:35,190 Mary- and provide citations, then what a great collegial resource 1400 00:56:35,190 --> 00:56:37,290 Mary- these kinds of experiences could be 1401 00:56:37,290 --> 00:56:39,780 Mary- if that's how the models were intended to work. 1402 00:56:39,780 --> 00:56:41,340 Liz- Yeah, I think it's a great stepping stone 1403 00:56:41,340 --> 00:56:43,560 Liz- for hesitant students. 1404 00:56:43,560 --> 00:56:46,110 Liz- I think a lot of people in our conversations, 1405 00:56:46,110 --> 00:56:48,120 Liz- a lot of people that I've either talked to 1406 00:56:48,120 --> 00:56:49,710 Liz- outside of our podcast today 1407 00:56:49,710 --> 00:56:52,020 Liz- are actually more excited to integrate it 1408 00:56:52,020 --> 00:56:53,700 Liz- and tell students how to use it 1409 00:56:53,700 --> 00:56:56,290 Liz- and almost train students on the best way to utilize 1410 00:56:56,290 --> 00:57:00,330 Liz- the tool while also maintaining academic integrity. 1411 00:57:00,330 --> 00:57:03,150 Liz- So I think that the more people look into it, 1412 00:57:03,150 --> 00:57:05,910 Liz- the less concerns they have about academic integrity are, 1413 00:57:05,910 --> 00:57:09,210 Liz- and the more they're excited to see 1414 00:57:09,210 --> 00:57:12,273 Liz- how this can help students be better. 1415 00:57:14,070 --> 00:57:16,655 Tamara- We've gotten a lot of good responses from faculty. 1416 00:57:16,655 --> 00:57:19,080 Tamara- Some of them have been through our survey 1417 00:57:19,080 --> 00:57:21,690 Tamara- with high levels of numbers, you know, 1418 00:57:21,690 --> 00:57:24,600 Tamara- out of one to four, rate this, with four being the highest. 1419 00:57:24,600 --> 00:57:26,820 Tamara- We have so many fours, which is wonderful, 1420 00:57:26,820 --> 00:57:28,920 Tamara- and so much good constructive feedback. 1421 00:57:28,920 --> 00:57:30,930 Tamara- But then we also have things being shared 1422 00:57:30,930 --> 00:57:32,550 Tamara- outside of the university. 1423 00:57:32,550 --> 00:57:34,830 Tamara- And this is one of my favorite quotes 1424 00:57:34,830 --> 00:57:37,290 Tamara- that's being shared publicly outside the university. 1425 00:57:37,290 --> 00:57:39,067 Tamara- It's from a professor who said, 1426 00:57:39,067 --> 00:57:41,430 Tamara- "I have to admit that as someone who has experienced 1427 00:57:41,430 --> 00:57:43,140 Tamara- academic integrity issues, 1428 00:57:43,140 --> 00:57:44,460 Tamara- I had more of a negative feeling 1429 00:57:44,460 --> 00:57:46,320 Tamara- toward generative AI at first. 1430 00:57:46,320 --> 00:57:50,730 Tamara- And since I've been gone, I felt overwhelmed 1431 00:57:50,730 --> 00:57:51,990 Tamara- in terms of teaching." 1432 00:57:51,990 --> 00:57:53,730 Tamara- And she said, "The only thing I did know was that 1433 00:57:53,730 --> 00:57:56,760 Tamara- generative AI was transforming higher education, 1434 00:57:56,760 --> 00:57:58,980 Tamara- which was the most powerful transformation after that 1435 00:57:58,980 --> 00:58:00,240 Tamara- due to COVID-19. 1436 00:58:00,240 --> 00:58:02,010 Tamara- However, after going through this course 1437 00:58:02,010 --> 00:58:04,170 Tamara- and many other resources and discussions, 1438 00:58:04,170 --> 00:58:07,230 Tamara- I experienced how powerful generative AI can be 1439 00:58:07,230 --> 00:58:09,780 Tamara- and already have some ideas that not only integrate it 1440 00:58:09,780 --> 00:58:12,990 Tamara- into the curriculum, emphasizing student critical thinking 1441 00:58:12,990 --> 00:58:14,910 Tamara- and the iterative and collaborative process 1442 00:58:14,910 --> 00:58:17,670 Tamara- with generative AI, but also to utilize it 1443 00:58:17,670 --> 00:58:20,100 Tamara- in various aspects of my own teaching 1444 00:58:20,100 --> 00:58:21,960 Tamara- from developing and refining my curriculum 1445 00:58:21,960 --> 00:58:23,460 Tamara- and designing assessments, 1446 00:58:23,460 --> 00:58:26,070 Tamara- or even to just draft messages and feedback to students 1447 00:58:26,070 --> 00:58:28,260 Tamara- to generating images for slides. 1448 00:58:28,260 --> 00:58:31,200 Tamara- So now I'm really very excited about the potential 1449 00:58:31,200 --> 00:58:33,210 Tamara- and all the opportunities that it brings." 1450 00:58:33,210 --> 00:58:34,950 Tamara- And so that to me, 1451 00:58:34,950 --> 00:58:37,440 Tamara- if we want one person to get something out of the course, 1452 00:58:37,440 --> 00:58:39,510 Tamara- that's what we hope they'll get out of it, 1453 00:58:39,510 --> 00:58:42,570 Tamara- that generative AI is something to address and be aware of, 1454 00:58:42,570 --> 00:58:44,160 Tamara- but it's exciting. 1455 00:58:44,160 --> 00:58:47,040 Mary- Well, I do wanna say thank you for the opportunity 1456 00:58:47,040 --> 00:58:48,390 Mary- to talk to everyone. 1457 00:58:48,390 --> 00:58:51,210 Mary- Before we go, does anyone have anything they wanna plug? 1458 00:58:51,210 --> 00:58:54,090 Mary- We have the library guide, is there a short link for that? 1459 00:58:54,090 --> 00:58:55,380 Mary Ann- Not that I can think of right now. 1460 00:58:55,380 --> 00:58:56,340 Mary- Okay, that's okay. 1461 00:58:56,340 --> 00:58:57,600 Mary- We're gonna put it in the show notes. 1462 00:58:57,600 --> 00:58:59,359 Mary- The short link is, go to Teach Online. 1463 00:58:59,359 --> 00:59:01,140 (group chuckles) 1464 00:59:01,140 --> 00:59:01,980 Mary- Andrew, how about you? 1465 00:59:01,980 --> 00:59:03,090 Mary- You have a lot of things to plug. 1466 00:59:03,090 --> 00:59:04,290 Mary- Please, the whole list. Andrew- Oh, goodness me. 1467 00:59:04,290 --> 00:59:08,010 Andrew- Yeah, so if anybody is interested in the exercises 1468 00:59:08,010 --> 00:59:10,500 Andrew- we do in the ChatGPT course, 1469 00:59:10,500 --> 00:59:13,290 Andrew- most of them are available on my website, 1470 00:59:13,290 --> 00:59:15,660 Andrew- which is just andrewmaynard.net. 1471 00:59:15,660 --> 00:59:17,280 Andrew- So you can go and check them out there. 1472 00:59:17,280 --> 00:59:18,807 Andrew- But the other thing I would say, 1473 00:59:18,807 --> 00:59:21,960 Andrew- and you see this through all of my writings around 1474 00:59:21,960 --> 00:59:23,970 Andrew- ChatGPT and generative AI, 1475 00:59:23,970 --> 00:59:26,460 Andrew- is if you think you know what it's about 1476 00:59:26,460 --> 00:59:28,680 Andrew- but you've never tried it, you don't. 1477 00:59:28,680 --> 00:59:31,650 Andrew- You really have to experience it before you can understand 1478 00:59:31,650 --> 00:59:34,563 Andrew- what its benefits and power and limitations are. 1479 00:59:35,460 --> 00:59:37,110 Mary- That is a very good call to action 1480 00:59:37,110 --> 00:59:38,580 Mary- for those who are listening that are like, 1481 00:59:38,580 --> 00:59:39,810 Mary- I was just listening to hear that 1482 00:59:39,810 --> 00:59:41,360 Mary- I shouldn't use it in my class. 1483 00:59:42,840 --> 00:59:45,030 Mary- But you did enjoy your student-based class. 1484 00:59:45,030 --> 00:59:46,500 Mary- Are you offering it again? 1485 00:59:46,500 --> 00:59:49,773 Andrew- Oh yes, we're offering it in the fall, Fall B. 1486 00:59:50,670 --> 00:59:53,340 Andrew- It was a wonderful experience. 1487 00:59:53,340 --> 00:59:56,490 Andrew- Not only important, but it surpassed my expectations 1488 00:59:56,490 --> 00:59:58,500 Andrew- in terms of what the students learned 1489 00:59:58,500 --> 01:00:00,267 Andrew- and what they came away being able to do. 1490 01:00:00,267 --> 01:00:01,860 Mary- Are you gonna change anything this time 1491 01:00:01,860 --> 01:00:02,693 Mary- that you learned from last time? 1492 01:00:02,693 --> 01:00:03,982 Andrew- Yeah, there were a couple of assignments 1493 01:00:03,982 --> 01:00:06,030 Andrew- that didn't work so well. 1494 01:00:06,030 --> 01:00:09,630 Andrew- Multi-stage prompt templates was a disaster. 1495 01:00:09,630 --> 01:00:11,370 (group laughs) 1496 01:00:11,370 --> 01:00:14,040 Andrew- We're gonna be changing that assignment. 1497 01:00:14,040 --> 01:00:16,110 Mary- And then Tamara, I know that you have 1498 01:00:16,110 --> 01:00:16,943 Mary- a little bit to plug, 1499 01:00:16,943 --> 01:00:18,870 Mary- we've already talked about the risk reduction guide. 1500 01:00:18,870 --> 01:00:19,952 Mary- But anything else to say as well 1501 01:00:19,952 --> 01:00:21,570 Mary- in reference to your experience 1502 01:00:21,570 --> 01:00:24,120 Mary- in building the Generative AI course for faculty 1503 01:00:24,120 --> 01:00:25,950 Mary- and staff at ASU? 1504 01:00:25,950 --> 01:00:28,860 Tamara- I think I'll just kind of describe it really quickly 1505 01:00:28,860 --> 01:00:30,300 Tamara- and what it was. 1506 01:00:30,300 --> 01:00:33,120 Tamara- The Teaching and Learning with Generative AI course 1507 01:00:33,120 --> 01:00:37,690 Tamara- was a way to gather some of the most 1508 01:00:38,790 --> 01:00:43,790 Tamara- innovative folks at ASU who are using generative AI 1509 01:00:43,980 --> 01:00:45,270 Tamara- effectively in their courses 1510 01:00:45,270 --> 01:00:47,640 Tamara- and experimenting with generative AI 1511 01:00:47,640 --> 01:00:50,550 Tamara- and put that expertise into one course 1512 01:00:50,550 --> 01:00:55,440 Tamara- to help faculty and staff know how to not only 1513 01:00:55,440 --> 01:00:56,700 Tamara- harness the power of AI, 1514 01:00:56,700 --> 01:01:00,390 Tamara- but also address issues that might arise with AI. 1515 01:01:00,390 --> 01:01:03,180 Tamara- As we said, Andrew is a wonderful voice 1516 01:01:03,180 --> 01:01:04,800 Tamara- throughout the modules in the course, 1517 01:01:04,800 --> 01:01:06,750 Tamara- he's almost this guiding voice. 1518 01:01:06,750 --> 01:01:10,230 Tamara- And then we have so many of our other faculty experts 1519 01:01:10,230 --> 01:01:13,050 Tamara- as well as staff experts who've contributed to this. 1520 01:01:13,050 --> 01:01:16,500 Tamara- And I love this part about the course, 1521 01:01:16,500 --> 01:01:18,510 Tamara- one of the things that I love about the course is that 1522 01:01:18,510 --> 01:01:22,920 Tamara- it is a catalyst for other work to be able to showcase it. 1523 01:01:22,920 --> 01:01:26,730 Tamara- But also we have this wonderful page 1524 01:01:26,730 --> 01:01:29,910 Tamara- that talks about accessibility related to generative AI 1525 01:01:29,910 --> 01:01:31,920 Tamara- on the university website. 1526 01:01:31,920 --> 01:01:36,920 Tamara- Because of this course, we have tools that are ASU-created 1527 01:01:37,110 --> 01:01:40,410 Tamara- that have passed our VITRA security review process 1528 01:01:40,410 --> 01:01:42,930 Tamara- that leveraged the powers of ChatGPT 1529 01:01:42,930 --> 01:01:46,950 Tamara- that were created and showcased. 1530 01:01:46,950 --> 01:01:48,270 Tamara- They weren't created because of this course, 1531 01:01:48,270 --> 01:01:49,590 Tamara- but they were showcased. 1532 01:01:49,590 --> 01:01:50,517 Tamara- We've got wonderful syllabus statements. 1533 01:01:50,517 --> 01:01:52,590 Mary- Would you mind naming what those are, by the way? 1534 01:01:52,590 --> 01:01:54,577 Mary- Sorry to interrupt you, but those are really great tools. 1535 01:01:54,577 --> 01:01:56,250 Mary- What are those tools that were shared? 1536 01:01:56,250 --> 01:01:57,390 Tamara- Those are incredible. 1537 01:01:57,390 --> 01:01:59,460 Tamara- We've got the ClipGist, 1538 01:01:59,460 --> 01:02:02,850 Tamara- which takes the transcripts of a video 1539 01:02:02,850 --> 01:02:07,440 Tamara- and summarizes it and actually asks questions. 1540 01:02:07,440 --> 01:02:10,000 Tamara- So if you have an interactive video tool and 1541 01:02:11,280 --> 01:02:14,400 Tamara- you want to quickly make some interactive questions 1542 01:02:14,400 --> 01:02:16,980 Tamara- for your video to pause and talk to students 1543 01:02:16,980 --> 01:02:18,960 Tamara- and assess where they're at. 1544 01:02:18,960 --> 01:02:21,870 Tamara- And then we have the question pool generator, 1545 01:02:21,870 --> 01:02:26,520 Tamara- which takes either your learning objectives for the module 1546 01:02:26,520 --> 01:02:28,560 Tamara- or a question stem 1547 01:02:28,560 --> 01:02:33,560 Tamara- and it generates anywhere from 5-30 questions with 1548 01:02:34,590 --> 01:02:38,820 Tamara- anywhere from two to five or six responses. 1549 01:02:38,820 --> 01:02:42,183 Tamara- And it sends it over to faculty or whoever used the tool 1550 01:02:42,183 --> 01:02:47,183 Tamara- in a PDF so that they can look at those and use those to 1551 01:02:47,640 --> 01:02:50,580 Tamara- refine, revise, start again. 1552 01:02:50,580 --> 01:02:54,000 Tamara- But it's a wonderful way to get ideas. 1553 01:02:54,000 --> 01:02:56,310 Tamara- It's actually a great way to combat the issues 1554 01:02:56,310 --> 01:02:57,990 Tamara- with academic integrity, isn't it? 1555 01:02:57,990 --> 01:03:01,200 Tamara- Because we notice a lot of the questions that are asked 1556 01:03:01,200 --> 01:03:04,500 Tamara- in courses end up on Chegg or Course Hero 1557 01:03:04,500 --> 01:03:05,730 Tamara- or something like that. 1558 01:03:05,730 --> 01:03:08,790 Tamara- And so, it's hard to continually update your questions. 1559 01:03:08,790 --> 01:03:11,040 Tamara- So you are assessing knowledge 1560 01:03:11,040 --> 01:03:13,110 Tamara- and not if someone can find the answers online, 1561 01:03:13,110 --> 01:03:15,090 Tamara- but this is a nice way to harness 1562 01:03:15,090 --> 01:03:16,590 Tamara- the power of generative AI, 1563 01:03:16,590 --> 01:03:19,650 Tamara- so I'm really excited about that. 1564 01:03:19,650 --> 01:03:22,770 Tamara- And one thing, I wanted to tell you guys about 1565 01:03:22,770 --> 01:03:25,023 Tamara- an exciting fun fact about this course. 1566 01:03:27,270 --> 01:03:31,230 Tamara- Right now, we have, I think I had already mentioned, 1567 01:03:31,230 --> 01:03:33,300 Tamara- we have about 624, 1568 01:03:33,300 --> 01:03:37,530 Tamara- and when I was looking at it, the 625th person was added, 1569 01:03:37,530 --> 01:03:39,600 Tamara- which was wonderful to see the numbers go up. 1570 01:03:39,600 --> 01:03:42,060 Tamara- But in the first day of the course launch, 1571 01:03:42,060 --> 01:03:46,500 Tamara- we had 3,460 page views. 1572 01:03:46,500 --> 01:03:48,930 Tamara- So people were in it and experimenting. 1573 01:03:48,930 --> 01:03:51,870 Tamara- I know, I'd never seen a number that high in any course 1574 01:03:51,870 --> 01:03:53,910 Tamara- that people were excited about going through it. 1575 01:03:53,910 --> 01:03:56,490 Tamara- So, hopefully there's a lot of benefit, 1576 01:03:56,490 --> 01:04:00,000 Tamara- and it is riding on the shoulders of people 1577 01:04:00,000 --> 01:04:01,410 Tamara- who have already done this work. 1578 01:04:01,410 --> 01:04:03,990 Tamara- Like I said, Andrew Maynard and so many of the other people 1579 01:04:03,990 --> 01:04:07,140 Tamara- who are experts in this Generative AI course as well. 1580 01:04:07,140 --> 01:04:08,610 Mary- And while this course is only available 1581 01:04:08,610 --> 01:04:11,100 Mary- for ASU faculty and staff, 1582 01:04:11,100 --> 01:04:12,780 Mary- we will still put the link in the show notes 1583 01:04:12,780 --> 01:04:14,400 Mary- 'cause if you're listening to this from ASU, 1584 01:04:14,400 --> 01:04:16,080 Mary- that'll be an easy way for you to get into the course. 1585 01:04:16,080 --> 01:04:18,210 Mary- If you haven't already, or to go back into it, 1586 01:04:18,210 --> 01:04:19,320 Mary- if you started to get into it 1587 01:04:19,320 --> 01:04:21,090 Mary- and then got distracted by the start of the session, 1588 01:04:21,090 --> 01:04:24,540 Mary- which is normal, because there's a lot of great content. 1589 01:04:24,540 --> 01:04:26,010 Mary- You were saying the syllabus, 1590 01:04:26,010 --> 01:04:27,750 Mary- I think you're gonna give me some other parts of it. 1591 01:04:27,750 --> 01:04:29,010 Mary- Did you have anything else you wanted to add 1592 01:04:29,010 --> 01:04:30,510 Mary- on reference to that course? 1593 01:04:30,510 --> 01:04:33,510 Tamara- Oh, so many things came out of this, 1594 01:04:33,510 --> 01:04:34,530 Tamara- which was really exciting. 1595 01:04:34,530 --> 01:04:36,870 Tamara- Yeah, the syllabus statements were wonderful. 1596 01:04:36,870 --> 01:04:38,250 Tamara- Oh, actually, Mary, 1597 01:04:38,250 --> 01:04:41,670 Tamara- you developed kind of an innovative solution at ASU 1598 01:04:41,670 --> 01:04:46,230 Tamara- with using InScribe to develop a community where we can-- 1599 01:04:46,230 --> 01:04:47,490 Mary- I didn't develop that, 1600 01:04:47,490 --> 01:04:49,500 Mary- but I definitely put things in there for the course. 1601 01:04:49,500 --> 01:04:51,600 Mary- But this is the Learning Experience team 1602 01:04:51,600 --> 01:04:52,803 Mary- that created that space. 1603 01:04:54,330 --> 01:04:55,980 Mary- Well, I'm happy to do the work where I can. 1604 01:04:55,980 --> 01:04:56,847 Tamara- You were fantastic. 1605 01:04:56,847 --> 01:04:58,560 Mary- But the InScribe community is awesome, 1606 01:04:58,560 --> 01:05:00,630 Mary- it is a great place to share and to learn. 1607 01:05:00,630 --> 01:05:03,030 Mary- So in the course there are many opportunities 1608 01:05:03,030 --> 01:05:05,760 Mary- to share your artifacts with the community of ASU. 1609 01:05:05,760 --> 01:05:07,560 Mary- That's actually how I found Mary Ann 1610 01:05:07,560 --> 01:05:09,300 Mary- is because I'm a lurker in all spaces, 1611 01:05:09,300 --> 01:05:11,850 Mary- and I saw her artifact get posted there. 1612 01:05:11,850 --> 01:05:15,000 Mary- So if you have any interest in how could I use this, 1613 01:05:15,000 --> 01:05:17,160 Mary- what does that look like when other people interact with it, 1614 01:05:17,160 --> 01:05:18,990 Mary- there's a lot of opportunity to lurk. 1615 01:05:18,990 --> 01:05:21,060 Mary- You don't have to do anything other than lurk, 1616 01:05:21,060 --> 01:05:24,030 Mary- you could be like me and just watch people from the outside. 1617 01:05:24,030 --> 01:05:26,400 Mary Ann- My imposter syndrome's waning just enough for me 1618 01:05:26,400 --> 01:05:28,053 Mary Ann- to actually follow with something. 1619 01:05:28,053 --> 01:05:32,250 Mary Ann- So I do have the lib guide that has some basics on 1620 01:05:32,250 --> 01:05:37,250 Mary Ann- citing generative AI, et cetera, and it's growing. 1621 01:05:37,350 --> 01:05:40,170 Mary Ann- Another resource is the prompt engineering, 1622 01:05:40,170 --> 01:05:41,460 Mary Ann- like having some templates. 1623 01:05:41,460 --> 01:05:44,430 Mary Ann- But I just wanted to plug, I have wonderful colleagues. 1624 01:05:44,430 --> 01:05:46,320 Mary Ann- One of the things I've done in this class 1625 01:05:46,320 --> 01:05:49,470 Mary Ann- and Dr. Maynard's class was to really practice 1626 01:05:49,470 --> 01:05:51,330 Mary Ann- and look beyond my own, 1627 01:05:51,330 --> 01:05:53,550 Mary Ann- like, the students that I work with. 1628 01:05:53,550 --> 01:05:55,400 Mary Ann- And so I worked with some of the health sciences students 1629 01:05:55,400 --> 01:05:58,470 Mary Ann- to create a specific template for the health sciences 1630 01:05:58,470 --> 01:06:00,450 Mary Ann- that would lead students through 1631 01:06:00,450 --> 01:06:04,080 Mary Ann- the design of a PICO method question. 1632 01:06:04,080 --> 01:06:07,200 Mary Ann- So please do feel free to reach out to your librarians. 1633 01:06:07,200 --> 01:06:09,870 Mary Ann- I'm happy to work with you if you're looking to see 1634 01:06:09,870 --> 01:06:12,270 Mary Ann- how can ChatGPT or generative AI 1635 01:06:12,270 --> 01:06:13,830 Mary Ann- be used in the research process 1636 01:06:13,830 --> 01:06:16,110 Mary Ann- that doesn't equate to finding sources, 1637 01:06:16,110 --> 01:06:17,190 Mary Ann- that gets them to the place 1638 01:06:17,190 --> 01:06:19,440 Mary Ann- where they can start jumping into academic sources. 1639 01:06:19,440 --> 01:06:20,307 Mary Ann- The librarians are here for you, 1640 01:06:20,307 --> 01:06:22,500 Mary Ann- and I'm happy to collaborate, so. 1641 01:06:22,500 --> 01:06:23,850 Mary- There you go, wide world, 1642 01:06:23,850 --> 01:06:25,170 Mary- ASU, you're very supported, 1643 01:06:25,170 --> 01:06:27,330 Mary- you have lots of great experts here at your fingertips. 1644 01:06:27,330 --> 01:06:29,940 Mary- Don't be afraid, jump right in, get curious. 1645 01:06:29,940 --> 01:06:31,170 Mary- And if you're not from ASU, 1646 01:06:31,170 --> 01:06:32,550 Mary- a lot of the resources we're sharing 1647 01:06:32,550 --> 01:06:33,690 Mary- are still viewable to you. 1648 01:06:33,690 --> 01:06:36,180 Mary- And Andrew's Substack is absolutely viewable to you, 1649 01:06:36,180 --> 01:06:37,080 Mary- so go visit that. 1650 01:06:37,080 --> 01:06:39,120 Mary- There's lots of great material there. 1651 01:06:39,120 --> 01:06:40,980 Mary- And thank you again for coming 1652 01:06:40,980 --> 01:06:43,050 Mary- and talking through this experience with us. 1653 01:06:43,050 --> 01:06:44,809 Andrew- Thank you. Mary Ann- Thank you. Tamara- Thank you. 1654 01:06:44,809 --> 01:06:47,392 (lively music) 1655 01:06:49,740 --> 01:06:50,970 Ricardo- That was a great conversation, 1656 01:06:50,970 --> 01:06:53,070 Ricardo- such great experts in the room here. 1657 01:06:53,070 --> 01:06:53,903 Mary- Truly. 1658 01:06:53,903 --> 01:06:55,230 Ricardo- And so we decided to ask them to stay here 1659 01:06:55,230 --> 01:06:57,804 Ricardo- because Liz has got a scheme. 1660 01:06:57,804 --> 01:06:59,490 (group chuckles) 1661 01:06:59,490 --> 01:07:03,810 Liz- So I am working on a little bit of a side project, 1662 01:07:03,810 --> 01:07:05,550 Liz- I thought it could be a fun side project 1663 01:07:05,550 --> 01:07:09,210 Liz- to trick my loving in-laws or soon-to-be in-laws 1664 01:07:09,210 --> 01:07:11,550 Liz- into thinking I know something about football. 1665 01:07:11,550 --> 01:07:13,890 Ricardo- Right, integrity's now out the window. 1666 01:07:13,890 --> 01:07:15,780 Mary- Which you also call sports ball, 1667 01:07:15,780 --> 01:07:16,920 Mary- just to set the tone. 1668 01:07:16,920 --> 01:07:19,500 Liz- I know nothing about football except that 1669 01:07:19,500 --> 01:07:22,590 Liz- Larry Fitzgerald was my hero until he retired. 1670 01:07:22,590 --> 01:07:24,900 Liz- I drafted him for my team but he retired, 1671 01:07:24,900 --> 01:07:27,720 Liz- so he got me zero points that season. 1672 01:07:27,720 --> 01:07:30,660 Liz- So what I decided to do as a part of my little scheme, 1673 01:07:30,660 --> 01:07:32,970 Liz- and I think it's really relevant to the conversation 1674 01:07:32,970 --> 01:07:34,650 Liz- we've had about prompt engineering 1675 01:07:34,650 --> 01:07:38,190 Liz- 'cause I need some help on prompt engineering for, 1676 01:07:38,190 --> 01:07:41,640 Liz- I'm gonna use ChatGPT to draft and run 1677 01:07:41,640 --> 01:07:44,340 Liz- my fantasy football team this season 1678 01:07:44,340 --> 01:07:45,840 Liz- and hopefully get some points 1679 01:07:45,840 --> 01:07:49,560 Liz- and show my future husband's family that 1680 01:07:49,560 --> 01:07:51,570 Liz- I know something about football. 1681 01:07:51,570 --> 01:07:53,220 Liz- So I can tell you a little bit about what I've done. 1682 01:07:53,220 --> 01:07:56,280 Liz- I've basically gone to ChatGPT 1683 01:07:56,280 --> 01:07:58,102 Liz- and I've given it some, 1684 01:07:58,102 --> 01:08:00,390 Liz- what I've started doing is asking you questions like, 1685 01:08:00,390 --> 01:08:02,250 Liz- can you help me with this? 1686 01:08:02,250 --> 01:08:03,870 Liz- And then it will give me a response, 1687 01:08:03,870 --> 01:08:05,760 Liz- and then will tell me, yeah, I can help you with this 1688 01:08:05,760 --> 01:08:09,090 Liz- if you give me A, B, C information. 1689 01:08:09,090 --> 01:08:12,030 Liz- But like what would your be advice for like 1690 01:08:12,030 --> 01:08:15,150 Liz- prompt engineering for something like fantasy football, 1691 01:08:15,150 --> 01:08:18,750 Liz- or are there things that I should be keeping an eye out for 1692 01:08:18,750 --> 01:08:21,456 Liz- in terms of accuracy when I'm working with my? 1693 01:08:21,456 --> 01:08:22,779 (group laughs) 1694 01:08:22,779 --> 01:08:23,946 Andrew- Yeah, don't. 1695 01:08:26,930 --> 01:08:31,380 Andrew- So your biggest challenge is that ChatGPT, A, 1696 01:08:31,380 --> 01:08:35,760 Andrew- knows nothing about football and it knows nothing beyond, 1697 01:08:35,760 --> 01:08:38,283 Andrew- when is it, September, 2021. 1698 01:08:39,390 --> 01:08:41,763 Andrew- So anything after that is a problem. 1699 01:08:42,660 --> 01:08:45,030 Andrew- However, I think it can help you begin 1700 01:08:45,030 --> 01:08:47,670 Andrew- to formulate your approach to things. 1701 01:08:47,670 --> 01:08:51,930 Andrew- So you can first of all ask it general questions about 1702 01:08:51,930 --> 01:08:54,270 Andrew- what sort of characteristics should you be looking for 1703 01:08:54,270 --> 01:08:56,700 Andrew- as you put your fantasy team together. 1704 01:08:56,700 --> 01:08:58,890 Andrew- You can even give it basic information 1705 01:08:58,890 --> 01:09:01,020 Andrew- about the sorts of people you are interested in. 1706 01:09:01,020 --> 01:09:04,950 Andrew- So you can sort of create notes about potential players 1707 01:09:04,950 --> 01:09:06,090 Andrew- and upload them and say, 1708 01:09:06,090 --> 01:09:08,130 Andrew- hey, what do you think about this team? 1709 01:09:08,130 --> 01:09:11,010 Andrew- So now it begins to look like a conversation 1710 01:09:11,010 --> 01:09:13,440 Andrew- where you give ChatGPT information, 1711 01:09:13,440 --> 01:09:14,490 Andrew- you may not know what you're doing, 1712 01:09:14,490 --> 01:09:16,230 Andrew- but at least you can put notes together 1713 01:09:16,230 --> 01:09:18,690 Andrew- and it'll help you sort through those 1714 01:09:18,690 --> 01:09:20,610 Andrew- and make some sort of sense of them. 1715 01:09:20,610 --> 01:09:24,660 Ricardo- So how can I con my in-laws is not a good prompt for? 1716 01:09:24,660 --> 01:09:26,610 Andrew- Well, you know, knowing ChatGPT, 1717 01:09:26,610 --> 01:09:28,350 Andrew- it will probably start off by saying 1718 01:09:28,350 --> 01:09:30,030 Andrew- that that is not a good idea. 1719 01:09:30,030 --> 01:09:31,565 Andrew- It is a very moral machine. 1720 01:09:31,565 --> 01:09:33,569 (group laughs) 1721 01:09:33,569 --> 01:09:34,402 Liz- That's disappointing. 1722 01:09:34,402 --> 01:09:36,000 Tamara- It's illegal. 1723 01:09:36,000 --> 01:09:38,610 Liz- I'm trying to cheat at this fake game 1724 01:09:38,610 --> 01:09:39,870 Liz- and it's not gonna help me. 1725 01:09:39,870 --> 01:09:43,710 Tamara- One of my neat tricks that I like to do with things is 1726 01:09:43,710 --> 01:09:47,340 Tamara- I like to leverage plugins in GPT-4 1727 01:09:47,340 --> 01:09:49,290 Tamara- and I like to use PDFs. 1728 01:09:49,290 --> 01:09:52,110 Tamara- And I discovered this because 1729 01:09:52,110 --> 01:09:55,080 Tamara- we have a lot of board games that we play in our family, 1730 01:09:55,080 --> 01:09:58,800 Tamara- and there's always a little bit of a conversation, 1731 01:09:58,800 --> 01:10:01,350 Tamara- a heated discussion about which rule somebody broke 1732 01:10:01,350 --> 01:10:03,120 Tamara- or which rule somebody used. 1733 01:10:03,120 --> 01:10:06,630 Tamara- So what we did is we took our board game rules 1734 01:10:06,630 --> 01:10:11,220 Tamara- and we put it in a PDF and we use the ChatGPT for PDF plugin 1735 01:10:11,220 --> 01:10:15,090 Tamara- to say, here's a PDF, remember these rules, 1736 01:10:15,090 --> 01:10:17,340 Tamara- and we're going to ask you questions about the rules, 1737 01:10:17,340 --> 01:10:19,620 Tamara- and then it references that PDF. 1738 01:10:19,620 --> 01:10:22,514 Tamara- So you could do something similar to what Andrew was saying. 1739 01:10:22,514 --> 01:10:24,540 Tamara- You could put this stats 1740 01:10:24,540 --> 01:10:28,080 Tamara- for key players for last season in a PDF. 1741 01:10:28,080 --> 01:10:30,900 Tamara- Use the ChatGPT for PDF plugin, 1742 01:10:30,900 --> 01:10:32,250 Tamara- and then you could say, 1743 01:10:32,250 --> 01:10:35,430 Tamara- based off of the information in this PDF, 1744 01:10:35,430 --> 01:10:38,280 Tamara- which are the top players according to this? 1745 01:10:38,280 --> 01:10:40,387 Tamara- Which would be a balanced team according to that? 1746 01:10:40,387 --> 01:10:42,000 Tamara- And which should be my first, 1747 01:10:42,000 --> 01:10:43,950 Tamara- second and third picks, and why? 1748 01:10:43,950 --> 01:10:46,680 Tamara- And if you want to really trick your in-laws, 1749 01:10:46,680 --> 01:10:48,870 Tamara- that's probably the best way to do that because you can say, 1750 01:10:48,870 --> 01:10:51,600 Tamara- well, actually in the third quarter of last season, 1751 01:10:51,600 --> 01:10:54,420 Tamara- so-and-so did this and he had this number of losses 1752 01:10:54,420 --> 01:10:57,330 Tamara- but I would say he's only a second round pick. 1753 01:10:57,330 --> 01:10:59,010 Tamara- That would impress them, so. 1754 01:10:59,010 --> 01:11:00,840 Mary- Absolutely, I love that you're using it 1755 01:11:00,840 --> 01:11:01,953 Mary- as a gaming referee. 1756 01:11:03,348 --> 01:11:04,181 Liz- This is really good 1757 01:11:04,181 --> 01:11:06,090 Liz- 'cause I've already asked it like, who should I, 1758 01:11:06,090 --> 01:11:09,270 Liz- it's given me some advice on like a strategy for drafting. 1759 01:11:09,270 --> 01:11:11,490 Liz- Our draft is actually tomorrow night, 1760 01:11:11,490 --> 01:11:13,860 Liz- so I am gonna have to do some research 1761 01:11:13,860 --> 01:11:15,420 Liz- and some uploading today. 1762 01:11:15,420 --> 01:11:16,253 Mary- You got some chatting to do. 1763 01:11:16,253 --> 01:11:17,850 Liz- So it's good to know that 1764 01:11:17,850 --> 01:11:21,540 Liz- it only goes back so far 'cause it has already given me 1765 01:11:21,540 --> 01:11:23,970 Liz- some thoughts on who my first pick should be. 1766 01:11:23,970 --> 01:11:25,110 Liz- And I do get first pick 1767 01:11:25,110 --> 01:11:27,000 Liz- 'cause I was the big loser last year, 1768 01:11:27,000 --> 01:11:28,650 Liz- so I get to pick first this year. 1769 01:11:28,650 --> 01:11:30,622 Mary- Maybe then Google, have they retired? 1770 01:11:30,622 --> 01:11:31,663 (group laughs) 1771 01:11:31,663 --> 01:11:35,160 Liz- Honestly that was the biggest disappointment 1772 01:11:35,160 --> 01:11:37,110 Liz- of like two years ago, so. 1773 01:11:37,110 --> 01:11:38,940 Mary Ann- I think if I were trying to trick my in-laws, 1774 01:11:38,940 --> 01:11:41,970 Mary Ann- what I would do is also have it ChatGPT quiz me 1775 01:11:41,970 --> 01:11:45,587 Mary Ann- to see what my knowledge is so that I can respond quickly. 1776 01:11:45,587 --> 01:11:47,940 Mary- Next level. Mary Ann- That's great. 1777 01:11:47,940 --> 01:11:52,440 Liz- Yeah, my plan was to have my laptop open with one window 1778 01:11:52,440 --> 01:11:53,910 Liz- with the draft information 1779 01:11:53,910 --> 01:11:56,100 Liz- and another window with ChatGPT 1780 01:11:56,100 --> 01:11:57,480 Liz- and I was just gonna feed it. 1781 01:11:57,480 --> 01:12:01,320 Liz- So here's who's been taken, who should I pick next 1782 01:12:01,320 --> 01:12:04,260 Liz- based on these players being taken? 1783 01:12:04,260 --> 01:12:07,470 Liz- But I'll have to upload like an entire roster for this year 1784 01:12:07,470 --> 01:12:11,790 Liz- it sounds like if it doesn't work for 2023. 1785 01:12:11,790 --> 01:12:13,920 Andrew- But actually that's not too difficult to do. 1786 01:12:13,920 --> 01:12:16,500 Andrew- The information in that PDF or whatever you upload 1787 01:12:16,500 --> 01:12:18,720 Andrew- doesn't have to be perfect because it can make sense of it, 1788 01:12:18,720 --> 01:12:20,610 Andrew- you just need the information there. 1789 01:12:20,610 --> 01:12:21,630 Liz- This is great news. 1790 01:12:21,630 --> 01:12:24,390 Liz- So I'm trying to keep this a secret from the in-laws 1791 01:12:24,390 --> 01:12:25,680 Liz- and have them be surprised that 1792 01:12:25,680 --> 01:12:30,540 Liz- suddenly I'm a wiz at fantasy football. 1793 01:12:30,540 --> 01:12:32,910 Liz- They've already been making fun of me with memes 1794 01:12:32,910 --> 01:12:37,680 Liz- in our family chat about how I'm terrible at picking people. 1795 01:12:37,680 --> 01:12:40,020 Liz- I pick based off of fun names, 1796 01:12:40,020 --> 01:12:42,420 Liz- I don't know anything about their stats. 1797 01:12:42,420 --> 01:12:44,730 Liz- But I like, oh that's a fun name. 1798 01:12:44,730 --> 01:12:46,950 Andrew- Just whatever you do, don't put money down on this. 1799 01:12:46,950 --> 01:12:48,250 (group laughs) Liz- I'm not. 1800 01:12:49,110 --> 01:12:51,420 Ricardo- All right Mary, if anybody wants to chastise Liz, 1801 01:12:51,420 --> 01:12:52,410 Ricardo- how can they reach us? 1802 01:12:52,410 --> 01:12:54,806 Mary- Well, they can reach out to corestories@asu.edu. 1803 01:12:54,806 --> 01:12:59,806 Mary- Come to Teach Online, we'll post Liz's football draft team. 1804 01:12:59,847 --> 01:13:01,550 Ricardo- We'll keep you updated this season. 1805 01:13:01,550 --> 01:13:03,450 Liz- Yeah, we'll see how it goes. 1806 01:13:03,450 --> 01:13:04,800 Mary- Yeah, we'd love to get an update. 1807 01:13:04,800 --> 01:13:07,170 Mary- In fact, if you wanna publish your conversation, 1808 01:13:07,170 --> 01:13:08,550 Mary- we can also put that on there 1809 01:13:08,550 --> 01:13:10,530 Mary- 'cause that's how easy it is to share these things. 1810 01:13:10,530 --> 01:13:11,790 Mary- Like, there's a link that you can go to 1811 01:13:11,790 --> 01:13:15,870 Mary- people's conversations and see what these students, faculty, 1812 01:13:15,870 --> 01:13:18,840 Mary- Liz, our producer, are doing with ChatGPT. 1813 01:13:18,840 --> 01:13:19,673 Liz- That's good to know. 1814 01:13:19,673 --> 01:13:22,380 Liz- Plus I'm gonna try to record some of this on TikTok 1815 01:13:22,380 --> 01:13:26,130 Liz- is like a little diary entry for my experiences. 1816 01:13:26,130 --> 01:13:29,310 Liz- I thought it'd be a really fun way to explore ChatGPT, 1817 01:13:29,310 --> 01:13:31,620 Liz- explore prompt engineering, 1818 01:13:31,620 --> 01:13:33,690 Liz- and maybe hopefully come up with some great ways 1819 01:13:33,690 --> 01:13:35,145 Liz- for people to use it in the future 1820 01:13:35,145 --> 01:13:37,323 Liz- for good or for evil, in my case. 1821 01:13:39,330 --> 01:13:41,370 Tamara- You might've just done some work for anybody else. 1822 01:13:41,370 --> 01:13:43,560 Tamara- If you share that link, they can build on your link 1823 01:13:43,560 --> 01:13:46,050 Tamara- and create their own fantasy football teams too. 1824 01:13:46,050 --> 01:13:49,023 Ricardo- Oh yeah, you're Prometheus, bringing down the fire. 1825 01:13:50,040 --> 01:13:51,750 Liz- I'm gonna one by one, 1826 01:13:51,750 --> 01:13:53,790 Liz- I'm gonna take down fantasy football. 1827 01:13:53,790 --> 01:13:55,320 Liz- Two years of being embarrassed, 1828 01:13:55,320 --> 01:13:56,153 Liz- I'm gonna burn the whole thing down. 1829 01:13:56,153 --> 01:13:57,430 Andrew- Take the ethics module first. 1830 01:13:57,430 --> 01:14:00,097 (group laughs) 1831 01:14:03,180 --> 01:14:05,910 Ricardo- Course Stories is available wherever you listen to podcasts. 1832 01:14:05,910 --> 01:14:08,940 Ricardo- You can reach us at corestories@asu.edu. 1833 01:14:08,940 --> 01:14:10,440 Ricardo- Course Stories is produced by the 1834 01:14:10,440 --> 01:14:13,050 Ricardo- Instructional Design and New Media team at EdPlus 1835 01:14:13,050 --> 01:14:14,910 Ricardo- at Arizona State University. 1836 01:14:14,910 --> 01:14:17,010 Ricardo- If you're an instructor at ASU Online, 1837 01:14:17,010 --> 01:14:18,930 Ricardo- tell us your core story and we may feature it 1838 01:14:18,930 --> 01:14:20,670 Ricardo- in a future episode. 1839 01:14:20,670 --> 01:14:21,720 Ricardo- Thanks for listening.