With accessibility to online education increasing, the retention of online students has become a concern of academic leaders in higher education (Allen & Seaman, 2015). As a result, many universities have launched initiatives to improve course completion, program completion, and student support services (Johnson, Adams Becker, Estrada, & Freeman, 2015). Although many causes for students withdrawing from an online course are beyond the realm of instructor control, retention and attrition can be reduced through various means. Continue reading →
This is the third article in our series on Classroom Assessment Techniques (CATs) which can be used to gauge lesson effectiveness and student comprehension. To review, CATs were developed by Angelo and Cross (1983) to efficiently check whether students understand a certain concept. For more examples of these formative assessments, please see our previous posts:
In this article, we will present three CATs focusing on developing Higher Order Thinking Skills (see Collins, 2014) and that can also be used face-to-face, hybrid, or online teaching. Continue reading →
The ability to establish presence is closely connected to the ability of the instructor to create a sense of community among learners in an online course. (Palloff & Pratt, The Excellent Online Instructor, 2011)
Research has long pointed to engagement as a key predictor of student success (Pascarella & Terenzini (2005), Kuh, (2005) CITE). Fortunately, new online learning environments and tools (see ASU Online Digital Learning Platform) provide a variety of opportunities for students to engage not only with course content, but during student-student and student-interaction as well. (Swan, 2004) Continue reading →
In a previous post (see Gauging Student Understanding: CATs are puuuuur-fect), we introduced instructors to the idea of using Classroom Assessment Techniques (CATs) by Angelo and Cross (1983) to check whether students understand a certain concept. To recap, CATs are generally short, non-graded, and student-centered activities that provide instructors with feedback about lesson effectiveness and student comprehension. Best of all, they require little preparation, class time, or grading. In the following section, we present three additional CATs and suggest ways to adapt them to online courses. Continue reading →
Note: This is a highly interactive article! Please click on all of the hyperlinks. They either take you to the game mentioned OR to an article about the game’s use in education.
The Games for Change (G4C) Festival in New York City has come a long way over the past few years. When I started attending the conference in 2010, the emphasis on using games to educate was at the periphery, not because attendees didn’t believe in the potential of games in the learning space, but because the money simply wasn’t there to create commercial quality learning games. There also wasn’t universal support for the idea that learning could be fun. (“They are having too much fun to be learning.”)
With finals week coming to an end and grading about to be completed, it is only natural to make a mad dash for the door and enjoy a well-deserved break. There is no question that we all need a break to relax and find inspiration, but before heading out, keep in mind that there is a good chance that you will have to return and teach the same or a similar course again. To save time in the future and perhaps “tweak” some elements of the course, consider the following five things successful instructors do when the semester is over.
1. Reflect on the Course
Reflection is not only an important aspect of student learning but also offers an opportunity for faculty to seek insights from past teaching experiences. At the end-of-the semester, when the memories are still fresh, take a few moments to reflect on the course (e.g., what went well, what did not?). It is helpful to write down a few notes to avoid forgetting important details over the break. As an alternative, one can also discuss the course with a colleague, a friend, or instructional support at the university.
2. Make a Plan
As an essential part of reflection, certain topics or issues emerge that one would like to address in future courses. Since there might be multiple options or solutions, it is good to brainstorm on potential actions one might undertake. Occasionally, this might require talking to others or finding additional information and resources. Then, it is time to select an option and make a plan! One recommended method is to identify required steps and develop a timeline (e.g., What are some steps that need to be done? By what time?). Depending on the number of topics, it might be necessary to prioritize.
3. Archive the Course
Many Learning Management Systems (LMS) provide the option to archive a course which could come in handy when teaching again. Instead of starting from scratch, one has a version that is already developed and potentially reusable. The archived course is also a good place to make any changes without impacting enrolled students. Additionally, many universities are now making teaching portfolios a critical component of degree programs or tenure requirements. Certain LMS allow to share the archived versions without revealing confidential student information.
4. Ask Students for Permission
Student artifacts are a powerful and helpful resource that can be used as a model for future students or as proof of student learning. To avoid any privacy or copyright concerns, the end-of-the-semester is a good time to ask students for their written permission for sharing those artifacts with outsiders.
5. Take Time Off
Although it might be tempting to continuously “tweak” a course, it is also important to relax and focus on other aspects of life. Taking time off generally helps to finds new inspiration and motivation for the next semester. It is completely fine to avoid thinking about teaching for some time to find time to read a book or explore places… as long as one remembers that the next semester is right around the corner.
Do you do anything at the end-of-the-semester that helps with your future teaching? Please share your tips with our community.
As it turns out, you don’t need Muppets to teach a successful online course.
I worried about this a few months ago as I began to prepare for my Arizona State University Online ”Media Research Methods” class. This concern blossomed when, out of curiosity, I signed up for a Harvard EdX online course on computer science basics and watched the first lecture.
The video began with a two-and-a-half minute segment of a fuzzy puppet waking up in a dorm room and then rushing across the Harvard Yard to get to this class. It wasn’t some generic puppet — it was a Muppet; apparently the dorm is around the corner from Sesame Street. And the rock-star lecture was delivered from the stage of what looked like a packed opera house, complete with props, multiple camera angles, wireless mikes and professional editing.
My virtual classroom, on the other hand, was my home office, with me sitting in front of my iMac’s built-in camera talking into a cheap microphone. The contrast just shows, I guess, what that $60,000 a year buys those Harvard kids.
But I shook off my feelings of inferiority, completed my class materials and recorded my 24 lectures. Now that the seven-week course is over, I have learned a few things to share with any of you who might be considering teaching online, too.
Based on experience from my own (ahem) checkered undergrad days, I know that rigid deadlines help most students to stay on track in a course. I had previously taught this course, which is focused on statistical literacy for journalists, as a lecture class in person. Back then, after each class there was a quiz on the material that had to be completed before the next class begins.
But students taking online classes often are working full-time jobs, living in time zones across the world, or have family responsibilities that make such a rigid schedule difficult to meet. To balance these competing demands of enforced progress and flexible scheduling, I organized the class so that each week there were three to four lectures, chapters and quizzes to complete. But only one deadline — all of the week’s work had to be submitted online by 6 a.m. Arizona time each Monday of the following week.
When they chose to do the work was up to them. Some students spaced it out over the course of each week. Others did it all on Sunday — or sometimes into the wee hours of Monday. I made it very clear in my syllabus that no excuses — last-minute computer crashes, pets eating homework, alien abductions, whatever — would be accepted for missing the weekly deadline; after all, I told them, they had all week to do the work.
Just like it is easier to write long than to write tight, it takes a lot of thought to limit an online lecture to an effective duration. The typical in-person course is built around 50- or 90-minute classroom sessions, but I knew that a considerable portion of that time is spent on more than just the instructor droning on from lecture notes — questions, discussion, demonstration and such.
ASU offers its online instructors a lot of support and research on best practices in teaching such courses. One of the key recommendations was to limit video lectures to about 15 minutes. The reason is that attention wanders when content is being delivered in one direction — teacher to student — without any other interaction. Better several short talks focused on one concept than one long one that tries to cover several topics.
To adapt to the time limits of online teaching, I had to ruthlessly pare my slides to the essence of the objective I wanted each lesson to deliver. I could do this because I designed the course around a textbook (“Seeing Through Statistics — 4th Edition”, by Jessica Utts) in which each chapter was the focus of the corresponding lecture. In my syllabus, I told students that they were responsible not only for what I said in a lecture, but also any other material that was discussed in the companion chapter.
This actually worked quite well. The lectures highlighted the concepts I wanted the students to learn, and they could find additional explanation and examples in the text, as well as questions and exercises at the end of each chapter. It certainly would be possible to do the same thing without a textbook, but I would strongly recommend supplying handouts, tipsheets, practice exercises and other supplemental reading to go along with each lecture.
ASU has a video studio that some online instructors use for recording their lectures, but arranging studio time seemed like a hassle to me. Instead, I bought a $99 screencapture package called Camtasia 2 for Mac, which allowed me to record my lectures at home on my own schedule. (Plenty of other choices exist, including Quicktime Pro and Screenflow.) With Camtasia, I could record my PowerPoint slides in a window, and at the same time my image as I talked into my webcam was displayed in a small box in a bottom corner of the slides. Camtasia also allows video editing — useful when the phone would ring in the middle of a lecture, for instance.
One problem (noted by other Camtasia users, too) was that sometimes the audio would shift increasingly out of synch with the video, making the little box with my face look like a badly dubbed foreign film. I sort of solved the problem by recording an intro with my talking head, then doing the slides with just audio but no little box, and then closing with another talking head shot reminding them to do the quiz. I’d then splice these together with the editing tools.
At the end of a lecture, Camtasia would convert it into the widely-used .mp4 video format and I would upload the result into ASU’s servers for the courseware to deliver on demand. Depending on how much motion went on during the lecture, the file sizes were about 150 to 250 MB in size.
Quizzes and Tests
I designed the course so students could earn up to 500 points. Each chapter quiz was worth 10 points — low stakes so there was less anxiety but enough so that missing a deadline starts to subtract from the final grade. I also had a midterm and a final exam, covering the concepts of each half of the course. These tests were worth 100 points each. And I started the semester with a quiz about the detailed syllabus, an easy 10 points that eliminated later attempts at claiming “Gee, I didn’t know” about class schedules or policies.
With 60 students signed up — and there could be as many as 200 — grading all those quizzes and tests could have been drudgery. Instead, I made all the assignments multiple-choice questions that could be automatically graded by ASU’s online courseware. ASU uses Pearson ecollege, but other platforms like Blackboard exist. It also would be possible to use web survey sites like SurveyMonkeyor QuestionPro to build your own quizzes and tests without a big courseware infrastructure.
I should add that one of my policies is that all quizzes and exams are open book, open notes, open Internet — and no time limit on completion other than the weekly deadline. As a journalist who does statistical work on occasion, I am more interested in making sure I am correct than in memorization skill. I have a bookcase full of stats and reference books, and I refer to them often when I do that kind of work, and I allow my students to do the same. So does all that make the course so easy that everyone gets an A+? I’ll save that answer to the end.
Like clockwork, students begin asking about opportunities for extra points right after the midterm. I suggested in my syllabus that I might offer up to 20 extra points, in the form of some exercise that used class concepts — but no busywork. I did that, with two exercises that most students completed.
Another of the strong recommendations from the ASUonline support team was to have a “discussion board” feature built into the class. I was dubious at first, imagining how stilted “discussion” would be if it was just postings from scattered students and responses given perhaps days later. But with some skepticism, I set it up so that students would earn points each week (for a maximum of 50 points through the semester) for posting about some media report of a research study of some kind they had encountered that was relevant to what we were doing in class.
To my surprise, the discussion board turned out to be lively, engaging and a lot of fun — for me included. I made sure to check in with it each night and post a sentence or two of my own reaction to each posting; I learned from the evaluations at the end of the course that the students really appreciated knowing that their instructor actually was reading what they were posting. The studies (including links to the original material) that were posted and the comments the students made demonstrated that most actually were becoming critical consumers of media reports — a central goal of the course.
The only problem was that a few students would wait until late Sunday night to post, which meant that there would be no chance for others to read and react. But I will address that the next time I do the course by having some sort of point-penalty for last-minute postings.
When all the virtual chalk dust had settled and the final exams were in, I evaluated the success of the course. Two students maxed the points; one student flunked, though he apparently quit doing the work for several weeks without formally dropping the course. Overall, about a quarter earned an A, a third were in the B range, another quarter in the C range and the rest limped in with Ds.
Was it too easy, or too hard? The so-called curve (a statistical concept in itself) looked pretty good to me. And the strictly anonymous student evaluations that were completed by more than 90% of the students were almost uniformly positive, despite the fact that many of them were from students who didn’t earn As or Bs.
One student noted in the evaluation: “What I liked the most was how helpful this class was in terms of building statistical literacy and confidence. Before this class, I would just shut down as soon as I saw numbers.” Another wrote simply: “I LOVED this class!”
Over 100 third-party tools and services are used by faculty and students in ASU Online courses. With the 50+ companies indicated here in bold, ASU Online has established a connection with a vendor representative. These relationships provide ASU Online with opportunities to create true partnerships.
Does increasing the amount of time students spend viewing faculty research videos and playing course specific games positively impact course grades?
Yes, it does, in both online and face-to-face courses and regardless of student GPAs.
These questions were the basis for a paper on the effects of faculty-research videos and games in an upper-division empirical-methods course. We expected that a media-rich environment would increase student engagement and that this increased student engagement would enhance performance and increase student satisfaction with the course.
Working with ASU Online and the College of Liberal Arts and Sciences, Political Science faculty created 10- to 20-minute videos in which they presented a research project; they explained what they found, how they found it, and why they asked the question.
The videos are part of a repository and, although the impetus for the project was the growing demand for research methods online, the videos have been used in online, face-to-face and blended courses.
We assumed that the video examples of real research projects, tied to theory, would add depth and breadth to the course. In turn, this would make research methods more interesting;
increase student engagement;
improve the quality and quantity of student interactions with faculty;
expose online students to a wider range of faculty than their coursework typically might offer, thus providing them with some of the benefits accrued by traditional students;
and raise course grades.
These questions are consistent with the media-integration model, which dominates the theoretic literature and research on technology inclusion. Earlier research found that streaming video and audio explanations are better than textbooks at explaining complex concepts.
We found that online students spent 21 percent of their online time viewing the faculty-research videos. The correlation between video-viewing time and course grade (49%) was the same as the correlation between course grade and GPA (50%). We expected the relationship between GPA and course grade to be strong but were surprised to find that the relationship between video viewing and course grade was equally strong. The correlation between viewing the research videos and GPA was 24 percent. When we controlled for GPA, the partial correlation between faculty video viewing and course grade was 73%, which was considerably higher than the zero order correlation between course grade and video viewing. GPA and faculty-research videos had independent effects on course grade; they did not substitute for one another. Moreover, within “GPA groups” video viewing had a strong, positive influence on course grades.
Anecdotal evidence is consistent with these findings. Students wrote, “the videos were fantastic and…fun to watch; as an online student, the videos are essential to making me feel a connection to the faculty; (the videos) were indeed informative.”
The videos “helped make the abstract concepts a bit more concrete and easier to grasp. I think the videos were…a good addition to the course.”
Another student identified a benefit beyond the scope of the project when s/he wrote, “I quite enjoyed the videos and found them useful not only as a tool to explain course concepts and research examples, but also – and probably more importantly – as an opportunity to sort of ‘sample’ other professors. The videos provided a kind of mini course from a few professors, which helped in selecting classes for the following semesters.”
Our project had one more element. The spring 2013 online section added games associated with the course material for each week. These were drill and practice games emphasizing memory, repetition and retention; and Quizlet (our creation tool) offered students variations within the drill and practice theme.
The data regarding the impact of these tools is unequivocal. Students using drill and practice online vocabulary games had higher quiz and course grades than students who opted not to use the games.
Additionally, the inclusion of “mini-games” or “learning objects” in the course, while not measured, appeared anecdotally to increase student satisfaction.
On an anonymous course evaluation, one student wrote, “this instructor went out of her way to make it easy for distance-learning students. She…tied the different elements of the course together …. (making) learning much easier (textbook, homework, quizzes, discussions and games).” Another wrote, “The many sources of media ….brought diversity to the course material.”
Our project was a success. Many faculty who made research videos reported that students referred to the videos in conversations. This was an unexpected consequence but one that was entirely consistent with the project’s goal of making students more comfortable with research.
The videos helped students become reviewers and evaluators of research instead of passive consumers; the games made them comfortable with the language. Course material often seen as dry became engaging, and as they came to appreciate it, teaching became easier and more rewarding.
This post further breaks down some of the activities of Instructional Designers mentioned in the previous TeachOnline post, Introducing the ASU Instructional Designer, and discusses the importance of the relationship between the faculty and instructional designer.