E-xcellence in Teaching
Editors: Manisha Sawhney & Natalie Ciarocco

  • 01 Jun 2017 8:19 AM | Anonymous

    Flipping the Classroom Improves Performance in
    Research Methods in Psychology Courses

     Ellen Furlong
    Illinois Wesleyan University

    Despite having taught it many times Research Methods in Psychology remains one of the most challenging courses I teach. The difficulty arises primarily because Methods has two major goals: (1) to teach students the required concepts (2) to be able to understand, evaluate, design, and conduct research. In short—we must teach both content (what is a hypothesis?) and skill (where is the hypothesis in this article? Is it strong? What is my hypothesis?), usually in just one semester.

    The first few times I taught Methods I tackled this problem by covering content in class and relying on a semester-long APA-style research proposal for students to practice. On the surface this worked modestly—students typically wrote interesting papers with at least superficially solid ability to apply their knowledge.  

    One semester I challenged my students to something new: I assigned a very short 2 page article (Kille, Forest & Wood, 2013) and asked questions about it (i.e., True or False. One of Kille and colleagues (2013) hypotheses was a rating of the likelihood that marriages of four well-known couples would break up in the next 5 years). This activity was a disaster. Although students readily defined a hypothesis or a dependent variable, almost none could correctly identify or differentiate them in the article. This revealed both a shallowness of understanding of the psychological concepts and a lack of practice applying and working with them.

    I found this troubling not only for my students who would go on to graduate school or take upper level seminars, but perhaps most troubling for my students who would likely not receive more training in methods and might graduate without the ability to consume research critically. Successful consumers of research need to not only describe the concepts involved in research, but apply them readily to newspapers, blog posts, or Buzzfeed articles that they read. This is especially important in today’s age of ‘disinformation’ and false news.

    In short, the problem with Research Methods is that to practice the skills involved in research, students first need to understand the concepts. And given the pressures of the semester we often don’t have enough time for them to do both.

    This is hardly a new problem; others with similar difficulty have often turned to flipped classrooms (see, for example Peterson, 2016 and Wilson, 2013 who have used flipped courses for similar reasons in a statistics course). A typical flipped classroom involves presenting traditional lecture-based material (i.e., the foundational concepts) in an online video that students watch on their own before coming to class. During class students then work together under the guidance of the instructor to practice applying these concepts and honing skills (e.g., Lage, Platt & Treglia, 2000). This allows students to do the “easy” parts of learning—listening to a professor lecture, memorizing material, etc. —at home, while doing the hard parts—actually thinking about and applying the material—in the classroom with the professor’s help.

    Flipped classrooms have many advantages. First, students can learn the content at their own pace because they can watch the lectures as often as they need to in order to understand the content. Second, through classroom activities, students can assess their own knowledge early, so they know what they don’t know before the exam, and target their practice accordingly. Third, because students practice their research skills in the classroom I can provide one-on-one time with them. I can offer instant feedback, can see where they struggle, and can scaffold them to success. I can correct their mistakes while they are making them, and adjust activities in the moment to ensure they fully meet my course goals.  When students practice their skills at home I may have no idea where or how they struggle.

    In effect, flipping the classroom allows me to move from a “sage on the stage” to a “guide on the side”, emphasizing the skill involved in assessing and designing research rather than providing definitions and rote memorization of the jargon.

    Implementing a flipped classroom is very time consuming and difficult—for every 10-20-minute video I made, I spent at least 3 hours writing a script (don’t think you can do this on the fly—you hem and haw and students feel like you’re wasting their time), creating slides, recording the video, editing it, and posting it to our course management system. Sometimes I found other people’s work that was far better than what I could have done (see Ben Goldacre’s Battling Bad Science TED Talk: https://www.youtube.com/watch?v=h4MhbkWJzKk) and that saved me hours, but for the most part I made my own lectures. I wrote online quizzes and discussion forums to ensure that students watched the lectures, and on top of all that I had to create an entirely new set of in-class activities to help my students practice their skills—the entire point of this exercise (The Society for the Teaching of Psychology  (http://topix.teachpsych.org/w/page/19980993/FrontPage), Teach Psych Science (http://www.teachpsychscience.org/),  and others have excellent resources for help on their websites). Each of these took at least another 2-3 hours to prepare, many of them much longer. In short, between making your own videos, exploring other people’s work, writing quizzes, and developing new in class exercises this is a daunting exercise, not to be assumed lightly.

    However, despite the immense amount of time and effort it took to flip my course the outcomes were phenomenal and I hope that will be encouraging enough to motivate others to pursue it and, equally importantly, to motivate your students to give a flipped class a chance.

    A brief word about what I will show you here—in the Fall of 2013 I taught Methods in a traditional lecture-based course and in the Fall of 2014 I taught the same course flipped with 16 video lectures spread throughout the semester. I chose to compare two fall semesters although my first time flipping the course occurred in the Spring of 2014. I did not examine this data as students in fall and spring typically differ in systematic ways (i.e., more first-semester juniors in the fall and more second-semester sophomores in the spring).

    I assessed three measures over the course of both semesters: (1) applied exam questions, (2) a large APA style research paper, and (3) student evaluations of instruction scores. I chose exam questions that focused on particularly difficult foundational questions and for which there were least two questions per topic. For the APA style research paper, I randomly selected 5 student papers per class for in-depth assessment. These were scored on a scale of 1 (absent) to 6 (exceeds expectations). There was a good correlation between these scorings (r = .87) and the grading rubric I had initially used to score the papers. Student evaluation of instruction scores ranged from 1 (strongly disagree) to 5 (strongly agree) and included a number of questions that I will discuss below. Finally, because the sample size was low I accepted alpha values of .10.

    T-tests revealed that students in the flipped course (F) and the traditional course (T) scored fairly similarly on most applied exam questions (Design: F 88%, T: 90%, p = .82; Hypotheses: F 81%, T 76%, p = .69; Sampling/ Assignment: F: 85%, T: 80%, p = .38; Reliability/Validity: F: 83%, T: 78%, p = .39) but for two of the hardest concepts, variables and causation, students in the flipped course greatly outperformed students in the traditional course (Variables: F: 90%, T: 79%, p = .06; Causation: F: 92%, T: 73%, p = .015).

    Though this was impressive, the largest improvement showed in the APA style research papers. Interestingly students in the flipped course used evidence better (F: 5.2; T: 3.4, p = .02), had better argument organization (F: 4.8, T: 3.2, p = .05), stronger hypotheses (F: 6, T: 4.2, p = .03), better proposed methods (F: 5.13, T: 4.13, p = .03), were able to discuss their predicted findings in more profound ways (F: 5.6, T: 4.35, p < .01), and had overall better papers than students in the traditional course (F: 5.45, T: 4.5, p = .06). Students in the flipped course were also marginally better at synthesizing information across sources (F: 5, T: 3.8, p = .11). However, it wasn’t just that students in the flipped course were better writers (Writing style: F: , 4.54, T: 4.47, ns) or better at following directions (APA Style: F: 5.13, T: 4, ns) so their improvements in these areas seems targeted and important.

    Student evaluation of instruction scores also told an interesting tale—students in the flipped course were more likely to recommend the course (T: 4.13, F: 4.70, p = .10) even though they found it provided a greater intellectual challenge (T: 4.40, F: 4.90, p = .06) and they found the difficulty level less appropriate (i.e., they reported that the course was too hard: T: 4.67, F: 4.10, p = .01). So even though students found the course harder they were more likely to recommend the flipped class to others compared to those in the traditional course.

    While we’re talking about student evaluation scores, I will point out that my evaluation scores suffered a little the first semester I flipped the course (Spring 2014). While they dropped in some areas (i.e., students found me less available for help; thought my comments were not as useful) their overall evaluation scores stayed fairly similar (4.58 vs 4.59). Further, this ‘hit’ to my evaluations disappeared after one semester. My interpretation here is that I was frantically writing lectures and prepping in-class activities and didn’t have as much time to spend with the students and on comments. Now that all that work is done I have more time than ever to spend on my students. Since then my evaluation scores have stayed the same or risen (average 2014/2015: 4.58, 2015-2016: 4.60, 2016-2017: 4.82). Open ended student evaluations indicate that they very much valued the flipped experience and used it just as I would hope. For example, one representative comment said:

    Teaching this particular material in a “flipped course” was effective. The nature of the material is generally easy to understand with previous experience in psychology but it was not always as simple to apply it; therefore, practicing application in class was helpful. Overall this fostered the ability to apply the knowledge across useful areas both in this course and other courses.

    In summary, flipping the course in Research Methods is hard, but it benefits the students. While this benefit may not necessarily show up on every exam it shows where it counts—when students use their knowledge of methods to evaluate articles or design their own research. They are better able to think about important scientific controls, to design better experiments, and to keep their interpretation within reach of their data set. In short, this improves their training as scientists and consumers of research which we hope will persist throughout their lives. Though this work is hard (for both you and the students), it pays off.

    I’ll leave you here with a few quick words of advice about flipping your own course: First, you don’t need to flip your entire course all at once. Consider flipping one day this semester and see how it goes. Next semester, add another. Second, borrow from people who have done this already. Raid listservs and teaching websites. Email me and I will happily send you my materials (scripts, videos, quizzes, activities, etc.) or give you a pep talk. Talk to your colleagues and share with them. Third, tell your students they will be in a flipped course and, importantly, why. Give them the data I’ve given you—reassure them that their papers will be stronger, their grades will be better, and they will be happier. They will get on board. Fourth, and perhaps the scariest for junior faculty like myself, accept that the first semester you flip, your teaching evaluations may take a hit. Know that you’re gambling, yes, but it’s on a good bet—they will likely rise higher down the road once you’ve sold your students, once they know what they’re getting by enrolling in your course, and once you have mastered the flip.



    Kille, D.R., Forest, M.L. & Wood, J.V. (2013). Tall, dark, and stable: Embodiment motivates mate selection. Psychological Science, 24, 112-114.

    Lage, M.J., Platt, G.J., & Treglia, M. (2000). Inverting the classroom: A gateway to creating an inclusive learning environment. The Journal of Economic Education, 31, 30-43.

    Peterson, D.J. (2016). The flipped classroom improves student achievement and course satisfaction in a statistics course: A quasi-experimental design. Teaching of Psychology, 43(1), 10-15.

    Wilson, S.G., (2013). The flipped class: A method to address the challenges of an undergraduate statistics course. Teaching of Psychology, 40(3), 193-199.


    Biographical Sketch

    Ellen Furlong is an Assistant Professor in Psychology and Director of the Comparative Cognition Lab at Illinois Wesleyan University. She received her B.A. in Mathematics from Transylvania University and her Ph.D. in Psychology from The Ohio State University. Before joining the faculty at Illinois Wesleyan University in 2013, she served as a postdoctoral fellow at Yale University. Ellen has taught several courses with "flipped" components including a survey level fully online course, a writing intensive research methods course with flipped lectures, and a team-taught, cross-institution (Illinois Wesleyan and Transylvania Universities) May Term travel course with flipped lectures and skyped class sessions.

  • 16 May 2017 8:57 AM | Anonymous

    Teaching with Affordable Technology to Increase Student Learning

    Judith Pena-Shaff (Ithaca College)

     Amber Gilewski (Tompkins Cortland Community College)


    Last year at the APA Convention in Orlando, we participated in a symposium about the use of Open Educational Resources (OER) to increase student learning. Judith had little familiarity with OER, while Amber had been using these resources in her classes for the past two years, on the recommendation of her Provost who was enthusiastic about them.  A few days later, the president of Judith’s institution began his all-faculty meeting cautioning about the threat that OER known as Massive Open Online Courses (MOOCs) posed to traditional institutions of higher education. As a current participant in an Introduction to Psychology class offered through Coursera, questions about the educational and learning values of these resources came to Judith’s mind. Will OER increase students’ learning? And if so, how? In this essay, we discuss the value of open educational resources to increase student learning opportunities, as well as their challenges and promises.

    Open Educational Resources (OER) are “teaching, learning, and research resources that reside in the public domain or have been released under an intellectual property license that permits their free use or re-purposing by others” (Atkins, Brown, & Hammond, 2007, p 4). Inspired by the Open Source Software (OSS) and the Open Access (OA) movements in the mid 90’s (Baraniuk, 2008; Wiley & Gurrell, 2009), OER are relatively new phenomena that aim to 1) provide free or at least affordable access to knowledge and digital educational and research resources; and 2) reduce the high cost of teaching materials. Philanthropically, it is hoped that OER will help to equalize worldwide access to knowledge, and provide everyone with the opportunity to share, re-use, and re-conceptualize knowledge (Atkins et al., 2007; Baraniuk, 2008). OER include, but are not limited to, learning resources such as full online courses, courseware (e.g., syllabi, lectures, quizzes, and homework assignments), learning objects, assessment tools, software (e.g. IHMC CmapTools program), learning management systems (e.g. Sakai), textbooks, encyclopedias (e.g. Wikipedia), simulations, and other resources or techniques used to support access to knowledge (Hylén, 2006; Downes,2007). Some well-known open education projects are Connexions, which started in 1999; Wikipedia, launched in 2001; a series of OER projects sponsored by the Hewlett Foundation ; MIT Open Courseware, which began in 2002; and more recently, platforms such as Coursera Udacity, and edX (a joint venture between Harvard and MIT), which offer MOOCs.

    There are many reasons why psychology instructors might decide to adopt OER in their traditional face-to-face or distance learning classes. First, OER allow us to provide students with affordable access to information and knowledge. For example, Gilewski provided students with the option to use an OER textbook in her general psychology community college classes (Gilewski, 2012). They could either read the book online or print it for a small fee. She found, in contrast with previous semesters, that students spent less for their class materials, their grades improved, and there was a reduction in the number of course withdrawals. However, it is impossible to know if these results were caused by students’ access to affordable reading material.

    Second, OER allows instructors the opportunity to customize their course materials, providing students with different types of learning aids that better fit the course objectives and benefit different types of learners. For example, Audley-Piotrowski and Magun-Jackson (2012) used a custom-designed DVD with different types of learning resources to increase student preparation and involvement in a Developmental Psychology course. Their study revealed that different types of learning aids engaged different types of students. Non-traditional students and students who defined themselves as independent learners benefited the most from the ancillary the course CD offered than more traditional and dependent learners.

    In addition, OER can be used to combine different tools to help students develop shared knowledge through communities of practice. Draper (2012) explored how knowledge-building activities, such as individually and collaboratively creating concept maps, helped her students develop knowledge convergence. She used Moodle, a free course management system, an asynchronous online communication system for student collaboration, and IHMC CMap tools, a concept mapping software package that can be downloaded for free at http://cmap.ihmc.us/download/. Integrating these learning resources with instructional activities increased student engagement and participation and fostered the development of complex knowledge structures both in online and blended classroom environments.

    So far, we have presented the inclusion of OER in somewhat traditional course environments. MOOCs, however, are a different species of OER. Although the first course using the name MOOC was offered in 2008, the term became a buzzword at the beginning of 2012, with the creation of Coursera, an online platform that offers entire college courses for free. This company, started by two Stanford professors, now has contracts with well-known universities that offer free courses, although not yet for credit, through its online platform. Judith’s experience taking an Introduction to Psychology class taught by University of Toronto professor Steve Joordens has been very positive so far, although not very challenging. The lectures are 15-minutes or less, and are geared to introduce a few basic psychology concepts and theories to a very diverse audience in terms of age, occupation, and geographical location. At the end of each lecture there are two multiple-choice items related to the lecture (not graded), links to free online videos (usually from YouTube), and additional readings. The online discussions are lively, and some participants have been promoted to the level of teaching assistants because of the feedback they often give to others. Other participants write lecture notes and share them with the class. Judith, as others, just watches the lectures. To obtain a certificate of completion a student must complete two multiple-choice exams with a grade of 70 or higher. These tests permit a review of the lecture and retest on the items, to allow the student to correct wrong responses (very like B. F. Skinner’s Programmed Instruction technique). In addition, a short, peer-reviewed argument paper can lead to a “certification of completion with distinction.”

    From these examples we can see that OER offer instructors and students certain advantages. Students find them more affordable than commercial sources. Thus, if access to textbooks is an issue for our students, then OER become very appealing. OER also provide equal access to learning resources worldwide. For example, in the Coursera Introductory Psychology course, all participants have access to the videos and readings, no matter where they live or their levels of education. Many of the resources can be customized by instructors (e.g. editing the textbook, adding or simplifying information). They also give instructors the flexibility to combine different learning resources to better serve their students, to favor different pedagogical approaches (from memorization to knowledge construction), and to complement the textbook. They can be designed to follow a non-linear format. Instructors can link the course syllabus to the readings, videos, and Internet resources to help students gain a better understanding of the course content. All these factors sound very appealing.

    For faculty interested in infusing more OER in their own courses, some resources may include, but are not limited to the Community College Consortium for Open Educational Resources (http://oerconsortium.org), Carnegie Mellon Open Learning Initiative (http://oli.cmu.edu), Saylor (www.saylor.org), and OpenStax College (http://openstaxcollege.org). Amber has been involved with the Kaleidoscope Project (http://www.project-kaleidoscope.org), a cross-institutional collaboration for using the best existing OER for the past few years. They are always looking for new adopters in this grant-funded work.

    However, there are also challenges in adopting OER. For example, increased access does not necessarily mean enhanced or increased learning or motivation. Research shows that less than 30% of psychology students read their textbooks before class and less than 70% read them before an exam (Clump, Bauer, & Bradley, 2004). Of the 60,000 individuals who registered for the Coursera-based Introduction to Psychology class that Judith is observing, 12,000 (20%) were still actively participating at the time we wrote this essay (class announcement, June 4, 2013). This was before the first assessment took place. We wonder how many participants will actually complete all the course assignments and finish the course.

    Also, research on students’ perceptions of textbooks’ pedagogical aids (Marek, Griggs, & Christopher, 1999) shows that students tend to prefer aids that directly relate to test preparation (such as chapter glossaries, boldface definitions, chapter summaries and self-tests) rather than aids that might lead to a deeper understanding of the course material. Therefore, it was not surprising that students in Audley-Piotrowski’s and Magun-Jackson’s (2012) case study focused only on the readings and concepts and not on the other resources, since the test focused mainly on the readings.

    Issues also arise from our lack of familiarity with and concerns over the quality of OER resources. Of course, this is not much different than when we try to select textbooks in our area. The main difference is that we can always get some feedback from colleagues about textbooks. Since OER are not so well known, we are less likely to get feedback so we have to figure things out on our own. Also, we must find the OER while the textbooks usually come to our offices via publishers’ representatives.

    A major challenge relates to the sustainability of OER in terms of funding (so far most OER funding has come from educational institutions’ or foundations’ grants), technical upkeep (e.g., What happens when a problem occurs? Who maintains the sites?), and content (updating the content, reliability of sources, and so on). Several models have been proposed, particularly for the sustainability of MOOCs, such as charging participants for certificates of completion, charging employers who might be given access to participants’ grades, and of course, sponsors.

    While we have different, affordable learning technologies available today, some of the problems we face as instructors are still the same. For example, Hammer (2012) discussed students’ lack of metacognitive skills and learning strategies. Basically, many of our students do not know how to study or which learning strategies work best for them. We need to teach students these strategies directly, and help them become more conscious and purposeful in their learning. One way to do this could be by creating assignments that make them reflect on how they learn, regardless of the type of learning resources or environment where learning takes place.

    Students also need to be active in learning. To encourage more active learning in her Introduction to Psychology classes, Amber has been involved with the Carnegie Mellon Open Learning Initiative, which provides a more interactive approach to learning the material. Students read material online, watch embedded videos, engage in “Learn-By-Doing” and “Did-I-Get-This?” activities that provide immediate, targeted feedback, before they go on to take graded Checkpoints after each module. She has seen a dramatic increase in her students’ success and interaction with course material, which she’ll present at a symposium at the APA’s 2013 Convention in Hawaii. 

    In conclusion, OER provides affordable access to learning resources. Integrating OER and active learning strategies might help to foster complex knowledge structures. Our role is to guide our students so they use and take advantage of these resources.


    Atkins, D.E., Brown, J.S., & Hammond, A.L. (2007). A review of the Open Educational Resources (OER) movement: Achievements, challenges, and opportunities (Report to the William and Flora Hewlett Foundation). Retrieved June 2013 from:  http://www.hewlett.org/uploads/files/ReviewoftheOERMovement.pdf.

    Audley-Piotrowski, S.R. & S. Magun-Jackson, S. (2012, August) Textbook alternatives and student learning in a lifespan development course. In A.M. Gilewski and D.C. Draper (chairs), Teaching with affordable technology to increase student learning: What works. Symposium presented at the annual convention of the American Psychological Association, Orlando, FL.

    Baraniuk, R. G. (2008). Challenges and opportunities for the open education movement: A Connexions case study. In T. Iiyoshi & M. V. Kumar (Eds.), The Collective Advancement of Education through Open Technology, Open Content, and Open Knowledge (pp. 229-246). Cambridge, MA: MIT Press.

    Clump, M.A., Bauer, H. & Bradley, C. (2004). The extent to which psychology students read textbooks: A multiple class analysis of reading across the psychology curriculum, Journal of Instrumental Psychology, 31, 227-233.

    Downes, S. (2007). Models for sustainable open educational resources. Interdisciplinary Journal of Knowledge and Learning Objects, 3, 29-44. Retrieved June, 2013 from: http://www.ijklo.org/

    Draper, D.C. (2012, August), Instructional strategies to promote knowledge convergence in online communities of practice.  In A.M. Gilewski and D.C. Draper (chairs), Teaching with affordable technology to increase student learning: What works. Symposium presented at the annual convention of the American Psychological Association, Orlando, FL.

    Gilewski, A.M., (2012, August). Using open educational resources to improve student success in introduction to psychology courses. In A.M. Gilewski and D.C. Draper (chairs), Teaching with affordable technology to increase student learning: What works. Symposium presented at the annual convention of the American Psychological Association, Orlando, FL.

    Hammer, E.Y (2012, August). Meta-studying: Teaching metacognitive strategies to enhance student success. Paper presented at the annual convention of the American Psychological Association, Orlando, FL.

    Hylén, J. (2006, September). Open educational resources: Opportunities and challenges. Proceedings of Open Education 2006: Community, culture and context.  Utah State University (pp. 49-63). Retrieved June 10, 2013 from: http://library.oum.edu.my/oumlib/sites/default/files/file_attachments/odl-resources/386010/oer-opportunities.pdf.

    Marek, P., Griggs, R. A., & Christopher, A. N. (1999). Pedagogical aids in textbooks: Do college students' perceptions justify their prevalence? Teaching of Psychology, 26(1), 11-19.

    Wiley, D., & Gurrell, S. (2009). A decade of development. Open Learning, 24(1), 11-21.  doi:10.1080/02680510802627746.



    Judith Pena-Shaff is an associate professor and chair of the psychology department at Ithaca College. She earned her Ph.D. in educational psychology from Cornell University in 2001. Dr. Pena-Shaff’s research interest is in instructional technology. Specifically, she is interested in the knowledge construction processes students use in computer-mediated learning environments with the purpose of creating a taxonomy to help instructors assess student learning.  In addition, Dr. Pena-Shaff is highly engaged in her community, often conducting evaluations of educational programs run by schools and local organizations.


    Amber Gilewski is an assistant professor of psychology at Tompkins Cortland Community College in upstate NY. She is a Psychology Fellow on the Kaleidoscope Project, which is a Next Generation Learning Challenges grant-funded collaboration of colleges in the U.S. devoted to improving student success and retention in general education courses, through the use of OER. She earned her master’s degree in Clinical-Counseling Psychology from LaSalle University in 2002 and has been teaching at community colleges since 2004.


  • 02 May 2017 7:44 PM | Anonymous

    A Short Writing Assignment for Introductory Courses and Beyond
    Mitchell M. Handelsman
    University of Colorado Denver


    I don’t want to be a downer or anything, but I have a lot of problems in my teaching. Among them:

    • Getting students to do the readings
    • Getting students to think
    • Getting students to think about the readings they do
    • Wanting to have students write in meaningful ways
    • Having too much work to do
    • Getting bored reading papers that all say the same thing
    • Having student read without being accountable until the test, which may be weeks away (Handelsman, 2016)
    In this essay I describe an assignment that solves, or at least addresses, these problems. I have students write very short papers about their reading assignments in which they do more than summarize or question. To get a sense of the assignment, imagine that you are an introductory psychology student, and you read this in the syllabus:


    Processing and Reflecting on Psychology (PROPS)
    • Actors need props, right? If you want to act like a student, you need PROPS!
    • PROPS are short reflections on—and explorations of—your reading. They can be as short as a few sentences and no longer than 1 page. You will process (do something with, reflect on) at least 2 major concepts or key terms from the material you read. Here’s what I mean by processing:
      • You can ask and answer a question about what you’ve read.
      • You can differentiate key terms from each other, or show how you might remember them.
      • You can generate a couple of new examples of a couple of key terms.
      • You can relate the concepts to material from other modules, courses, or experiences.
      • In general, you can do anything beyond just questioning (e.g., “What does the hindsight bias mean?”) or reporting (e.g., “The psychoanalytic approach deals with unconscious material.”).
    • I assign PROPS to encourage you to:
      • do the reading (Course Goals 1 and 2) and do it actively (Course Goal 3).
      • practice active learning skills (Course Goal 3), such as self-reflection, applying, and elaborating.
      • come to class, and come prepared to work (Good for ALL course goals!).
    • Logistics
      • You will write 15 PROPS this semester. At the top of each, put your name, the date, the module covered, and the number of the PROP (e.g., the first prop you submit will be “PROP 1”).
      • PROPS need to be typed, double-spaced, 12-point font, 1-inch margins, no longer than 1 page.
      • You can hand in a PROP any day for which there is a reading assignment. The 2 (or more) concepts you process must be from the reading assigned for that day.
      • You can only hand in 1 PROP per class.
      • You have some choice about when you hand in PROPS, but I encourage you to start soon!! If you wait until the beginning of March, for example, you will have to hand in a PROP every class period.
    • Grading
      • You can earn 2 points for each of your PROPS. You will earn 2 points for showing that you’ve done the reading and are doing something more than reporting or questioning 2 concepts. You will earn 1 point if you hand in the PROP on time but have not processed or reflected actively upon 2 concepts.
      • I don’t grade PROPS on accuracy, but on activity! You are rewarded for taking risks and trying to learn.
    • Hints
      • The best PROPS are those that help you answer test questions by going beyond simple, sweeping statements or stories about your life. Take risks to see if you understand.
      • Use the language of psychology. Show that you’ve done the reading (Course Goals 1, 2, and 4).
      • If you discuss personal experiences, do more than tell a story:  show explicitly how the concepts apply your experience. For example: To say that you use coping strategies and tell a story about one of them is not enough. To show why some of your strategies are problem-focused and some emotion-focused is better. To relate your coping to some other information in the book, like speculating on some biological, social, or psychological factors in your coping, is wonderful!
      • PROPS can demonstrate that you appreciate the complexity of human behavior (Course Goal 2) by avoiding simplistic and extreme statements. For example, instead of, “I find it interesting that most fields of practice use the scientific method.This means that psychology is no different than any of the other fields of study in the world,” this might be better: “Many fields of study use the scientific method.Thus, psychology shares one characteristic with fields like biology and physics.In other ways, of course, psychology is different from other fields.”

    By the way, here are the course goals that the assignment refers to:

    I teach this course so you can:

    1. Learn major concepts and findings in psychology.
    2. Appreciate the complexity of human behavior.
    3. Develop and practice more active ways of studying and learning, including writing to learn, active reading, reflection, participating in class (individually and in groups), and more effective test-taking skills.
    4. Appreciate how psychologists think; e.g., how they use scientific methods to study behavior.
    5. Develop the ability to meet deadlines and follow directions.
    Students can earn a total of 400 points in the course; thus, these papers represent 7.5% of the final grade. Of course, the relative weight of the assignment is up to you depending on your goals. In my course, students earn 300 points for test performance and the rest for two larger papers in which they process at least three concepts across at least two chapters. One of these papers can be revised, and one can be an expansion of a PROP.

         I used to have students submit hard copies of their PROPS at the beginning of class, to encourage attendance. Recently I’ve been having students submit these types of papers on our LMS a few hours before class so I have the chance to read at least some of them before class (Handelsman, 2014). This gives me a chance to address misunderstandings and tailor exercises to incorporate students’ efforts.

         You can adapt this assignment for other courses and purposes (Handelsman, 2014). For example, you can specify additional elements for one or more of the PROPs, such as having students apply concepts from the text to an outside reading, an upcoming presentation, or previous PROPS. You can increase the number of concepts as the semester goes on. In upper-division courses you might specify the type of higher-order thinking you want students to do.

         Although the final product is short, I find it helpful to let students know that they may need to write much more than one page and then edit it to show me their best work. Here is the way I often explain it:

    “A one-page paper is like a traditional five-page paper with the extra verbiage removed. In high school (or other college courses), you sometimes spend the first four of the five pages summarizing what you’ve read. Then, you have a page to go and you don’t have anything else to summarize, so you say to yourself, ‘Let me just mess around and throw in something from the previous unit that seems to relate.’ It’s on that last page that you actually do something with what you’ve read. That’s what I want! I don’t need the summary. So you may have to write all those pages, but cut out the first ones and polish up the part where you’re thinking!

     Of course, there are still problems with this assignment (What kind of academic would I be if I didn’t see problems?):

    1. There is not enough opportunity for students to revise their work, and I do not spend enough time on grammar, style, and other aspects of writing. In my defense, I want freshmen to have ideas. Once they have something of their own to say, they may be more motivated to learn how to share their thoughts in effective ways.
    2. I still have a lot of reading to do. However, PROP reading is more interesting than reading a bunch of summaries, and the short length makes grading easier. And, of course, the assignment fits my short attention span.
    3. Students can still read the first paragraph, or any paragraph, of a chapter and write something that would work. But I figure that even a little effort is better than nothing! They still have more of an opportunity to think and read differently (Hanelsman, 2016).

    I hope you see some of the advantages of this assignment and ways to adapt it to your own course objectives. And forgive me for taking more than a page to explain it.


    Handelsman, M. M. (2014, August 19). This year I’m having my freshmen do POT: Four reasons to have students rolling in papers [Blog post]. Retrieved from http://www.psychologytoday.com/blog/the-ethical-professor/201408/year-im-having-my-freshmen-do-pot-0

    Handelsman, M. M. (2016, September 28). Reading with purpose, or purposes [Blog post]. Retrieved from http://www.psychologytoday.com/blog/the-ethical-professor/201609/reading-purpose-or-purposes




    Mitchell M. Handelsman, Ph.D., is Professor of Psychology and CU President's Teaching Scholar at the University of Colorado Denver, where he has been on the faculty since 1982.  Dr. Handelsman has won numerous teaching awards, including the 1992 CASE (Council for the Advancement and Support of Education) Colorado Professor of the Year Award, and APA’s Division 2 Excellence in Teaching Award in 1995.  He has co-authored three books, Ethics for Psychotherapists and Counselors: A Proactive Approach (2010; with Sharon Anderson), Ethical Dilemmas in Psychotherapy: Positive Approaches to Decision Making (2015; with Samuel Knapp and Michael Gottlieb), and The Life of Charlie Burrell:  Breaking the Color Barrier in Classical Music (2015, with Charlie Burrell). He is an associate editor of the APA Handbook of Ethics in Psychology (2012). 


  • 16 Apr 2017 9:18 AM | Anonymous
    OMG RU Really Going to Send That?
    Email Communication with Students

    Andrew Peck, PhD

    The Pennsylvania State University

         Electronic communication plays an important role in traditional collegiate education and online learning. In 2001, the number of email messages outnumbered letters sent by the United States Postal Service (Levinson, 2010). In 2002, Bloch reported that the typed word began to establish itself as the primary means of interpersonal communication, mentioning a case in which a student broke-up with her boyfriend via email. In fact, email has become the most widely used instructional technology (see Wilson & Florell, 2012).  Recognizing this, at least one college tells students that email is the “lifeline of [their] communication with the college.” (http://www.gwinnetttech.edu/webmail/, sec. 1). Interestingly, while we are most likely to initiate electronic correspondence to send course announcements or meeting requests, students tend to use their “lifeline” to make appointments, ask questions, and offer excuses (Duran, Kelly, & Keaten, 2005)


          Email can benefit faculty members and students in a variety of ways. Email is a relatively inexpensive way to communicate with many people quickly, it fosters collaboration, file sharing (Hassini, 2004) and group problem solving (Hassini, 2004; Wilson & Florell, 2012), and it provides an electronic record or “paper trail” for later reference (Wilson & Florell, 2012). Email can also increase the accessibility of the instructor (Hassini, 2004; Wilson & Florell, 2012). We can use email to provide feedback, which can foster academic development (Duran, Kelly, & Keaten, 2005), motivation (Duran, Kelly, & Keaten, 2005; Kim & Keller, 2008), and achievement (Kim & Keller, 2008). Some have noted that email can increase student writing (Hassini, 2004), although others have expressed concerns about the quality of students’ electronic correspondences (see Bloch, 2002). Email can increase communication with students who struggle with face-to-face communication, including foreign, shy, or disabled students (see Bloch, 2002; Duran, Kelly, & Keaten, 2005). Finally, email use can improve students’ perceptions of us, especially when our responses are helpful and prompt (Sheer & Fung, 2007), and include appropriate emotional content (Wilson & Florell, 2012).


          Like other instructional technologies, email is a tool, and misuse can result in unexpected consequences. Although the option to send a message to a large group of people quickly can be helpful, email does not come with “you probably shouldn’t send that” warnings, and sometimes people will send ill-conceived electronic messages to many recipients, as these examples of public Tweets (posts on Twitter) demonstrate:

    “I can't believe my Grandmothers making me take out the garbage   I'm rich f*** this I'm going home I don't need this s***”   - 50 cent (Note: I’ve added spaces and censored the message to make it more readable and appropriate for readers)

     “With so many Africans in Greece, at least the mosquitoes of West Nile will eat homemade food.”   - Voula Papachristou, Greek triple jumper who was removed from the Greek Olympic team for posting this sarcastic comment

     Although many of us are fortunate enough to have students who don’t send inappropriate mass mailings to classmates regularly, email does provide an avenue for upset students to vent before they’ve fully considered the consequences. Furthermore, while email increases the accessibility of the instructor, it also means that students have increased expectations about our availability and personal attention. Consequently, responding to email seems to have changed the nature of our work.

         Some of us prefer to use email as little as possible because the loss of non-verbal, social, and contextual cues can increase misunderstandings (Hassani, 2004), but many of us seem to treat it as a job requirement (and sometimes it is). Nonetheless, it can be time consuming to respond appropriately to student messages (Hassani, 2004), and sometimes responding becomes “the third shift in an already overcrowded day” (Mason, 2010, para. 3). Sometimes, when it is clear that students did not take the time to read important announcements sent via email, we wonder if sending email is worth the time it takes us to compose the message.

         To make matters worse, sometimes we wonder if the email students send are actually written by the student who is listed as the sender. In our department, my colleagues and I have received messages from student accounts that were actually written by those students’ friends, roommates, and parents. Ironically, some of us might wish students’ parents wrote messages for their children more often, as student messages can be too casual for many educators (see Bloch, 2002). It is not uncommon for electronic messages to lack grammar and punctuation, as this example demonstrates:

     “can i come 2 ur office i need 2 meet w u b4 the test i have ?s thx”

     Faculty Member Expectations

          Faculty members vary in their expectations of student email (Biesenbach-Lucas, 2007). To help students understand specific expectations, some of us include a statement about email communication in their syllabus. Here is an excerpt from a sample syllabus that focuses on instructor accessibility and other concerns:

    Email policy: On weekdays, I check my mail once -- in the early morning. If you send me an e-mail after 6 a.m., do NOT expect an answer until the next day. I do NOT check my mail at all on weekends. So if you send me a message anytime after 6 a.m. on Friday, you will not get an answer until Monday morning. I do not open emails with attachments. I do not open emails without subject lines. I do not open emails written in languages I can’t read – so be sure if you have your email set to a non-English format that your name and information come through in English. (http://public.wsu.edu/~mejia/Handbook/Sample_105_Syllabus.htm, para. 2)

    Here is an excerpt from another syllabus that focuses on tone and style:

     …all email communication will follow the guidelines enumerated here.  Email should be composed in formal, professional language, and with attention to the propriety accorded to the position of the writer, and the addressee…(http://www.hist.umn.edu/hist3722/syllabus.html, para. 9)

     Some might worry that including these types of statements in their syllabus might cause students to view them as overly strict, but students may not be aware of how they come across in their email and appreciate knowing teacher expectations (Martin, 2011).

         While a syllabus statement can help, challenging email messages seem to come with the job. While there are no recipes or guidelines we can use to construct the perfect email message, people have offered a number of helpful considerations. To help sort out these considerations, I have organized them below using based the popular green, yellow, red color coding scheme to reflect the potential gravity of the student’s message or the educator’s response.

     Code Green Messages

          Fortunately, we sometimes get “Code Green” messages. These messages are complimentary or positive in tone and content (I wanted to thank you for…, I enjoyed your course, are you teaching others…), ask for appropriate information respectfully, or include appropriate requests. Generally, these messages are easy to respond to professionally, so there is little need to offer strategies for responding to these types of messages.

     Code Yellow Messages

          Unfortunately, “Code Green” messages are often outnumbered by “Code Yellow” messages. These messages require us to proceed cautiously, as the message might require a considered response. Experience suggests that there are several types of “Code Yellow” messages: those that demonstrate that students misunderstand their own responsibilities, messages containing inappropriate personal information, and messages motivated by students’ anxieties (see Wilson and Florell, 2012, for an excellent review).

         Sometimes students misunderstand their own responsibilities, and deflect or request accommodations to compensate (Wilson and Florell, 2012). For example, my colleagues and I get messages from students like these:

    Dr. __ , I didn’t do well on your final exam. I am on the __ team and need an A in your class to get into my major and retain my scholarship. Please help.

     Dr. __ , I didn't realize the ___ was due yesterday. What can I do to make-up those points?

     Dr. __, I won’t be prepared for class discussion and can't do the first reading quiz because I just ordered the book. I apologize for any inconvenience.

     Dr. __ , I didn't make it to class today. Can you please send me the notes I missed?

    Sometimes students will include personal details of their lives inappropriately to justify a request. Sometimes lonely students just write to be friendly, and sometimes students seeking relationship advice confuse us with writers for the Dear Abby column. Consistent with examples provided by Wilson and Florell (2012), here are some example messages my colleagues and I received:

    Dr. __, How are you? I would like to make an appt. to meet with you. I don’t have anything specific to discuss, I just thought I would stop in to say hi and chat. I have two dogs named….

    Dr. __ , Help!…me and my friend hooked up once in the beginning of the semester and I liked her but didn't think she liked me back so I moved on, and……but now...what should I do?

    Sometimes “Code Yellow” messages are sent by conscientious and responsible students whose anxieties get the best of them.  Consistent with examples provided by Wilson and Florell (2012), here are some example messages we received:

    Dr. ___  , I am in your 11:00 am class. I completed the extra credit writing assignment in class today, but I didn't receive credit in the online grade book yet. Please get back to me right away. I really need this credit. [message sent at 1:30 pm]

    Dr. ___ , I wonder if the study guide you gave us is really everything we need to know for the final. We didn’t cover Chapter 11 in class, and it isn’t on the syllabus, but should I study it anyway? I emailed you earlier today, but I didn’t hear back yet.

    Sometimes, students send “Code Yellow” messages requesting information that is outside of the responder’s expertise. In these cases, it is appropriate to redirect the student to the appropriate resource, often an academic advisor or health services professional. However, many “Code Yellow” messages are class specific, requiring us to respond directly. In these cases, we should try to treat these moments as “teachable moments.” We should model professionalism, maintain a professional tone and offer appropriate content (Wilson & Florell, 2012). Sometimes leading by example can help, and one never knows who will read the message, especially when technologies make it easy to share electronic correspondence with others easily.

         As mentioned above, students appreciate it when we include emotional content in their responses (Sheer & Fung, 2007), but it is important to balance a congenial tone with a professional tone. One way to do that is to express empathy/sympathy when saying “no” (Wilson & Florell, 2012).

    Example: Thanks for letting me know. I appreciate your dilemma. I hope that you can stay on the team and keep your scholarship. I’d really like to accommodate your request, but I have to assign your grade on the basis of merit and abide by the grading policies in our course syllabus or I will…. violate departmental and college policies….create an unfair situation for other students….

    Wilson & Florell (2012) have also recommended that we provide students with perspective and encourage responsible action.

    Example: Unfortunately, you can’t make it up, but it is only worth…you can still do well in the course if you…..

    Example: Yes, you can do that. Please see the syllabus for details.

    They also recommend ending our messages with a positive and sincere tone when possible, but also recognize that a persistent student will struggle to take “no” for an answer. In these cases, it is up to us to end the conversation directly, but not aggressively, ignoring additional email from the student about the same issue.

    Example: Thanks for following-up and providing more information. I hope you have a good weekend.

    Example: I appreciate your continued concerns, but as I said, there isn’t anything else I can do without violating college/course policies. I consider this matter closed.

    Code Red Messages

    While “Code Yellow” messages require us to slow down and respond cautiously, “Code Red” messages often require us to stop what we’re doing to construct a planned response. “Code Red” messages are highly emotional, highly critical, or have an aggressive tone. Examples include pleas for help, student disclosures of abuse or suicidal inclinations, or hostile messages from irate students. While discussing strategies for responding to aggressive behavior, Tunnecliffe (2007) listed a number of potential causes for students’ anger.  He noted that some aggression stems from the lack of critical knowledge or inaccurate information, unrealistic expectations, or previous rewards for aggressive behaviors. Research on the development of the teenage brain also suggests that teenagers are more likely to become highly emotional than we are, and that emotion may cloud students’ reasoning abilities (for an example, see Spinks, 2013). Regardless of the factors involved, many aggressive messages seem to be triggered by perceptions of unfairness or inequity.

         Because of the nature of “Code Red” messages, there are a number of things to consider when responding. On many campuses, when faculty members are alerted to imminent threats of harm (including student self-harm) they are required to alert their chairs/department heads and campus or local police. Many campuses have counseling or intervention teams, other student resources, or partnerships with community programs to offer student resources. When appropriate, we should introduce these resources to victimized students and should consider facilitating student contact/appointment scheduling. If nothing else, we can encourage victimized students to go to the local hospital, where hospital personnel and case-workers can get involved.

         On some campuses, faculty members are instructed NOT to take on the role of detective/police officer or ask the student specific questions about a traumatic experience. This can increase feelings of victimization and make it less likely that the student will share critical details with law enforcement officials, student conduct authorities, police, or health professionals. Instead, we are advised to take the information the student has provided at face value, ask a few general questions (What happened? When? Where?) so that information can be passed on to authorities, reassure the student that they will do what they can to help, and then follow campus guidelines for helping.

         Dealing with aggressive students can be challenging and emotional for us. My colleagues and I have found it helpful to walk away from the computer and let some time pass before they respond (usually 12-24 hrs). This gives us time to cool down so that we can respond more professionally, and it gives the student time to cool down, too. Occasionally, students will realize their message contained inappropriate content or had an inappropriate tone, and they will send a follow-up apology. While there isn’t any research on successful strategies for responding to aggressive email, recommendations can be drawn from discussions about the best ways to communicate with angry students to promote de-escalation. It is important to avoid using a reprimanding tone (Tunnecliffe, 2007), which can promote defensiveness and increase perceptions of victimization. It is also important to recognize that anxiety can increase threat perceptions (Craske, Rauch, Ursano, Prenoveau, Pine, Zinbarg, 2009), and that anxious students are more likely to interpret ambiguous information or references to authority as more threatening than intended. A calm, jargon-free, tone might be more successful (Tunnecliffe, 2007; University of Oregon Counseling and Testing Center, 2012). With this in mind, it is important to note that we should avoid using capitalized words or bold text for emphasis, as some student interpret these formatting cues to mean yelling rather than emphasis (Hassini, 2004).  The University of Oregon Counseling and Testing Center recommends acknowledging the student’s emotion, and Larson (2008) recommends using content cues that facilitate an empathetic or sympathetic tone (e.g., I can see this is really important to you). We should use the present tense, focusing on the present situation rather than rehashing the past (Tunnecliffe, 2007) and explain what we can do (Larson, 2008) rather than explaining why we can’t address the student’s concerns, even if that is nothing more than an offer to meet and discuss.

         Some of us might want to respond to criticisms from students directly. We all make mistakes, and sometimes students’ criticisms are based on something legitimate. In these cases, it might be best to agree with what is accurate and share your plan for corrective action (Tunnecliffe, 2007). If criticism is vague, it is fine to ask for clarification (Larson, 2008). Sometimes the initial criticism, or the response to your request for clarification, can be lengthy. In these cases, it might be best to address concerns globally rather than respond to individual concerns (Tunnecliffe, 2007). If none of these strategies sound appealing, we can always deflect the criticisms by simply thanking students for sharing their views (Tunnecliffe, 2007).

     Final Thoughts: Maintain Perspective

    Regardless of how you choose to respond to critical email messages, it is important to consider Alexander Pope’s “to err is human; to forgive divine” and to cut ourselves some slack (Tunnecliffe, 2007). It is also important to recognize that, while we can make the most out of “teachable moments,” we can’t get through to everyone (Larson, 2008). Research has shown that readers who are angered by email attribute the tone to the writer’s personality (Levinson, 2010). Student politeness affects our feelings towards the student, our beliefs about the student’s competence, and our motivations to help (Stephens, Houser, & Cowan, 2009; Bolkan & Holmgren, 2012). So, it is critically important to remember and apply the lessons we teach our students about the Fundamental Attribution Error and consider that situational, rather than dispositional, factors can lead the student to send inappropriate email.

         Steve Johnson, a football player for the Buffalo Bills, blamed God for a dropped pass and posted the following to Twitter:


    So, the next time you read an annoying email message from a student, take a moment to appreciate that you are in good company.


    Biesenbach-Lucas, S. (2007). Students writing emails to faculty: An examination of e-politeness among native and non-native speakers of English. Language Learning & Technology, 11(2), 59-81.

    Bloch, J. (2002). Student/teacher interaction via email: The social context of internet discourse. Journal of Second Language Writing, 11, 117-134.

    Bolkan, S., & Holmgren, J.L. (2012). ‘‘You are such a great teacher and I hate to bother you but...’’: Instructors’ perceptions of students and their use of email messages with varying politeness strategies. Communication Education, 61(3), 253-270.

    Craske, M.G., Rauch, S.L., Ursano, R., Prenoveau, J., Pine, D.S., Zinbarg, R.E., (2009). What is an anxiety disorder? Depression and Anxiety, 26, 1066–1085.

    Duran, R.L., Kelly, L., & Keaten, J.A. (2005). College faculty use and perceptions of electronic mail to communicate with students. Communication Quarterly, 53(2), 159-176

    Gwinnet Technical College. (n.d.) Student webmail. Retrieved from http://www.gwinnetttech.edu/webmail/

    Hassini, E. (2004). Student–instructor communication: The role of email. Computers & Education, 47,  29–40.

    Kim, C. & Keller, J.M. (2008). Effects of motivational and volitional email messages (mvem) with personal messages on undergraduate students’ motivation, study habits and achievement. British Journal of Educational Technology, 39(1), 36–51. doi:10.1111

    Larson, J. (2008). Angry and aggressive students. Principal Leadership, January, 12-15. Retrieved from http://www.nasponline.org/resources/principals/Angry%20and%20Aggressive%20Students-NASSP%20Jan%2008.pdf

    Levinson, D.B. (2010). Passive and indirect forms of aggression & email: the ability to reliably perceive passive forms of aggression over email. (Unpublished doctoral dissertation). Wright Institute Graduate School of Psychology, Berkeley, CA.

    RC Martin. (2011, June 21). Avoiding the angry email [Web log post]. Retrieved from http://blog.uwgb.edu/alltherage/avoiding-the-angry-email/

    RC Martin. (2012, March 2). Responding to the angry email: A follow-up [Web log post]. Retrieved from http://blog.uwgb.edu/alltherage/responding-to-the-angry-email-a-follow-up/

    Mason, M.A., (2010, July). Email: The third shift. The Chronicle of Higher Education. Retrieved from http://chronicle.com/article/E-Mail-the-Third-Shift/66312/

    Mejia, E. (n.d.). Sample English 105 syllabus. Retrieved from http://public.wsu.edu/~mejia/Handbook/Sample_105_Syllabus.htm

    Richtmyer, E. (2007). History 3722 syllabus. Retrieved from http://www.hist.umn.edu/hist3722/syllabus.html

    Sheer, V.C., & Fung, T.K. (2007). Can email communication enhance professor-student relationship and student evaluation of professor?: Some empirical evidence. Journal of Educational Computing Research, 37(3), 289-306.

    Spinks, S. (2013). One reason teens respond differently to the word: Immature brain circuitry. Retrieved from http://www.pbs.org/wgbh/pages/frontline/shows/teenbrain/work/onereason.html

    Stephens, K.K, Houser, M.L., & Cowan, R.L. (2009). R U able to meat me: The impact of students’ overly casual email messages to instructors. Communication Education, 58(3), 303-326.

    Tunnecliffe, M. (2007). Behavioural de-escalation. Retrieved from http://www.education.nt.gov.au/__data /assets/pdf_file/0014/2318/Module7TeacherNotes.pdf

    University of Oregon Counseling and Testing Center. (2012). Strategies for Dealing with Angry Students Outside the Classroom. Retrieved from http://counseling.uoregon.edu/dnn/FacultyStaff/DisruptiveThreateningStudents/DealingwithAngryStudentsOutsidetheClassroom/tabid/325/Default.aspx

    Wilson, S., & Florell, D. (2012). What can we do about student e-mails? Observer, 25(5), 47-50.

  • 03 Apr 2017 9:00 PM | Anonymous

    Submitted by William S. Altman and Lyra Stein, Editors, E-xcellence in Teaching Essays ___________________________________________________________________________

    Using media in the classroom: A cautionary tale and some encouraging findings

    Lynne N. Kennette
    Durham College


    Instructors should use caution when implementing new methods of teaching or assessments: just because students like it, doesn’t mean their learning necessarily benefits. This was recently revealed to me in one of my classes when I tried a new activity. However, as I discovered through student comments, there is a silver lining (read on!)


    One of the key skills that instructors in psychology try to develop in their students is the identification of  independent variables (IV) and dependent variables (DV), which form the basis of research design and analysis. The very foundation of the scientific method includes identifying changes in one variable and how it relates to  another variable. I wondered whether students would show a performance advantage (or any preference for) using media clips over written scenarios used for identifying IVs and DVs in experiments. So, I presented students with video clips from episodes of the television series MythBusters (Discovery Channel), audio clips from the National Public Radio’s Radiolab series, and my traditional written experiment scenarios.

    Burkley and Burkley (2009) reported the benefits of using MythBusters clips to illustrate experimental designs. Students enjoyed the use of these clips in class, and performed better on MythBusters-related exam questions (compared to control questions). I suspected that students would prefer the video and audio scenarios for their entertainment value, but wondered whether their performance would actually benefit. Previous research suggested that students might both prefer and benefit from multimedia formats because it would stimulate interest and thus retention (Nowaczyk, Santos, & Patton, 1998). Media may also be more engaging than a written description, and engaging content leads to better learning of information (Tobias, 1994), and as we know, students put more effort into tasks they find interesting (Renninger, 1992).

    However, it is also possible that the additional information provided by audio and video clips could distract students from the relevant information required to complete the task of identifying IVs and DVs (Walker & Bourne, 1961). This distracting information may come from the irrelevant “story-telling” details required to make these media commercially appealing (especially in the case of MythBusters). Additionally, because the learner cannot as easily control the stream of information (i.e., the speed at which information is delivered), students may experience a cost when presented with media compared to the traditional written format.


    In two sections of my advanced cognitive psychology laboratory course (and following a brief review lecture on the topic of IVs and DVs), students were presented with traditional written scenarios, video clips, and audio clips and had to identify IVs and DVs. Students were assessed multiple times: immediately following the IV/DV review lecture (Time 1), during the second to last week of class (Time 2), and on the very last day of class (Time 3; here, I presented previously-encountered scenarios to measure retention, however this timepoint resulted in ceiling effects and was, therefore, difficult to analyze). At the end of the class, I also asked students (anonymously) some qualitative questions to obtain their perceptions of the three question types (e.g., which of the three were perceived easier).

    Results and Discussion

    After adjusting for final course grade, it is reassuring to have found that students improved over the course of the semester (F(2, 252) = 50.87, p < .001, hp2 = .288). Student performance on the three formats also differed (F(2, 252) = 4.01, p = .019, hp2 = .031), whereby students answered the traditional written scenarios more accurately than Radiolab questions (Mwritten = 78%, MRadiolab =68%, p = .005), but performance on the written scenarios did not differ from MythBusters questions (p = .128). What is perhaps even more interesting is that students perceived all three to be of similar difficulty, but indicated a preference for the MythBusters clips over the Radiolab audio clips. In addition, many students provided unsolicited feedback about how much “fun” the video and audio clips were and that these allowed them to finally “get” IV manipulation and DV measurement.

    So, does showing students video and audio clips actually benefit learning or performance on assessments? My experience with this activity is particularly interesting because it taught me that using media or multimedia for classroom assessment may not necessarily lead to better understanding, even though students expressed a preference for these formats. Student preference for these formats does, however, suggest that instructors can use multimedia as a valuable tool because they increase student engagement with course material.


    Some of the factors that instructors should consider when contemplating the use of multimedia for teaching and assessment include:

                Familiarity: the written format is a common way to expose students to IV and DV identification, which they may have encountered in previous courses. It is also the most common assessment method (tests and assignments), and therefore students are familiar with this format from high school. If planning to use multimedia for assessments, students should be given ample time to practice assessments using those less familiar formats.

    Superfluous information: Both types of media clips contained additional details that were not directly relevant to the experiment. The presence of these extraneous details could distract students (especially those not sufficiently proficient in experimental design and unable to suppress this irrelevant information). Walker and Bourne (1961) found a linear decline in performance on a problem-solving task with each added piece of irrelevant information (also see Mayer, Heiser, & Loan, 2001, for a more recent investigation).

    Entertainment: Students’ previous experience with MythBusters, Radiolab, or both (or perhaps television and radio more generally) as entertainment may result in difficulty focusing on the relevant experimental features of the clips (i.e., IVs and DVs), leading to declines in performance than with the written experimental scenarios,

    Concluding remarks

                Instructors should use caution when implementing new technologies and new teaching strategies. As my recent experience has demonstrated, just because they like it, doesn’t mean they necessarily learn, perform, or retain it better. Similarly, these new techniques or formats (although interesting for students) may not be appropriate to use during assessments. However, it is encouraging to know that they can lead to increased student engagement (e.g., MythBusters) which can lead to increased learning while in class! Because student engagement is so important, instructors should use many tools to encourage student learning in their discipline, while keeping in mind the considerations outlined above.



    Burkley, E., & Burkley, M. (2009). Mythbusters: A tool for teaching research methods in psychology. Teaching of Psychology, 36(3), 179–184. doi:10.1080/00986280902739586

    Mayer, R. E., Heiser, J., & Loan, S. (2001). Cognitive constraints on multimedia learning: When presenting more material results in less understanding. Journal of

    Educational Psychology, 93(1), 187–198. doi:10.1037/0022-0663.93.1.187

    Nowaczyk, R. H., Santos, L. T., & Patton, C. (1998). Student perception of multimedia in the undergraduate classroom. International Journal of Instructional Media, 25(4), 367–382.

    Renninger, K. A. (1992). Individual interest and development: Implications for theory and practice. In K. A. Renninger, S. Hidi, & A. Krapp (Eds.), The role of interest in learning and development (pp. 361–398). Hillsdale, NJ: Erlbaum.

    Tobias, S. (1994). Interest, prior knowledge and learning. Review of Educational Research, 64(1), 37–54. doi:10.3102/00346543064001037

    Walker, C. M., & Bourne, L. E. (1961). The identification of concepts as a function of amounts of relevant and irrelevant information. The American Journal of Psychology, 74(3), 410–417. doi:10.2307/1419747

    Author bio

    Lynne N. Kennette, Ph.D. is a Professor of psychology and program coordinator for General Arts and Sciences programs at Durham College in Oshawa, Ontario (Canada). She is a graduate of Wayne State University (Detroit, Michigan, M.A. and Ph.D.) and the University of Windsor (Windsor, Ontario, B.A.). She teaches primarily general education courses in introductory psychology and her research focuses on the SoTL as well as how the mind processes languages. This research was conducted at Wayne State University.

  • 15 Mar 2017 3:57 PM | Anonymous

    Teaching in the Core Curriculum:
    Re-thinking our Approach to Introductory Psychology Courses

    Amie R. McKibban
    University of Southern Indiana

        “I am losing hope, Amie. Our students are being raised in a political system that is guided by economic theory. How can I teach students the value of higher education when they come to college asking ‘what job is this going to get me and how much money am I going to make?’” This was the start of a very long conversation I recently had with a former colleague. Indeed, his concerns are well founded, as higher education has been in the center of a heated debate for the last several years. Political critics and academic administrators alike have given much attention to the idea that we need more college graduates with specialized skill sets as a way to increase graduates’ employability. Harvard English professor James Engell (n.d.) laments, “an emphasis on majors believed to land a good job… appeal to ‘utility,’ to a supposedly clear-sighted appraisal of what the ‘real’ world demands of college graduates” (para.2). As Engell further discusses, this central parable in higher education is in conflict with the reality that few entry level jobs require four years of specialized knowledge.

        In a recent survey, the American Management Association (2012) found that over half of executives felt their employees scored average, at best, in four areas: critical thinking, communication, collaboration, and creativity. Most of the executives surveyed agreed, that they need “highly skilled employees to keep up with the fast pace of change” in business (para. 3). Yes, college graduates do need a specialized skill set, but one that focuses on critical thinking and creativity, rather than content-specific knowledge. As Engell (n.d.) points out, even professional schools (e.g., law and medicine) want students who have been exposed to a broad range of knowledge; students who can critically think and “look at life as a whole” (para.3). In other words, we need to begin reemphasizing the value of a liberal arts education and the utility of the core curriculum. As many of us in higher education know, the goal of a liberal arts education is not specialized knowledge or training. Rather, a liberal arts education aims to prepare students to function as productive citizens in a diverse and complex world (Task Force on General Education, 2007). Core curricula at many institutions embrace the same philosophy. This is often asserted in declarations similar to my own institution’s, stating that the core curriculum embraces non-specialized and non-vocational learning, with an emphasis on critical thinking (the ability to analyze and evaluate information) and information processing (the ability to locate, gather, and process information).

        With this in mind, I argue that what the “real” world actually demands of our students is at the very heart of the core curriculum: a curriculum that prepares students for citizenry and productivity, regardless of major. Further, I propose that teaching Introductory Psychology from a core curriculum perspective is a step toward addressing the disconnect Engell so eloquently discusses. Although numerous instructors may currently approach the teaching of Introductory Psychology as a core curriculum class, there are just as many who take a content-based approach. That is, structuring the class with the goal of preparing students to succeed in subsequent psychology courses should they declare a major in psychology. For those of you who fall into this latter category, I encourage you to reconsider the guiding philosophy of the course. In the remainder of this essay, I offer steps (points of consideration) in restructuring the course, and reflect on my own personal experience teaching the class for 13 years, providing insights and examples to help guide you through these considerations. I strongly believe in academic freedom, and therefore these should be taken as general guidelines. You know your students, community, and state requirements best, hence; the content of your actual class should be tailored accordingly.

    Step 1: Develop a course that reaches the majority.

        Although many of us would prefer to receive “graduate-school-bound” students in our classrooms, the reality of teaching is that many students who cross our paths will discontinue their formal educational pursuits after obtaining a bachelor’s degree. Others discontinue before completion of their degree. The majority of students will need to be prepared, as well as possible, for the realities of the working world. A core curriculum approach best meets this reality; I structure my Introductory Psychology course accordingly. Much of my course’s focus is on application of the material to the real world (i.e., making the connections between theory and example) rather than memorization of content. I achieve this largely by telling stories, giving personal anecdotes, discussing clips from popular television shows, and analyzing articles in local and national newspapers.

        My approach is based on fulfilling two tenets of the core curriculum: critical thinking and information processing. Using content from the text to critically evaluate a news article, for example, reinforces the importance of a broad knowledge base for the students. It also models creativity,; one of the four skills sets discussed by the American Management Association (2012). By making the course material relevant to their lives, students are better equipped (and more motivated) to actively engage with the content. As one student recently wrote in my evaluations, “Many of the personal anecdotes and stories that were used to help teach the concepts will be with me for a long time.” The point is this: what you do with the content is much more memorable and meaningful than the content itself. This notion brings me to the second step in re-thinking Introductory Psychology as a core curriculum class.

    Step 2: Choose content for your course based on usability.

        Often times we feel pressure to cover as much material as possible. This makes sense if you are preparing students for the AP test in psychology or if the only students required to take Introductory Psychology at your institution are psychology majors. For many of us, however, this course is part of a larger curriculum, and many students (especially freshman and sophomores) will filter through our classrooms. As such, I argue that it is not the quantity of information we cover that is important, but the quality. Cut content for the sake of experience. Although this may cause some of you to cringe, I offer this: there are many terms, definitions, and facts that we forget along the way (really, how many of you can remember everything from your intro to political science course?), however, we remember the process. That is, our students may not remember the difference between a conditioned and unconditioned stimulus, but if we make the content experiential they will remember the process of classical conditioning.

        Given that many Introductory Psychology students will not become psychology majors, you should choose content by asking yourself “if this were the only course my students took, what would I want them to understand?” That is, what material (theories and concepts) will help students become more productive citizens? What do you feel is most important for them to understand and use in their everyday lives? In other words, what processes are important? For example, I always cover judgmental heuristics when discussing cognitive psychology, using current events in politics and recent findings in medicine. Indeed, understanding how humans make decisions is important in being able to make sound decisions and discover creative solutions. It is also an important process in becoming a knowledgeable consumer of information and services. What processes you feel are important to achieving the goals of the core curriculum are up to you. Choose them, and spend time on them in class. The students will remember these things. As a former student recently told me, “Every time I watch the news or read an article on Facebook, I can’t help but think of you and everything we learned in class. I find myself exclaiming ‘Darn it, McKibban!’ all of the time.”

    Step 3: Seek continual feedback from your students.

        Structuring your course in a way that promotes skill development, rather than content specific knowledge (application rather than memorization) requires continual feedback from your students. Waiting for the results of your teacher evaluations is not sufficient. I have found that having someone outside of my department come in for 20 minutes and run a focus group (while I am not there) results in the best feedback. With whatever approach works for you, ask your students, in an anonymous format, what they find effective about your teaching style, what content they have found most applicable and why, what is working for them and what is not. Tailor the questions to the individual class and discuss the results the next class period. This is something that can be done one to two times during the semester. Students will have suggestions, as well as good insights. The one “golden rule” of implementing this feedback is that you do make changes, when reasonable.

        This idea of a continual feedback loop is not only mutually beneficial, but speaks to the goals of the core curriculum. It gives your students decision making power over their education and provides them with experience in collaborating with an expert in the field when making those decisions. If we are to prepare students for the demands of the world, effectively communicating with others is a skill they must develop, especially when those “others” are people in higher positions. Again, this process is important in developing a course that promotes critical thinking and assists in the development of communication, collaboration, and creativity. Not to mention, you will learn just as much from this process as your students.

    Concluding Remarks

        The steps I have offered are meant to give you a framework in reconsidering the guiding philosophy of Introductory Psychology course development. Given the nature and breadth of the course, we have the unique opportunity to prepare our students for citizenry and productivity; for the challenge of seeing the world as a whole; and for a lifetime of critical thinking and reflection. I encourage you to ask, given your academic environment and situation, if your students would benefit from a focus on quality over quantity. I challenge all of us to find the best way possible to meet the needs of our introductory students, knowing that many of them may not finish college, or will complete a degree outside of the field of psychology. I ask you to tell your students that “whether or not you stay in college and no matter what major you ultimately choose, I promise that you will use the information learned in this class,” and then live up to that promise. After all, psychology in and of itself embraces the philosophy of a liberal arts education and the goals of a core curriculum, and what better class to demonstrate this with than Introductory Psychology? What better way can we tell students “this is the value of higher education?” I think that those of us who teach this class can relate to Engell’s (n.d.) statement that “the aims [of a liberal arts focus] are at once personal and social, private and public, economic, ethical, and intellectual” (para. 9).


    American Management Association (2012). Executive summary: AMA 2012 critical skills survey. Retrieved from http://www.amanet.org/uploaded/2012-Critical-Skills-Survey.pdf

    Engell, J. (n.d.). Professor of English James Engell offers his reflection on the value of a liberal arts education. Retrieved from http://www.admissions.college.harvard.edu/about/ learning/liberal_arts.html

    Task Force on General Education, (2007). The value of a liberal arts education. Retrieved from http://www.admissions.college.harvard.edu/about/learning/liberal_arts.html

    Amie R. McKibban, professor of psychology at the University of Southern Indiana, completed her PhD in community psychology in 2009. She has presented numerous papers and published in diverse areas, ranging from attitudes toward individuals in the LGBT community, sexual health and communication, happiness, community redevelopment, academic dishonesty, and perfectionism. Using a well-known program in a way that mobilizes allies and allows for solutions at each level, she founded and directs a community and campus wide Safe Zone program. In the first few years of her tenure at the University of Southern Indiana, she has received the Willie Effie Thomas award and Phenomenal Women of USI award for her work in social justice, as well as the H. Lee Cooper Core Curriculum award for her excellence in the teaching of psychology.

  • 04 Mar 2017 5:29 PM | Anonymous

    Four Simple Strategies from Cognitive Psychology for the Classroom


    Megan A. Smith (Rhode Island College)

    Christopher R. Madan (Boston College)

    Yana Weinstein (University of Massachusetts Lowell)


    Scientists focusing on educational research questions have a great deal of information that can be utilized in the classroom. However, there is not often bidirectional communication between researchers and practitioners in the field of education as a whole (see Roediger, 2013). In this article, we describe the science behind four evidence-based teaching strategies: (1) providing visual examples, (2) teaching students to explain and to do, (3) spaced practice, and (4) frequent quizzing. Below, we provide concise overview of these strategies and examples of how they can be implemented in the classroom before describing the science behind each strategy:


    1.      Providing visual examples
    • Relevant cognitive concepts: Dual coding
    • Description: Combining pictures with words.
    • Application examples (using social psychology topics):
      • Students can draw examples of factors determining liking or loving. For example, two people who are close vs. far away, two people who are similar vs. different, or a visual depiction of reciprocity
      • Instructors can make sure to provide video depictions of experiments where available to go with verbal descriptions (e.g., Milgram, misattribution of arousal)
    2.      Teaching students to explain and do
    • Relevant cognitive concepts: Elaborative interrogation; Levels of processing; Enactment effect
    • Description: Asking and explaining why a factor or concept is true; asking students to perform an action.
    • Application examples (using social psychology topics):
      • Students can ask and explain what factors contribute to whether one person helps another person.
      • Instructors can provide students with example scenarios of a person in need of help and ask students to describe and explain why they think a passerby may or may not help.
    3.      Spaced practice
    • Relevant cognitive concepts: Spacing; Interleaving; Distributed practice; Optimal lab
    • Description: Creating a study schedule that spreads study activities out over time.
    • Application examples (using social psychology topics):
      • Students can block off time to study for 30 minutes each day rather than only studying right before a test or exam.
      • Instructors can assign online quizzes that interleave questions from various chapters.
    4.      Frequent quizzing
    • Relevant cognitive concepts: Testing effect; Retrieval practice; Retrieval-based learning
    • Description: Bringing learned information to mind from long-term memory.
    • Application examples (using social psychology topics):
      • o   Students can practice writing out everything they know about a topic, for example conformity, obedience, and bystander effects.
      • o   Instructors can give frequent low-stakes quizzes in the classroom or online to encourage retrieval practice.


    Instructors can find free teaching materials for each of these strategies on the Learning Scientists website (www.learningscientists.org/downloadable-materials).

    We focus on these strategies because they were highlighted in a recent policy report from the National Council on Teacher Quality (Pomerance, Greenberg, & Walsh, 2016), which identified key teaching strategies based on evidence from the science of learning. The report found that few of the 48 teacher-training textbooks they examined cover any of these learning principles well–and that none covered more than two of them (but see Thomas & Goering, 2016). These strategies also reiterate recommendations made in an earlier guide commissioned by the U.S. Department of Education (Pashler, Bain, Bottge, Graesser, Koedinger, McDaniel, & Metcalfe, 2007; also see Dunlosky, Rawson, Marsh, Nathan, & Willingham, 2013). Thus, there seems to be a gap between the research – converging evidence from controlled laboratory studies and classroom studies – and practical use of the strategies in education. While there are in-depth reviews on each of these strategies, here we provide a concise, teacher-ready overview of these strategies and how they could be applied in the classroom.


    1. Providing visual examples

    Learning can be substantially enhanced if verbal information is accompanied by visual examples. This coupling of verbal and visual information is supported by the ‘dual-coding theory’ (Paivio, 1986). This theory attributes the mnemonic benefits of providing visual examples to different cognitive processes associated with processing words and images, or even words that describe concrete ideas. This can be particularly useful when teaching abstract concepts (see Figure 1 for an example, http://www.learningscientists.org/dual-coding-example), as associating concrete and abstract terms can improve memory for the abstract information (Madan, Glaholt, & Caplan, 2010).

    Additionally, there is clear evidence that memory for pictures is superior to memory for words (Paivio & Csapo, 1969; 1973). However, this effect is fundamentally distinct from the notion of “learning styles”, where information to be learned is presented in a learner’s preferred modality. This type of differentiation is not supported by cognitive research (Rohrer & Pashler, 2012) and has often been described as a myth or urban legend (Coffield, Moseley, Hall, & Ecclestone, 2004; Hattie & Yates, 2014; Kirschner & van Merriënboer, 2013). Rather than diagnosing each student’s style and matching instruction for each individual, teachers can couple visual examples with text for all students.


    2. Teaching students to explain and to do

    One of the most effective methods to improve learning of information is to have students engage with the material more ‘deeply’, also known as elaboration (Craik & Lockhart, 1972; also see Lockhart & Craik, 1990). Elaboration has been defined in many ways, but most simply it involves connecting new information to pre-existing knowledge. Perhaps William James said it best: “The art of remembering is the art of thinking [...] our conscious effort should not be so much to impress or retain [knowledge] as to connect it with something already there. The connecting is the thinking; and, if we attend clearly to the connection, the connected thing, will certainly be likely to remain within recall” (James, 1899, p. 143). Two forms of elaboration are readily applicable to classroom learning: having students explain why something is the case, and having students perform actions.

    Elaborative processing can be fostered by having students question the material that they are studying; for instance, by asking them to produce their own explanations for why a fact is true, rather than just presenting them with a complete explanation (Pressley, McDaniel, Turnure, Wood, & Ahmad, 1987). This elaboration technique is flexible enough to work in a variety of different learning situations (e.g., for students working alone or in groups, Kahl & Woloshyn, 1994). However, work on elaborative interrogation outside of the lab is just beginning (Smith, Holliday, and Austin, 2010) and we need stronger evidence from the classroom before we can confidently claim that this technique is helpful (Dunlosky et al., 2013). Another relevant technique is that of self-explanation, where students walk themselves through the steps they take during learning. This technique is helpful both when students engage in it spontaneously (Chi, Bassok, Lewis, Reimann, & Glaser, 1989), and also when teachers prompt students to produce the self-explanations (Chi, De Leeuw, Chiu, & LaVancher, 1994).

    When feasible, the most elaborative way to process information is by ‘doing’. When information could either be learned by hearing about an action, watching someone else do the action, or having the student themselves perform the action, retention was best in cases where the student performed the action themselves (Cohen, 1981; Engelkamp & Cohen, 1991). This action component can build upon the previously described dual-coding theory (Engelkamp & Zimmer, 1984; Madan & Singhal, 2012). In the classroom, this type of learning could be supported by hands-on activities (e.g., science experiments, or getting students to draw their own diagrams; Wammes et al., 2016) or field trips to museums or nature sites.


    Read Part II at: http://teachpsych.org/E-xcellence-in-Teaching-Blog/4648286

  • 04 Mar 2017 5:24 PM | Anonymous

    3. Spaced practice

    We often tell our students that cramming “doesn’t work”. That is good advice–but is not entirely true. As many students have discovered, “cramming”–an intense study period that occurs shortly before one’s memory is to be tested–sometimes does work. Cramming often produces adequate performance on an imminent exam (Roediger & Karpicke, 2006); unless the cramming is done instead of sleep, in which case the sleep deprivation outweighs any gains from cramming (GillenO’Neel, Huynh, & Fuligni, 2013). The information learned through cramming, however, will subsequently be rapidly forgotten (Bjork & Bjork, 2011). In order for information to be retained more sustainably and over longer periods of time, it needs to be revisited on multiple occasions spaced out over time. This is known as distributed practice, or the spacing effect, which has been in the literature since Ebbinghaus first discovered it in the late 19th century (Ebbinghaus, 1885/1913). Despite much converging evidence over the past 100 years (see Cepeda, Pashler, Vul, Wixted, & Rohrer, 2006), this practice has not made its way into mainstream education (Kang, 2016).  

    In the cognitive literature, a distinction is made between spacing and interleaving, i.e., switching back and forth between different topics or question types within a topic (Rohrer & Taylor, 2007). That is, Storm, Bjork, and Storm (2010) showed that interleaving produces benefits that cannot entirely be accounted for by spacing. However, in practice, it is hard to imagine an educationally relevant situation in which spacing and interleaving would be dissociated. We propose, then, that the theoretical distinction between spacing and interleaving may not be critical in terms of practical applications. Instead, teachers can focus more generally on trying to provide students with opportunities to space their studying.

    One implementation issue is that spacing hurts performance in the short-term, which makes it less appealing. Students typically feel overconfident when they cram, while spacing out learning leads them to feel relatively less confident (Bjork, 1999); but this is a “desirable difficulty”, which helps learning in the long-term (Bjork, 1994). When making predictions about future performance based on different study schedules, students tend to underestimate the benefits of spacing (Logan, Castel, Haber, & Viehman, 2012). Another reason why spacing might not be used by students as often as we’d like was recently suggested by Kang (2016): this strategy may require more advance planning than simply studying one topic until a saturation point is reached. More research is necessary to fine-tune implementation of spaced study schedules, and would preferably involve teachers in classrooms.


    4. Frequent quizzing

    The use of retrieval practice to aid learning has been a major focus of the applied cognitive literature in the past decade. As with spacing, the finding that testing strengthens memory is not new (Gates, 1917). However, the message that testing helps learning is somewhat politically charged and often lost when teachers hear the word “testing” because this activates ideas related to high-stakes standardized testing. It’s important to note that frequent testing does not have to be presented as a formal quiz; any activity that promotes retrieval of target information should help (e.g., Karpicke, Blunt, Smith, & Karpicke, 2014).

    Although the mechanisms behind the retrieval practice effect are not yet fully understood, the findings are quite clear: when preparing for a test, practicing retrieving information from memory is a much more effective strategy that restudying that information (Roediger & Karpicke, 2006). This is true even when there is no opportunity to receive feedback on the quiz (Smith, Roediger, & Karpicke, 2013), as long as performance on the practice quiz is not too low (Kang, McDermott, & Roediger, 2007). The only notable exception to the retrieval practice effect is when the final test is occurring immediately after study, in which case restudying can sometimes be more effective than testing (Smith et al., 2013). However, unless students are reviewing their notes before walking into the exam room, in general it is quite rare for students to be anticipating an immediate test situation while studying. Thus, in regular exam preparation situations, a strong recommendation can be made from the literature: students ought to practice retrieval.

    A good way to integrate quizzes into regular teaching is to provide opportunities for retrieval practice during learning; quiz questions interspersed during learning produce the same benefit to long-term retention as quiz questions presented at the end of a learning episode such as a lecture (Weinstein, Nunes, & Karpicke, 2016). In addition to providing retrieval practice, this method also boosts learning by maintaining test expectancy throughout the learning experience (Weinstein, Gilmore, Szpunar, & McDermott, 2014). A combined benefit of retrieval practice and spacing can be gained from engaging in retrieval practice multiple times. Creating the specific spacing schedule for a particular educational situation is tricky because it depends how strong the original memory is, and how quickly forgetting is going to happen for that information (Cepeda, Vul, Rohrer, & Wixted, 2008). Without the use of sophisticated software to schedule spacing, a more practical suggestion may be for teachers to include quiz questions from previous topics throughout the semester, in order to facilitate a reasonable amount of spaced practice.




    There is an unending supply of suggestions on how students can learn information more effectively. Here we draw from established cognitive psychology research and distill four simple strategies to enhance classroom learning. These four strategies are: (1) providing visual examples, (2) teaching students to explain and to do, (3) spaced practice, and (4) frequent quizzing. More specifically: (1) Try to present information with both text and pictures; (2) Get students to explain the information they are learning, or if possible, have them act things out; (3) Create opportunities to revisit information over the course of a semester; and (4) Include low-stakes quizzes throughout learning to provide retrieval practice. Critically, each of these strategies is strongly supported by extant research and can be readily implemented in the classroom.




    Bjork, R. A. (1994). Memory and metamemory considerations in the training of human beings. In J. Metcalfe and A. Shimamura (Eds.), Metacognition: Knowing about knowing (pp. 185–205). Cambridge, MA: MIT Press.

    Bjork, R. A. (1999). Assessing our own competence: Heuristics and illusions. In D. Gopher and A. Koriat (Eds.), Attention and Performance XVII. Cognitive regulation of performance: Interaction of theory and application (pp. 435-459). Cambridge, MA: MIT Press.

    Bjork, E. L., & Bjork, R. A. (2011). Making things hard on yourself, but in a good way: Creating desirable difficulties to enhance learning. In M. A. Gernsbacher, R. W. Pew, L. M. Hough, & J. R. Pomerantz (Eds.), Psychology and the real world: Essays illustrating fundamental contributions to society (pp. 56-64). New York: Worth Publishers.

    Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T., & Rohrer, D. (2006). Distributed practice in verbal recall tasks: A review and quantitative synthesis. Psychological Bulletin, 132, 354-380. doi: 10.1037/0033-2909.132.3.354

    Chi, M. T., Bassok, M., Lewis, M. W., Reimann, P., & Glaser, R. (1989). Self-explanations: How students study and use examples in learning to solve problems. Cognitive Science, 13, 145-182. doi: 10.1207/s15516709cog1302_1

    Chi, M. T., De Leeuw, N., Chiu, M. H., & LaVancher, C. (1994). Eliciting self-explanations improves understanding. Cognitive Science, 18, 439-477. doi: 10.1016/0364-0213(94)90016-7

    Coffield, F., Moseley, D., Hall, E., & Ecclestone, K. (2004). Learning styles and pedagogy in post-16 learning: A systematic and critical review. London: Learning & Skills Research Centre.

    Cohen, R. L. (1981). On the generality of some memory laws. Scandinavian Journal of Psychology, 22, 267–281.

    Craik, F. I. M., & Lockhart, R. S. (1972). Levels of processing: A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11, 671–684.

    Dunlosky, J., Rawson, K. A., Marsh, E. L., Nathan, M. J., & Willingham, D. T. (2013). Improving students’ learning with effective learning techniques: Promising directions from cognitive and educational psychology. Psychological Science in the Public Interest, 14, 4-58. doi: 10.1177/1529100612453266

    Ebbinghaus, H. E. (1885/1913). Memory: A contribution to experimental psychology. New York: Teachers College, Columbia University.

    Engelkamp, J., & Cohen, R. L. (1991). Current issues in memory of action events. Psychological Research, 53, 175-182. doi: 10.1007/BF00941384

    Engelkamp, J., & Zimmer, H. D. (1984). Motor programme information as a separable memory unit. Psychological Research, 46, 283–299. doi: 10.1007/BF00308889

    Gates, A. I. (1917). Recitation as a factor in memorizing. New York: The Science Press

    Gillen-O’Neel, C., Huynh, V. W., & Fuligni, A. J. (2013). To study or to sleep? The academic costs of extra studying at the expense of sleep. Child Development, 84, 133-142. doi: 10.1111/j.1467-8624.2012.01834.x

    Hattie, J., & Yates, G. (2014). Visible learning and the science of how we learn. New York: Routledge.

    James, W. (1899). Talks to teachers on psychology: And to students on some of life's ideals. New York: Henry Holt and Company. Accessible from https://ebooks.adelaide.edu.au/j/james/william/talks/.

    Kahl, B., & Woloshyn, V. E. (1994). Using elaborative interrogation to facilitate acquisition of factual information in cooperative learning settings: One good strategy deserves another. Applied Cognitive Psychology, 8, 465-478. doi: 10.1002/acp.2350080505

    Kang, S. H. (2016). Spaced repetition promotes efficient and effective learning policy implications for instruction. Policy Insights from the Behavioral and Brain Sciences, 3, 12-19. doi: 10.1177/2372732215624708.

    Kang, S. H., McDermott, K. B., & Roediger III, H. L. (2007). Test format and corrective feedback modify the effect of testing on long-term retention. European Journal of Cognitive Psychology, 19, 528-558. doi: 10.1080/09541440601056620

    Karpicke, J. D., Blunt, J. R., Smith, M. A., & Karpicke, S. S. (2014). Retrieval-based learning: The need for guided retrieval in elementary school children. Journal of Applied Research in Memory and Cognition, 3, 198-206. doi:10.1016/j.jarmac.2014.07.008

    Kirschner, P. A., & van Merriënboer, J. J. (2013). Do learners really know best? Urban legends in education. Educational Psychologist, 48, 169-183. doi: 10.1080/00461520.2013.804395

    Lockhart, R. S., & Craik, F. I. M. (1990). Levels of processing: A retrospective commentary on a framework for memory research. Canadian Journal of Psychology, 44, 87–112. doi: 10.1037/h0084237

    Logan, J. M., Castel, A. D., Haber, S., & Viehman, E. J. (2012). Metacognition and the spacing effect: the role of repetition, feedback, and instruction on judgments of learning for massed and spaced rehearsal. Metacognition and Learning, 7, 175-195. doi: 10.1007/s11409-012-9090-3

    Madan, C. R., Glaholt, M. G., & Caplan, J. B. (2010). The influence of item properties on association-memory. Journal of Memory and Language, 63, 46-63. doi:10.1016/j.jml.2010.03.001

    Madan, C. R., & Singhal, A. (2012). Using actions to enhance memory: Effects of enactment, gestures, and exercise on human memory. Frontiers in Psychology, 3, 507. doi:10.3389/fpsyg.2012.00507

    Moscovitch, & H. L. Roediger (Eds.), Perspectives on human memory and cognitive aging: Essays in honour of Fergus I. M. Craik (pp. 28-47). Philadelphia: Psychology Press.

    Paivio, A. (1986). Mental representations: A dual coding approach. New York: Oxford University Press.

    Paivio, A., & Csapo, K. (1969). Concrete image and verbal memory codes. Journal of Experimental Psychology, 80, 279-285. doi: 10.1037/h0027273

    Paivio, A., & Csapo, K. (1973). Picture superiority in free recall: Imagery or dual coding? Cognitive Psychology, 5, 176-206. doi: 10.1016/0010-0285(73)90032-7

    Pashler, H., Bain, P. M., Bottge, B. A., Graesser, A., Koedinger, K., McDaniel, M., & Metcalfe, J. (2007). Organizing instruction and study to improve student learning (NCER 2007-2004). Washington, DC: National Center for Education Research, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://ncer.ed.gov.

    Pomerance, L., Greenberg, J., and Walsh, K. (January 2016). Learning About Learning: What Every New Teacher Needs to Know. Washington, D.C.: National Council on Teacher Quality. Retrieved from http://www.nctq.org/dmsView/Learning_About_Learning_Report.

    Pressley, M., McDaniel, M. A., Turnure, J. E., Wood, E., & Ahmad, M. (1987). Generation and precision of elaboration: Effects on intentional and incidental learning. Journal of Experimental Psychology: Learning, Memory, and Cognition, 13(2), 291-300. doi: 10.1037/0278-7393.13.2.291

    Roediger, H. L. (2013). Applying cognitive psychology to education: Translational educational science. Psychological Science in the Public Interest, 14,1-3. doi: 10.1177/1529100612454415

    Roediger, H. L., & Gallo, D. A. (2002). Levels of processing: Some unanswered questions. In M. Naveh-Benjamin, M. Moscovitch, & H. L. Roediger (Eds.), Perspectives on human memory and cognitive aging: Essays in honour of Fergus I. M. Craik (pp. 28-47). Philadelphia: Psychology Press.

    Roediger, H. L., & Karpicke, J. D. (2006). Test-enhanced learning: Taking memory tests improves long-term retention. Psychological Science, 17, 249-255. doi: 10.1111/j.1467-9280.2006.01693.x

    Smith, B. L., Holliday, W. G., & Austin, H. W. (2010). Students' comprehension of science textbooks using a questionbased reading strategy. Journal of Research in Science Teaching, 47, 363-379. doi: 10.1002/tea.20378

    Smith, M. A., Roediger, H. L., & Karpicke, J. D. (2013). Covert retrieval practice benefits retention as much as overt retrieval practice. Journal of Experimental Psychology: Learning, Memory, and Cognition, 39, 1712-1725. doi: 10.1037/a0033569

    Storm, B. C., Bjork, R. A., & Storm, J. C. (2010). Optimizing retrieval as a learning event: When and why expanding retrieval practice enhances long-term retention. Memory & Cognition, 38, 244-253. doi: 10.3758/MC.38.2.244

    Rohrer, D., & Pashler, H. (2012). Learning styles: Where's the evidence? Medical Education, 46, 34-35. doi: 10.1111/j.1365-2923.2012.04273.x

    Rohrer, D., & Taylor, K. (2007). The shuffling of mathematics practice problems improves learning. Instructional Science, 35, 481-498. doi: 10.1007/s11251-007-9015-8

    Thomas, P. L., & Goering, C. Z. (2016, March). Review of learning about learning: What every new teacher needs to know. Retrieved from http://nepc.colorado.edu/thinktank/review-teacher-education

    Wammes, J. D., Meade, M. E., & Fernandes, M. A. (2015). The drawing effect: Evidence for reliable and robust memory benefits in free recall. Quarterly Journal of Experimental Psychology, 69, 1752-1776. doi: 10.1080/17470218.2015.1094494

    Weinstein, Y., Gilmore, A. W., Szpunar, K. K., & McDermott, K. B. (2014). The role of test expectancy in the build-up of proactive interference in long-term memory. Journal of Experimental Psychology: Learning, Memory, and Cognition, 40, 1039-1048. doi: 10.1037/a0036164

    Weinstein, Y., Nunes, L. D., & Karpicke, J. D. (2016). On the placement of practice questions during study. Journal of Experimental Psychology: Applied, 22, 72-84. doi: 10.1037/xap0000071

    Megan Smith is an Assistant Professor at Rhode Island College. She received her Master’s in Experimental Psychology at Washington University in St. Louis and her PhD in Cognitive Psychology from Purdue University. Megan’s area of expertise is in human learning and memory, and applying the science of learning in educational contexts. Megan is passionate about bridging the gap between research and practice in education. In an effort to promote more conversations between researchers and practitioners, she co-founded The Learning Scientists (www.learningscientists.org). Her research program focuses on retrieval-based learning strategies, and the way activities promoting retrieval can improve meaningful learning in the classroom. Megan addresses empirical questions such as: What retrieval practice formats promote student learning? What retrieval practice activities work well for different types of learners? And, why does retrieval increase learning?


    Christopher Madan is a Postdoctoral Fellow at Boston College. He received his PhD in Psychology from the University of Alberta. Chris’ area of expertise is in human memory and decision making, particularly in factors that can make some information more memorable. He studies the role of factors intrinsic to the to-be-remembered information, such as emotion and reward, as well as mnemonic strategies, particularly the Method of Loci. His research program is particularly interested in how biases in memory encoding and retrieval can manifest in other cognitive domains. Chris uses a variety of methodological approaches, including cognitive psychology, neuroimaging, and computational modeling to investigate ‘what makes memories last’.


    Yana Weinstein is an Assistant Professor at University of Massachusetts, Lowell. She received her PhD in Psychology from University College London and had 4 years of postdoctoral training at Washington University in St. Louis. The broad goal of her research is to help students make the most of their academic experience. Yana's research interests lie in improving the accuracy of memory performance and the judgments students make about their cognitive functions. Yana tries to pose questions that have direct applied relevance, such as: How can we help students choose optimal study strategies? Why are test scores sometimes so surprising to students? And how does retrieval practice help students learn? She recently co-founded The Learning Scientists (www.learningscientists.org) with Megan Smith.



  • 15 Feb 2017 7:19 PM | Anonymous

    Enhancing Student Learning with Podcasting and Screencasting

    David B. Miller, Ph.D.
    Professor, Department of Psychology
    University of Connecticut

        Portable devices for media consumption became prominent in the 1950s and 1960s with the growing popularity of the transistor radio (Schiffer, 1991). Since then, there has been a cultural shift fostered by the invention of newer technologies such as the Sony Walkman in the 1980s, and in the current century, the Apple iPod and similar personal listening devices. A vast ecosystem of accessories that facilitate portability has co-evolved with these technologies (Darlin, 2006). While these devices were originally intended for listening to musical recordings, other media such as books, newspapers, magazines, movies, and podcasts have since gained popularity in the portable media market.

        Podcasts are digital recordings that can be downloaded from the Internet or from another source, such as Apple’s iTunes Store, from which they are also available for subscription, usually at no cost. Once downloaded, they can be accessed directly on a computer or transferred to a portable digital media player, such as an iPod, iPhone, or any other mobile device capable of playing audio files. (Despite the name, “podcast,” one does not need an Apple “iPod” to use these digital recordings.)

        When podcasts were first introduced around 2004, they were audio recordings. While this has remained the primary format, others have evolved. An “enhanced” podcast contains not only audio, but also a visual component, typically a series of static (i.e., no animations) Microsoft PowerPoint or Apple Keynote screens. Enhanced podcasts also contain a navigation menu. When accessed on a computer via iTunes, a new menu item appears called, “Chapters.” Clicking on this unfurls a list of “chapters,” along with a small visual icon, of each screen composing the podcast. Users can navigate to whichever chapter they want to hear, or can simply allow the podcast to play sequentially. Enhanced podcasts can be created in a variety of ways, but the most popular software packages are Apple GarageBand, which comes bundled with every Macintosh computer as part of the iLife software suite, and a shareware software package from Humble Daisy called ProfCast (http://www.profcast.com). For non-iTunes users, enhanced podcasts can be saved as .mov files playable on the Internet.

        Finally, actual video podcasts have become more prevalent. They are best used only when a video component is essential, because video can greatly increase the file size depending on how it is encoded, its dimensions, and other factors. For example, if the university cancels class because of bad weather, I upload a video podcast of that day’s lecture to keep my class on schedule. In this case, video is essential. Some podcasts, such as speeches by notable individuals, are available either as audio-only or as video. The visual aspect is appealing in such cases, but the audio alone can suffice.

    iCube: Issues In Intro   

        I began my first podcast series in the Fall of 2005, in connection with my 315-student General Psychology course. The main component of iCube: Issues In Intro is a weekly discussion of course material that I conduct with a small group of up to 20 students. The discussions, which typically last 40-50 minutes, are primarily student-driven (Sener, 2007). They ask questions and I respond. Nothing is scripted. These casual discussions take place in a seminar room near my office in which I set up eight microphones connected to an audio mixer, which, in turn, is connected to my laptop computer for capturing the audio.

        Students who participate receive no extra credit for doing so. Some students return every week, and others stop by only a few times in the semester. Because I have to identify a time when both I and a seminar room are available, there are usually many students who would like to participate but cannot due to schedule conflicts. I encourage students to send in questions via email if they are unable to attend, and we address those items in the podcast.

        The participants are highly motivated and willing to invest the extra time. Interestingly, the majority are not psychology majors, but many of them become very engaged in the course content via our podcast discussions and end up either switching majors, incorporating psychology as a double major, or pursuing a minor in psychology. As an added benefit, I’ve become the academic advisor of former podcast participants. In large classes, students and professors often have difficulty getting acquainted with one another, but podcasting greatly facilitates the kind of scholarly interactions that might otherwise not occur in large classroom settings. Having podcast participants as my advisees enables me to better serve them, and, of course, there are additional benefits to the students in terms of having at least one professor who can write somewhat detailed letters of recommendation in the years that follow.

        Perhaps most importantly, these weekly discussions provide a means of personalizing the course, making it seem psychologically “smaller.” The large class sessions are lectures with minimal opportunity for discussion; but, students who participate in the podcast recordings have an opportunity to interact with me (and me with them) in a relatively informal context. Students who routinely listen to the podcasts also report of sense of having a more personal connection with me and with the student participants. While I prefer lecturing with computerized multimedia in my courses, podcasting provides an important means to incorporate active learning for those students seeking such an opportunity (McLoughlin & Lee, 2007).

        In addition to the weekly discussion, there are two other components of iCube: Precasts and Postcasts. Precasts are short, enhanced podcasts (5-15 minutes long) that I record twice weekly (because I lecture twice each week). They’re intended to provide students with important points that I’ll cover in the next lecture. I also play the Precasts before class begins for students who arrive early, which gives them yet another way of accessing the material and also provides a mechanism for “setting up” the lecture that immediately follows.

        The third component of iCube is the Postcasts, which I create sporadically. Postcasts are content modules that I record to clarify difficult concepts, or items that I feel I didn’t cover clearly in class. In recent years, I have uploaded video screencasts (see below) of full lectures to keep the class on track when school is cancelled.

        iCube is accessible via iTunes for free subscription. As is the case with participating in the recording sessions, listening to the podcasts is entirely optional. I make it available as one of several course enhancements to aid in student learning.

        Every semester, I add items to the University course evaluations to ascertain how many students are listening to iCube and whether they believe that these podcasts help them learn the material. Data gathered over the course of eight semesters starting in Fall 2005, indicate that approximately 40% of the class listen more than occasionally to the podcasts. Of that 40%, 76% of the students report that the podcasts enhance their learning. Most of the remaining 24% report that the podcasts were only marginally helpful. The reason that most of the non-listeners give for not accessing the podcasts is that they don’t feel they have enough time do so.

    Animal Behavior Podcasts

        In the Fall of 2006 (one year after launching iCube), I began a second podcast series for my upper-division Animal Behavior course. This course, which used to have a capacity of 50 students, now has a capacity of 150, and is also taught as a lecture. Among the 150 students, there are typically about 10 who are in the University Honors Program. Honors students at UConn may, with an instructor’s permission, convert a non-Honors course to obtain Honors credit. (Students in the upper-division Honors Scholars Program need 12 Honors credits to graduate with Honors, along with other requirements.)

        My Animal Behavior Podcasts series provides an opportunity to earn Honors credit in this course. It’s based on the iCube discussion model, but Honors students who participate are expected to attend regularly. In these 40-50-minute sessions, we discuss animal behavior course content. Like iCube, these discussions are informal and are distributed on iTunes. In recent years, there have been about 14 Honors students each semester earning Honors credit by participating in these podcasts.

    Interactive Discussion vs. Coursecasts

        In higher education, podcasting gained popularity as a means of recording and distributing entire lectures (what I refer to as “coursecasts”). Lecture recording has been around at least since the invention of affordable, portable cassette tape recorders. Today’s coursecasts are much easier to distribute because of their digital format. At some universities, coursecasts can be created by any professor at the flick of a switch when they enter classrooms outfitted with recording equipment. But one wonders about the extent to which such ease of recording has been preceded by forethought regarding course enhancement.

        Some professors fear students might skip class if coursecasts are readily available (Young, 2008). To minimize attendance problems, some professors who do coursecasting have developed counter-strategies, such as giving regular in-class assessments, recording only a portion of each lecture, waiting a week or longer before uploading the recordings, or even eliminating coursecasting altogether if attendance drops significantly.

        My own experience at UConn with both General Psychology and Animal Behavior podcasts is that students not only view these podcasts as genuine enhancements over and above the classroom experience, but also that the podcasts help the students understand the material and become further engaged with course content. Nevertheless, coursecasting appears to dominate higher education podcasts (certainly those available via iTunes U).

        Coursecasting can also be helpful on religious holidays when observant students will not be in class, and when weather conditions are not threatening enough to deter some (but not all) commuting students, yet not bad enough to result in cancelled classes. The result is that students who have legitimate reasons for being absent from a particular lecture will still have the opportunity to access the course content.

        A major issue for coursecasting is the inclusion of copyrighted material in these distributed lectures. Materials that may have been used legally in a classroom through the “fair use” provision of the Copyright Law of the United States should not be distributed in downloadable podcasts. Instructors who record and then distribute lectures are legally required to edit out such materials prior to distribution. Unfortunately, some of the automated recording systems installed in lecture halls make this difficult because the files are immediately uploaded to a server. In situations where coursecasts are editable, instructors need to acquire expertise in editing as well as a willingness to devote the time for such post-production following each lecture. Thus, routine coursecasts not only have questionable value as an educational enhancement but also potentially have legal consequences.

        Coursecasts might provide an enhancement if approached differently. For example, instead of recording in-class lectures, the actual course content could be delivered by recordings of the professor for students to access online on a regular basis. Class time could then be used for discussion, clarification, demonstrations, examples and applications that weren’t included in the recorded podcasts, and student presentations. Perhaps a better way to conceptualize the application of such media for classroom use is to use the term “coursecast” in reference to a recording of a live classroom lecture, and “screencast” as a recording intended to substitute for a live lecture, thereby providing a basis for what has come to be known as a “hybrid” or “flipped” course.


        In a sense, a screencast can be viewed as an evolutionary advance relative to podcasts and coursecasts. Screencasts are dynamic in the sense that they are produced by recording all activity on one’s computer screen with added narration, edited with sometimes powerful post-production tools, and then exported as videos to be uploaded to the Internet for viewing. Software programs such as ScreenFlow (http://telestream.net) and Camtasia Studio (http://techsmith.com) offer powerful, but user-friendly interfaces for producing screencasts.

        Screencasts can range from simple tutorials (e.g., instructions to be followed in a laboratory course), elaborations of points made in class, or even entire lectures and entire courses, as would be the case with a hybrid or flipped course.

        In 2009, I used ScreenFlow to convert my large Animal Behavior lecture course to a hybrid course in which most of the content was delivered online via streaming video. Students were able to access the videos anytime on a password-protected server, and we met once weekly for discussion, questions, and additional course content not covered in the screencasts. The post-production editing tools enabled me to focus students’ attention on particular screen elements, which is not easily done in a live lecture. Additionally, students were able to pause the videos, replay parts if they so desired, and take thorough, high-quality notes.

        The time that it took (well over 400 hours) to produce the screencasts paid off in terms of student engagement in course material and learning. Almost half of the class of 140 students earned course grades of “A,” and not a single student failed the course the first time it was offered in Fall 2009. It’s been offered in this format every Fall since then with similar results.

        What is clear is that technology (podcasts, coursecasts, screencasts, and other innovations), when used properly, can serve as pedagogical enhancements. However, technology should not be used just for the sake of using it, or simply because it happens to be available. Pedagogy must always precede technology.


    Darlin, D. (2006, February 3). The iPod ecosystem. The New York Times, C1.

    McLoughlin, C., & Lee, M. J. W. (2007). Listen and learn: A systematic review of the evidence that podcasting supports learning in higher education. In: C. Montgomerie & J. Seale (Eds.), Proceedings of ED-MEDIA 2007 World Conference on Educational Multimedia, Hypermedia & Telecommunications (pp. 1669-1677). Vancouver, Canada, June 25-29, 2007.

    Schiffer, M. B. (1991). The Portable Radio in American Life. Tuscon: The University of Arizona Press.

    Sener, J. (2007). In search of student-generated content in online education. Retrieved February 15, 2013, from http://www.e-mentor.edu.pl/artykul/index/numer/21/id/467

    Young, J. R. (2008). The lectures are recorded, so why go to class? The Chronicle of Higher Education, 54, A1.


    David Miller is a Professor of Psychology, Associate Department Head, and Coordinator of Undergraduate Studies at the University of Connecticut at Storrs.  He received his Ph.D. at the University of Miami in 1973, and his research has focused on animal behavior, both in the field and in the laboratory.  He was a Postdoctoral Fellow at the North Carolina Division of Mental Health, where he did field research on parent-offspring auditory interactions of several avian species.  In 1977, he became an Alexander von Humboldt Fellow at the University of Bielefeld (Germany) in the Department of Ethology and a participant in a nine-month interdisciplinary conference on “Behavioral Development in Animals and Man” at the Center for Interdisciplinary Research. He returned to the North Carolina Division of Mental Health in 1978 as a Research Associate, where he began a long series of studies on alarm call responsivity of mallard ducklings, which continued when he joined the faculty at the University of Connecticut in 1980.  Beginning around 1990, his long-standing interest in the effective use of multimedia in the classroom expanded and has continued to evolve.  He has received several awards for teaching excellence at the University of Connecticut and, in 1989, was the recipient of The National Psi Chi/Florence L. Denmark Faculty Advisor Award “for outstanding contributions to Psi Chi and psychology.”  He received the high honor of University of Connecticut Teaching Fellow (1997–1998), and, in 1999, his work in multimedia instructional design and classroom implementation was recognized with the Chancellor’s Information Technology Award.  In 2005, he received the University of Connecticut Alumni Association Faculty Excellence Award in Teaching at the Undergraduate Level, as well as the 2005–2006 University of Connecticut Undergraduate Student Government Educator of the Year Award.  In 2007, he received the University of Connecticut Outstanding Student Advisement and Advocacy Award, and his efforts in podcasting were recognized by the national publication, Campus Technology, which awarded him the 2007 Outstanding Innovator Award in Podcasting.  In 2011, he received the Frank Costin Memorial Award from the National Institute on the Teaching of Psychology for promoting quality teaching methods, as illustrated in a poster on screencasting, and, in 2012, the Animal Behavior Society Distinguished Teaching Award.  He has served on several editorial boards and was Editor-in-Chief of the scholarly journal, Bird Behavior for 15 years.  In recent years, Dr. Miller has devoted considerable time in creating computerized, multimedia versions of his animal behavior and introductory psychology courses.  Multimedia production of university-level educational material is one of his foremost activities.  His most recent multimedia project involved a major transformation of his Animal Behavior course into 90 screencast movies, an effort that was also featured in Campus Technology magazine.
  • 02 Feb 2017 9:07 AM | Anonymous

    Ditching the “Disposable Assignment” in Favor of Open Pedagogy

    Rajiv S. Jhangiani

    Kwantlen Polytechnic University

    Ever since George Miller’s famous (1969) APA presidential address, many others have called upon our field to “give psychology away” (e.g., Epstein, 2006; Goldman, 2014; Klatzky, 2009; Lilienfeld, Ammirati, & Landfield, 2009; Tomes, 2000; Zimbardo, 2004). There is arguably no better way to achieve this than by adopting open pedagogy to place the knowledge base of our discipline in as many hands as possible.

    With open pedagogy, students are not just consumers of educational resources but also producers of educational resources. A key aspect of open pedagogy therefore involves replacing “disposable assignments” with “renewable assignments” (Wiley, 2013). Disposable assignments are those that are typically only seen by the instructor. Students often see little point in them (and rarely revisit them) and many instructors despise grading them. David Wiley, an open education pioneer, describes them bluntly:

    They’re assignments that add no value to the world – after a student spends three hours creating it, a teacher spends 30 minutes grading it, and then the student throws it away. Not only do these assignments add no value to the world, they actually suck value out of the world. Talk about an incredible waste of time and brain power (and a potentially huge source of cognitive surplus)! (2013, para. 5)

    By contrast, renewable assignments are those in which the students’ energy and efforts are repurposed by having them generate materials and resources for the “commons,” including future students taking their course and other formal and informal learners around the world. The materials produced might include developing tutorials, wiki entries, or even videos posted online.

    Incorporating openness into pedagogy is simultaneously liberating and terrifying. It challenges instructors to reflect on their practices and move away from the traditional top-down model of pedagogy by assigning open-ended problems and empowering students to act as co-creators (Rosen & Smale, 2015). But whereas it takes a degree of courage to untether oneself from the security and predictability of the staid research essay, once accomplished, the benefits to the learning process are sizable. For one, students and instructors work collaboratively towards creating resources for public consumption, adding tangible value to the world outside of their classroom. Second, students tend to invest more effort and care more deeply about the product when they know that their work has a larger potential audience than just their instructor (Farzan & Kraut, 2013). Third, open pedagogy unleashes the students’ creative potential, allowing them to ascend the rungs of the cognitive process dimension in Bloom’s revised taxonomy (Anderson & Krathwohl, 2001). Here they generate, plan, and produce instead of merely recognizing and recalling, in the process acquiring higher-order cognitive and meta-cognitive skills that will serve them throughout their university education and career. Fourth, depending on the specific nature of the assignment, the resource produced may serve as an enduring electronic portfolio of their academic work that can be shared with others, including potential employers. In this fashion they may showcase their writing skills (e.g., blogs, wiki entries, etc.), multimedia skills (e.g., videos, websites, etc.), or even their ability to integrate and apply research findings (e.g., policy proposals or briefs). And finally, “because any one of these remixes might end up helping next semester’s students finally grasp the concept that has proven so difficult in the past, faculty are willing to invest in feedback and encouragement at a different level” (Wiley, 2013, para. 16).

    Instructors interested in experimenting with open pedagogy might, for example, design course assignments that require students to create a guide for parents on the use of rewards and punishments with young children based on principles from learning theory, design a public service announcement for a local nonprofit organization based on principles from social psychology, build and edit a wiki that might serve as an instructional resource for future students, write questions for an in-class practice quiz ahead of midterm examinations, or publish blog posts that critically analyze depictions of psychological phenomena in popular films. On a larger scale, an excellent example of an organized open pedagogy initiative is the Association for Psychological Science’s (APS) Wikipedia Initiative.

    APS Wikipedia Initiative

    Wikipedia is a free, online encyclopedia, written and edited collaboratively by those who use it. Its English language edition includes about 4.7 million articles and is the sixth most popular website in the world, with nearly 500 million unique visitors every month (“Wikipedia,” n.d.). Its incredible popularity among students, for whom it is often the first resource accessed when looking up background information for a term paper (Head & Eisenberg, 2009; Lim, 2009), is matched only by its equal unpopularity among faculty, who strongly caution against citing its articles or even penalize their students for doing so (Waters, 2007). Some instructors may work with librarians to better instruct their students on how (and why) to access refereed articles from research databases, but this strategy is merely a weak left jab at the problem. The APS Wikipedia Initiative (APSWI), on the other hand, presents a creative and pragmatic right hook.

    Born out of a desire to “deploy the power of Wikipedia to represent scientific psychology as fully and as accurately as possible and thereby to promote the free teaching of psychology worldwide” (“APS Wikipedia Initiative,” n.d.), the APSWI serves to improve the very resource whose use psychology faculty routinely rail against.

    For context, there are currently more than 8,500 articles on Wikipedia devoted to topics in psychology. At the time of this writing, only 63% of these have been assessed through Wikipedia’s peer assessment system. Far more terrifyingly, only 9% of these have achieved “good article” status while the remaining lower quality articles are viewed in excess of 64,000 times every six months (“APS Wikipedia Initiative,” n.d.).

    These sorts of numbers are why, in 2011, then-APS President Mahzarin Banaji called upon psychology faculty to participate in the APSWI as contributors, reviewers, and especially through adopting open pedagogy:

    The likely most effective way to generate contributions, in my opinion, is to include writing for Wikipedia as part of college and graduate-level courses. In this way, professors and students in a class can begin to populate Wikipedia on the topic of the course, taking advantage of the built-in expertise that is contained in that collective, in a semester long time frame. Writing Wikipedia entries from scratch, editing entries, or evaluating them can be a worthwhile learning experience in a standard classroom. Such work can teach students so much — that even the simplest ideas are hard to communicate to general audiences; that logic, strength of argument, flow and clarity of writing, citations of the appropriate literature, and, above all, accuracy need to be mastered in order to be a member of this guild. My request is that for any course that you are about to teach this semester and beyond, that you consider adding contribution to Wikipedia as part of the course’s requirements. (para. 8)

    Many faculty have since responded to Banaji’s call. During the Fall 2011 and Spring 2012 semesters alone, 640 students across 36 classes participated in the APSWI. Collectively, they edited 840 articles – “the rough equivalent of writing a 1,200 page textbook in psychology” (Farzan & Kraut, 2013, p. 5). Participating instructors have ranged from those completely new to Wikipedia (e.g., Hoetger & Bornstein, 2012) to those with extensive experience (e.g., Marentette, 2014), and the classes enrolled have ranged from small seminars (e.g., Karney, 2012) to enormous 1,700 student sections (Joordens, 2012). The APSWI has also been incorporated into courses at all levels, displacing a research paper in an introductory psychology course (Ibrahim, 2012), a literature review in a 200-level cognitive psychology course (Munger, 2012), a research article review in an upper level course on memory (Hoetger & Bornstein, 2012), an essay for a fourth-year course on the history of psychology (Reynolds, 2011), a 15-page paper in a graduate seminar in social psychology (Karney, 2012), and a traditional final paper in a graduate course on clinical neuropsychology (Silton, 2012).

    Naturally, appropriate instruction and support must be provided and the specific assignment (e.g., adding citations, writing or revising articles, being granted “good article” status by the Wikipedia community on the basis of the quality of writing, neutrality, and appropriate sourcing, etc.) must be tailored to the level and ability of the class. For example, introductory psychology students might be best served by working in teams and focusing their efforts on a small number of articles, adding citations, images, and links where necessary, tagging them appropriately when problems are located, and incorporating feedback from their peers and the Wikipedia community. The potential benefits to students from participating in the APSWI include achieving a deeper understanding of the topic (Farzan & Kraut, 2013), learning to evaluate and defend the credibility of their sources (Marentette, 2014), learning to write more concisely and think more critically (Farzan & Kraut, 2013), collaborating with students from other universities and around the world (Karney, 2012), learning to provide as well as receive constructive feedback (Ibrahim, 2012), enhancing digital literacy (Silton, 2012), and learning how to communicate ideas to a general audience (Association for Psychological Science, 2013).

    Although some students begin a little wary of the assignment, they go on to derive excitement, meaning, and even pride from the open nature of their work, as the following instructor testimonials indicate:

    The students also realized they were a valuable asset to Wikipedia. Their thinking and writing skills as well as their access to an extensive academic library were not broadly shared. As knowledge translators, they could also provide a service to the general public by clearly communicating basic concepts about language acquisition. They wondered who their readers might be: parents? teachers? students in developing countries? One thing that the students uniformly loved about this project was the possibility of other people seeing and recognizing their work. (Marentette, 2014, p. 37).

    They felt their work was meaningful because their contributions are shared with the entire world, rather than just their instructor. They liked that their contributions will not end up in a drawer after the semester ends, but will continue to be available to many people as a useful resource. Some students even noted with pride that their contributions might have wider use than some articles published in academic journals. (Ibrahim, 2012, p. 29)

    Of course, participating in the APSWI is not without its challenges, which include developing an appropriate rubric for grading (Silton, 2012), learning the writing style and referencing standards of Wikipedia (Reynolds, 2011), managing the time frame of the assignment (Marentette, 2014), and maintaining flexibility with the assignment guidelines (Hoetger & Bornstein, 2012). Some practical strategies for instructors considering participating in the APSWI include providing a list of topics not yet covered on Wikipedia, gaining experience with posting an article, looking through the sample Wikipedia assignments provided by the APS, making use of the many articles and step-by-step guides for editing Wikipedia articles and participating in the APSWI, and enlisting the help of a campus Wikipedia Ambassador (Hoetger & Bornstein, 2012; Ibrahim, 2012).

    Concluding Thoughts

    Adopting open pedagogy can seem daunting at first but does not have to mean designing an entirely new assignment or working with new media. All that is required is for the students to work towards producing a resource that others will find useful. This could include literature reviews, evidence-based policy recommendations, or practical guides for the application of psychological knowledge (e.g., promoting environmentally responsible behavior, parenting, etc.). However, if an assignment requires students to develop and exercise a new skill, instructors will need to plan to provide instruction and support throughout the process (e.g., it takes some practice to learn how to properly edit Wikipedia articles). Depending on the nature of the assignment, instructors may also have to develop or locate an appropriate grading rubric.

    As mentioned earlier, adopting open pedagogy is simultaneously liberating and terrifying. With traditional (closed) assignments, vague guidelines, a poor design, unclear rubrics, and insufficient support remain hidden, with student evaluations and perhaps a few grey hairs being the only enduring record. With open pedagogy, on the other hand, both successes and failures with the assignment are much more public. But while this opens the instructor to more criticism, it is also an opportunity to share, collaborate, and receive constructive feedback. More importantly, it creates a foundation for our students to begin to invest more deeply, think more critically, work more collaboratively, and communicate more accessibly—exactly the skills needed to be able to “give psychology away.”


    Anderson, L. W., & Krathwohl, D. (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives. New York: Longman.

    APS Wikipedia Initiative. (n.d.). Retrieved from http://www.apastyle.org/learn/faqs/web-page-no-author.aspx

    Association for Psychological Science [PsychologicalScience]. (2013, May 23). 2013 APS convention video: The benefits of traditional vs. Wikipedia research assignments [Video file]. Retrieved from       https://www.youtube.com/watch?v=6YBdQH0eIEQ&t=66

    Banaji, M. (2011). Harnessing the power of Wikipedia for scientific psychology: A call to action. Observer, 24(2). Retrieved from http://www.psychologicalscience.org/index.php/publications/observer/2011/february-11/harnessing-the-power-of-wikipedia-for-scientific-psychology-a-call-to-action.html

    Epstein, R. (2006). Giving psychology away: A personal journey. Perspectives on Psychological Science, 1(4), 389-400. doi:10.1111/j.1745-6916.2006.00023.x  

    Farzan, R., & Kraut, R. E. (2013). Wikipedia classroom experiment: Bidirectional benefits of students’ engagement in online production communities. CHI'13: Proceedings of the ACM conference on human factors in computing systems (pp. 783-792). New York: ACM Press. doi:10.1145/2470654.2470765

    Goldman, J. G. (2014). Giving psychological science away online. Observer, 27(3), 9-10.

    Head, A. J., & Eisenberg, M. B. (2009, December 1). Lessons learned: How college students seek information in the digital age. Project Information Literacy Progress Report. Retrieved from the Project Information Literacy Website at the University of Washington: http://projectinfolit.org/pdfs/PIL_Fall2009_Year1Report_12_2009.pdf

    Hoetger, L., & Bornstein, B. H. (2012). Enliven students’ assignments with Wikipedia. Observer, 25(4), 44-45.

    Ibrahim, M. (2012). Reflections on Wikipedia in the classroom. Observer, 25(1), 29-30.

    Joordens, S. (2012). Using Wikipedia in a mega classroom: A 1,700 student case study. Wikipedia Symposium.

    Karney, B. (2012). Feedback from the whole world. Observer, 25(3), 45-46.

    Klatzky, R. L. (2009). Giving psychological science away: The role of applications courses. Perspectives on Psychological Science, 4(5), 522-530. doi:10.1111/j.1745-6924.2009.01162.x  

    Lilienfeld, S. O., Ammirati, R., & Landfield, K. (2009). Giving debiasing away: Can psychological research on correcting cognitive errors promote human welfare? Perspectives on Psychological Science, 4(4), 390-398. doi:10.1111/j.1745-6924.2009.01144.x 

    Marentette, P. (2014). Achieving “good article” status in Wikipedia. Observer, 27(3), 25, 37.

    Munger, M. (2012). Improving students’ writing with Wikipedia. Observer, 25(5), 43-45.

    Reynolds, M. (2011). Wikipedia in the classroom. Observer, 24(7). Retrieved from http://www.psychologicalscience.org/index.php/publications/observer/2011/september-11/wikipedia-in-the-classroom.html

    Rosen, J. R., & Smale, M. A. (2015, January 7). Open digital pedagogy = critical pedagogy. Hybrid Pedagogy. Retrieved from http://www.hybridpedagogy.com/journal/open-digital-pedagogy-critical-pedagogy/

    Silton, R. (2012). More than just a grade. Observer, 25(2). Retrieved from http://www.psychologicalscience.org/index.php/publications/observer/2012/february-12/more-than-just-a-grade.html

    Tomes, H. (2000). Giving psychology away. Monitor on Psychology, 31(6). Retrieved from http://www.apa.org/monitor/jun00/itpi.aspx

    Waters, N. (2007). Why you can’t cite Wikipedia in my class. Communications of the ACM, 50(9), 15-17. doi:10.1145/1284621.1284635

    Wikipedia. (n.d.). In Wikipedia. Retrieved January 14, 2015, from

    Wiley, D. (2013). What is open pedagogy? Retrieved from http://opencontent.org/blog/archives/2975

    Zimbardo, P. G. (2004). Does psychology make a significant difference in our lives? American Psychologist, 59(5), 339-351. doi:10.1037/0003-066X.59.5.339 


    Biographical Sketch

    Dr. Rajiv Jhangiani is the Open Studies Teaching Fellow and Psychology Faculty at Kwantlen Polytechnic University in Vancouver, BC, where he conducts research on open education and the scholarship of teaching and learning. A recipient of the Robert E. Knox Master Teacher Award from the University of British Columbia and the Dean of Arts Teaching Excellence award at KPU, Dr. Jhangiani serves as the Senior Open Education Advocacy & Research Fellow with BCcampus, an Associate Editor of Psychology Learning and Teaching, and a faculty workshop facilitator with the Open Textbook Network. Along with the other members of the STP ECP committee, he recently co-edited the e-book A Compendium of Scales for Use in the Scholarship of Teaching and Learning. His forthcoming book is titled Open: The Philosophy and Practices that are Revolutionizing Education and Science (Ubiquity Press).


Powered by Wild Apricot Membership Software