STP Logo

Society for the Teaching of Psychology
Division 2 of the American Psychological Association

GSTA Blog

Welcome to the GSTA blog! 

In an effort to keep the Graduate Student Teaching Association (GSTA) blog current, we regularly welcome submissions from graduate students as well as full-time faculty. Recently we have made the decision to expand and diversify the blog content to include submissions ranging from new research in the area of the Scholarship of Teaching and Learning (SoTL), public interest topics related to teaching and psychology, occasional book reviews, as well as continuing our traditional aim by including posts about teaching tips. The blog posts are typically short, ranging from about 500-1000 words, not including references. As it is an online medium, in-text hyperlinks, graphics, and even links to videos are strongly encouraged!

If you are interested in submitting a post, please email us at gsta@teachpsych.org. We are especially seeking submissions in one of the five topic areas:

  • Highlights of your current SoTL research
  • Issues related to teaching and psychology in the public interest
  • Reviews of recent books related to teaching and psychology
  • Teaching tips and best practices for today's classroom
  • Advice for successfully navigating research and teaching demands of graduate school

We would especially like activities that align with APA 2.0 Guidelines!

This blog is intended to be a forum for graduate students and educators to share ideas and express their opinions about tried-and-true modern teaching practices and other currently relevant topics regarding graduate students’ teaching.

If you would like for any questions to be addressed, you can send them to gsta@teachpsych.org and we will post them as a comment on your behalf.

Thanks for checking us out,

The GSTA Blog Editorial Team:

Teresa OberCharles Raffaele, Hallie Jordan, and Sarah Frantz


Follow us on twitter @gradsteachpsych or join our Facebook Group.


<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
  • 16 Jul 2019 1:12 PM | Anonymous member (Administrator)

    Wind Goodfriend, Ph.D., Buena Vista University

    Every year, around 1.5 million students take Introductory Psychology (Intro) classes (Gurung et al., 2016). Given that about 5% of all college students are psychology majors, the vast majority of students in those Intro classes are not actually particularly interested in psychology. Instead, they are taking it as a general requirement toward graduation.

    Those of us who teach Intro are lucky enough to know that we benefit from inherently interesting material. Personality, mental health issues, how memory works, close relationships, group decision making—almost every chapter in an Intro class should be fascinating and relatable to college students. That said, those of us who teach Intro also know that the ideal situation of every student sitting on the edge of their seat with excitement is, sometimes, not quite reality.

    I’ve been teaching Intro for twenty years now, and I’ve stumbled upon a few secrets that seem to help my students stay engaged. Today I want to share one of my favorite “tricks” – one that is often mentioned as a huge positive in my student evaluations at the end of the semester. I call it intermission.

    Because I know that most of the students in my class aren’t psych majors, and because I ban all electronic devices (minus 5 points each time I catch you!), I feel a responsibility to be as engaging, entertaining, and exciting as possible for my students. I want them to really love my class, despite the fact that it’s challenging at times. Most importantly, I know that I need to keep their attention throughout. In a world where attention spans seem to shrink a little each year, I’ve created a simple technique that seems to really help them get engaged, right when attention seems to slip.

    Approximately half-way through the class, I suddenly call “intermission.” My students know it’s coming around then, so they start to perk up about 20 minutes in, waiting with anticipation for when it’s going to arrive (which means, again, they start paying attention again). Intermission is structured to be about 60 seconds of something completely irrelevant and, frankly, a little silly. I assure the students that the intermission material is not going to be on the test; it’s honestly a time to just take a quick mental break and bond with the class.

    If you want to try this, I suggest choosing intermission topics that really speak to your own interests and personality, so they seem relatively authentic. I also suggest that you steer away from political issues or anything that might be controversial. Intermission is supposed to be light-hearted fun, during which I often use self-deprecating humor. Here are some examples from my own class:

    • Ask the class who would win in a fight: Gandalf or Dumbledore.
    • Show pictures of cute baby animals, with a funny song in the background.
    • Summarize a Shakespearean play in 60 seconds or less.
    • Ask them to turn to a neighbor and describe what country they’d most like to visit, and why.
    • A brief history of the “Smurfs” cartoon and why they are racist.
    • Sing a song together, like “Soft Kitty” (from the Big Bang Theory; you can put the lyrics up on the projector).
    • Ask them to turn to a neighbor and explain whether they’d rather be a vampire or a werewolf.
    • Show pictures of animals dressed up in cute Halloween costumes.
    • Debate with them about what the best superpower would be, and why.
    • Show them embarrassing pictures of you as a child.
    • Show them photos of 1800s-era Presidents and have them choose which is the “hottest.”

    Again – these are all very silly. But that’s kind of the point. You have to be willing to play along with this activity, showing some vulnerability in your own silliness. But it’s a way to show the students your sense of humor, your approachability, and your acknowledgment that Intro can be a lot of material. By breaking up a long lecture into two parts, you get their attention back after the intermission; you don’t lose them for the 10-15 minutes right in the middle of your class. And honestly, my students seem to really love the relaxed nature of the class and the fun, nerdy surprise they get each day. If you want to spice up their attention in a fun way, it’s worth a try! Even if you don’t do it every single day, peppering in intermissions every few days, at random, will give the students something to look forward to, and something that gets their brains back in the game.

    References

    Gurung, R. A. R., Hackathorn, J., Enns, C., Frantz, S., Cacioppo, J. T., Loop, T., & Freeman, J. E. (2016). Strengthening introductory psychology: A new model for teaching the introductory course. American Psychologist, 71(2), 112-124.

    Wind Goodfriend is a full professor of psychology and division chair of social sciences at Buena Vista University in Iowa. She has won the “Faculty of the Year” award there three times so far, and was the recipient of the 2001 Wythe Teacher of the Year award. Her new co-authored textbook Social Psychology won the 2019 Most Promising New Textbook award.


  • 03 Jun 2019 8:30 PM | Anonymous member (Administrator)

    By Janet Peters, Ph.D.

    The first time I taught statistics, I was intimidated by the course. I knew the reputation statistics courses have with psych majors and I didn’t want to teach a class that students thought was boring, impossible, and intimidating (the irony that I was experiencing the exact same feelings was completely lost on me). So, I was prepared for the worst – student anxiety, reluctance, or possible mutiny. The first day of school was approaching and I braced myself for impact. After a few weeks of getting the hang of the course, I realized that teaching statistics is actually WONDERFUL. No one had told me that it could be enjoyable! Or maybe they did tell me, and I just didn’t listen (and now that I think about it, that’s way more likely). Thus, this blog post is about throwing out misperceptions and infusing fun into your stats class.

    I’ve broken this post into three parts for you: (1) the basics (easiest and least investment); (2) next level applications (creating course content that uses real applications); and (3) advanced applications (wonderfully fun course projects that require more planning and commitment). You can think of it as an a la carte menu – take what you like and leave the rest.

    Just remember that there is no panacea for creating the perfect course or getting students to understand content; we can only take small, measurable steps in the right direction. I hope you can find one idea that helps refresh your teaching or inspires you to add a little zest to your class because we all love a little fun!

    Making it fun: The basics

    When I think about the easiest way to infuse fun into my own statistics courses, I think of three basic approaches: pop culture, quirky examples, and food.

    Infusing pop-culture can be a lot of fun as you get to engage students with the content using readily available examples from the world around them. For example, I made a Disney themed worksheet to help students practice the scales of measurement. Practice questions might include “A magic mirror that rates the fairest of them all on a 1-7 Likert scale” or “The length of Rapunzel’s hair.” Both questions include elements from Disney cartoons, but answering the questions is not dependent upon that knowledge. The key here is that your items should be inclusive; anyone should be able to answer the questions, even if they’ve never seen a Disney movie. Yet if they are familiar with the examples, it makes the practice more fun and less mundane.

    If you find it difficult to include pop culture, you can always find other fun, quirky examples that might interest your students. For example, when I teach students about z-scores, we do an example using Bigfoot sightings in Washington State (where we are located). This is because one of my institution’s more unique historical claims-to-fame is that we were home to the world’s foremost bigfootologist, Dr. Grover Krantz. In honor of this legacy and to learn about z-scores, we analyze data compiled by the Bigfoot Field Researchers Organization (seriously, this is real data from a real organization). We use the total number of Bigfoot sightings by county to calculate z-scores, including how many Bigfoot sightings are in the county we live in (below average, by the way!). The students love this activity because it’s unique and the research comes from our home institution.

    If you don’t have the luxury of having a notable bigfootologist, I’m sure there are other examples you can draw upon that are unique – any interesting research your university has produced? Any famous people or alumni from your university or area? Any campus-specific issues you could play with (for example, at a previous institution, students were heavily invested in athletic team rivalries, so I often used examples that played off the good-natured rivalry). The key here is to find examples that are special to your context: your school, your city/state, etc.

    Coming up with pop culture and quirky examples can be tough, so you can always rely on the time-tested approach of using food and candy. For example, I’ve used M&Ms to demonstrate sampling with replacement and Skittle flavors to demonstrate ANOVAs. Once, I used pizza to demonstrate the difference between samples (a single slice) compared to the actual population (the whole pizza). Granted, I tend to have small class sizes, but you could get creative (more on this later).

    Making it fun: Next level applications

    The abstract nature of statistics can lead students to perceive the course as difficult and detached from their everyday lives and professional future. They can generally understand how statistics relates to research, but students rarely feel an immediate and personal connection to the course content. As a result, students often underappreciate the personal and professional benefits that statistics offer. Thus, to help students understand the importance and relevance of stats, I try to create content that helps them see that statistics are all around us – from the way politicians use polling data, to influencing the death penalty in Florida.

    One way to incorporate applications into your course is to think about creating content that is relevant and personally meaningful. In class, I use examples from wedding, housing, and job websites to show them how different markets exploit statistics to alter consumer perceptions and behavior.

    For example, we use national data to examine student loan debt using a variety of tools (depending on where we are in the semester, we might use z-scores, t-tests, correlations, etc.). You can also use less serious examples. For homework assignments, I’ve been able to find user data on companies such as Netflix, Facebook, and Tinder. The degree to which students use or are familiar with these companies varies, but they provide real data that comes from the type of news articles the average consumer is likely to encounter.

    I also have students make their own connections between their lives and course content. One way I do this is to assign pre-lecture activities (PLAs) based on the readings/content that I want students to complete before class. The students can do WHATEVER they want to show me they have read or watched the lecture as long as their submission applies the concepts they learned (the submission cannot just regurgitate definitions).

    The PLAs provide the structure for students to dig a little deeper and look at their world through the lens of statistics. It takes a little getting used to for the students, but I’ve had some amazing student work come out of this. For example, students have submitted a script for a Parks & Recreation (TV show) episode where the characters explained t-tests, an essay on how Eminem's newest album demonstrated different scales of measurement, an animated YouTube video with 3-D animation, cookies baked in the shape of different distributions, and a PowerPoint presentation that used the history of beauty pageants to explain descriptive statistics. I am absolutely stunned at the talent of my students and I find myself looking forward to grading these assignments.

    It’s important to note that not all students produce creative submissions, but ALL submissions are applied examples of the concepts. In general, students seem to really enjoy the PLAs. Some students like them because it gives them the chance to be creative and some students like them because it helps them prepare for class and/or the exams. Either way, I love PLAs because they inject an element of fun (they are my very favorite thing to grade), while also meeting my desire to have students apply the material and come to class prepared.

    Making it fun: Advanced applications

    After teaching statistics for several semesters and solidifying the foundation of my course, I found myself looking for more. I wanted students to be able to work on a project they enjoyed, that increased their understanding of course content, helped them understand how to apply statistics to realistic problems, and helped them develop professional skills that would be marketable post-graduation. From this, my service-learning project was born. To date, I’ve paired with local, non-profit health clinics, homeless shelters, and youth mentoring programs.

    Each semester, someone from the organization visits our class to introduce themselves and the mission of the organization, we take a tour of the facility, and we spend one day volunteering for the organization. In class, we spend lab time each week analyzing data from the organization and writing up results. At the end of the semester, students present their findings to the organization and work together to write a white paper.

    This project is classified under the advanced applications because although these semester-long projects are some of the most fun, rewarding, and valuable work that I have done as a professor, it is also a lot of labor. Coordinating communication, meetings, data, and volunteering can be time consuming. The data provided are often messy and the findings inconclusive (though a great exercise for students!). If you’re interested in incorporating fun in this way, I encourage you to take it slowly, make sure you have a strong connection with your community partner, and only commit to a few small assignments. If you and your students enjoy the project and as you gain experience, you can always add more elements each semester.

    Considerations

    There is no “one-size-fits-all” solution for teaching. You might have different interests, a different student population, different institutional support, different class sizes, or any other of a variety of factors that influence the way we teach. The key here is to adapt as necessary. I’ve presented you with some ways in which I have made my own class more fun, but you should do what makes sense for you and your students. A few things to keep in mind…

    • The work matters most. You can have the most fun class in the world, but none of that matters if you aren’t achieving your learning outcomes. So, before you go wild in trying to find fun examples and activities, set up your foundation first. Identify your learning outcomes and ensure that the content you cover reflects those outcomes. Once you’ve built that framework, you can then look up all sorts of fun stuff.
    • Finding the fun stuff. I spend a lot of time on the internet (too much?), so I’ve built myself a cache of great resources. If you’re having trouble coming up with your own examples, don’t fret. There are lots of great websites, blogs, and YouTube channels out there that already do an awesome job of coming up with fun and helpful activities. To help you get started, I’ve listed a few:

    As you explore these resources, pick examples that you find exciting and engaging. You won’t be able to convey the fun and intrigue to students if you don’t understand the meme or humor yourself.

    The reality is that statistics can be a challenging course to teach (I’m assuming that’s why you’re reading this article and have made it this far). The good news is that it’s also incredibly fun – statistics unites all areas of psychology and is present in everyday life. It’s our job to help foster those connections through well-designed examples, activities, and homework. Using fun, personally meaningful, and professionally relevant coursework is one way we can help students see those connections. 


    Dr. Janet Peters is a Clinical Assistant Professor of Psychology at Washington State University Tri-Cities. She received her Ph.D. in Industrial and Organizational Psychology from Colorado State University. Her current research interests center on effective pedagogical practices, particularly as they relate to the teaching of Introductory Psychology, Statistics, and Research Methods.

  • 30 May 2019 9:44 AM | Anonymous member (Administrator)

    By Matthew Mulvaney, Ph.D., and Rachel Razza, Ph.D., Syracuse University

    Co-teaching (or team teaching) can be an effective approach for faculty to work collaboratively to deliver new courses that reflect their combined expertise. Here we discuss the approach that we took to develop a course from a co-teaching perspective. The two of us (both faculty in a Human Development and Family Science department) wanted to ensure that our graduate students would be trained in structural equation modeling (SEM). Our context for developing this team-taught class was based on our shared belief that graduate students in our program need to learn the basics of SEM, along with our observations of the limited options available for learning SEM on our campus. We had to be creative in constructing such an opportunity, however, as neither of us are trained methodologists and thus we felt unequipped to tackle this course alone. Therefore, in order to cultivate this opportunity for our graduate students, we pushed ourselves to develop this course via a co-teaching approach.

    The idea for this course and our team teaching approach originated from a Graduate Student Research Seminar Series in our department that was initiated by one of us, but ultimately presented as a collaborative effort. The seminar series consisted of four, 2-hour sessions where students explored the basics of SEM using key variables that we constructed using data from the Fragile Families and Child Wellbeing Study. The seminar series was more successful than we imagined, as we routinely had 10-12 graduate students per week who were eager to participate and learn with us! We took away three important lessons from this seminar series. First, it was clear that the students would require more advanced training in this analytic technique, especially if they planned to build more complex models in their own research. Second, it was evident that in order to provide them with this knowledge, we also required additional training. And third, we were eager to continue this journey as collaborators, as having someone to help prepare, test, and teach the material proved to be critical in the seminar series. Thus, we identified a summer grant from our university that would allow us to create and teach this course. We spent a significant amount of time consulting with faculty at peer institutions on course design and assisting each other in strengthening our understanding of the material. We met once a week for approximately 8 weeks to select readings, prepare the syllabus, build models to test, and create assignments. In addition, one of us completed additional statistical training in SEM and shared his new knowledge and skill as we prepared this course.

    The course we constructed was taught over two weeks in the summer, with 5-hour days of teaching. We both attended all sessions but alternated primary responsibility on an every-other day basis, split up the grading of the homework, and co-constructed exams. The daily course routine was consistent. Each session included a lecture based on a chapter from an introduction to AMOS textbook. After the new material was presented, students participated in a guided activity where they constructed models in AMOS and ran the analyses that were included in the textbook chapter. In preparation for class, students also read examples of journal articles that reflected the specific approach to modeling that they were learning in the chapter. These articles were chosen by us during the course-planning phase and we used class time to dissect the models and discuss the results of these current studies. The daily sessions wrapped up with time for questions and an introduction to the homework. The homework assignments paralleled the skills that were taught that day but were based on a secondary data set that we constructed with the help of our external consultant. Thus, the students were practicing on one data set during class time and transferring their skills to a different data set for the homework assignments.

    Previous work has identified critical features necessary for the development of team-teaching approaches, including a shared commitment to the co-teaching process and commitment to constructive, reflective discussion throughout the design and delivery of the course (Lock et al., 2016). We would concur that our ability to openly discuss our challenges with the content at all stages of development and delivery, while building off each other’s strengths, was essential. We could both be honest about the challenges we were having and then use the other for more effective support. This dialogue also infused itself into the classroom, where it provided students an opportunity to observe a model of a collaborative professional relationship where different techniques and teaching practices were implemented simultaneously in a supportive learning environment (Chanmugam & Gerlach, 2013). The co-teaching approach seemed to work very well and the feedback was very favorable overall for the class.

    For those who are interested in pursuing co-teaching opportunities, we would put forth the following suggestions. First, identify a need or potential area of development within your department or across departments and then consider potential collaborators. Obviously, selecting an effective co-teacher is essential. Co-teaching is an intense process and you need to find someone that you can feel comfortable working with, who will be committed to the process, and who is as excited as you are about developing the project. You should also survey the supports that might be possible to develop the course. Institutional support is critical to facilitating team taught courses (Morelock et al., 2017). Costs to consider include those associated with course preparation and course delivery.

    While we are not aware of any instances of graduate students and faculty together in co-teaching approaches, it may be a potentially fruitful area to explore. The benefits of co-teaching extend to the instructors who teach it, as it allows them to develop their knowledge and skills in a particular domain (Carpenter, Crawford, & Walden, 2007; Marshall, 2014). Thus, we think that further explorations of such models may be beneficial for developing graduate student teachers and simultaneously moving forward innovative curriculum.

    Resources 

    Expanding your comfort zone handout

    References

    Carpenter, D. M., Crawford, L., & Walden, R. (2007). Testing the efficacy of team teaching. Learning Environments Research, 10, 53-65. doi:10.1007/s10984-007-9019-y

    Chanmugan, A., & Gerlach, B. (2013). A co-teaching model for developing future educators’ teaching effectiveness. International Journal of Teaching and Learning in Higher Education, 25, 110-117.

    Lock, J., Clancy, T., Lisella, R., Rosenau, P., Ferreira, C., & Rainsbury, J. (2016). The lived experiences of instructors co-teaching in higher education. Brock Education Journal, 26(1), 22-35.

    Marshall, A. M. (2014). Embedded professional development for teacher educators: An unintended 'consequence' of university co-teaching. International Journal of University Teaching and Faculty Development, 5, 17-30.

    Morelock, J. R., Lester, M. M., Klopfer, M. D., Jardon, A. M., Mullins, R. D., Nicholas, E. L., & Alfaydi, A. S. (2017). Power, perceptions, and relationships: A model of co-teaching in higher education. College Teaching, 65(482-191. doi:10.1080/87567555.2017.133661

    Author Bios

    Dr. Matthew Mulvaney is an Associate Professor of Human Development and Family Science at Syracuse University. He teaches courses in parenting, child development, and family theories. He is currently serving as the chair of the SRCD Teaching Committee.

    Dr. Rachel Razza is a an Associate Professor of Human Development and Family Science at Syracuse University. She has served as the department’s Graduate Director and as member of the Teaching Committee for SRCD. Dr. Razza has received several grants focused on curriculum development and pedagogy in higher education and was honored with the Syracuse University Teaching Recognition Award in 2014.

  • 22 May 2019 10:31 AM | Anonymous member (Administrator)

    By Jessica Hartnett, Ph.D., Gannon University

    On the very first day of my Introduction to Statistics class, I show my students this and tell them that upon successful completion of my course,  they should add this to their resumes:

    Special skills:

    • Novice data analysis using JASP software, including descriptive statistics, t-tests, ANOVA, chi-square, regression, and correlation

    Over the course of the semester, I work with them, talking about different statistical tests, analyzing and interpreting countless examples using JASP, and learning basic, regimented APA style Method, Results, and Discussion section standards so my students can “talk” statistics. I want them to live up to that special skill claim and to feel comfortable doing statistics. I teach like this because I believe that statistics instructors are in a unique position to teach a core proficiency within our discipline that is also a specific, highly marketable skill.

    Notice that I didn’t include performing statistics by hand anywhere in that paragraph. I do very little by hand calculations in my classes. Why? Because statisticians don’t. And if students need to understand the guts of data analysis, they will go on to graduate school in a quantitative field. And guess what? Most of our BA/BS students ARE NOT doing that. The American Psychological Association Center for Workforce Studies (APA, n.d.) counted up 3.5 million people in the US who have bachelor’s degrees in Psychology. So you know how many of them have PhDs in psychology? 4% of them do, and 10% of them have a Master’s in psychology. A full 53% stopped with the bachelor's degree and the remainder pursued post-baccalaureate degrees outside of psychology. As such, what skills do we need to teach to serve the majority of our students? Basic, novice stats skills, with the assumption that they will learn more in-depth statistics if they pursue graduate study that requires it.

    Teach them statistics so that they can keep up with data and research within their careers and non-quantitative MS/MA programs. Teach them research methodology along with statistics so they can be valued by their employer and trusted to run the occasional correlation or ANOVA, or so that they can help with someone else’s data collection. Teach them enough about statistics that they are not going to fall for click-bait headlines that poorly summarize research.

    Another benefit of not belaboring by-hand calculations is that it leaves you time to do other things. Like mastering analytic software, calculating alternatives to p-values, and teaching your students how to talk statistics. Rather than focusing on by-hand calculations, my students leave our class feeling confident in JASP and able to produce rough APA reports that include effect sizes and confidence intervals. Which brings me to my next point: Picking appropriate software for novice statisticians who probably aren’t going the academic route.

    Most of your students are not going to graduate school and will probably never see SPSS again if you do use SPSS. I would wager that the small proportion of students you are teaching who do go on to graduate school likely won’t see SPSS again, either. And, your students  aren’t rich and they are on the go, so let’s use free software options they can run off their own machines and, maybe, even tablets and mobile devices. You could use JASP, PSPP, R, Google Sheets, or Jamovi. Both JASP and Google Sheets can be used via web browser, for added flexibility. For a free-ish option, you can teach students to conduct statistical analyses via MS Excel, especially using the data analysis add-ins. I use JASP. It is intuitive and doesn’t take up a lot of RAM. Plus, if your students are graduate school bound, they can use the free JASP/R hybrid program, JAMOVI.

    In terms of my mini-APA style reports, I have them create a Methods, Results, and Discussion section for each test we run. The Methods and Discussion are only one to two sentences long (I said mini) and the Result section teaches them the basics of in-text statistical reporting. My Methods sections also include effect sizes, CIs, and p-values. As I tell my students, the point of conducting statistical analysis is to share your efforts with others, many of whom do not understand statistics at all. Mini-APA style reports teach them this skill, and also allow me to assess my students’ understanding of the analysis.

    *If you have any questions about teaching stats or need any help, feel free to email me: hartnett004@gannon.edu. And then re-email me a week later because I can be horrible at email. Or just DM/follow me on Twitter, @notawful.

    **Important caveats relevant to my argument: You may be teaching stats as part of a sequence. Consider what role you have in changing that sequence. How can you work with other instructors to maintain consistency for your students? In my department, we just switched to teaching our Intro Psych class with JASP, and we’re teaching our Stats Lab, Multivariate, and Psychometrics classes with R/Jamovi. This was a big shift but everyone was on board with the change. Now, we don’t have to charge our students a lab fee for our statistics class, either, which I think is an improvement. We also have small class sizes, 30 or less, and significantly smaller advanced classes, which are required for our Bachelor’s of Science track. Additionally, I pick my own textbook and I don’t teach as part of a stats/research sequence. I also teach mostly non-psychology majors.

    Reference

    American Psychological Association. (n.d.). CWS data tool: Degree pathways in psychology. Retrieved May 8, 2019, from https://www.apa.org/workforce/data-tools/degrees-pathways 

    Jessica L. Hartnett is an associate professor of psychology at Gannon University in Erie, PA. She enjoys studying novel methods for teaching statistics and research methods, best practices in obtaining informed consent, and positive psychology. In her spare time, she reflects upon how lucky she is to have a philosopher husband who understands the demands of an academic career and two beautiful sons who doesn’t care about the demands of her academic career in the least.


  • 07 May 2019 6:00 PM | Anonymous member (Administrator)

    By Maria S. Wong, Ph.D., Stevenson University

    Growing up in Hong Kong, I immigrated to Canada with my family at the age of 17. I vividly remember my first day of class as a senior in a public high school. It was a big surprise for me to find that no student really seemed to care when the instructor stepped into the classroom. It was not until the instructor started speaking that the students slowly quieted down and got ready for class. Similar to the experience of many international students, I was used to standing up and greeting the instructor in unison with the rest of the class. Another surprise came when it was time for class discussion. My education in Hong Kong has taught me that instructors often have the right answer in mind when they posed a question. However, my classmates were used to entertaining different points of view. Most of them also felt comfortable speaking their mind and were ready to defend their view.

    Instructors with international backgrounds have probably experienced similar culture shocks. Research on individualism and collectivism (Hofstede, 1980; Triandis, 2001) could explain some of these cultural differences, with people from individualistic cultures (e.g., European Americans) tending to value individual uniqueness, and people from collectivistic cultures (e.g., East Asians) tending to value social hierarchy and group harmony (Oyserman, Coon, & Kemmelmeier, 2002; Triandis, 1995). Indeed, it took me a few years to process and realize how my cultural background has influenced my own learning and teaching. With this blog post, I hope to share “three don’ts and do’s” with fellow instructors who are also thinking about similar issues related to their teaching.

    Three Don’ts

    1. Don’t assume disrespect right away

    For instructors who came from cultures that emphasize hierarchy and authority, they may have a harder time interpreting the casual demeanor of college students in the U.S. From emails that read like text messages (LOL) to in-person exchanges, instructors may automatically assume that the students are being disrespectful. While it is possible that the students are behaving in a disrespectful way, their behavior may reflect possible cohort or cultural differences. From my own experiences, students sometimes engage in unprofessional or immature behaviors without any bad intentions, and these can be turned into teachable moments that can ultimately benefit the students.

    2. Don’t get bogged down by ESL (English as a Second Language)

    For a long time, I was very self-conscious and insecure of my spoken English. Over time, I have come to realize that my accent has no implications for how well I teach. I am now more focused on whether I am communicating information to my students clearly, and how else I could support my students’ learning, than the proficiency of my spoken English. Interestingly, for the past few years, I have been teaching a course on Writing in Psychology for our majors. Students seemed relieved when I shared with them that English is my second language and that I also struggled with writing in college. They explained to me that finally there was an instructor who could understand their struggles. It was a beautiful moment when I felt connected with my students by sharing my vulnerabilities.

    3. Don’t get fixated on negative feedback

    I have to confess that I still have a hard time with this: I tend to focus on the one single negative comment and ignore the rest of the positive comments from my student course evaluations. While we are hardwired to pay attention to threats (e.g., Shoemaker, 1996), I think it is important to put everything in context. For me, it means that I can never process student feedback the first time I read them. I have to remind myself not to put too much weight on a single negative comment, unless it is something that has been raised by multiple students.

    On a related but separate note, you may receive negative comments from students because of your race, ethnicity, gender, age, etc. Do not feel that you have to process those comments alone. Share them with a trusted colleague. Nasty comments reflect nothing about you, but only the person who was making those comments.

    Three Do’s

    1. Do use your multicultural experience as an asset

    One class that I consistently teach is Human Growth and Development. To this end, I have found that my multicultural experiences have enriched the stories that I share with my students. For example, I usually begin the first class by sharing my own story: how I grew up as an only child living in a 600-square-feet apartment with my parents, maternal grandparents, and my aunt in Hong Kong. I am convinced that personal stories are a powerful tool to build rapport with students and encourage them to think about the role of culture in their own development.

    2. Do seek a trusted person as a teaching mentor

    I am a strong advocate for junior instructors to seek out teaching mentors. I have had the privilege of working closely with a mentor for the past four years. My mentor has offered me great help in processing my thoughts and emotions related to challenging teaching moments. There were also times when I was not sure whether I was overreacting because of my own biases, so it was good to have a trusted person to help me understand and interpret the situation from a different perspective. I am now serving as a mentor for a junior faculty member in Brazil and I hope to take on a supportive mentoring role through our regular Skype meetings.

    3. Do have a growth mindset toward teaching

    Stemming from Confucianism, my education experience in Hong Kong emphasized the mastery of material, which was often achieved through deliberate practice and memorization. In contrast, stemming from Greek philosophy, my education experience in North America was Socratic, which emphasized learning through the process of self-discovery (Tweed & Lehman, 2002). For my teaching to be effective in the North American context, I have learned the importance of designing and incorporating engaging class activities that help students come to knowledge via critical thinking, which is something that I have not experienced much in Hong Kong. For me, having a growth mindset—believing that my teaching ability can be developed through practice—really motivates me to reflect on my experiences and helps me strive to become a better instructor every day.

    Taken together, my multicultural experiences have become a core part of my identity. While there are certainly difficult moments when I navigate between different cultures, culture is ultimately what makes my classes come to life. To this end, there are countless moments when my teaching is enriched as I incorporate elements of culture and diversity. While all instructors can certainly bring these elements into their teaching, I do think that those of us with multicultural backgrounds tend to be more inclined to understanding the nuances of cultural differences, which can become a great asset for our teaching. We have so much to offer!


    References

    Hofstede, G. (1980). Culture’s consequences: International differences in work-related values. Newbury Park, CA: Sage Publications, Inc.

    Oyserman, D., Coon, H. M., & Kemmelmeier, M. (2002). Rethinking individualism and collectivism: Evaluation of theoretical assumptions and meta-analyses. Psychological Bulletin, 128, 3–72. https://doi.org/10.1037/0033-2909.128.1.3

    Shoemaker, P. J. (1996). Hardwired for news: Using biological and cultural evolution to explain the surveillance function. Journal of Communication, 46, 32-47.

    Triandis, H. C. (1995). Individualism & collectivism. Boulder, CO: Westview Press.

    Triandis, H. C. (2001). Individualism-collectivism and personality. Journal of Personality, 69, 907–924. https://doi.org/10.1111/1467-6494.696169

    Tweed, R. G., & Lehman, D. R. (2002). Learning considered within a cultural context: Confucian and Socratic approaches. American Psychologist, 57, 89-99.


    Maria S. Wong is an Associate Professor of Psychology at Stevenson University. She teaches courses such as Writing for Psychology, Human Growth and Development, Introduction to Psychology, Statistics for Social and Behavioral Sciences, and Parenting. To her students, Dr. Wong is known for her energy, enthusiasm, supportive guidance, and the use of creative learning activities. As a developmental psychologist (Ph.D. in 2011 from the University of Illinois at Urbana-Champaign), Dr. Wong has an active research program focusing on children’s social-emotional development within the family context. Her work has been published in journals such as Child Development and the Journal of Family Psychology. Dr. Wong is a member of the Society for the Teaching of Psychology (STP) and serves as an Associate Editor for the STP E-book series. She also serves on the teaching committee of the Society for Research in Child Development (SRCD) and was a Co-Chair of the SRCD 2019 Teaching Institute.

  • 01 May 2019 10:00 AM | Anonymous member (Administrator)
    By Jennifer A. McCabe, Ph.D., Goucher College

    Last year, as part of my portfolio for promotion to Full Professor, I wrote a teaching philosophy statement. As this is required for nearly every teaching position in academia, this was not my first draft. In fact, I had written a solid statement for my tenure case just six years ago. At first I asked myself, did anything really change in that time? I soon realized that, in a way I could not articulate at earlier points in my career, I could now identify six core principles that guide every teaching-related decision I make. I hope that by sharing these principles I can encourage others to identify and develop their own set of guiding principles as higher education practitioners.

     

    1. Strategies for Durable Learning

    My scholarship focuses on learning strategies that benefit long-term memory, and I have become more intentional about my responsibility to integrate these evidence-supported memory principles into the structure and delivery of my courses. Early in my career I was worried that some of these choices would be unpopular with students. It took more time and confidence in the classroom to commit fully.

    Now I do so transparently and unapologetically. This includes the use of frequent, effortful, low-stakes, cumulative, spaced (distributed) retrieval practice (a.k.a. quizzes) - followed by discussion to encourage elaboration and connections - in all of my courses. It’s amazing how readily students get on board with these strategies even though they require more time and effort than traditional class practices.


    2.              Interest-First Approach

    Memory researchers also know about the self-reference effect, that we remember information more easily if it relates to ourselves. I enact this principle by adjusting the entry point to many topics and assignments, allowing student interest to be the guiding force. If they begin work on a complex topic using a question or a problem sparked by natural curiosity, I trust that deep and durable learning of the content will follow. I give students much more agency in their assignments and approach to learning than I did in my early years of teaching.

    I also prioritize a narrative approach to course material - learning psychological science through stories. I assign popular press books and articles whenever possible to promote real-world applications of concepts, ensuring a connection between my students’ coursework and their lives. To further engage student interest, I make sure to include (and prioritize) interactive elements such as demonstrations and discussions during each class period.


    3.              Integration and Connection

    I intentionally structure my classes to encourage students to make meaningful connections, drawing on the principle of elaboration. Every day in class, I emphasize how current material links with past topics, and prompt them to continue this work in assignments outside of class.

    For example, I ask students to connect various course topics at the end of the semester as part of a final exam take-home essay. They may identify connections between various aspects of the class with course themes, draw on course material in writing a narrative about each stage of cognition needed to play a game of their choosing, or write a story about how each component of memory would contribute to a day in a person’s life. I have also enjoyed the challenge of embracing college-wide Theme Semesters in my courses, including topics of Mindfulness and Storytelling.


    4.              Authentic and Shared Learning

    I strive to increase the types of activities and assignments that are situated in authentic real-world issues, and that are shared beyond a submission to the instructor. I have found that students are far more engaged—and submit higher-quality products—under these circumstances.

    For example, sharing can happen among classmates in the form of each student reading a different article on a certain topic and then coming to class ready to teach (and learn from) peers in small group discussions. Or they may present mini-TED Talks, consisting of an engaging 5-minute oral presentation on a course-related topic of their choice. There is also great value in having students share their work in other settings, such as presenting at the college’s student symposium or creating resources on course topics for the public. Each semester my students in a seminar on Cognition, Teaching, and Learning complete a “Translational Project for a General Audience.” Formats include podcasts, infographics, videos, games, and one time even a children’s book. Several students who wrote posts in the style of The Learning Scientists blog were subsequently published on this site. Talk about sharing the authentic work of learning!


    5.              Metacognitive Self-Reflection

    My research program also focuses on metacognition, specifically the extent to which students know about and use effective strategies. This scholarly interest permeates all my courses with the goal that students learn about, and reflect on their own, learning. The main idea is to embrace desirable difficulties: learning strategies that are initially slower and harder, but produce more durable memories. Students experience in-class demonstrations showing the memory benefits of these strategies (e.g., spacing, elaboration, testing), followed by activities and assignments to encourage an examination of their own learning beliefs and misconceptions. Then they brainstorm the best ways to communicate this information to peers, and plan for how they will utilize these strategies.

    Metacognitive development is also a natural side effect of frequent, low-stakes, cumulative quizzing. Testing is not only an effective learning strategy, it also provides metacognitive feedback about the state of one’s own knowledge. I encourage students to use testing for both purposes, knowing that the best way to avoid the fluency illusion (believing you have learned something because it seems familiar or easy) is to take a test that requires effortful retrieval from memory. Further, in classes with major tests, I administer a post-exam metacognitive debrief activity aimed to help students understand if they were overconfident, how they studied, the reason(s) they answered items incorrectly, and a preparation plan for the next exam.


    6.              Transparent Course Design with Intentional Scaffolding

    I have improved my communication with students regarding the goals and objectives in my courses, and the purpose behind how I structure the class and assess student learning. In other words, I think about course design in a more integrated way, connecting learning objectives to teaching strategies and assessments. I work to enhance the clarity and detail of my syllabi and assignment instructions, making them as purpose-driven and explicit as possible. To close the loop on this process, one of my favorite activities is to ask students on the last day of class to reflect on the course objectives, evaluating their progress, and identifying components of the course that helped them improve.

    With regard to scaffolding, I remind myself that each of us is in a developing state of expertise (to borrow growth-mindset language). Some students in my class will need a lot of support and opportunities to get to the level of expertise I expect, and others will need less. The best remedy for this, in my opinion, is to offer early and frequent opportunities for formative feedback. Complex assignments can be broken down into scaffolded components, with feedback at every step. This approach leads to both higher-quality end products and a more positive learning experience for students. It is also a step toward a more inclusive classroom that allows for students of diverse backgrounds and abilities to grow.


    A central theme that unites all of the above is something I express to my students early and often: I care about your learning. Learning, by definition, is about change. And I am committed to nurturing that growth in my students, as well as in myself as an ever-evolving practitioner. In this way, I can maintain high standards in my courses while helping students feel informed and supported in their efforts to achieve success. In turn, they can (and should) have high standards for me, including expectations of preparation, availability, clear and consistent communication, prompt feedback, authentic and enthusiastic engagement, and—maybe most importantly—ongoing efforts to improve. After all, teaching is about change too.



    Jennifer A. McCabe is a Professor in the Center for Psychology at Goucher College in Baltimore, Maryland, where she has taught since 2008. She just completed her 15th year of full-time teaching, having also taught at Marietta College in Ohio. She earned her B.A. in Psychology from Western Maryland College (now McDaniel College), and her M.A. and Ph.D. in Cognitive Psychology from the University of North Carolina at Chapel Hill. She has taught courses including Introduction to Psychology, Cognitive Psychology, Human Learning and Memory, Statistics, Research Methods, and Seminar in Cognition, Teaching, and Learning. She has won teaching excellence awards from Marietta College and Goucher College. Her research interests include memory strategies, metacognition, and the scholarship of teaching and learning. She has been published in Memory and Cognition, Teaching of Psychology, Scholarship of Teaching and Learning in Psychology, Instructional Science, Frontiers in Psychology, Journal of Applied Research in Memory and Cognition, and Psychological Science in the Public Interest. Supported by Instructional Resource Awards from the Society for the Teaching of Psychology (STP), she has also published two online resources for psychology educators on the topics of mnemonics and memory-strategy demonstrations. She has served as a Consulting Editor for Teaching of Psychology, and is currently a Consultant-Collaborator for the Improve with Metacognition project.

  • 30 Mar 2019 12:00 PM | Anonymous member (Administrator)

    By Teresa Ober, Kalina Gjicali, Eduardo Vianna, and Patricia Brooks

    To be informed and responsible citizens, students should be able to make sense of data—and in this day and age, we live in a world with an abundance of it!  As such, developing students’ quantitative literacy (QL) has become one of the overarching goals of undergraduate education (Sons, 1994). QL is considered “an aggregate of skills, knowledge, beliefs, dispositions, habits of mind, communication, capabilities, and problem solving skills that people need in order to engage effectively in quantitative situations arising in life and work” (as cited in Steen, 2001, p. 7). QL often involves applying mathematical thinking skills to real-world data with the purpose of drawing informed conclusions about issues of personal and/or societal concern (Elrod, 2014). Students who possess strong QL skills need not have strong computational backgrounds, but should be able to identify and interpret quantitative relations (e.g., in visual graphs), organize quantitative information (e.g., in spreadsheets) and communicate effectively about the relevance of quantitative data in everyday life (Blair & Getz, 2011). It has been argued that QL is most effectively taught when embedded across the curriculum given a critical component of its application involves identifying quantitative relations in varied real-world contexts (Hughes-Hallett, 2001). Embedding QL in college classes across disciplines helps students develop skills they will need to engage effectively in work and life beyond college. In this regard, instruction around QL that uses psychology content can support development and application of practical quantitative reasoning skills outside of the classroom. Hence, psychology departments are increasingly recognizing the need to teach QL as a core cross-curricular requirement (Lutsky, 2008). This means using data and mathematical thinking in all psychology courses, and not just in statistics and research methods courses.

    In this post, we offer some perspectives on how to promote QL across the psychology curriculum. Some of the tools and ideas for integrating QL activities into psychology courses were presented during a recent GSTA-sponsored workshop held at the Graduate Center of the City University of New York on March 6, 2019. During the workshop, Kalina Gjicali, PhD candidate in Educational Psychology, presented best practices for using visual graphs to help students develop quantitative concepts and skills in interpreting data. Dr. Eduardo Vianna of LaGuardia Community College shared resources developed through the Numeracy Infusion Course for Higher Education (NICHE) / Numeracy Infusion for College Educators (NICE), a consortium of educators who share a common mission of promoting quantitative reasoning across various college-level courses. He also shared information about a new CUNY-wide project to improve college students’ QL skills and his own experiences in teaching QL in an introductory (general) psychology course. Teresa Ober, PhD candidate in Educational Psychology, described sources of secondary data and free open-source statistical programs that can be used to develop students’ data analysis skills

    Engaged Pedagogy and Quantitative Literacy

    Educational research suggests that QL is best taught through student-centered, progressive pedagogies that promote active and inquiry-based learning (Lowney, 2008). In particular, studies have demonstrated the following strategies to be especially effective for teaching QL (Carver et al., 2016): (a) active learning, in which students are engaged learners rather than passive recipients of information; (b) inquiry-based learning, which emphasizes conceptual thinking rather than rote skills and memorization of facts as well as the use of problems and examples that are relevant to real-life situations; and (c) the use of technology to analyze actual data in real-life situations. According to constructivist learning perspectives (Cobb, 1994; Fosnot, 1996; Keeling, 2004), students learn most effectively when they explore new concepts and ideas while working out solutions to meaningful problems and considering the implications of research findings. In other words, students should figure out for themselves how new information, concepts, and ideas relate to their existing systems of knowledge and beliefs, and have opportunities to revise and expand their views in response to new knowledge.

    Strategies for Teaching Quantitative Literacy

    Interpreting Graphs

    One way to build QL skills is to focus on data and visual representations of data. Introducing effective graphic displays of data into college lectures can take the focus away from text-heavy slides that summarize information as if it were established fact (as opposed to research findings that may be in need of replication) and instead towards student-centered learning and knowledge construction. By being presented with appropriate visual representations of social science data, students can be expected to (Beaudrie et al., 2013):

    • Articulate their ideas

    • Express themselves with precision  

    • Ground their observations in evidence  

    • Test claims and hypotheses  

    • Participate in civil discourse

    • Represent what they are ill-equipped to see

    • Recognize and weigh uncertainty

    • Construct a context to attract interest and to inform critical thinking

    You can build QL with your college students with the free online feature “What’s Going On in This Graph?”created by The New York Times Learning Network in partnership with the American Statistical Association. Updated on a weekly basis, this resource features graphs of different types and within different contexts, such as varied topics from labor and automation to teen smoking habits (see figure below) that can be used to ask students the following questions:

    1. What do you notice?

    2. What do you wonder?

    3. What’s going on in this graph?

    4. What are the implications for ________ (e.g., understanding health risks of teenagers)?


    All releases are archived, so instructors can use previous graphs anytime. Visit this introductory post and this article about how teachers use this powerful activity.

    Analyzing Data in-Class

    Hands-on opportunities to work with actual data can open many doors for students, especially for those who have had limited experiences in data analysis and have anxiety about it. The concept of using secondary data to teach students about psychological science is not a novel one (see Sobel, 1981), but has received reinvigorated interest due to the vast amount of open access data currently available. In thinking through an in-class data demo, it might be useful for instructors to consider these questions:

    1. How is this data source meaningful to students within the course?

    2. What tools are available to students to help them analyze the data?

    3. What strategies/resources can be made available to students to help them interpret the data?

    4. How can we apply the findings from the data to everyday life?

    The Data

    Selecting a dataset for a data demo project is a crucial first step, and will depend on course content as well as the skills and reasoning abilities that you want your students to develop. Resources abound, with data sources including Kaggle, UNData, OECD, IES, and more than several amazing OSF repositories (e.g., EAMMi2) as well as data provided by local and regional government agencies. In choosing the dataset, consider what topics might be of interest to students, what problems/questions they can use the data to address, and whether there is sufficient documentation to support student learning. For example, for a course covering language development, the CHILDES database (part of Talk Bank) is an invaluable resource. This database contains transcripts of parent-child conversations in a variety of languages, often with accompanying audio or video, and includes datasets for children growing up in multilingual environments as well as datasets with various clinical populations (e.g., developmental language disorder, autism spectrum disorder, hearing loss). This resource includes CLAN software for analyzing conversational interactions and manuals to help you get started.

    Another option for integrating data collection into instruction is to ask students to complete a brief survey during class time. GoogleForms is a very convenient way to collect and present such data quickly, but other survey programs can work just as well. Asking students to complete a short-form survey may be an effective way to introduce them to a dataset by helping them become familiar with the actual scales used in the original study.

    The Tools

    Considering which statistical software programs are available and accessible to students is critical. While many undergraduate psychology courses use proprietary programs like SPSS or STATA to teach statistics, whether students own a license or actually have access to such programs off-campus is often questionable. Thus, it might be more advantageous for students long-term to consider free and open source programs, such as JASP or R. Sometimes the more sophisticated statistical programs may not even be necessary for teaching QL. Rather, in many cases, using a spreadsheet application such as GoogleSheets or Excel might actually be sufficient for teaching basic statistics (DiMaria-Ghalili & Ostrow, 2009). Many students have access to Excel on their personal computers, but benefit from instruction on how to use it to make pivot tables or charts.

    The Interpretation

    In preparing a lesson around the use of secondary data, it is important to consider students’ prior knowledge, skills, and interests to ensure that the instruction is developmentally appropriate. You might start by distinguishing research questions that relate to frequencies (How often?), associations (Are X and Y related?), or causal relationships (Does X cause Y?) as this can lead to a fruitful discussion of how to fit one’s analytic approach to the research question at hand. Students may need instruction to decide what sorts of graphs are appropriate for different types of data (e.g., line graphs, bar graphs, scatterplots). This can lead to further discussion of how to present findings in APA format, determine statistical significance, and interpret p-values.  Along the way, you might consider outliers, skewed distributions, and various threats to the validity of the research, such as the representativeness of the sample. As you guide your class in interpreting research findings, allow for spontaneity by offering students opportunities to test their own hypotheses and develop ideas for future research.

    The Significance, … or rather, Relevance

    Finally, it might be helpful to consider what possible applications can come out of the findings presented in class. Even if the findings might seem intuitive, walking students through the process of analyzing and interpreting the findings should ultimately lead them to feel empowered in working with data. When findings are non-significant and hypotheses are not supported, students have opportunities to learn that this sort of “productive failure” is part of the research process. For this reason, using secondary data as opposed to artificially generated data can lead to a more practical learning experience, particularly when resources for conducting secondary data analysis are plentiful.

    Conclusions

    Strengthening QL has been recognized as an imperative of undergraduate education, with students best served when instructors use an “across-the-curriculum” approach to ensure they have sufficient opportunities to develop QL skills. Like any well-implemented curriculum, teaching QL necessitates planning. When datasets are analyzed or in-class demonstrations are conducted, instructors should take extra precautions to ensure that the lessons achieve their objectives. Lack of clarity during a demonstration, improper analyses, or technical problems can greatly interfere with learning opportunities when conducting in-class demonstrations. Nevertheless, we hope the resources described above may at the very least offer some initial inspiration for incorporating QL instruction into all of your courses.

    Resources

    Data

    Teaching Tools

    References

    Beaudrie, B., Ernst, D. & Boschmans, B. (2013). First semester experiences in implementing a mathematics emporium model. In R. McBride & M. Searson (Eds.), Proceedings of SITE 2013-Society for Information Technology & Teacher Education International Conference (pp. 223-228). New Orleans, Louisiana, United States: Association for the Advancement of Computing in Education (AACE). Retrieved March 21, 2019 fromhttps://www.learntechlib.org/primary/p/48098/.

    Carver, R., Everson, M., Gabrosek, J., Horton, N., Lock, R., Mocko, M., … Wood, B. (2016). Guidelines for assessment and instruction in statistics education: College report, Alexandria, VA: American Statistical Association.

    Cobb, P. (1994). Where is the mind? Constructivist and sociocultural perspectives on mathematical development. Educational Researcher, 23(7), 13-20.

    DiMaria-Ghalili, R. A., & Ostrow, C. L. (2009). Using Microsoft Excel® to teach statistics in a graduate advanced practice nursing program. Journal of Nursing Education, 48(2), 106-110.

    Elrod, S. (2014). Quantitative reasoning: The next "across the curriculum" movement. Association of American Colleges & Universities Peer Review, 16(3), 4. Retrieved online: https://www.aacu.org/peerreview/2014/summer/elrod.

    Fosnot, C. T (Ed). (1996). Constructivism: Theory, perspectives, and practice. New York, NY: Teachers College Press.

    Hughes-Hallett, D. (2001). Achieving numeracy: The challenge of implementation. Mathematics and democracy: The case for quantitative literacy, 93-98.

    Keeling, R. (Ed.). (2004). Learning reconsidered: A campus-wide focus on the student experience. Washington, DC: American College Personnel Association and National Association of School Personnel Administrators.

    Lowney, K. S. (Ed.). (2008). Teaching social problems from a constructivist perspective, New York: W.W. Norton.

    Lutsky, N. (2008). Arguing with numbers: Teaching quantitative reasoning through argument and writing. Calculation vs. context: Quantitative literacy and its implications for teacher education, 59-74.

    Sons L. R. (1994). Quantitative reasoning for college graduates: A complement to the standards. Mathematical Association of America. Retrieved online: https://www.maa.org/programs/faculty-and-departments/curriculum-department-guidelines-recommendations/quantitative-literacy/quantitative-reasoning-college-graduates.

    Steen, L. A. (Ed.). (2001). Mathematics and democracy: The case for quantitative literacy. Report prepared by the National Council on Education and the Disciplines. Retrieved online: https://www.maa.org/sites/default/files/pdf/QL/MathAndDemocracy.pdf

    Author Bios

    Patricia J. Brooks is Professor of Psychology at the College of Staten Island and the Graduate Center, CUNY and GSTA Faculty Advisor.  Brooks was recipient of the 2016 President’s Dolphin Award for Outstanding Teaching at the College of Staten Island, CUNY.  Her research interests are in two broad areas: (1) individual differences in language learning, (2) development of effective pedagogy to support diverse learners.​

    Kalina Gjicali is a doctoral candidate in Educational Psychology at The Graduate Center, CUNY and a Quantitative Reasoning Fellow for the University at the Quantitative Research & Consulting Center (QRCC).

    Teresa Ober is a doctoral candidate in Educational Psychology at the Graduate Center, CUNY. Teresa is interested in the role of executive functions in language and literacy. Her research has focused on the development of cognition and language skills, as well as how technologies, including digital games, can be used to improve learning.

    Eduardo Vianna, Professor of Psychology, has taught at LaGuardia since 2005. He has a Ph.D. in developmental psychology from the GC- CUNY after completing his medical studies in Brazil. Building on recent advances in Vygotskian theory, especially the Transformative Activist Stance approach,  his works focus on research with transformative agendas. His recent recent work includes applying critical-theoretical pedagogy to build the peer activist learning community (PALC), which was featured in the New York Times. In 2010 he received the Early Career Award in Cultural-Historical Research by the American Educational Research Association and currently he is chief editor of Outlines Critical Practice Studies and  Co-PI in the NSF grant "Building Capacity: A Faculty Development Program to Increase Students' Quantitative Reasoning Skills.’
  • 13 Mar 2019 1:03 PM | Anonymous member (Administrator)

    By David Kreiner, Ph.D., University of Central Missouri

    On a cosmological level, time may be infinite, but we constantly run out of it in our daily lives. I have seen students and faculty struggle with it. I have certainly experienced it myself:

    • When my class time is up but I haven’t finished everything I wanted to.
    • When I’m planning a 16-week class and there’s just not enough time.
    • When I thought I could finish a draft of a paper in one afternoon but I didn’t even get close.

    Similarly, your students may:

    • have trouble meeting course deadlines;
    • fail to use effective study methods because they don’t have enough time;
    • be unrealistic about allocating time for the different components of a major project;
    • or find that they are about to graduate before they had a chance to accomplish all their goals.

    I propose that we look to the rich literature on the psychology of time in the same way that we have looked to the science of learning for more effective studying and teaching methods. I will describe one example to illustrate what I mean, but there is much more out there. If only we had time to explore it all!

    Kahneman and Tversky (1979) defined the planning fallacy as a tendency to underestimate how much time we need to complete larger tasks and overestimate the time we need for smaller tasks. We tend to be confident in these estimates – confident, but wrong (Buehler, Griffin, & Peetz, 2010). Think about how this affects your plans for a large project like your thesis or dissertation. Also think about how your students might struggle with finishing a project on time, or why they might run out of time and submit work that is less than their best.

    Fortunately, there is research on how to estimate more accurately how much time it will take to do something. One strategy is to avoid anchoring effects, in this case anchoring on the present when making a time estimate. LeBouef and Shafir (2009) found that people could make better estimates if they identified a future date at which they thought they would finish instead of estimating how many days from now.

    Another way to make more accurate time forecasts is to consider how much time similar tasks took in the past (König, Wirz, Thomas, & Weidmann, 2015). It also helps to think about possible obstacles that can cause delays (Buehler et al., 2010). When your student is estimating that she can knock out that paper in three hours, she may not be considering possible interruptions, technology issues, or finding out that the key article she needs is not available full-text.

    Imagining from the perspective of an observer can also improve accuracy (Buehler et al., 2010). What would your friend say about your plan to complete the literature review of your dissertation in one week?

    We might ask whether making better time estimates is that important. It doesn’t speed anything up or save time, right? But if our estimates are inaccurate, we will make mistakes in budgeting our time. Other things may fall through the cracks – sleep, for example – which could affect our well-being and success. One way to improve our relationship with time is to get a better handle on how much time we need. The research suggests that we can get better at it.

    At the upcoming APS-STP Teaching Institute, I will share a few other examples of how we can make use of the literature on the psychology of time. I hope to see you there …. if you can find the time!

    References

    Buehler, R., Griffin, D., & Peetz, J. (2010). The planning fallacy: Cognitive, motivational, and social origins.  In P.Z. Mark & M.O. James (Eds.), Advances in experimental social psychology (Vol. 43, pp. 1-62). New York, NY: Academic Press. doi: 10.1016/S0065-2601 (10)43001-4

    Kahneman, D., & Tversky, A. (1979). Intuitive prediction: Biases and corrective procedures. TIMS Studies in Management Science, 12, 313-327.

    König, C.J., Wirz, A., Thomas, K.E., & Weidmann, R.Z. (2015). The effects of previous misestimation of task duration on estimating future task duration. Current Psychology, 34(1), 1-13. doi: 10.1007/s12144-014-9236-3

    LeBouef, R.A., & Shafir, E. (2009). Anchoring on the “here” and “now” in time and distance judgments. Journal of Experimental Psychology: Learning, Memory, and Cognition. doi: 10.1037/a0013665

    David S. Kreiner is Professor and Chair of the School of Nutrition, Kinesiology, and Psychological Science at the University of Central Missouri, where he has taught since 1990. He earned his B.A. in Psychology and Ph.D. in Human Experimental Psychology from the University of Texas at Austin. He has taught courses including General Psychology, Orientation to Psychology, Research Design & Analysis I & II, History of Psychology, Advanced Statistics, Cognitive Psychology, and Sensation & Perception.  His research interests include language processing, memory, and the teaching of psychology.  He often collaborates with students on research projects and has coauthored publications and conference presentations with undergraduate and graduate students. 


  • 12 Feb 2019 3:32 PM | Anonymous member (Administrator)

    By Ashley Waggoner Denton, Ph.D., University of Toronto

    As an undergraduate student, I learned that being primed with the stereotype of professor could make me act smarter, that I might deplete my self-control if I refused the tempting cookies presented to me at a meeting, and that if I conducted a study whose findings were unexpected, I could just rewrite my introduction and tell a new story. Thankfully, I also learned how to learn, which prevented me from becoming trapped in a knowledge time warp. Psychological “facts” have an estimated half-life of seven years (Arbesman, 2013). Seven! This means that by the time you have completed graduate school, half of the psychological findings you learned as an undergraduate will have been updated, revised, or deemed outright wrong. Such is the nature of scientific progress. However, this helps drive home the point that one of our most important goals as teachers is to help our students develop into lifelong learners who will be able to continue learning (effectively and across a range of topics) long after they have left our classrooms. 

    The term learning how to learn comes from Fink’s taxonomy of significant learning (2013), which includes six major categories: foundational knowledge, application, integration, human dimension, caring, and learning how to learn. If you are not familiar with Fink’s model, I highly recommend checking it out (see recommended reading below). Learning how to learn takes a number of different forms, and in each of the courses I teach, at least one of these forms is emphasized. The first form is learning how to be a better student, the second form is learning how to construct new knowledge in a discipline, and the third form involves helping students become “self-directing learners” (Fink, 2013, p. 59), the key to which involves the ability to critically reflect on one’s own learning. Below I provide some examples of how I encourage these various forms of learning how to learn in different courses that I teach.

    Learning How to Be a Better Student

    Without a doubt, this form of learning how to learn gets emphasized the most in my Introductory Psychology class. In order to encourage my students to adopt better learning strategies, I don’t just teach them what psychologists have learned about effective study strategies (see links to helpful resources from the Learning Scientists below). Instead, I first let the students tell me (via a survey or in-class response system) how they typically study, and then I frame the lesson around their responses. Specifically, I address the limitations of their common habits (e.g., cramming) and study strategies (e.g., re-reading), explain why these strategies seem appealing despite their limitations, and then provide the students with more effective replacement strategies (e.g., retrieval practice), including an overview of the research that has been done on each strategy and specific tips for how to implement these strategies in Intro Psych. Rather than presenting this information a preachy way (“everything you are doing is wrong and I know better!”), I want the students to recognize that they are not alone in using these common strategies, and that I completely understand why they use them, but that I have good reason to believe they can learn even more effectively by adopting some new strategies.

    In a similar vein, I also present students with research on the effects of technology use on learning (both in the classroom and when they are studying on their own). Again, I ultimately leave it up to the students (as self-directing learners!) to make their own decisions, but I arm them with the information that will allow them to make informed decisions about whether they take notes with a laptop or on paper, where they should leave their phone during class or a study session, and so on. A detailed slide-deck that can be used for covering this material in your own classes is available via a link below.  

    Learning How to Construct New Knowledge

    We all know that students should practice writing and get hands-on experience doing research as much as possible. Encouraging this form of learning how to learn is standard in any research methods or laboratory class. But it’s worth spending a moment to reflect on the type of inquiry and knowledge construction students are engaging in across all of your courses. Are they being pushed enough? Are they being asked to truly write and think “like a [social/cognitive/clinical etc.] psychologist,” or are they simply getting practice using some new terms and theories? As an example, students in my Intro to Social Psychology course used to complete an assignment where they analyzed an event from a social psychological perspective. It was a perfectly good assignment, but what were the students actually learning? Application is important, don't get me wrong (it has its own category in Fink’s model), but I have since replaced this assignment with an observational study project where the students must develop a hypothesis; design a study; collect, analyze, and interpret their data; and write everything up in a final APA-style report. This new assignment obviously requires a lot more scaffolding and resources, but the students walk away from the course not just being able to apply the knowledge they’ve learned, but with the ability to potentially contribute to that knowledge base. Additionally, they are in a better position to recognize the limitations of drawing conclusions from single studies and the importance of replication and reproducibility.

    Learning How to Become Self-Directing Learners

    Most of what our students do, they do because we tell them to. For example, students in my Social Psychology Laboratory class complete a research proposal because that is what they are told to do. They develop their own research question and hypothesis and design their own experiment, which all seems perfectly “self-directed.” However, the task falls short of its goal if the students fail to engage in a critical reflection of their learning throughout this process. The way that I encourage this (in this class and others) is through the use of reflective learning journals. Reflection changes everything. When students are encouraged to reflect on their learning it can improve their self-monitoring and goal-setting capabilities as well as lead to changes in study habits and other skills. It encourages students to focus more on the how and why of their learning, rather than simply on what they are learning. Students who are able to critically reflect on their learning are much more likely to develop into self-directing learners, so I do whatever I can to give my students practice with reflection. More information on how I have implemented reflective learning journals into my statistics course can be found in the Waggoner Denton (2018) article listed below.

    Self-directing learners are able to recognize gaps in their understanding and formulate plans for filling those gaps. As a developing teacher, you are likely to start noticing all sorts of gaps in your knowledge and skills (all those things that manage to go unnoticed until we actually have to explain it to someone!). The next time you go about filling in one of those gaps, take some time at the end to reflect on the process you just undertook. Who did you talk to? What did you read? Could you have done it better or more efficiently? And how did you know how to do these things? Would your students know what to do?

    Below are some resources that may be useful as you consider how to incorporate certain aspects of learning how to learn more fully within your own courses!

     

    Additional Reading/Resources:

    • Reflective Learning Journals in Statistics: Waggoner Denton, A. (2018). The use of a reflective learning journal in an introductory statistics course. Psychology Learning and Teaching, 17, 84-93. DOI: 10.1177/1475725717728676

     

    References

    Arbesman, S. (2013). The half-life of facts. New York: Penguin. 

    Fink, D.L. (2013). Creating significant learning experiences: An integrated approach to designing college courses.  San Francisco: Jossey-Bass.


    Ashley Waggoner Denton is an Associate Professor, Teaching Stream in the Department of Psychology at the University of Toronto. She received her Ph.D. in Social Psychology from Indiana University and completed her bachelor's degree at the University of Toronto. She teaches courses including Introductory Psychology, Social Psychology, Statistics, and the Social Psychology Laboratory. She also supervises undergraduate research projects that examine questions related to the social psychology of teaching and learning.

  • 17 Jan 2019 10:00 AM | Anonymous member (Administrator)

    By Teresa Ober, Elizabeth Che, and Patricia J. Brooks, GSTA Leadership

    In the Fall 2018, the GSTA distributed a short survey to gather informal input about the preferences of graduate students with regards to a possible mentorship program. We were specifically interested in gauging whether graduate students would be interested in a program where they would be mentored by early career psychologists.

    There have been past efforts to apply mentorship programs within the framework of existing professional organizations. The Society for the Teaching of Psychology has recently formed a mentorship program pairing early career psychologists and advanced graduate students with more senior full-time faculty. The mentorship program was featured in a recent GSTA blog by Dr. Diane Finley which describes some of the history and benefits of mentorship. Mentorship is thought to encourage networking, collaboration, and sharing of instructional resources and ideas. In addition to these benefits, mentorship has also been shown to relate to decreased work-family conflicts and increased job satisfaction in the long-term (Tenenbaum et al., 2001).

    To date there has been relatively little systematic and quantitative research on mentorship as an evidence-based practice (Troisi, Leder, Stiegler-Balfour, Fleck, & Good, 2015), and virtually none on mentorship of graduate students in psychology. Existing research on professional mentorship between faculty and students indicates that it consists of two distinct components: instrumental and psychosocial help (Tenenbaum et al., 2001). “Instrumental help” involves coaching and training. “Psychosocial help” includes empathizing and counseling. In conducting this survey, we were particularly interested in the types of instrumental help that graduate students might seek in a mentorship program, as well as what types of mentorship models and modes of communication would be preferred. Research in this area is necessary to understand whether graduate students have unique needs and interests as potential mentees.

    Survey

    We sought to identify interests related to professional mentorship among graduate students, particularly those with a background in teaching. Last fall (October 12-November 7, 2018), the GSTA distributed a short survey to gather informal input about the preferences of graduate student instructors that would help to guide recommendations for a possible mentorship program. Graduate students were invited to participate in the survey through various STP channels of communication, including the STP and GSTA social media pages (e.g., Facebook, Twitter) and email (STP/DIV2 listserv). The survey received a total of 78 responses, summarized below.

    Sample Characteristics

    Graduate student respondents were asked various questions about their areas of specialization and years in graduate school. Approximately one out of four respondents indicated their field was social psychology (25.6%). There were equal proportions of respondents from clinical and cognitive psychology (14.1%), followed by developmental psychology (9.0%), and neuroscience (7.7%). Nearly half of respondents were in the second (23.1%) or third (25.6%) year of their program, followed by those in the first year (16.7%). Respondents in their fourth (12.8%), fifth (14.1%), sixth (6.4%) or seventh or higher (1.3%) year in the program represented about one out of three respondents.

    When asked about their post-graduation plans, more respondents indicated an interest in working at a research-based institution (61.5%) than at a teaching-based institution (41.0%); note that respondents could indicate interest in both. Respondents indicated a preference to work at a public institution (59.0%) over a private institution (42.3%). There appeared to be a negligible difference in the preference for working at a large institution (46.2%) as opposed to a small institution (44.9%). A minority of respondents indicated an interest in working at a nonprofit organization post-graduation (2.6%).

    Interest in a Mentorship Program

    Over 9 in 10 of the respondents indicated either a potential interest (51.3%) or a definitive interest (39.7%) in being mentored by an early career psychologist. The remainder (9.0%) did not indicate an interest, nor did they provide an explanation for why they did not have an interest.

    The survey asked about their topics for mentorship; note that respondents could indicate interest in multiple topics. Half of the respondents indicated they would like mentorship to focus on how to prepare for the job market (50.0%). Others indicated they would also like mentorship around teaching (11.4%), how to prepare work for publication (10.0%), research advisement (10.0%), engagement in service (8.6%), innovation in the field (1.4%), and jobs outside of academia (1.4%). Some indicated they were open to and interested in mentorship for all of the above noted topics (5.7%).

    Respondents were asked what types of mentorship models they would most prefer. Over half of the respondents indicated an interest in dyadic mentorship (60.6%), while a minority indicated interest in a group mentorship model (35.2%). Other respondents were content with either option (4.2%).

    The survey asked respondents how frequently they would like to communicate with their mentor(s). It appeared that most respondents preferred meeting about once a month (46.5%) or twice a month (36.6%). Less popular, though preferred by some respondents, involved communicating on a weekly basis (11.3%). Even fewer respondents indicated an interest in communicating less frequently, or about once every three months (5.6%).

    Respondents also indicated their preferred channels of communication for a potential mentorship program, with respondents given the option to select multiple options. The vast majority of respondents indicated a preference for email (88.7%) or in-person (85.9%) communication. About half also indicated a preference for video calls (47.9%). Other respondents indicated phone (36.6%) or text messaging (36.6%) as preferred channels of communication as well.

    Summary of Key Findings

    Mentorship opportunities may be especially beneficial for graduate students as they try to gain a professional footing. Such opportunities can connect graduate students studying psychology to others in the field, possibly leading to long-term collaborations. Without a previous systematic investigation into the needs and interests of potential graduate student mentees, we distributed this survey to gather this information. The responses indicated a preference for a mentorship program structured around a dyadic mentor-mentee arrangement. The results also suggested that respondents preferred communication on an approximately monthly or bi-monthly basis. The most popular means of communication appeared to be email and in-person; however, over a third also indicated a preference for video call, phone, or text messaging. These findings shed light on the effective ways to organize a mentorship program.

    With regards to the focus of the mentorship, given that we recruited through STP and GSTA communication channels, we were surprised that fewer than half of the respondents (41.0%) indicated interest in a teaching-based position post-graduation, and even fewer (11.4%) indicated interest in mentorship around teaching. Most of the respondents were in the earlier years of their program (first to third), suggesting that there is demand for a mentorship program geared towards students in the earlier phase of their doctoral studies.

    Our findings pointed towards a greater interest and need among graduate students for mentoring on issues centrally related to preparing for the job market. Recent news articles have featured the many challenges associated with entering the job market (Smith, 2019), particularly for those who are pursuing careers in academia (Smith, 2017). Given the context of such a competitive job market even for highly skilled individuals, a successful mentorship for graduate students should incorporate both aspects of help described by Tenenbaum et al. (2001), with a focus on preparing students with the instrumental knowledge necessary for applying for jobs, and the psychosocial support to buffer the challenges and inevitable rejections they will experience in the process.

    Participation in mentorship may create expectations around the education and training of graduate students as a continuous endeavor (Epstein & Hundert, 2002). Such a perspective may be particularly helpful for advanced graduate students and recent post-graduates who anticipate preparing for a competitive job market, particularly in academia. Professional mentorship opportunities may be one way to better prepare recent graduates for a long-term career, rather than forcing them to abruptly recalibrate their job ambitions. Having such opportunities beyond the formal student-advisor relationship may be one means by which institutions and organizations can promote a culture where the continual development of professional competency is held in high regard.


    References

    Epstein, R. M., & Hundert, E. M. (2002). Defining and assessing professional competence. Journal of the American Medical Association, 287, 226–235.

    Smith, N. (2017, Oct 4). Too many people dream of a charmed life in academia. Bloomberg, Retrieved from https://www.bloomberg.com/opinion/articles/2017-10-04/too-many-people-dream-of-a-charmed-life-in-academia

    Smith, N. (2019, Jan 9). Burned-out millennials need careers, not just jobs. Bloomberg, Retrieved from https://www.bloomberg.com/opinion/articles/2019-01-09/millennial-burnout-young-adults-need-careers-not-jobs

    Tenenbaum, H. R., Crosby, F. J., & Gliner, M. D. (2001). Mentoring relationships in graduate school. Journal of Vocational Behavior, 59(3), 326-341.

    Troisi, J. D., Leder, S., Stiegler-Balfour, J. J., Fleck, B. K., & Good, J. J. (2015). Effective teaching outcomes associated with the mentorship of early career psychologists. Teaching of Psychology, 42(3), 242-247.


    Teresa Ober is a doctoral candidate in Educational Psychology at the Graduate Center of the City University of New York. Teresa designed and created Manuscript Builder in completion of the certificate program in Interactive Technology and Pedagogy at the Graduate Center. She is interested in the role of executive functions in language and literacy. Her research has focused on the development of cognition and language skills, as well as how technologies, including digital games, can be used to improve learning.

    Elizabeth S. Che is a doctoral student in Educational Psychology at the Graduate Center, CUNY and the GSTA Deputy Chair. Her research interests include individual differences in language development, creativity, and pedagogy.

    Patricia J. Brooks is Professor of Psychology at the College of Staten Island and the Graduate Center, CUNY and GSTA Faculty Advisor.  Brooks was recipient of the 2016 President’s Dolphin Award for Outstanding Teaching at the College of Staten Island, CUNY.  Her research interests are in two broad areas: (1) individual differences in language learning, (2) development of effective pedagogy to support diverse learners.​

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
Powered by Wild Apricot Membership Software