Society for the Teaching of Psychology: Division 2 of the American Psychological Association

E-xcellence in Teaching
Editor: Annie S. Ditta

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
  • 16 Jul 2024 5:31 PM | Anonymous member (Administrator)

    Lisa Dierker
    Wesleyan University

    My Story

    I was still in my 20s when I arrived at Wesleyan University, fresh off a 3-year post-doctoral fellowship at the Yale School of Medicine. When asked to teach a research methods course, I had what felt like a brilliant idea driving home from the grocery store one day. I would not use a textbook and I would not deliver lectures. My own classroom training had been ineffective and uninspiring. As I tell my students, I learned 20 different kinds of post hoc tests but didn’t understand when or why to actually use one. So, instead of drowning my own students in information the way I had been drowned, I decided to get them involved with large, real-world data sets and support them in conducting original research. I would teach them what they needed to know when they needed to know it and not before. Their own questions would drive the learning and I would help them to experience the research process from start to finish. Passion-Driven Statistics was born!

    Ten years later, it would become a multidisciplinary introductory statistics course at Wesleyan and a National Science Foundation funded model serving thousands of students across disciplines and educational environments in the United States and Internationally (e.g., Canada, Ghana, Nigeria, Philippines, Peru, United Kingdom, and still reaching). Passion-Driven Statistics is now a widely used project-based curriculum that has been implemented as a statistics course, a research methods course, a data science course, a capstone experience, and a summer research boot camp. Liberal arts colleges, large state universities, regional colleges and universities, medical schools, community colleges, and high schools have all successfully implemented the model.

    The curriculum has been found to attract higher rates of under-represented minority (URM) students compared to a traditional statistics course and students enrolled in Passion-Driven Statistics are more likely to report increased confidence in working with data and increased interest in pursuing advanced statistics coursework (Dierker et al., 2018). This project-based approach also promotes further training in statistics. Using causal inference techniques to achieve matched comparisons across three different statistics courses, students originally enrolled in Passion-Driven Statistics were significantly more likely to take at least one additional undergraduate course focused on statistical concepts, applied data analysis, and/or the use of statistical software compared to students taking either an activity-based psychology statistics course or a math statistics course (Nazzaro, et al., 2020). In more recent research Passion-Driven Statistics has been found to be associated post-graduation with a higher likelihood of holding a job in which a primary responsibility includes working with data, greater confidence in working with data, and a higher likelihood of earning more than $100K annually (Dierker et al., in press).

    A New Role

    I always thought that I understood the ingredients that make Passion-Driven Statistics so empowering, and if asked, I would have told you about the opportunity to ask your own research questions, or I would have pointed to its just-in-time and need-to-know approach to content knowledge, or even its focus on technical skills in the service of disciplinary content and critical thinking. This year, I stepped back in to teach the course after several years away from it. Seeing it with fresh eyes more than 20 years after that first spark of inspiration made me realize that so much of its power comes from the simple act of new learners teaching newer learners.

    I used to be the “new learner,” understanding exactly what it felt like to encounter and struggle with the abstract concepts, disciplinary jargon, mathematical complexity, and the arcane programming syntax involved in authentic research. Two decades later, I find that my role in the course has changed. I am no longer a new learner, and as much as I try to recreate that space and those feelings in myself, the “curse of knowledge” and my hard-won expertise hold me back. Now, I am recognizing an entirely new role in providing support to those former Passion-Driven Statistics students who have generously stepped in as peer mentors, warmly guiding our newest generation of students in the same empowering way that I was able to all those years ago. They are now the new learners teaching our newer learners from a place of empathy, passion, patience, high expectations, and mutual support. Every day in class, I see them using their new learners’ superpowers to inspire others, to explain concepts by getting to the simpler, more digestible parts faster, and to understand students’ perspectives in a deeply genuine way. I have loved watching them hone their skills in listening, adapt to the needs of the individual students and nurture them in ways that meaningfully impacted their own educational trajectory when they played the role of the newer learner.

    Working this semester with some of the current peer mentors, Joyce Sun, Erin Byrne, and Luis Perez, has reminded me that Passion-Driven Statistics is as much a culture as it is a course. It is a space where no one needs to know everything, where we can all bring our best stuff, and where moral support and compassionate engagement allow our students to become the heroes of their own learning. Together, we take students out of their comfort zone and then love them through the fallout by creating an inviting classroom and an experience that gives students a safe and supportive space to get things wrong before they get them right.

    While my role as expert in this space may continue to be necessary and even valuable on rare occasions, it is also wholly insufficient. It is only together with new learners, our newer learners, and expert voices that we hold the necessary and sufficient ingredients to change lives in the data analytics space. I know, it sounds rather dramatic, but it is! 

    And if that were not enough, I am also marveling at the chorus that I have continued to hear from peer mentors across the years, that they “learn more as a peer mentor” than they did when taking the Passion-Driven Statistics course for the first time. Though secondary and post-secondary education continues to resist the power of learning through teaching, it is the most untapped, cost positive tool that we currently have as educators. I believe that it is stronger even than the current promises of AI. Peer mentors may serve as volunteers, be paid through student work programs or training grants, or receive course credit as teaching assistants or through course designations (e.g., statistics education practicum). It does not have to be a promise for the future. We have everything that we need right now.

    The Next Step

    You might be interested to learn that my time away from teaching Passion-Driven Statistics has been spent designing a new project-based curriculum aimed at reimagining General Education. The goal of this new initiative is to expose students to a wide range of digital skills as they learn traditional disciplinary content. Within our digital “Introduction to Psychology” course, students explore concepts and content in the field of psychology through video storytelling, programming, data visualization, web development, design and more. This novel curriculum is aimed at solidifying new content knowledge, exposing students to modern digital tools, and providing them with the opportunity to create new learning artifacts.

    And with this, I have found myself a new learner again, not just conquering new content outside of my research subdiscipline, but learning new tools, new skills, new design principles and being useful again the way only a new learner can be. All this newness is of course accompanied by uncertainty, vulnerability, and the distinct possibility of utter failure. It is hard and that is what I love about it. I find myself feeling inspired again and eager to bound out of bed in the morning to face new challenges and to find the transformative experience that I first found in the Passion-Driven Statistics classroom all those years ago.

    I am always eager to network with passionate instructors excited about things we have not even imagined yet. Please feel free to reach out at

    Resources for Passion-Driven Statistics are available at Some that you might find particularly useful include a free e-book and translation code aimed at supporting the use of diverse statistical software. Resources for Digital Intro are available at I encourage you to take advantage of our introductory psychology lessons and project videos on our Youtube Channel. I am also happy to share a new project sharing platform, OpenLab, where students can get inspired, post learning artifacts, and share their work and learning by creating a free digital portfolio. Follow us on Instagram or check us out on LinkedIn to learn more!

  • 15 Apr 2024 6:42 PM | Anonymous member (Administrator)

    Rachel T. Walker
    University of the Incarnate Word
    Click here for a link to the article with figures

    When I was an undergraduate student in biology, I decided to take a statistics course in psychology. I didn’t realize at that time that I would later be teaching this course in graduate school, and I couldn’t imagine that statistics would be one of my favorite courses to teach. Statistics can be a challenging subject for many students, but effective teaching can make a significant difference in how it's perceived (Pan & Tang, 2004). Over the years I have taught this course using a variety of teaching strategies based on the department layout of the course. As I progressed in teaching this course, I wanted the course to be flexible and responsive to students’ needs and create an active and effective learning experience for teaching behavioral statistics.

    Over the years, I have continued to ask questions related to effective teaching strategies. What if I embedded videos or journal articles related to the real-world application of statistics? How could formative assessments such as quizzes, discussions, and polls during the course gauge students’ understanding? How can I use hands-on applications to illustrate the concepts of the material? Can I combine traditional lectures with interactive elements? Could I use a technology integration like SPSS (a software package used for the analysis of statistical data) to provide a hands-on project? How can I use scaffolded learning to break down complex statistical concepts? I will share some of the ways I have addressed these questions.

    What if I embedded videos and research related to the real-world application of statistics?

    I embed videos and research materials using a mixture of resources. Here are several examples of how I use short videos within the lecture. I show the videos during class, but students can also access them outside of class to confirm their understanding of the material.

    I incorporate various Crash Course Statistics videos into the semester, offering detailed examples that illustrate the practical applications of specific statistical concepts in our daily lives. Before the start of the semester, I reach out to students and shared a crash course statistics video that provides the purpose of statistics; for example, how meteorologists use statistical methods to analyze historical weather data, identify patterns, and make predictions about future weather conditions, and how companies use statistics to aid in analyzing consumer behavior, preferences, and trends. I incorporate additionalCrash Course Statistics videos to provider a preview of specific statistical concepts, such as central tendency, before diving into the lecture content. For instance, before the central tendency lecture, I share a video that provides an overview of how these statistics can determine the center of both normal and skewed distributions.

    Crash Course Statistics Preview

    Mean, Median, and Mode: Measures of Central Tendency: Crash Course Statistics #3

    In one of the classes, I cover the four levels of measurement along with fundamental definitions and a few examples. Following that, I present a brief video offering visual insights into the distinctions among the measurement scales.

    Data Science & Statistics: Levels of measurement

    Another instance related to the use of a short video involves the application of bar graphs. Initially, I instruct students on the X- and Y-axes to depict data. Students acquire the skills to construct histograms and bar charts and interpret their representations. Once they grasp the fundamentals of bar graphs, I introduce a video that provides real-world instances of commonly shared misleading graphs.

    How to spot a misleading graph

    In addition to videos, I also distribute sections of a journal article, allowing students the chance to practice reading and interpreting the results section. I first provide students with the abstract of the article to offer a brief overview of the article, highlighting the main objectives, methods, results, and conclusions of the work. I share the results section to provide an overview of the structure of the results on the statistic that relates to the lecture. This is usually the first time that students are introduced to reading the results of a scientific article related to psychology. This process assists students in understanding the format of how statistics are reported in a journal article and the use of APA format. In other psychology courses, students will be required to summarize scientific articles and understand the methods and analyses.

    How could formative assessments such as quizzes, discussions, and polls during the course gauge student understanding?

    Quick quizzes are embedded throughout the lecture to test student understanding after each small section of content. These questions, taken from Cengage’s instructor materials related to the textbook (Essentials of Statistics for the Behavioral Sciences 10th ed., Gravetter et al.) could be multiple choice, true or false, or applied research questions. This process allows students to confirm they understand the course material before we continue to move forward in the chapter.

    I incorporate discussion group assignments in the course to encourage active engagement amount students. Throughout the semester, I offer six discussion board opportunities, where students submit their discussion topics and respond to posts from their peers.

    Here are several examples of sources that could be used for creating discussion group assignments:

    1) A majority of Americans have heard of ChatGPT, but few have tried it themselves. Integrate the information from the tables into your overall understanding of the material.

    2) How to defend yourself against misleading statistics in the news.

    Integrate the information in the video in your overall understanding of misleading statistics.

    3) Correlating Barriers to Medication Adherence With Trait Anxiety, Social Stigma, and Peer Support in College Students With Chronic Illness

    Indicate how the information from the tables and result section into your overall understanding of the material.

    Directions for Response: Make sure your responses are well thought out and each provides at least 3 sentences for each section. Respond to each of the following questions: Describe the topic provided by this resource. What did you find interesting? How would this relate to the real world? What did you find challenging to understand?

    Directions for Replies to colleagues should be at least 3 sentences as well. Reply to another student's post: Replies can include your thoughts about the student's perception of the source or your additional thoughts on the topic related to the source.

    Moreover, I employ Poll Everywhere in diverse manners within a lecture. For example, at the beginning of a lecture on descriptive statistics, students are asked “What type of social media is used the most in the U.S.?” Once students submit their thoughts, I show students the data related to this question that did not support most of their responses for adults. However, I then provide data on teens' use of social media that is closer to their responses. After the discussion, I lecture on descriptive statistics.

    Here are the links I shared from PEW and discussed the changes over time.

    I also use Poll Everywhere towards the end if a lecture to ensure that students understand the content. For example, rate your level of understanding of how to calculate an independent t-test. If students respond that they are struggling with this issue it provides useful feedback and students can ask specific questions regarding their issue.

    How can I use hands-on applications to the concepts of the material?

    Here is an example of how I utilize hands-on applications in class. First, I teach students how to read and understand a research scenario, determining information such as the alternative hypothesis, the alpha level, and the variables provided.

    During a lecture, I present how to use that information in the 4-step process for hypothesis testing.

    1. State null and alternative hypotheses.

    2. Identify the critical region based on alpha level, one or two-tailed hypothesis, and degrees of freedom.

    3. Compute the statistics by showing all calculations.

    4. Draw out the distribution with the critical and test statistic. Conclude and report the findings in APA format.

    After students took notes on this process, I provide them with another research scenario to solve during class. While students are working through the 4-step process, I assist them along the way. For example, if a student wants to know if they are on the correct path, they might ask if their critical region is correct. If the student is incorrect, instead of saying no, I ask them a question.. I asked them to show me how they came to that conclusion. This process allows the student to find the correct answer in most cases. Students can then proceed to complete homework questions using the hands-on applications introduced in class. In many real-world scenarios, the use of statistical software and tools has become standard due to their efficiency, accuracy, and ability to handle large datasets. This process of manual calculations can be more effective in conveying the step-by-step process, contributing to better conceptual communication. However, in some work situations small datasets and manual calculations can be quicker than setting up and using statistical software.

    Can I combine traditional lectures with interactive elements? Could I use a technology integration like SPSS to provide a hands-on project?

    How can I alter my previous teaching of behavioral statistics? I did something I thought I would never do. I had to remove some of the content to provide students with the opportunity to learn how to analyze, interpret, and summarize their results by integrating technology. I use SPSS, but other types of software can be used. such as Microsoft Excel. I've excluded lectures covering paired-t tests, two-way ANOVA, and Regression. While these statistics are referenced in a lecture, students won't receive in-depth information about these subjects. Our department offers an elective course in Advanced Statistics, providing students with the opportunity to explore and delve into more intricate statistical concepts. In addition, this change allowed me to use those class times to embed a lab component into the lectures.

    I provide students with a preexisting dataset that I collected earlier, which they use in the lab component of the class. This data is employed for descriptive statistics, independent t-test, one-way ANOVA, and correlations. I familiarize students with the broader subject of the research they will be examining, which involves personality and social networking. Subsequently, I clarify the variables and their measurements, such as gregariousness and the frequency of social media usage. In the lab, I guide them through the SPSS layout to enhance their understanding of the software's functionality and then provide lab for each of the four types of statistics that will be analyzed in SPSS. For example, after I teach the independent t-test, I will have a lab focused on how to calculate the independent t-test in SPSS, how to interpret the outcome, and how to write up the findings in the APA format. I provide handouts for the lab that include an introduction, the steps to complete in SPSS, an example of the output, and a paragraph of the findings of the example. As I explain this process, students follow along by mimicking my steps. Subsequently, I task students with forming hypotheses derived from the measured variables. In the assignment, students are required to generate two hypotheses. I review each hypothesis before examining the analysis of the first one in the lab. Afterward, I provide feedback on the results of each student's first hypothesis before the conclusion of the lab session. Throughout the lab, I employ the Socratic method to facilitate learning and guide students in completing the assignment related to the second hypothesis outside of class.

    How can I use scaffolded learning to break down complex statistical concepts?

    Teaching a course using scaffolding involves providing structured support to students as they learn new concepts, gradually removing this support as they gain mastery. Here's my step-by-step guide on how I implement scaffolding in a course:

    1) Assess prior knowledge: I use poll everywhere at the beginning of a lecture.

    2) Break down the information: I define terms, provide steps for analysis, and utilize quizzes.

    3) Provide guidance: I allow students individual practice in and out of the classroom.

    4) Encourage collaboration: I embed collaboration with the instructor and other students.

    5) Continuous assessment: I assess in-class calculations, poll everywhere, and quizzes.

    6) Gradual release of responsibility: I utilize the Socratic method in the lecture and lab.

    7) Applications to real-world tasks: I offer discussions on real-world situations and provide students with the opportunity to analyze, interpret, and report on existing data.

    8) Flexibility: I utilize an adaptive based on various levels of support needed.

    It is essential to teach statistics according to students' needs and foster an active and effective learning experience for various reasons. This includes the utilization of active learning methods, such as hands-on activities and engaging discussions, to keep students motivated and involved in the learning process. Additionally, enhancing understanding by presenting practical situations, connecting statistical concepts to real-world scenarios, and equipping students with proactive skills and problem-solving abilities are key objectives in this approach.

    In summary, teaching statistics in a way that addresses students' needs and incorporates active learning methodologies enhances the overall learning experience, making the subject more accessible, engaging, and applicable to students' academic and professional pursuits.

    I consistently adapt and modify the course design in response to student feedback and through collaboration with fellow instructors. This ongoing process makes teaching this course a continuous and rewarding experience. This story never ends… which makes this course still one of my favorite courses.


    Pan, W., & Tang, M. (2005). Students' perceptions on factors of statistics anxiety and instructional strategies. Journal of Instructional Psychology, 32(3), 205.

  • 20 Mar 2024 2:15 PM | Anonymous member (Administrator)

    Mona Corinna Griesberg
    FernUniversität in Hagen, Germany

    During my psychology bachelor’s program in Germany, classes were mostly teacher-centered lectures that allowed little student engagement. Fortunately, psychology classes at a small liberal arts college in Michigan, USA, introduced me to less hierarchical but feminist teaching formats. In the beginning of her feminist psychology course, Dr. Karyn Boatwright shared her feminist teaching philosophy aiming to create collaborative learning communities (Enns et al., 2005; Sinacore & Boatwright, 2005). She gave me the opportunity to facilitate a social action project focused on sexism research. In the following, I describe the project and reflect on its benefits and challenges. By sharing my experience as a teaching assistant, I hope to encourage fellow educators to create more wholesome learning opportunities for students to gain research and feminist leadership experience.

    Feminist Leadership & Project Goals

    Dr. Karyn Boatwright teaches Feminist Psychology of Women every winter term. Within ten weeks, two classes of circa twenty students each meet three times a week to discuss feminist issues from a psychological perspective. An intergral part of the course is students’ participation in social action projects, making up thirty percent of their grading. In the beginning of the term, students received a list of possible social action projects and informed Dr. Boatwright about their preferences. For example, the projects were about creating a more inclusive environment at the campus gym, a theatre performance on reproductive rights or feminist peer-support groups for students. I offered a collaborative  research group focusing on sexism research. Each social action group consisted of five to ten students and met outside of class to work on a social justice issue. The projects aimed to promote community engagement, political activism, and long-term social change. In the following, I describe the social action group that I facilitated as a teaching assistant: a collaborative research group focusing on ambivalent sexism. As the facilitator of the collaborative research group, I endeavored to follow my professor’s example and apply feminist leadership principles. That meant creating collaborative learning spaces that were - contrary to my former and many others experiences in higher education - based on relationship building and welcomed expressions of emotions, intuition, and vulnerability. I wanted to minimize hierarchies between the students and myself but allow us to build trust and connection. Furthermore, the goal was to empower the students, to foster awareness for the needs of marginalized communities and to increase an understanding for how feminist research can contribute to social justice and improved well-being. 

    Academia and research have long been less accessible for people with marginalized identities (DeBlaere, 2020). I hoped the project would help students to feel less intimidated by research, to connect research to personal experiences, and to build confidence knowing that they have much to contribute to academic spaces. However, the project was not directly aimed at encouraging students to stay in academia or build a career in research. Instead, the project offered opportunities for students to build their curiosity by finding and exploring topics about which they cared deeply and wanted to learn more. I wanted to open their minds to the different ways in which research can be conducted, so they have a broader view of what research and academia can look like, can contribute to change within the fields, and can make more informed decision career choices.

    Project Activities

    Since the project was carried out during the covid-19 pandemic, it was limited to synchronous and asynchronous online learning. Our weekly online meetings gave the project the necessary structure. In preparation for these meetings, students did their individual  literature review on ambivalent research (Glick & Fiske, 2018). I recommended that they would invest up to two hours per week in their research. In an online drive, students could view recommended scientific research papers and other materials like podcast episodes. Moreover, I encouraged students to look for alternative materials to engage with the project topic in a way that would suit their interests. If they found interesting alternative materials, they added them to our online drive.

    During our weekly meetings, we discussed their individual research of the past week.  I invited students to pose questions and to share their newly gained knowledge. For example, if someone didn’t understand the statistics of a research paper they had read, we looked at it again. We also shared observations of our daily life that aligned with the research content we had learnt about. Towards the end of the term, we used out group meetings to plan our final online event. Even though students’ individual research was the basis for their learning, the regular meetings created a communal learning experience.

    In addition to our weekly project meeting, I organized online meetings with international researchers who worked on topics related to ambivalent sexism. Students attended on a voluntary basis and asked questions about the research and academic life. It was also an opportunity to practice networking, an important skill for career development. 

    Furthermore, I met students one-on-one online, at least twice throughout the semester. The first time was a chance to get to know each other better and discuss expectations for the research project and their first impressions. Towards the end of the project, we met again to reflect on the overall learning experience, to exchange feedback and discuss their grading.

    At the end of the term, our project group hosted a public, online event to which all students of Feminist Psychology and other interested community members were invited. At the event, the project members presented what they had learnt about ambivalent sexism and engaged the audience in a discussion. It was an opportunity to share their new knowledge, to practice presenting research in an appealing way and to get feedback from other community members about how the scientific concepts related to their personal experiences.


    The grading was based on their attendance of the group meetings, their independent research and participation in the final online event about our project. In the beginning of the project, I explained to the students that they would keep track of their engagement themself. At the end of the term, I met with each student, we reflected on the group project and they told me how they graded their own engagement. Unless it was very different from my impression, the grade was set and contributed thirty percent to their final grading of the whole Feminist Psychology course.

    Most students had been very engaged in our group project and graded themselves accordingly. There was a student whose engagement seemed low to me in the beginning of the project. After the first few weeks, we met one-on-one and discussed how she could improve her learning experience and contribute more to our group learning. If educators get the impression that a student isn’t engaged and capacities allow, I would advise educators to do the same time: get in touch with the student, try and find out how they feel about their current engagement and depending on their interest, reflect on how they may engage more.

    Learning Through Group Facilitation

    For me as a teaching assistant, the project was a fantastic opportunity to practice feminist group leadership, project development and social justice research. I engaged in networking and community building and learned about student supervision, skills that proved useful in graduate school and my further career. Besides, it was simply fun to get to know the students and to learn from them in challenging discussions. For example, we talked about how our families’ dynamics had contributed to us internalizing gender roles. We also reflected on the image of research and how it had influenced our own academic aspirations.

    In parallel to the social action project, I was working on my bachelor’s thesis about ambivalent sexism and the engagement with my peers helped my motivation and creativity. Furthermore, the group facilitation and collaboration with Dr. Karyn Boatwright enhanced my passion for researching and teaching psychology and combining these with social justice work. The positive student feedback also encouraged her to continue working with me.

    Towards More Wholesome Learning Experiences

    The success of the ambivalent sexism research project led to another social action project that I facilitated in 2023.This time, seven students took part and the project centered upon lesbian, gay, bisexual, transgender, queer and related (LGBTQ+) research. Because the pandemic restrictions had been loosened, the project was conducted on-campus creating new opportunities: In addition to independent research, weekly group meetings and one-on-one meetings, I invited students to engage in further learning activities. For example, some students and I attended external events together, a game night for community building at a local non-governmental organization and a presentation of recent LGBTQ+ research at a neighboring university. We also met for a crafting evening on campus where a queer artist joined us online to teach us crafting queer zines. A few of us met for an intimate self-care morning including meditation and talking circles which created special bonding moments between the students. These activities were all voluntary and there was no grading penalty if students did not join. 

    Lastly, I want to highlight a memorable project event: Through an LGBTQ+ organization, we got in touch with bi- and pansexual women from the local community and invited them to join one of the feminist psychology classes. Four women agreed to meet and share their experiences and perspectives with the class. We met in room on campus that allows comfortable seating in a circle next to a fireplace. Providing hot chocolate, tea and snacks contributed to the rather informal and comfortable setting. The students in our group project had prepared questions about the women’s identities, wishes, experiences of discrimination and coping strategies and facilitated the classes. The speakers shared many personal insights, for example, about religious communities and female, sexual empowerment. Their openness allowed for an empowering experience for all attendees. Many students explicitly mentioned the positive impact of the event in their course reflections.

    Compared to the ambivalent sexism research project, the LGBTQ+ research project took a more holistic learning approach by including independent research, group meetings focusing on scientific research as well as artistic expression and self-care, one-on-one meetings, online meetings with international researchers, an in-class meeting with local community members and further voluntary events off-campus. These voluntary activities were offered to the students to explore their academic and personal interests, find inspiration, experience a sense of community, to gain confidence in scientific discussions and in their research skills, to build connections to international researchers and more. It was the students’ responsibility to decide how and how much they could and wanted to invest in the project and in the group. This freedom was essential to avoid emotional overload and to create positive experiences in research engagement.

    Future Directions and Considerations

    In other educational settings, similar projects may require different degrees of structure and flexibility. We implemented the group projects in Feminist Psychology of Women at a small liberal arts college in the Midwest. It’s a relatively small student body with small cohorts and classes. Less than ten students participated in each social action project. Other lecturers and teaching assistants may not have the resources to invest the time and effort into a few students. However, I would like to encourage lecturers in higher education to acknowledge the resources that their teaching assistants, advanced students and interns bring to the table. Sharing teaching and leadership responsibilities can not only ease the lecturer's work but also create new learning opportunities for the group facilitators. Each social action group in Feminist Psychology was facilitated by a teaching assistant. We had the responsibility of overseeing the progress of our project group. The lecturer Dr. Karyn Boatwright met with us, teaching assistants, weekly to talk about how the projects were going and if we needed any additional support.

    With the support of Dr. Karyn Boatwright and with few bureaucratic barriers, I had much freedom in developing and facilitating the projects. This may be different at other higher education institutions where, for example, the curricula and grading guidelines may be stricter. Other challenges may be insurance and safety when attending external events with students or inviting external guests on-campus. Before the project, lecturers and teaching assistants should make sure they know about the risks and limits of working on their project outside of campus. They should be mindful that working on social justice issues might come with different risks for different students and they may need assistance in navigating those risks. For example, being associated with LGBTQ+ topics might be dangerous in communities that hold strong anti-LGBTQ+ attitudes. Therefore, the local and academic environment should be considered when selecting the research topic and activities.

    The teaching assistants should consider students’ multifaceted identities and positions. Their former experiences and knowledge can differ as well as expectations and needs for the project. The group constellation will lead to a particular power dynamic within the group. Certain intersecting identities will be underrepresented or less visible throughout the project. Thus, teaching assistants should consider how they can make space for those perspectives in the group discourse. For example, they can recommend research materials that thematize underrepresented identities and experiences. Encouraging students to bring in their own interests and alternative materials can also help to diversify the learning content. Still, lecturers and teaching assistants should be aware that conflict mediation might be needed. Since students’ learning curves and their opinions on social justice issues differ, it is important to address early on that the group should work towards a comfortable learning environment for everyone. To achieve this, it can help to collectively set some ground rules in the beginning of the project.  

    I hope naming these possible challenges does not discourage educators from questioning if and how they can apply my suggestions to their work. Every educational setting has its challenges and limits. Therefore, each project and group will be different and require adaptation. Nonetheless, I see much potential in this approach of collaborative student research groups. To start small, educators might consider the following: do they have the capacities to create multidimensional learning experiences for their students? Would the students be interested in project work on social justice topics? Are there possibilities of local or virtual community engagement? Can they share learning responsibilities with teaching assistants and students?  How can they make research less intimidating and academia more accessible for a variety of students? Overall, how can you create collaborative learning spaces? I believe that answering these questions, can move higher education towards create more enjoyable and wholesome learning experiences for the students and educators.



    DeBlaere, C. (2020). Defining myself in: My early career journey. Women & Therapy, 43(1-2), 144-156.

    Glick, P., & Fiske, S. T. (2018). The ambivalent sexism inventory: Differentiating hostile and  benevolent sexism. Social Cognition (pp. 116-160). Routledge.

    Enns, C. Z., Sinacore, A. L., Acevedo, V., Akçali, Ö., Ali, S. R., Ancis, J. R., Anctil, T. M., Boatwright, K. J., Boyer, M. C., Byars-Winston, A. M., Fassinger, R. E., Forrest, L. M., Hensler-McGinnis, N. F., Larson, H. A., Nepomuceno, C. A., & Tao, K. W. (2005). Integrating Multicultural and Feminist Pedagogies: Personal Perspectives on Positionality, Challenges, and Benefits. C. Z. Enns & A. L. Sinacore (Eds.), Teaching and social justice: Integrating multicultural and feminist theories in the classroom (pp. 177–196). American Psychological Association.

    Sinacore, A. L., & Boatwright, K. J. (2005). The Feminist Classroom: Feminist Strategies and Student Responses. C. Z. Enns & A. L. Sinacore (Eds.), Teaching and social justice: Integrating multicultural and feminist theories in the classroom (pp. 109–124). American Psychological Association.


    Acknowledgements: I would like to thank Dr. Karyn Boatwright for her invaluable trust, support and supervision throughout the projects as well as her feedback on this essay. I would also like to thank the amazing students who participated in the projects and all researchers and community members who enabled the depth and variety of our learning.
  • 05 Feb 2024 7:47 PM | Anonymous member (Administrator)

    Brooke O. Breaux
    University of Louisiana at Lafayette

    My department’s Psychological Science course has two primary objectives: 1) for students to start building the underlying knowledge that they will need to become a producer of psychology, and 2) for students to become more familiar with psychology as a major and a discipline. Psychological Science—designed for second semester freshman who have taken only an introductory psychology course—was integrated into my department’s 2020-2021 curriculum. Our intention was for Psychological Science to be taught as a traditional in person course, but due to the precautions taken by my university in the midst of the COVID-19 pandemic I taught this course first as a synchronous online course and then as a hyflex course with students deciding whether to attend class in person or online and then, finally, as a fully in person course. Setting aside the complexities of teaching the course in formats different than the one we had in mind when developing the course, Psychological Science itself is ambitious. At a minimum, students enrolled in this course are required to complete a pre-course and a post-course assessment, to take exams and/or quizzes, to construct an actionable plan for their professional development and career exploration, to earn a research ethics certification (i.e., Undergraduate Training on Human Subjects Research through the Collaborative Institutional Training Initiative (CITI), to serve as a participant in actual psychological research, and to write a brief APA Style research proposal. Faculty assigned to teach this course are required to cover topics ranging from psychology as a discipline—including degrees and careers in psychology—to psychology as a science—including research methods, research ethics, and APA Style writing. When teaching this course for the first time, I made the incorrect assumption that if my goal was to have my students write quality research proposals, all I needed to do as an instructor was to provide them with the relevant research design concepts and a clear assignment rubric. What I learned that first semester was that such an approach was insufficient for many of my students and that they needed significantly more scaffolding to produce what I would consider to be a quality product.

    I have now taught this course six times and have dramatically changed the way in which I teach research methods. The approach I have developed is highly scaffolded, involving a sequence of three assignments. For each of the assignments, I have constructed explicit instructions, aligned the delivery of course topics with the assignment deadline, and eliminated unnecessary complexity; however, before diving into a more detailed discussion of my efforts to make the writing of an APA Style research proposal a much more integral part of the course, I thought it would be useful to discuss the development of our Psychological Science course, the integral role it plays in my department’s current curriculum, and our efforts to standardize certain elements of the course.

    Curricular-Level Enhancements: How Did We Get Here?

                When I was hired as a faculty member, undergraduate psychology majors did not take our Introduction to Psychology course. They took two courses designed for majors: one focused more on the basic science of psychology and the other focused more on the applied aspects of psychology. After several semesters of teaching the basic science half of this introductory course sequence, I advocated for a change in our curriculum. This change was supported by the faculty members teaching these introductory courses for majors, who agreed that our curriculum lacked a true research methods course, that we could do a better job of preparing students for our Psychological Statistics course, and that the order in which we introduced certain topics and assessed certain learning outcomes in our curriculum could be improved. To illustrate this last point, it is helpful to know that students enrolled in our basic science of psychology course for majors were typically freshman who were taking the course during their first semester in college. This is the same semester in which the majority of students take their first general education English writing course, which requires them to write papers in MLA Style. Then, during the same semester, our basic science of psychology course for majors introduced students to the discipline of psychology; taught them about some of the major themes, concepts, and findings related to basic science topics, such as the biological psychology and cognitive psychology; and required them to write an APA Style literature review. It is no wonder, then, that many students found it difficult to be successful in this course. I knew that our department could provide students with a better introductory learning experience and that such a change could also serve to strengthen our curriculum.

    We went with a change that would require our majors to take a Psychological Science course but only after taking our Introduction to Psychology course. The decision to have all students take our Introduction to Psychology course was supported by documents such as “Strengthening the Common Core of the Introductory Psychology Course” in which the American Psychological Association (2014) explains that there is no evidence in the literature to suggest that having two introductory psychology courses—one for majors and one for nonmajors—is needed. The decision to create a new course for majors was supported by Stoloff et al. (2010), who suggest that departments that want a more robust Introductory Psychology course for their majors can modify other requirements and sequencing. For example, departments that want to provide more early experiences might be better served by creating another course, such as one that addresses research methods (Stoloff et al., 2010), career preparation (Atchley, Hooker, Kroska, & Gilmour, 2012; Brinthaupt, 2010; Thomas & McDaniel, 2004), preparation for the major (Atchley et al., 2012; Dillinger & Landrum, 2002), or writing in the major (Goddard, 2003)” (American Psychological Association [APA], 2014, p. 20). To this end, we determined that students would benefit from the creation of a required Psychological Science course designed to target these specific objectives.

    Psychological Science is a critical course in our curriculum, providing students with a solid foundation in research methods and serving as a prerequisite for our required Psychology Statistics course. Because of its foundational nature in our curriculum and because it would inevitably be taught by a variety of faculty members, we determined that a minimum standardization of the course would be necessary to ensure similar outcomes across all students. Included in our standardization of this course is the requirement for all students to complete a brief APA Style research proposal, consisting of an APA Style title page, introduction with APA Style citations, method section, and APA Style reference entries; however, what we did not specify was a means by which faculty are to achieve this objective. There are two faculty members who regularly teach Psychology Science, but other faculty members are assigned to teach this course as needed. Everyone who teaches Psychological Science is considered a member of our standardization committee. The role of this committee is to address any issues a faculty member has with the standardization and resolve these issues by updating or changing the standardization.

    Course-Level Enhancements: What Am I Doing?

    Teaching psychological research methods to undergraduates who have only had an introductory psychology course is challenging, and requiring undergraduate students to complete research proposals within such a course can be overwhelming for everyone involved, especially when the class is not small (i.e., around 45 students), does not include a laboratory component, and takes place during a 15-week semester. In the context of research methods courses, project-based learning experiences, such as writing a research proposal, are generally encouraged; however, because the assignments described in the literature tend to focus on more advanced students (e.g., Chamberlain, 1986), I used trial-and-error to develop an approach that enables students to more effectively and efficiently produce quality research proposals. Interestingly, my intuitions ended up aligning with strategies that have been advocated by other instructors, such as reducing unnecessary complexity, especially as it relates to research design (e.g., Yoder, 1979), and offering students the opportunity to practice producing quality writing (e.g., Ishak & Salter, 2017).

    My initial approach to teaching this course was to provide lectures on the relevant topics in the order that they appear in the textbook, expecting students to incorporate this information into their research proposal document. My students found this part of the process exceedingly difficult and this strategy resulted in research proposal that did not meet my expectations; therefore, I created a three-stage (i.e., Introduction Section, Method Section, and Appendices), step-by-step process for developing a research proposal. The instructions for each section are contained within step-by-step documents that are made available to students on our learning management system. To reduce unnecessary complexity I reordered the course topics so that the concepts read about in the textbook and discussed in class would be directly relevant to the part of the research proposal that students are currently working on, and students are explicitly told which step in the step-by-step documents the textbook readings and lecture materials are relevant to. I also created a grading form that aligns with the step-by-step document, which enables me to provide timely feedback at each stage.

    Anyone interested in how I have aligned lecture topics, APA course objectives, and development of an introduction section, method section, and appendices can access this information in the form of a poster I presented at the APS-STP 2023 Teaching Institute (Breaux, 2023;  Actual resources that I used during the Spring 2023 semester, such as the step-by-step guidelines (e.g., “Introduction Section Instructions”) and grading forms (e.g., “Introduction Section Rubric”), can be found in the main folder I created for the APS-STP 2023 Teaching Institute (Breaux, 2023; Readers are invited to use or modify the resources provided for educational purposes only.

    I have also made the literature review portion of the introduction more manageable by requiring students to cite only four empirical research articles. This approach allows students to focus on basic skills, such as integrating information from different sources and using APA Style citations appropriately. It also helps students avoid both accidental plagiarism (often due to insufficient paraphrasing skills) and intentional plagiarism (often due to issues with time management). Another change that I made was to have the whole class focus on the same topic. I always try to select a topic that psychology undergraduates could relate to on a personal level, such as the extent to which college students believe psychological myths (e.g., Hughes et al., 2015) and the extent to which college students engage in self-care (e.g., Zahniser et al., 2017). I have found that topics related to the teaching of psychology and social psychology tend to be more accessible to students at this stage in their academic careers and that topics related to biological psychology and cognitive psychology are the most difficult. Pre-selecting a topic for the semester affords two primary benefits: Students can start reading the empirical literature sooner, and I can address issues specific to the topic during class time. My current approach to teaching Psychological Science shares similarities with Passion Driven Statistics (, which is a project-based approach to teaching statistics that focuses on providing students with only as much information as they need to successfully complete the current tasks they have been assigned.


    These improvements have made teaching psychological research methods to undergraduates who have only had an introductory psychology course feel much more manageable. Even though my evidence is primarily anecdotal, students seem less intimidated by the research proposal process because they are more aware of my expectations and the ways in which I want them to utilize the course materials when working on their research proposal. I hope that my experience can inspire other faculty members not only to continue improving their own courses to meet the needs of students but also to advocate for broader curriculum changes in their own departments, and I hope that what I have learned along the way can be used by others to improve how we teach psychological research methods to undergraduates.


    American Psychological Association. (2014). Strengthening the common core of the introductory psychology course. Washington, DC: American Psychological Association, Board of Educational Affairs. Retrieved from

    Breaux, B. O. (2023, May 23-24). Benefiting from explicit instruction, content alignment, and strategic simplification [Poster presentation]. APS-STP 2023 Teaching Institute, Washington, D.C., United States.

    Chamberlain, K. (1986). Teaching the practical research course. Teaching of Psychology, 13(4), 204-207. 

    Hughes, S., Lyddy, F., Kaplan, R., Nichols, A. L., Miller, H., Saad, C. G., Dukes, K., & Lynch, A.-J. (2015). Highly prevalent but not always persistent: Undergraduate and graduate student’s misconceptions about psychology. Teaching of Psychology, 42(1), 34–42.

    Ishak, S., & Salter, N. P. (2017). Undergraduate psychological writing: A best practices guide and national survey. Teaching of Psychology, 44(1), 5–17.

    Stoloff, M., McCarthy, M., Keller, L., Varfolomeeva, V., Lynch, J., Makara, K., Simmons, S., & Smiley, W. (2010). The undergraduate psychology major: An examination of structure and sequence. Teaching of Psychology, 37(1), 4–15.

    Yoder, J. (1979). Teaching students to do research. Teaching of Psychology, 6(2), 85-88.

    Zahniser, E., Rupert, P. A., & Dorociak, K. E. (2017). Self-care in clinical psychology graduate training. Training and Education in Professional Psychology, 11(4), 283–289.

  • 27 Nov 2023 3:13 PM | Anonymous member (Administrator)

    Amanda W. Joyce
    Murray State University

    Psychological research methods can be a dreaded course for students and instructors alike.  Students report negative emotions from and perceptions about about research, they struggle to see the relevance of research-related material, and they are concerned about the complexity of the research process, all of which can negatively impact their understanding of the course content (Balloo, 2019; Murtonen et al., 2008; Rancer et al., 2013).  Similarly, instructors broadly report concerns about student tardiness, dishonesty, inattention to material, and lack of preparation (Fazily et al., 2018; Lashley & de Meneses, 2001) which could be exacerbated in challenging courses like research methods. 

    Thus, innovative techniques are needed to improve student and instructor experiences in research methods.  Frequently, this innovation comes in the form of applied, active learning that is directly relevant to student experiences—characteristics which have long been touted as beneficial for student learning (Ball & Pelco, 2006; Etengoff, 2023).  In fact, a recent study drawing upon interviews of experienced research methods instructors heavily emphasized the benefits of allowing students to apply what they learned, particularly through hands-on research experiences (Lewthwaite & Nind, 2016).

    Involving students in hands-on research experiences, however, can present still more challenges.  Individual student projects can lead to a heavy grading burden for instructors, and partnered or group projects can be fraught with interpersonal complaints and social loafing.  The purpose of this essay is to explore an option for whole-class collaborative data collection that still allows students individually to propose, analyze, write about, and present data on a project of their own personal choosing.  The collaborative data collection process encourages accountability and teamwork.

    The Project

    Pedagogical Context

    At my university, psychological research methods and statistics are taught in a combined three-course sequence, with the third course focusing on hands-on data collection in what is generally the students’ first research project.  Enrollment for this third course is typically 15 students, all Psychology majors.  The learning objectives for this course require successfully navigating the research process (e.g., “Generate an original research question,” “Conduct a research study in accord with APA’s ethical principles,” etc.).  Thus, the learning objectives of the course, as well as the teaching technique I propose here, encourage students to navigate the research process, from idea generation to final presentation. 

    The Research Project: What Works for Me

                I have personally had great luck with an approach to teaching research methods that intermixes individual and group work while leading students through their first ever quantitative research project.  I have found it to increase individual accountability and teamwork while reducing many of the headaches associated with individual or paired data collection.  I provide a brief overview of the project below.  I am also happy to share course resources with interested readers.

    Students’ experience with hands-on active learning through research occurs through a semester-long research project that occurs in three main phases: (1) individual idea generation, (2) group questionnaire and database creation, and (3) individual data analyses and presentation.

    Individual Idea Generation

                Students begin the semester by individually generating research questions.  Research shows that students have better learning experiences when they work on projects that are personally meaningful (Andresen et al., 2020), and I have found this to be true in my classes as well.  We spend several class periods discussing the contents of a strong research hypothesis that would be testable under the constraints of semester-long project collected on students at their university.  For instance, we discuss how longitudinal hypotheses or hypotheses about overly-specific populations who we are unlikely to recruit on campus (i.e., the elderly; fraternity members who have been diagnosed with schizophrenia) would not be appropriate.  I also limit students to correlational (as opposed to experimental) research designs, which work best within our collaborative data collection process that emphasizes surveys as the primary data collection method.  During the first week of classes, students submit a list of five research questions that they are interested in exploring, which means that they are generating ideas before they have had the benefit of all class discussions on the topic, but generally one or two of their ideas are appropriate, and I am able to guide them toward those ideas. 

    Then (week 3) students submit a final research question for approval before they dive into their topic of interest.  A librarian visits the course to teach students about how to use library resources to find peer-reviewed journal articles on their topics of interest, and students use this information to find five or more articles (week 5), which they summarize and later synthesize into an introduction section for their research paper (week 7).  

    Group Questionnaire and Database Creation

                Students then gather measures relevant to their individual research hypotheses.  They often overlap with their peers in their topics of interest, meaning that there is overlap, too, in the measures that they may choose.  For instance, one student may be interested in anxiety and sleep quality, while another is interested in fraternity and sorority membership and anxiety, and yet another is interested in sleep quality and religiosity.  I encourage students with overlapping topics to work together to find common measures so as to reduce their burden in working with said measures, and I find that they are happy to take this opportunity for reduced workload.  When students happen to not have variables in common with their peers, I encourage them to use brief measures, such as short-form versions of scales rather than full scales, so as to reduce participant burden.

    Students submit their measures (week 6) and, after I have reviewed each of them, we spend a class period gathering each measure into a class-wide shared Google Doc that will later become the questionnaire packet that participants receive.  Combining the measures into a single document during class ensures that everyone has the ability to closely supervise the process and catch any potential errors, like missing items or typographical errors, particularly in overlapping measures, for which several students are very closely monitoring.

    Throughout the semester, students learn about the ethical aspects of research, and they have been working through ethical certification (CITI Training).  Thus, as soon as measures are gathered, we are ready to submit our project, as a single application, to our institutional review board (IRB) for approval.  I submit the application on students’ behalf, but I include the measures and hypotheses that they have provided to me, and we spend one class discussing the contents and importance of the IRB application and process.

    In the one to two weeks (usually weeks 7 and 8) needed for IRB approval, the class prepares for data collection.  First, we learn about the data collection process and how to write about it.  Students learn departmental policies for data collection, including how to reserve rooms, how to use our participant management system (SONA), and more, and they write drafts of methods sections for their final paper. 

    When students begin collecting data (usually week 8 or 9), they host research sessions individually, but they collect on the full research packet that was approved by the IRB.  In other words, even though students collect data individually, they collect data relevant to everyone.  This means that they have the ability to share research materials, that they can cover each other’s research sessions in case of emergency, and that they feel a personal accountability to the group to do good research.  It also means that they can have a large sample size, typically 100 or more students drawn from our department’s research participant pool.  I emphasize throughout the semester how we are a team working toward a common goal, and I find that students will often organically support one another in ways that I haven’t anticipated, such as offering up suggestions about where to find free or cheap printing for research materials.

    Similarly, we crowdsource data management.  We spend several class periods building a shared class database in Google Sheets.  Students are responsible for creating a key for their individual measures so that everyone knows how data should be entered for all measures.  Again, in a combination of individual and group efforts, each student is responsible for entering all data that they collect, meaning that they are helping to support not only their own research interests but also their peers’.  This shared data entry strategy is another way in which I find students embracing the collaborative nature of this type of work—many will offer to cover data entry for another student when they know the other student is overwhelmed with their participant workload.

    Individual Analyses and Presentation

    When students finish data collection (week 11 or 12), we can begin the data analysis process.  Students are reminded as a group how to run the most common analyses (calculating a scale score from Likert data, determining participant demographics, running a reliability analysis, correlations, and t-tests).  Then there are several in-class workdays during which students can practice these analyses on their own data.  Each student is responsible for analyzing data relevant to their own research hypothesis.  I float around the computer lab to provide support to students with questions, but as there is only one of me, they find additional support in their classmates.  Students often answer one another’s questions and double-check analyses.  This is easily the most rewarding part of the semester, hearing students teaching and encouraging one another, and cheering when they see statistically significant results.

    Following analyses, students are responsible for sharing their results in a final research paper.  They previously submitted a draft of an introduction (week 7) and method section (week 9).  The initial method draft was written at a time when they did not know their participants’ characteristics, so in that draft, they left placeholders for these numbers.  Thus, one of their first tasks after data analyses is to write a new draft of their methods section with these placeholders replaced with actual data.  They submit this alongside their results section (week 12) with a discussion section to follow roughly two weeks later.  While writing generally can’t be completed fully in class, students have several in-class writing days so that they can consult with the instructor and their peers when questions arise. 

    Students then learn about data presentation and create a draft poster to be submitted during the last week of class. Again, because students are working on individual research hypotheses, each of these paper and poster drafts are individual, but students have the benefit of receiving feedback from peers and the instructor on drafts at all stages, meaning that final projects are often in phenomenal shape.

    Students submit their finished products early during finals week, and then individually present their research to the class during the final examination period.  This is another very encouraging part of the semester, as students learn more about their peers’ projects and offer encouragement for their hard work.  Furthermore, because the work was approved by the IRB, students are in a very good position to later take their research projects to other venues, such as on-campus undergraduate research conferences and/or regional professional conferences, to share their findings with a broader audience.

    The Outcome

                The structure of the class research project, intermixing group and individual components is, admittedly overwhelming sometimes, particularly if an individual student must miss class frequently, such as in cases of student athletes.  In those cases, the students’ lack of attendance has the potential to harbor everyone’s progress on the collaborative project, so a fair amount of instructor foresight and flexibility is necessary in order to accommodate those absences and ensure that the project can still move forward.  That said, I have found the collaboration to be worthwhile.  Grades, attendance, and course evaluations have increased since I began collaborative data collection, as have student accountability and teamwork.  As students move in and out of group and individual efforts, they see the ways in which they efforts impact themselves and others, and they embrace the process of working toward a common goal.

                More than that, students recognize the ways in which collaboration has allowed them to more effectively manage their time so that they are not duplicating efforts.  For instance, by pooling their data collection, they avoid saturating the research pool and have access to many more participants than they would if they had collected data individually.  Similarly, from the instructor perspective, students’ collaboration allows me to more efficiently work with them (for instance, allowing me to work with one IRB application instead of 15), so that I can free up time to provide more detailed feedback on drafts throughout the semester, which also benefits the students.

    Teamwork makes dreamwork.  Gone are the days of spending countless office hours listening to students complain about how their research partner isn’t doing their fair share of the work.  Gone, too, are the days of trying to grade results sections based on data collected from 7 participants.  Instead, I see students working together and holding themselves to a high standard, and I see their efforts resulting in extraordinary outcomes.  I hope that others can find relief and excitement in a similar approach.


    Andresen, L., Boud, D., & Cohen, R. (2020). Experience-based learning. In Understanding adult education and training (pp. 225-239). Routledge.

    Ball, C. T., & Pelco, L. E. (2006). Teaching research methods to undergraduate psychology students using an active cooperative learning approach. International Journal of Teaching and Learning in Higher Education, 17(2), 147-154.

    Balloo, K. (2019). Students’ difficulties during research methods training acting as potential barriers to their development of scientific thinking. Redefining scientific thinking for higher education: Higher-order thinking, evidence-based reasoning and research skills, 107-137.

    Etengoff, C. (2023). Reframing psychological research methods courses as tools for social justice education. Teaching of Psychology, 50(2), 184-190.

    Fazli, A., Imani, E., & Abedini, S. (2018). Faculty members' experience of student ethical problems: A qualitative research with a phenomenological approach. Electronic Journal of General Medicine, 15(3).

    Lashley, F. R., & de Meneses, M. (2001). Student civility in nursing programs: A national survey. Journal of Professional Nursing, 17(2), 81-86.

    Lewthwaite, S., & Nind, M. (2016). Teaching research methods in the social sciences: Expert perspectives on pedagogy and practice. British Journal of Educational Studies, 64(4), 413-430.

    Murtonen, M., Olkinuora, E., Tynjälä, P., & Lehtinen, E. (2008). “Do I need research skills in working life?”: University students’ motivation and difficulties in quantitative methods courses. Higher Education, 56, 599-612.

    Rancer, A. S., Durbin, J. M., & Lin, Y. (2013). Teaching communication research methods: Student perceptions of topic difficulty, topic understanding, and their relationship with math anxiety. Communication Research Reports, 30(3), 242-251.

  • 19 Jul 2023 7:12 PM | Anonymous member (Administrator)

    Daniel A. Clark, Madelynn D. Shell, & Andria F. Schwegler
    Texas A&M University--Central Texas

    *Note: For the version with the figure included, please follow this link:

    Learning about research and statistics may be a much-maligned element of any undergraduate psychology program from the perspective of students (Harlow et al., 2009), but it is also widely viewed as an important element in psychological literacy (APA, 2013). On the faculty side, teaching these courses is often cited as challenging due to the amount of material required (Ciarocco et al., 2017). Instead of both faculty and students suffering in silence while engaging in these courses, we decided to take steps to improve how we teach all of our research-oriented undergraduate courses with the goal of distributing some of the content in the research methods course across other courses leading up to it. This redistribution of the workload was intended to ensure that students have equitable preparation for research methods and that students leave the program with equivalent experiences.

    To start the process, full-time faculty in the undergraduate psychology program began meeting regularly to discuss the desired alignment across the research course sequence (i.e., writing in psychology, statistics, and research methods) and rewrite the course learning outcomes in a manner that captured what we were doing in our individual classes. As academics, we did not always agree on everything, but we were inspired by a desire to improve our teaching and our students’ learning to find common ground. Putting the students’ learning ahead of our own idiosyncratic preferences enabled us to listen to each other’s perspectives, consider multiple ways to achieve a goal, and make decisions based on research across our respective content areas to facilitate learning. Such collaboration acknowledges that each faculty member has the academic freedom to teach using the methodology that they feel is best, but it also recognizes that courses do not exist in a vacuum (for further discussion see Cain, 2014). Courses exist in the context of programs which requires that faculty members come together at the program level to: 1) articulate the scope and quality of education we are providing to our students and 2) develop alignment across the curriculum so students acquire the same basic skills regardless of instructor, enabling them to graduate from the program with comparable knowledge and experiences. On a personal level, we were also seeking to reduce our own frustrations from teaching the research methods course with students who were not adequately prepared for it.

    Step 1: Start with the End in Mind

    We started by looking at the big picture: skills that were necessary for students to ultimately be successful in the research methods course and their psychology degree in general, rather than being bogged down by individual course outcomes and descriptions.  Consistent with previous research on teaching research methodology (Ciarocco et al., 2017; Gurung & Stoa, 2020), we found that our end goals for student performance in the course and in the program aligned quite well despite some differences in structure and content. For example, we agreed that we wanted our students to conduct IRB-approved human subjects research and collect real world data, a high impact practice (American Association of Colleges & Universities, 2013). The larger goal was for these research projects to provide grist for student conference presentations and graduate school applications. Our discussions regarding how we could set our students up for success led to the articulation of fairly specific skills (see Figure 1) that also resulted in clarifying some wording in the program learning outcomes. These specific skills fit our needs well though others might find that broader, more general wording allows for individual variation between faculty.

    Figure 1. Skill alignment across three research-oriented courses

    Step 2: Back Track to the Beginning

    Our program is housed in a regional, upper-level university that offers only junior and senior level courses in partnership with 2-year colleges. The undergraduate psychology degree includes three, four-credit hour research-oriented courses that students take in sequence: writing in psychology, statistics, and research methods. Research methods is a content-heavy class, particularly when designing original research and collecting data as part of the course, so we decided to introduce some of the research methods skills in the prerequisite courses. For example, in many universities, learning APA style starts in introductory or general psychology courses (Fallahi et al., 2006; Gurung et al., 2016). Because our university does not offer introductory-level courses, we added teaching of these skills to the first course in the research sequence, writing in psychology. In addition, we added basic research design to the writing in psychology course, as evidence suggests this can improve scientific reasoning in students at the introductory level (Becker-Blease et al., 2021). These skills prepare students for critically reading research articles not only in the writing in psychology course but across the curriculum.

    In addition to shifting skills to the beginning of the program, we moved some skills to the second course in the sequence, statistics, which students take prior to research methods. For example, students often enter research methods not knowing how to write statistical analyses in APA style, create online surveys, or clean and format data in a spreadsheet. These skills are essential to successfully completing the research project in research methods. Instead of waiting to introduce these skills in research methods, we modified the lab portion of the statistics course to include instruction in these areas. Thus, students come into research methods with an introduction to many of the basic skills they will use.

    Step 3. Ground the Plan in Learning Research

    These revisions have improved consistency and quality across our program because they are aligned with current knowledge about learning. In our discussions, we brought to bear years of research that has documented learning effects that should be incorporated into education. We know that prior knowledge improves subsequent learning, likely by reducing cognitive load (Simonsmeier et al., 2021). Spacing and retrieval practice also enhance learning (Latimier et al., 2021). By introducing important skills in earlier courses, we have made more effective use of these known mechanisms to facilitate learning. For example, as can be seen in Figure 1, relevant aspects of APA style were revisited in all three of the research-oriented courses in the curriculum. Although research methods instructors teach APA style, they now know that these skills have been introduced in previous courses and are able to focus on transfer and application of these skills rather than teaching a brand-new skill. The goal of this explicit attention to introduction/encoding, spacing, interleaving, and retrieval of information is for subsequent learning in research methods to be easier and more long lasting for students.

    Step 4. Put it in Writing

    After the end skills and curriculum map were sketched out in the first three steps, it was time to put those changes into writing so we could communicate them clearly to our students. We expanded and rewrote the course learning outcomes and the course descriptions so that they directly aligned to each of the program learning outcomes and reflected the scaffolded structure of the content students were expected to demonstrate. We also reviewed course prerequisites to ensure students were acquiring the material in the order we had designed. Using required prerequisites helped ensure that students enrolled in courses to build up their prior knowledge (Lauer et al., 2006). Finally, we discussed required assessments in each course. Although these were minimized to prioritize faculty academic freedom, we identified some core assessments that needed to be included in our courses. For example, a key outcome in research methods was writing a full research manuscript in proper APA style.  


    By aligning our course learning outcomes with program learning outcomes and identifying exactly where in the program these concepts were introduced and reinforced, we know that students are exposed to basic knowledge before entering research methods. We are also assured that when students graduate from our program, regardless of the section they completed, they are all equipped with the same basic skillset. As a 100% transfer institution, our students come to us with very diverse backgrounds and preparation. Ensuring that every student has the same exposure to essential skills such as APA style, survey development, and statistical analysis before research methods facilitates the process of the data-collection project.  Importantly, this plan embeds the high-impact practice of undergraduate research into the required curriculum, creating equitable access and opportunities for all students which have been chronic problems with implementation of these experiences (Zilvinskis et al., 2022). By focusing on broader program and course learning outcomes and using these to align our research-oriented curriculum, we were able to provide our students with a better, more consistent experience, without infringing on faculty academic freedom to choose how they teach these outcomes. We found this was a satisfying blend of faculty subject matter expertise and a collective articulation of expectations and standards that benefitted both our faculty and our students.





    American Association of Colleges and Universities. (2013). High Impact Practices. Retrieved from:

    American Psychological Association. (2013). APA Guidelines for the undergraduate psychology major: Version 2.0. Retrieved from:

    American Psychological Association. (2011). Principles for quality undergraduate education in psychology. Washington, DC: Author. Retrieved from principles.aspx

    Becker-Blease, K., Stevens, C., Witkow, M. R., & Almuaybid, A. (2021). Teaching modules boost scientific reasoning skills in small and large lecture introductory psychology classrooms. Scholarship of Teaching and Learning in Psychology, 7(1), 2–13.

    Cain, T. R. (2014, November). Assessment and academic freedom: In concert, not Conflict. (Occasional Paper #22). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

    Ciarocco, N. J., Strohmetz, D. B., & Lewandowski, G. W. (2017). What’s the point? Faculty perceptions of research methods courses. Scholarship of Teaching and Learning in Psychology, 3(2), 116–131.

    Fallahi, C. R., Wood, R. M., Austad, C. S., & Fallahi, H. (2006). A program for improving undergraduate psychology students’ basic writing skills. Teaching of Psychology, 33(3), 171–175.

    Gurung, R. A. R., Hackathorn, J., Enns, C., Frantz, S., Cacioppo, J. T., Loop, T., & Freeman, J. E. (2016). Strengthening introductory psychology: A new model for teaching the introductory course. American Psychologist, 71(2), 112–124.

    Gurung, R. A. R., & Stoa, R. (2020). A national survey of teaching and learning research methods: Important concepts and faculty and student perspectives. Teaching of Psychology, 47(2), 111–120.

    Harlow, L. L., Burkholder, G. J., & Morrow, J. A. (2009). Evaluating attitudes, skill, and performance in a learning-enhanced quantitative methods course: A structural modeling approach. Structure Equation Modeling.

    Latimier, A., Peyre, H., & Ramus, F. (2021). A meta-analytic review of the benefit of spacing out retrieval practice episodes on retention. Educational Psychology Review, 33, 959–978.

    Lauer, J. B., Rajecki, D. W., & Minke, K. A. (2006). Statistics and methodology courses: Interdepartmental variability in undergraduate majors’ first enrollments. Teaching of Psychology, 33(1), 24–30.

    Simonsmeier, B. A., Flaig, M., Deiglmayr, A., Schalk, L., & Schneider, M. (2021). Domain-specific prior knowledge and learning: A meta-analysis. Educational Psychologist.

    Zilvinskis, J., Kinzie, J., Daday, J., O’Donnell, K., & Vande Zande, C. (2022). Introduction: When done well – 14 years of chasing an admonition. In J. Zilvinskis, J. Kinzie, J. Daday, K. O’Donnell, & C. Vande Zande (Eds.), Delivering on the promise of high-impact practices: Research and models for achieving equity, fidelity, impact, and scale (pp. 1-12). Stylus.

  • 05 Jul 2023 2:29 PM | Anonymous member (Administrator)

    Amanda Mae Woodward
    University of Minnesota Twin Cities

    Open Science, or the practice of making research transparent and accessible, is becoming more prevalent in psychology research (Santoro, 2022; van der Zee & Reich, 2018). Journals, including Developmental Science and Psychological Science, accept registered reports and award authors badges for engaging in transparent research practices. As open science becomes more widely used, educating future researchers about the values and tools available is important. Graduate students, who have prior research knowledge, may benefit from guides and recommendations as they refine their research skills (Kathawalla et al., 2021). Undergraduate students, many of whom are just beginning to learn about research, may benefit from a more structured introduction to open science.  

    Infusing open science into undergraduate courses can be beneficial to students planning to enter the field of psychology because they will be introduced to modern research methods and values. Undergraduate students who learn about open science may gain skills that will make them more competitive for graduate school, including programming and communicating research decisions effectively. Further, students may gain a deeper understanding of research workflows as well as a better appreciation of how to evaluate mixed evidence and the importance of replication.  

    Of course, many undergraduate students do not go on to graduate school (APA, 2016). An introduction to open science can also be beneficial for these students. Activities that introduce undergraduate students to open science can help them refine skills, such as critical and analytical thinking skills, familiarity with software and databases, and evaluating evidence and making decisions, that are beneficial across a wide variety of careers (Naufel et al., 2018). All students, regardless of whether they go to graduate school, will come in contact with research findings in their daily lives. By helping them learn more about transparent and accessible research, these students will be better prepared to be informed consumers.

    While introducing students to open science can increase students learning, it may feel like it is adding to the instructor’s burden (many of us struggle to find space to add material given requirements and needs of a single course!). To facilitate the inclusion of open science into the classroom, I have written about the methods I have tried in my courses below. They are listed by category and provide some reflection on the ease of including them in the semester.


    Introductory Statistics Courses:

    About My Course:

    My introductory statistics course is a 4-credit course with a large lecture (~350 students) and a lab component. Students in this course learn both descriptive statistics and inferential statistics using R Programming. To introduce students to Open Science, I include the following: 


    Pre-registrations involve describing your methods and your analyses prior to collecting or analyzing your data. There are several platforms for doing this, including the Open Science Framework ( and Aspredicted ( Prior to covering inferential statistics, students in my course are presented with several scenarios, including those where the analyses are planned before data collection, those where data points are removed, or those where they are given no information. Students then discuss which type of evidence they would find more believable and whether they think sharing research plans ahead of time was a good or bad idea. After this discussion, I provide a brief recap of benefits and considerations with pre-registration and students explore the Aspredicted website. Then, I tell students that they will be expected to do a mock pre-registration for the inferential statistics we cover in class.     

    Students base their mock pre-registration on the prompts for the practice problems I provide in class. Specifically, students are asked to 1) identify the research question, 2) identify the variables in the prompt, 3) describe the scale of measurement used, 4) determine the independent and dependent variables, 5) write their hypotheses, 6) identify the correct statistical test and explain why, and 7) explain what information they will base their conclusion on. 


    This activity was relatively easy for me to include in my introductory statistics course. The pre-registration includes questions about all the information I typically want students to be able to identify. The main difference is that I am explicitly asking them to mention these pieces, rather than having an implicit expectation that they connect the scale of measurement and scenario to the statistic they calculate. I think pre-registrations help students in the course form connections between the wording of research questions, hypotheses, and analyses. 


    Advanced Statistics Course:

    About My Course:

    I teach a smaller, more advanced course that focuses on using R for statistical analyses. This class has approximately 20 students enrolled who meet twice a week. Students work on a final analysis project using existing data.


    For their secondary analysis project, students complete a pre-registration using the Open Science Framework template, which includes more questions. This step of their project helps them think through their specific research questions, what data they have access to, and what analyses would be appropriate for their project.

    Pre-registration Reflection:

    This pre-registration activity takes more effort and time than the alteration I made to my introductory course. However, it makes grading their final projects much easier and has led to more student meetings about the analyses they choose. Because this step is due before their data analyses, it gives us time to discuss different approaches to analyzing their data. It also makes them think through how to use the data (e.g., what should they do if they have missing values? Do they want to use summed scores or another approach?). By including this activity, I have shifted some of the work required to grade their final project to the beginning of the semester and I have noticed that their final projects tend to be of a higher quality. 

    Data and Code Sharing:

    In this course, there are several activities related to sharing code and data. First, I model sharing code and data by making all the course notes available via GitHub. I post the “blank shell” notes at the beginning of the class, and “commit” updates of the notes as we complete each learning outcome so students can see the updates in real time. Students in this course are expected to hand in their weekly assignments on GitHub so that they get the same experience of using GitHub. As part of their final project, students are expected to share the code they create and the necessary data on either GitHub or the Open Science Framework.

    Beyond assignment submissions, students in this course are also expected to evaluate each other’s code and to provide feedback to code posted on GitHub. This helps them think through what is needed for code to be reproducible as well as to think through ways to make their own files more easily accessible.

    Data and Code Sharing Reflection:

    Teaching students to use GitHub takes more time and an understanding of how different operating systems work. However, there is very helpful documentation to get started on GitHub and detailed instructions for syncing R Studio with GitHub. Further, there are ways for students to take steps, rather than fully integrating their work with GitHub. For instance, they can download files from GitHub and then reupload it by pointing-and-clicking. Overall, students have appreciated the opportunity to learn about GitHub, even if it was challenging.  


    Advanced Research Methods courses:

    About the Course:

    I teach an advanced undergraduate course on open science research methods. This course is approximately 10 students who meet twice a week. In this class, students learn about different aspects of open science and focus on applying what they learn to a replication project across the semester. 

    Data code and sharing:

    Students in this course use available materials to assess the methods used in a study. For instance, they can look at the study design and see how the variables were operationalized. Shared materials also allow students to understand how researchers transform raw data into the forms we often use in analyses because they can walk through the data cleaning code and the analysis script.

    Shared materials also facilitate students completing a replication project in their research methods course. My students are currently completing a replication project through Project CREP, which offers a great set of resources on the Open Science Framework to facilitate this process. Students in my course have used these materials to 1) create a pre-registration, 2) to develop a Qualtrics survey to collect data, and 3) have started using available data to complete analyses in R or in JASP. 


    Admittedly, this course is very easy to include open science topics in because it is the nature of the course. Students in the course have mentioned enjoying the activities mentioned above and have found connections between what they are learning and in their other courses. As the semester has progressed, I have seen the quality of student evaluations of open materials improve.


    Content Courses: 

    Though I have not included open science explicitly in my content courses, I believe some activities, like discussing replications and using open data, could be beneficial. Below are two examples related to cognitive development that I plan to use in the future. 

    Discussing Replications:

    Students in this course will read an article describing a failure to replicate (Oostenbroek et al., 2016) and a response from the original study’s author (Meltzoff et al., 2016). They will be prompted to think about the evidence presented in each paper and to identify factors that could have led to different results. The discussion would continue by having students think what other evidence they may need and how these papers relate to theories concerning imitation and social learning. 

    Open Data:

    There are several open data sources that include visuals. For instance, Wordbank ( is an open data source that allows you to examine children’s vocabulary growth around the world. An activity asking students to look at overall trends and as well as trends in specific groups would be one way to highlight a benefit of open data.


    There are many ways to introduce students to open science as part of our courses. Introductions can be short or more in depth, depending on instructor preference and the amount of material covered in a semester. Including these activities across my courses have led to fruitful conversations about cognitive development and about the methods and statistics we use.   



    APA (2016, February). By the numbers: How do undergraduate psychology majors fare? Monitor on Psychology, 47(2), 11. 

    Kathawalla, U. K., Silverstein, P., & Syed, M. (2021). Easing into open science: A guide for

    graduate students and their advisors. Collabra: Psychology7(1).

    Naufel, K. Z., Spencer, S. M., Appleby, D., Richmond, A. S., Rudman, J., Van Kirk, J., …

    Hettich., P. (2019). The skillful psychology student: How to empower students with

    workforce-ready skills by teaching psychology. Psychology Teacher Network, 29(1).

    Retrieved from ptn/2019/03/workforce-ready-skills

    Santoro, H. (2022, January). Open Science is Surging. Monitor on Psychology, 53(1), 1.   

    van der Zee, T., & Reich, J. (2018). Open Education Science. AERA


  • 21 Jun 2023 12:45 PM | Anonymous member (Administrator)

    Lynne N. Kennette(1) & Phoebe S. Lin(2)
    1. Durham College
    2. Framingham State University

    Since its introduction in November 2022, ChatGPT ( has caused a lot of chatter, especially in educational circles. ChatGPT is a software application that uses artificial intelligence (AI) to simulate human speech and/or writing. Some see it as a cause to re-think assessments or a risk to academic integrity. Others welcome it as a new teaching tool. Regardless of your view, its presence is a good opportunity to re-think our assessments and to examine whether this new technology might be a threat to the skills we expect students to be able to demonstrate during their studies. As such, we provide some food for thought about how faculty might re-consider their assessments in the context of this new tool.


    Although an initial reaction to ChatGPT might be one of concern, it’s important to take a step back and really focus on the pedagogy. Starting with our course learning outcomes may help to re-focus and/or overcome our concerns. For example, do you have a specific learning outcome tied to writing or critical thinking that you need to ensure students are demonstrating (rather than ChatGPT)? If so, maybe there are other ways to focus on those skills in a way that departs from the standard writing assignment. Or, if the ability to detect AI-written text in students’ assignments is relevant for your pedagogical goals, there are several AI detectors which assess the extent to which text is likely to be written by a human rather than a machine. One example of these detectors is GPT-Zero (, which provides general scores for the overall product as well as highlights which sentences are more likely to be written by AI (this sentence-by-sentence analysis is similar to what you see in plagiarism detection software like TurnItIn). So, if writing is a crucial part of your course, then perhaps detecting it will be important. If not, then perhaps writing doesn’t need to be so prominent in your assessments, rendering ChatGPT much less useful for students and consequently less concerning for faculty. Below, we provide some examples below of how instructors might modify or personalize assessments in ways that make it more challenging for AI to produce useful text for students to use. Then, we provide some ideas for how ChatGPT can be used to support student learning, rather than trying to fight against its use.

    Assessment Modifications

                If you’re concerned about your current assessments, you can modify them. One way to circumvent this type of AI is to ask students to write about something that it doesn’t know about. The data used by ChatGPT is a couple of years old (though they will certainly be updating the corpus of text regularly), so something very recent from the news, or specifically related to students’ direct/campus experience and/or which might not have been written about, or a newly published empirical study will make students have to do at least the majority of the work themselves. Additionally, you could assign something very specific that you have done in class as the basis of a writing assignment. For example, an in-class experiment where you classically conditioned students to salivate to the word Pavlov using sour candy is something that AI likely doesn’t know about. Similarly, asking students to summarize the class’s specific talking points during a debate or group discussion would fit the criteria. Another example of preventing the use of ChatGPT would be to implement a writing assignment that utilizes the self-reference effect that is specific to the institution, for example by requiring students to point to specific student services available at the college and/or buildings/offices on campus. By having students write from their own perspective, this also highlights the self-reference effect, making their recall of the course material more likely. For instance, instructors may assign a writing prompt of “Our university has recently adopted an anti-racist mission statement to better support student learning. Which specific components of this mission statement do you think will have the greatest impact on the campus culture (e.g., which element(s) are you or other students most likely to be able to act upon)? Describe specific ways in which you think this mission statement will impact your experiences on our campus.” In addition to making the course material more personally relevant, another benefit of this is to allow students the opportunity for their authentic human voices to be heard, providing greater potential for them to grow as critical thinkers. Finally, requiring students to interpret or otherwise write about data collected in class and/or the outcome of a study that was (or not) replicated in class may be another way to circumvent the bot by requiring things it doesn’t know anything about. 

    Another way to limit the usefulness of ChatGPT is to require that specific references be included in the assignment (or the basis of the assignment), which will also make it easier for faculty to detect inaccuracies written by AI. Currently, the AI is not very good at including specific sources as either in-text citations or references, or citing their sources in general (though with a bit of work and the right prompts, it can be done, though still not very effectively). In actuality, sometimes ChatGPT even invents references and sources that don’t exist at all. There are other AI tools such as Perplexity  ( that do a much better job with referencing sources; however, students will still have some work to do because sources are websites and may not be scholarly.

    Reflections and personal accounts of feelings are more difficult for GPT to do, as are integrating life experiences with specific course concepts. With this type of prompt, ChatGPT will generally begin its response by saying something along the lines of it’s a robot so it can’t really reflect, but then will provide some kind of reflection text. So students can still use this, but it might not be as easy for them to obtain immediately useful text from ChatGPT.

    Producing written work (or at least a draft or outline) in class, the old-fashioned way (pen and paper) is one way to limit the number of up-front contributions that ChatGPT can make. Encouraging students to collaborate in groups and produce a mind map or other visual representation of what they will be writing will also deter students from using AI to write for them. Further, implementing group work has been shown to increase the quality of communication in the classroom, establish boundaries for expectations of the amount of work each individual should contribute, and establish respectful social norms that each group member has a valuable contribution to make in the learning process (Aronson, 1978). These activities, if they serve as the basis of a written assignment, cannot be input into ChatGPT (at least not at the current time) because it is not text and therefore students wouldn’t be able to easily use the bot. It would be less work for them to write the assignment themselves than to convert the mind map into text and then somehow feed it to ChatGPT.

    Another example of moving away from traditionally written work would be to include more oral work from students, whether live or pre-recorded, in front of the entire class or only the instructor, as a group or individually. There are many possible permutations such as presenting the content that would traditionally be in a paper, or a less traditional format such as an interview (e.g., with a fake researcher) to learn about the topic in question. Added benefits of this teaching technique might include strengthening oral communication skills and building rapport between the students and the instructor.

    Using ChatGPT

    As the saying goes, “If you can’t beat them, join them.” It is likely that students will be able to (at least somewhat) circumvent anything you try to do to limit their use of ChatGPT, so why not include this AI tool in their assignment? One way that ChatGPT can be leveraged is to help students provide higher quality writing as it can provide students with feedback on their writing. So perhaps you can assess students on how they addressed the feedback provided and their explanation for how they improved their writing. You might require them to submit their original draft, the feedback they received from ChatGPT, and their improved version, along with a reflection/explanation about how they used the feedback and what they found the most challenging to address and also the most helpful feedback they received from ChatGPT. 

    Since GPT is often inaccurate (especially with references and in-text citations), students could also be tasked with asking it to write something on a particular topic and then tracking down scholarly sources to support the claims made (and if those claims cannot be supported, then re-work the text to reflect that). This will provide students the chance to practice using the library and tracking down sources as well as the mechanics of proper citations and a chance to work on their information literacy skills more broadly. This could be especially valuable in a research methods course to emphasize the rigorous process of publishing scientific research in addition to highlighting the merits of integrity in both writing and research practices.

     ChatGPT can also “grade” assignments, so students could ask it to write an assignment and then use the rubric to score it as a starting point. Students would then improve the writing in order to make it a better and more accurate version of what was produced. Again, asking students to reflect on the process and provide specific examples of where ChatGPT was the least accurate according to the rubric (for example) makes use of the tool and forces students to practice their own writing as well. This exercise could help develop stronger reading comprehension skills in addition to writing skills, as this assignment will help them differentiate between weak and strong writing as well as weak and strong arguments. Similarly, students can use AI to summarize research articles that they could then use in a paper where they synthesize or otherwise integrate the information in some way that is appropriate for the course. Alternatively, students could compare the generated summaries to their own and verify its accuracy, identify errors, and reflect on any differences in focus (e.g., perhaps ChatGPT focused its summary on the results of the study whereas students used a lot of their summary to describe the methods).

    Transferable Skills

    The other consideration, as we try to wrap our heads around the impact of this tool, should be about students’ eventual workplaces. Many workplaces are already using AI to assist in various tasks, whether overtly or covertly (Walia, 2023), and expectations in the workplace will likely adjust in terms of how long tasks should take their employees to complete in light of this new technology. As such, it would be a disservice for faculty not to give students a chance to use this tool and to become more efficient with it. Using AI is likely to become a new transferable skill (also known as a “soft skill”), which should be developed during their studies and then used in whatever workplace they end up. The transferable skill that current students need to develop in order to be competitive in the workforce may no longer be how to write from scratch, but rather, how to critically evaluat what ChatGPT (or a similar tool) is creating and be able to assess its accuracy, quality, and/or build on that text. Much like the invention of the hand-held calculator still required students to evaluate the answer given using their critical thinking skills, a similar skill is likely to be what needs to be developed to contend with ChatGPT. Further, with the increased efficiency of AI use, workers will have more time available in their daily work schedules to devote to more “human” tasks that involve original and creative thinking, such as problem solving or generating new ideas to implement for various projects.

    One somewhat recent assessment trend and best practice has been to use authentic (as opposed to disposable) assessments (Jhangiani, 2017; Seraphin et al., 2019). That is, making students’ assessments something that has an audience and purpose beyond the instructor and classroom. For example, asking students in a writing/grammar course to proofread a local business’ website or tasking psychology students to create pamphlets on a particular issue to distribute through a student services office on campus (e.g., studying/memory tips). In this way, these assessments resemble their eventual workplace more and have a clear purpose. What will their eventual workplaces look like? That is the million-dollar question.


    Whether friend or foe, its presence in the education landscape cannot be ignored. Moving away from written assignments or including more of a focus on work completed during class can be used by instructors to quickly modify their current assignments in light of the availability of ChatGPT. Although certain approaches to assessments might reduce a student’s ability to use ChatGPT to produce work for courses, perhaps the technology can be used in a way to encourage critical thinking or improve students’ writing skills. Using these types of tools in time-saving ways may be the expectation of workplaces in the not-so-distant future and students would be well-served to understand its functionality in our classrooms. 

    Disclosure: Although we did not use ChatGPT in any way to write this article, we did ask it for feedback on our writing once we had finished and it thought our writing was organized, clear, and a well-written piece overall (Paraphrase from OpenAI's ChatGPT AI language model, personal communication, February 12, 2023).


    Aronson, E. (1978). The jigsaw classroom. Sage.

    Jhangiani, R. (2017, February 2). Ditching the ‘‘disposable assignment’’ in favor of open pedagogy. E-xcellence in Teaching Blog.

    Seraphin, S. B., Grizzell, J. A., Kerr-German, A., Perkins, M. A., Grzanka, P. R., & Hardin, E. E. (2019). A conceptual framework for non-disposable assignments: Inspiring implementation, innovation, and research. Psychology Learning & Teaching18(1), 84-97.

    Walia, P. (2023, February 9).  Study finds more workers using ChatGPT without telling their bosses. Techspot.
  • 10 Apr 2023 5:01 PM | Anonymous member (Administrator)

    Jason S. McCarley (1), Raechel N. Soicher (2), & Jannah R. Moussaoui (1)
    (1: Oregon State University, 2: Massachusetts Institute of Technology)

    *Note: For the version with figures and additional resources included, please follow this link:

    The Gestalt principles of perceptual organization are a staple of undergraduate psychology. Examples like those in Figure 1 are common in Intro Psych, Cognitive, and Sensation & Perception textbooks, and discussion of the psychology behind them is important for a number of reasons. Historically, the Gestalt movement was an enormously influential school of thought (Rock & Palmer, 1990; Wagemans et al., 2012). Practically, the Gestalt principles are useful for designing displays, graphs, and lecture slides (Kosslyn, 2006; Moore & Fitz, 1993; Wickens et al., 2022). And pedagogically, the Gestalt phenomena reveal perceptual processes that students might normally take for granted.

    Figure 1.  Visual demonstrations of three familiar Gestalt principles.

    Discussion of Gestalt grouping typically focuses on visual processes, like those illustrated in the figure. But perceptual organization isn’t exclusive to vision; Gestalt processes are also necessary to organize messy sensory inputs in other senses, including touch (Gallace & Spence, 2011) and hearing (Bregman, 1990). In hearing, specifically, Gestalt processes help turn soundwaves crashing on the eardrum into a mental representation of the sound sources around us. Bregman (1990) used the term auditory scene analysis to describe the perceptual organization of sound, and auditory streams to denote the output of this analysis. An auditory stream is thus the analogue of a visual object or group.

    To accompany his book on auditory scene perception, Bregman provided demonstrations of auditory Gestalt effects. These are useful classroom demonstrations in two ways. First, they establish a unifying principle, showing students that perceptual organization operates in similar ways across different senses. Second, they make Gestalt phenomena accessible to students with visual disabilities.

    Below, we present three of Bregman’s auditory examples that can be used as classroom demonstrations in discussions of Gestalt grouping. For each one, we describe and illustrate the analogous visual effect, and explain the correspondence between the auditory and visual phenomena. We also provide links to downloadable files, made available by Bregman, that demonstrate the auditory phenomena.

    A larger set of examples is available on Dr. Bregman’s website.

    Grouping by Similarity

    In vision, the Gestalt principle of similarity says that items that look alike tend to group with one another. In theory, for example, we could see the dots in Figure 1A as forming columns, arbitrary clusters, or no pattern at all. Instead, we tend to group the dots by color, into rows.

    In his demonstration of auditory grouping by similarity, Bregman manipulates pitch to gradually segregate a series of notes into two streams. Figure 2 provides a schematic illustration. The stimulus is a well-known melody interleaved with random distractor tones. To begin, the melody and distractors are within the same pitch range, and the melody is camouflaged. As it plays repeatedly, the melody gradually moves into a higher pitch range. Eventually, it segregates from the distractor tones and becomes recognizable. Pitch here plays the same role as color does in Figure 1: sounds of similar pitch are grouped into a distinct auditory stream, standing out from sounds of dissimilar pitch.

    Figure 2.  Auditory grouping by similarity of pitch. A: When a melody is embedded amongst distractor tones from the same pitch range, it is effectively camouflaged. Here, the notes outlined in black represent the melody and the notes without outlines represent the distractors. B: When the melody and distractors are in different pitch ranges, the melody stands out and is easy to recognize.

    Grouping by Proximity

    The principle of proximity holds that items near one another are grouped together. In vision, proximity is spatial. In Figure 1B, for instance, the vertical separation between dots is smaller than the horizontal separation, and as a result, we perceive the dots as forming columns.

    In hearing, proximity is temporal. Bregman’s demonstration, illustrated in Figure 3, interleaves a series of three descending tones with a series of three ascending tones. The descending tones are in a higher pitch range than the ascending tones. To begin, the tones are played slowly, and we hear a series of notes jumping back and forth between pitch ranges. Next, the tones are played quickly. Now, temporal proximity and similarity combine to segregate the ascending and descending series into two distinct streams that seem to run simultaneously. Near enough to one another in time, the tones of similar pitch group.

    Figure 3.  Auditory grouping by proximity. A: When interleaved high and low tones are played slowly, we hear a single sequence that jumps between pitch ranges. B: When the interleaved tones are played quickly, notes of similar pitch group together. We perceive two simultaneous streams, one high-pitched and one low-pitched.

    Grouping by Connectedness

    The principle of connectedness (Rock & Palmer, 1990) holds that items connected to one another are grouped together. Figure 1C shows the influence of visual connectedness. Dots alternate color from top to bottom, and are closer together horizontally than vertically. But because they are linked by thin vertical lines, the dots perceptually group to form columns. Here, connectedness overpowers similarity and proximity.

    Bregman’s demonstration of grouping by connectedness shows an equally powerful effect. Figure 4 illustrates. In the unconnected condition, a series of tones alternates between high (“beep”) and low (“boop”) pitch. The impression is of two distinct streams, one high-pitched (“Beep. Beep. Beep…”) and one low-pitched (“Boop. Boop. Boop…”). In the connected condition, a smoothly rising and falling tone is interposed between the low- and high-pitched tones. Now, we hear a single stream of sound, smoothly modulating between high and low (“Beeeeooooeeeeoooo…”). Just as in vision, connectedness transforms isolated fragments into a unified perceptual object.

    Figure 4.  Auditory grouping by connectedness. A: When interleaved high and low tones are unconnected, we hear separate high- and low-pitched streams. B: When high and low tones are connected by a rising and falling tone, we here a single, undulating auditory stream.


    The Gestalt principles are foundational knowledge for psych undergrads. Teaching them with an exclusive focus on vision, though, can limit their accessibility and give students an unduly narrow view of the role they play in our mental life. Auditory demonstrations give us a way to expand the reach and impact of our lessons on perceptual organization.


    Bregman, A. S. (1990). Auditory scene analysis: The perceptual organization of sound. Cambridge, MA: MIT Press.

    Gallace, A., & Spence, C. (2011). To what extent do gestalt grouping principles influence tactile perception? Psychological Bulletin, 137(4), 538–561.

    Kosslyn, S. M. (2006). Graph design for the eye and mind. New York, NY: Oxford University Press.

    Moore, P., & Fitz, C. (1993). Using Gestalt theory to teach document design and graphics. Technical Communication Quarterly, 2(4), 389–410.

    Rock, I., & Palmer, S. (1990). The Legacy of Gestalt Psychology. Scientific American, 263(6), 84–90.

    Wagemans, J., Elder, J. H., Kubovy, M., Palmer, S. E., Peterson, M. A., Singh, M., & von der Heydt, R. (2012). A century of Gestalt psychology in visual perception: I. Perceptual grouping and figureground organization. Psychological Bulletin, 138(6), 1172–1217.

    Wickens, C. D., Helton, W. S., Hollands, J. G., & Banbury, S. (2022). Engineering psychology and human performance (5th edition). New York, NY: Routledge.

  • 02 Mar 2023 4:06 PM | Anonymous member (Administrator)

    Vanessa Woods
    (University of California, Santa Barbara)

    To truly create equity, a university must make sure that every student, regardless of background, can be successful. However, the reality is that students from groups that have been marginalized in higher education are entering a university that is not designed for them (e.g., needing to navigate the “hidden curriculum;” Laiduc & Covarrubias, 2022), and the responsibility for ensuring success begins with the instructor. My overall objective as an educator is to create learning opportunities that are engaging, meaningful, and motivating to students from diverse backgrounds. I strive to create inclusive learning environments in my courses and to make them relevant to students’ lives.

    Combatting the Hidden Curriculum

    There are three primary ways I create inclusive learning environments to promote motivation and engagement for all students. First, I create learning environments that include elements set up to combat the hidden curriculum for those students who are from marginalized groups (Laiduc & Covarrubias, 2022). To combat the hidden curriculum, I use explicit welcome and belonging messages in my syllabus, discuss what office hours are for, and continue to message throughout the course to convey this is their space, and my commitment to supporting them in their learning endeavors. The welcome message serves as a statement of community and conveys my appreciation for the strengths students bring into the space, and the belonging message explicitly conveys my belief they can be successful in the course and major. This includes repeated messaging that:

    1) students belong in the course and major,

    2) students bring important perspectives and ideas to our classroom space,

    3) I believe they can be successful in my challenging high-work courses,

    4) there are mechanisms for growth and improvement in the course structure,

    5) it is normal to sometimes struggle with content, and

    6) I am here to guide and coach them through that process.

    My teaching messages and practices have been informed by the literature on wise interventions, which underscore the importance of thinking about students' needs in academic settings in order to support students' ability to reframe inferences related to belonging. I embed meeting these needs in messages to motivate diverse students to engage effectively in the course (Laiduc & Covarrubias, 2022). Further, recent scholarship demonstrates that the syllabus can be an important tool to communicate instructor support for equity and inclusion, and as a tool to highlight a student/learner centered design for the course (Fuentes, Zelaya, Madsen, 2021; Richmond et al., 2019). For example, I include the following welcome message in my syllabus:

    “Our Course Community– As participants in a required pre major course in Psychological and Brain Sciences we all share an interest in the mind and behavior. I am excited to see where you will take your knowledge of methods in Psychology when you write your research proposal for this course. I value a diverse set of viewpoints and I welcome the strengths and talents you bring to the table as part of our community in this research methods course.”

    I also include a belonging message as well that reads:

    “You as a Researcher–You belong in Psychological and Brain Sciences and you belong here in this class as an undergraduate researcher. We have complete confidence in your ability to be an active capable member of this course. We also have complete confidence in your ability to develop your research and writing skills, and we are committed to guiding you through this process. Please feel free to discuss these things with Dr. Woods.*”

    Additionally, this kind of messaging is woven into my lectures and in the ways I engage the students when talking about the course structure.

    Providing Scaffolding for Student Success

    The second way I support inclusive classroom environments is to ensure that there are appropriate mechanisms to scaffold students to be successful in the course. I build in assignment scaffolding, revision opportunities, along with both individual and group exams, as well as exam retakes during finals, to ensure students can be supported in their learning. These structures provide different formative opportunities to demonstrate their learning, and contribute to a collaborative and collegial learning environment for students (Ambrose et al., 2010; Boothe et al., 2018). I carefully structure the content and pace of my courses so that students’ knowledge can build incrementally. For example, when writing a research proposal, I have five small assignments that culminate in the students having a draft of their proposal done 2 weeks before it is due. My assignments are developed to both guide the student through the material, foster autonomy, and reinforce working hard to improve with opportunities for revision (e.g., writing, peer review, feedback).

    I started recording insight discussions which are small groups of students discussing how they approached difficult course concepts and how they overcame the challenging content and gained insights into effective learning strategies. Informal student feedback (e.g. mid quarter feedback surveys, student discussion posts) suggests students find these insight discussions from their peers very useful; students like knowing that someone else had to struggle to understand a concept (helping to reframe the challenge of the course). On my exams, I ask integrative questions (e.g., authentic assessments) to test the students’ ability to put together information they have learned from different sources, and to apply this knowledge to a novel and real-world situation (Nolan et al., 2020). I design assignments that can foster the development of the students ability to think and practice as psychologists and neuroscientists by using structured peer review (Adler-Kassner & Wardle 2022; Miller-Young & Boman 2017; Woods, Safronova, & Adler-Kassner 2021). Cumulatively, my course practices and assessments give students multiple ways of gaining knowledge and of demonstrating their understanding of course concepts.

    Promoting a Welcoming & Engaging Classroom Environment

    The third way I support inclusive learning environments that promote belonging is to promote a comfortable, welcoming, and engaging classroom climate that encourages students to actively engage with content and learn from each other (Felten, 2020). When I set up group work, I have sections that promote students getting to know each other, and assign ways for students to have clear roles in the group (e.g., the person who likes dogs the best is the person who scribes for the worksheet, the person who likes Halloween the best is the one who ensures that everyone contributed to the discussion). I try to get to know the students with informal polls and chatting with them before class starts, to build relationships and to get a sense of each unique student. I create an open and warm classroom environment so that students are encouraged to ask questions or to express their point of view, using frequent in class questions that require some discussion. I strive to show cultural competence in my ways of communication to ensure that students who take my courses gain confidence in their abilities, and learn how to study and organize knowledge in meaningful ways (Tanner & Allen, 2017). I model tolerance and openness to opposing viewpoints so that students can feel safe in expressing their ideas and opinions. Specifically, last year I asked the students to help me stop using the phrase “you guys” so I could work on using non-gendered terms, and the class was very supportive in “catching” me and stopping me so I could reframe my language. This also modeled for students that we can make mistakes, while still being active members of the course that belong in the major.

    The strategies I have used to promote inclusive learning environments to combat hidden curriculum, including intentional course structure for learning, improvement, and creating a welcoming class climate, have been developed through many discussions with colleagues who value inclusion. My colleagues work to create classroom structures fostering belonging (Wilson et al., 2019), and that includes storytelling to engage students and foster knowledge application to real world situations (Alea Albada, 2022). Further, my inclusive strategies have been influenced by reading work that emphasizes kindness, affirmation, and communal goals (Estrada et al., 2019), and the importance of validating student’s experiences (Rendon, 1994). I was inspired by thoughtful workshops by Kimberly Tanner and Viji Sathy on best classroom practices for equity.

    My understanding of how to think about creating inclusive classroom environments includes thinking about the deep teaching practices; self-awareness, empathy, classroom climate, leveraging campus student support service network (Dewsbury, 2019). The growth and energy I have gained come in collaborative spaces and in conversation. I have realized how important it is to get to know your students as people. Consider taking the extra five minutes with a student asking about their hobbies, or goals, or passions, or starting a conversation with a colleague about what can be done to create more inclusion in the spaces you occupy. If we all add one or two small things that can foster inclusion, we can change our teaching and learning practices to promote equity in higher education and create real change.

    *Feel free to wordsmith the examples to suit your perspectives for your course, while citing appropriately.


    Adler-Kassner, L., & Wardle, E. (2022). Writing Expertise: A Research-Based Approach to Writing and Learning Across Disciplines. Clearinghouse.

    Alea Albada, N. (2022). Try Telling a Story: Why Instructors Share Personal Stories with Students.

    Ambrose, S., Bridges, M., DiPietro, M., Lovett, M., and Norman, M. (2010). How learning works: 7 Research-based Principles for Smart Teaching. Jossey-Bass.

    Boothe, K. A., Lohmann, M. J., Donnell, K. A., & Hall, D. D. (2018). Applying the principles of universal design for learning (UDL) in the college classroom. Journal of Special Education Apprenticeship, 7(3), n3.

    Dewsbury, B. M. (2019). Deep teaching in a college STEM classroom. Cultural Studies of Science Education, 15(1), 169–191.

    Estrada, M., Eroy-Reveles, A., & Matsui, J. (2018). The influence of affirming kindness and community on broadening participation in STEM career pathways. Social Issues and Policy Review, 12(1), 258–297.

    Felten, P. (2020). Critically reflecting on identities, particularities and relationships in student engagement. In A handbook for student engagement in higher education (pp. 148-154). Routledge.

    Fuentes, M. A., Zelaya, D. G., & Madsen, J. W. (2021). Rethinking the course syllabus: Considerations for promoting equity, diversity, and inclusion. Teaching of Psychology, 48(1), 69-79.

    Laiduc, G., & Covarrubias, R. (2022). Making meaning of the hidden curriculum: Translating wise interventions to usher university change. Translational Issues in Psychological Science, 8(2), 221.

    Miller‐Young, J., & Boman, J. (2017). Uncovering ways of thinking, practicing, and being through decoding across disciplines. New Directions for Teaching and Learning, 2017(150), 19-35.

    Nolan, S. A., Bakker, H. E., Cranney, J., Hulme, J. A., & Dunn, D. S. (2020). Project assessment: An international perspective. Scholarship of Teaching and Learning in Psychology, 6(3), 185.

    Rendon, L. I. (1994). Validating culturally diverse students: Toward a new model of learning and student development. Innovative higher education, 19(1), 33-51.

    Richmond, A. S., Morgan, R. K., Slattery, J. M., Mitchell, N. G., & Cooper, A. G. (2019). Project Syllabus: An exploratory study of learner-centered syllabi. Teaching of Psychology, 46(1), 6-15.

    Tanner, K., & Allen, D. (2007). Cultural competence in the college biology classroom. CBE—Life Sciences Education, 6(4), 251-258.

    Wilton, M., Gonzalez-Niño, E., McPartlan, P., Terner, Z., Christofferson, R. E., Rothman, J. H. (2019). Improving academic performance, belonging, and retention through increasing structure of an introductory biology course. CBE—Life Sciences Education 18(ar53), 1-13.

    Woods, V. E., Safronova, M., & Adler-Kassner, L. (2021). Guiding Students Towards Disciplinary Knowledge With Structured Peer Review Assignments. Journal of Higher Education Theory & Practice, 21(4).

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
Powered by Wild Apricot Membership Software