This page has three sections:
- Guidance on the effective use of classroom response systems (CRS)
- Good journal articles about the use of CRS
- Does the use of CRS have any impact on attainment?
If you have any questions about the use of CRS, please contact Adam Warren
Note: many resources and research papers talk about ‘clickers’, the handheld voting devices used by systems such as Turning Point. Most systems now only support voting using students’ phones, tablets or laptops – known as BYOD ‘Bring Your Own Device’.
Guidance on the effective use of CRS
Clicker Resources is a great set of links and an instructors’ guide (PDF) from the University of British Columbia. I highly recommend you read the guide, which is full of practical advice.
Classroom Response Systems from the Centre for Teaching at Vanderbilt University provides a great overview of question types, activities, benefits and challenges. Again, I highly recommend you read this when planning to teach using this technique.
Peer Instruction is a method developed by Prof Eric Mazur, a physicist at Harvard, which uses CRS and peer discussion to deliver significant research-proven learning gains in conceptually difficult subjects. The references on the Wikipedia page are a good start – and also look at the Physics Education Research pages. Note that this is an extension of the flipped learning approach.
Good journal articles about the use of CRS
There are many journal articles that describe and evaluate the use of CRS in Higher Education contexts, and this section highlights some key contributions.
Using classroom response systems for creative interaction and engagement with students
Middleditch, P., & Moindrot, W. (2015). Cogent Economics & Finance, 3(1), 1119368. https://doi.org/10.1080/23322039.2015.1119368
“The purpose of this paper is to highlight our findings that use of a CRS system during the lecture can increase student satisfaction and engagement with their taught course, also how we might react to the seismic changes we have seen in the use of technology by students, or young people in general. After a literature review, we describe our developmental path from the adoption of CRS to a more practiced and detailed use of the technology. We present evidence of the students’ reaction to the introduction of this interaction technology and demonstrate positive impact made upon student satisfaction and enjoyment, very much aligned to the re-emphasis of National Student Survey (NSS) scoring on student engagement and collaboration. Finally, we conclude and offer our recommendations to convenors considering novel tools to further student engagement or those seeking to create an interactive classroom.”
This well-written paper clearly presents arguments in favour of the use of classroom response systems and offers a useful set of references if you want to read more deeply. It describes the evolution of their use of CRS from simple polls to peer instruction and free-text comments for student feedback:
“Students began using the [feedback] tool to convey their level of understanding on the taught material. This facility proved particularly useful in reflecting on material presented and also on how certain areas might benefit from further explanation. … Further to this, students developed their own ways of using the tool: favourable comments, practical requests and even proposals for pedagogical innovation.”
Perhaps most useful is the sense the paper gives of how the academic’s use of CRS has developed over several years of use, and how student satisfaction and engagement with this large-cohort first-year Economics course has significantly improved as a consequence.
“Clickers” as Catalysts for Transformation of Teachers
Yifat Ben-David Kolikant , Denise Drane & Susanna Calkins (2010), College Teaching, 58:4, 127-135 http://dx.doi.org/10.1080/87567551003774894
“This paper presents three case studies of instructors who used CRS in undergraduate science and math classes at a research-intensive institution in the Midwest, USA. All three instructors reported having to make significant adjustments to their teaching over time in order to transform their respective learning environments and fully realize the benefits of CRS.”
For me the value of this paper is the instructors’ reflections on their use of CRS and the ways in which they adapted their practice to obtain the classroom dynamic they valued – largely based on the peer instruction principle.
Does the use of CRS have any impact on attainment?
There have been hundreds of journal articles published in the last five years about the use of CRS in Higher Education. Most of them present positive results about ‘engagement’ based on feedback from students, but very few provide quantitative evidence of their impact on attainment.
I was therefore please to find Clickers in the Classroom: A Review and a Replication [Keough, 2012] which provides a review of 66 other studies and an in-depth study which validated the findings in a Management course. I recommend you read the paper, but Table 2: Summary of Study Criteria says it all, really:
Criterion | Number of Samples | Significant positive outcomes |
Actual performance | 34 | 22 |
Satisfaction | 47 | 46 |
Perceived performance | 37 | 35 |
Attention span | 25 | 23 |
Attendance | 24 | 19 (7) |
Participation | 21 | 20 |
Feedback | 15 | 15 |
Ease of use | 8 | 8 |
Most studies focused on multiple outcomes, and the outcomes were predominantly positive. It is worth stressing that these results are due to the active teaching strategies that the technology enables, using in-class questions to facilitate thinking, discussion and feedback.
More recently, A meta-analysis of the effects of audience response systems (clicker-based technologies) on cognition and affect [Hunsu, Adesope and Bayley, 2015] analysed the results from 53 papers that used an experimental or quasi-experimental research design to compare outcomes for a group using CRS with a control group who did not. Again, I recommend you read the paper for the detail, but to summarise the findings:
- clickers have a small positive effect on cognitive learning outcomes. The greatest effect was seen on higher-order learning outcomes such as critical thinking and knowledge application, and there was no effect on lower-order outcomes such as retention of subject knowledge;
- clickers have a significant positive effect on non-cognitive learning outcomes such as engagement, participation, self-efficacy, attendance and interest in the subject.
“Instructors who would aspire to glean the potential benefits that lie within using clicker-based technologies in the classrooms would need to attentively and strategically develop effective clicker questions and creatively facilitate thoughtful discussions and feedback around such questions. Research suggests that conceptual, and not factual, questions are most effective to aid learning. In order to optimize clicker effects, instructors would not only need to commit to encouraging peer discussion and providing feedback, but such feedback would also need to be constructive and timely.”
Finally, in What’s the Payoff?: Assessing the Efficacy of Student Response Systems [Baumann, Marchetti & Soltoff, 2015] the authors rigorously control for other factors that can affect attainment, such as students’ demographic and socio-economic background. They found that there was a small but significant impact on students’ grades where the technology is used to facilitate peer learning as opposed to in-class quizzes:
“Using clickers to promote peer collaboration allows instructors to simultaneously assess current levels of understanding and enables students to use one another as resources to better understand material.”
To conclude, the research evidence strongly supports the significant positive impact of classroom response systems such as Meetoo on a wide range of valuable but non-cognitive learning outcomes such as engagement, participation, self-efficacy, attendance and interest in the subject. In contrast, the impact on cognitive learning outcomes and attainment is small, but can be maximised by using conceptual questions that facilitate peer learning discussions.