Do students in a real-world setting find audience response technology to be worthwhile? Even more importantly, do they really experience deeper learning as a result?
According to a recent study in the Journal of Chemical Education, the answer to both of those questions is a resounding yes.
The study – Tailoring Clicker Technology to Problem-Based Learning: What’s the Best Approach – sought to measure the efficacy of using TurningPoint interactive polling software with student response clickers in chemistry classes at Keele University in the United Kingdom. Study author Russell Pearson, a lecturer in Organic and Medicinal Chemistry at Keele, tracked a cohort of 127 pharmacy students in these interactive classes for the first two years of their four-year degree program.
Project Ponder – a program designed “to encourage students to become active thinkers rather than passive observers” and to “think more deeply in-class about their chemistry knowledge” – was divided into two phases, both with the intent of testing “the pedagogical benefit of clicker technology when applied to problem-based learning.”
In phase one, first-year chemistry students individually answered multiple-choice student clicker questions and participated in peer instruction during class. Phase two followed the same students, but instead implemented a team-based approach, with each team providing a single answer via a team clicker. The use of more advanced alphanumeric audience response clicker models allowed the students in this phase to give responses to not only multiple-choice, but also short-answer questions.
The results of the study show overwhelmingly how TurningPoint can be a valuable tool for instructors and students alike. Here are some of the more notable results:
94 percent of students agreed that response clickers improved their learning experience following phase one. That number jumped to 100 percent after the second phase.
When asked about the main benefits of student response technology, popular student responses were that they “gave instant feedback, improved engagement and made you think.”
Students in phase one saw a 3.7 percent improvement in exam grades on the first attempt, and a 4.4 percent decrease in failure rates across the academic year compared to the previous year’s cohort. Phase two continued this trend, with a 4.9 percent improvement in overall exam grades and a 5.8 percent reduction in failing their student assessment on the first attempt.
It is interesting to note that the exams do not just cover chemistry, but the pharmacy curriculum as a whole. The exam scores were measured, according to the study, to see if response clicker use in one class could potentially make students “more inquisitive thinkers and learners” and influence overall student learning and study techniques.
In phases one and two, 96 percent and 98 percent of students, respectively, wanted to see response technology incorporated into future classes. When phase two students were asked in weeks four and ten to complete the phrase “team-based clicker sessions are…,” they filled in the blank with words like, “awesome,” “good,” “fun,” “love” and “helpful.”
Although they found benefits to both, 94 percent of students preferred the phase two team-based approach to the techniques used in phase one. Pearson attributes this preference to the expanded peer instruction and discussion that occurred in the second phase, again foregrounding the importance of pedagogy when incorporating audience response technology into the classroom.
You can check out the full study here.