But anyway, the study in question was not an astrophysical or climatological observation -- it was a contrived long-term experiment. I know, that's how science works. The authors make a good case in the first pages about the not-very-sciencey aspects of prior research which had no control group and/or included too many variables. A well-designed experiment is almost as good as a toasted bacon sandwich (high praise!), but I have some reasons to disregard and devalue this BYU study:
- The class studied was a college-level course, which meets three times per week. The dynamics of a college course are widely different from what I experience as a day-to-day middle school teacher. Furthermore, the class size was almost 60 students; I teach four groups every day, with between 19 and 24 in each section. Again, that difference produces very different dynamics in the classroom.
- A major reason why I started flipping was to enhance connections with parents and with special education staff. My students' support group of adults get quite familiar with what & how I am teaching this week. That is generally not an issue or a need for college students, who are probably not getting academic support from their parents (?!) and no longer have an IEP. The bar graphs of exam results for "nonflipped" and "flipped" appear level in this study, but I believe that if I ran a similar study then the nonflipped score average would be much lower.
- The study authors did not analyze or even comment about the length, the quality, and the specific nature of the professor's online videos. Garbage in, garbage out is a common IT saying. Some flipping lessons you find out there in Youtube-land are dull, pedantic, and unwatchable! If the professor's video lessons were not really comparable with class lectures in educational value, then it's remarkable that the "flipped" group performed as well as they did! Flipping teachers are still learning details of the best format, filming approach, interactibility, etc. of our online videos. There's more than one way to skin this cat, and this study should have been more specific about the online instruction provided to its experiment subjects.
- Note that the formative assessments for the flipped group were performed during class, presumably with supervision to ensure academic honesty. However the chart on page 3 and subsequent descriptions show that the nonflipped group took some assessments outside of class, where they might have been able to check their notes, a website, etc. for helpful information! That would not impact the graph of unit exam performance, but it may certainly skew some of the other data and it could have improved those students' attitude toward the course. (Easy quizzes!)
- I am also left to wonder: How did this college professor use the extra 15-20 minutes of class time with the flipped group when he/she wasn't lecturing?! The real-life difference between the two courses is not described clearly enough for me to understand them.
So anyway, I will keep doing what I'm doing: teaching the best way I can. I don't make any cash from flipping. (In fact, I would probably make more if I spent more time tutoring and less time blogging!) If science can develop an even better way to do what I need to do, then I will jump ship faster than the speed of light ... just to see what Neil DeGrasse Tyson would do about it.