Finding the right flip

It’s been over a year since I posted reflections on my first attempt at lecture flipping. On joining the introductory biochemistry course as lecturer two years ago, I inherited slides that were packed full with detail, to the extent that 50 minutes of lecture time would be filled to the brim with telling students about biochemical “stuff”. Thing is, I love lecturing, almost all of that “stuff” is important, and I didn’t feel that any of it should be dumbed down or trimmed. But delivering a set of traditional lectures just didn’t seem a good use of time when the same content could be found in any number of places online. The idea of lecture flipping made intuitive sense- let the students work through the basic concepts in their own time, then use the lecture time slot for more interesting and useful stuff than reciting textbook knowledge. So off I went to prepare videos, homework and pub quizzes, and tried to find ways of making the “showtime” in the big lecture theatre more interactive and fun. I posted last year how that ‘flip 1.0’ worked for everyone.

In my experience, anything new that I try tends to be middling on the first attempt, worse on the second (see boring stats at the end of the post, comparing last year’s survey with this year), and much better after that. An enormous amount of attention goes into the first attempt, and that compensates for the lack of prior experience. Things may not work out perfectly, but more likely out of misjudgement than poor preparation. The second time round, there is a false sense of security concerning the technicalities, but at the same time the experience base is not yet big enough to compensate for technical glitches. This past semester I had another go at lecture flipping, trying to be much more sophisticated. And that’s where I went wrong…

I decided to wrap up a sequence of short videos and exercises into a single Nearpod “homework” that students could work through at their own pace. In essence, I had turned the material into a mini-MOOC, inspired by the structure of EdX courses I liked. Below is one example for one of the more complicated topics (enzyme kinetics and enzyme classification) that I felt would benefit from self-paced study:

flip15blogpics_4

Putting videos onto YouTube was a clear improvement- I’m not sure any other platform would be as widely accessible. There’s also the added perk of YouTube analytics, which I still haven’t quite got my head round:

flip15blogpics_6

So, it seems that for this example, the single biggest spike in viewer numbers was on the day of the lecture itself (as opposed to a day or two before, to prepare for the lecture) when around 120 students watched the video. There’s another small spike on the day before the exam. About a third of those who clicked on the video stopped it after less than half a minute, but the rest stayed for at least 4-5 minutes. Small peaks in “audience retention” show where students have re-played short segments; typically listening again to important definitions. I can see that practically all of the first spike of viewer numbers came from the Nearpod exercise (“unknown embedded player”), but the handful who watched on the day before the exam found the video via a number of ways- surprisingly very few via the channel “biochemistry rocks”. Finally, it’s fun to see that in addition to the obvious views from the UK and a healthy number form other English speaking countries, I had four viewers each tune in from Iraq and Russia. I’m aware that there’s nothing remarkable about that considering the global reach of YouTube, but it’s amusing nevertheless. With the ability to generate all those data, it’s no wonder that the MOOC folks are still marvelling at their click analytics and are taking their time getting to the important bits– understanding how to make online learning work.

An encouraging result from the analytics was the fact that a great majority of students on the course (over 400 out of 540) watched a set of videos on “simpler” topics in the first week of the semester (the very first lecture was held on January 26th):

flip15blogpics_8

So why do I consider this a mixed success? After all, the average exam score went up from 48% to 58%, which is great. Comments from students were quite mixed. Some of the positive ones were very encouraging:

I found the idea very useful for more complicated lectures because the videos are made by the lecturer so you essentially get the same thing you would in a lecture but condensed, which [gives you] the chance to see for yourself what you don’t understand right away. Then the actual contact time is spent on specific bits of the material that actually cause problems and need explaining, rather than wasting time on simpler concepts that most of the students might already know. Those that don’t get a video can rewatch [it] as many times as they like [and have] questions to learn on their own time. I think it’s quite an efficient way of learning and very original, good job 🙂

There was some very justified criticism too:

In my opinion, the key to this method’s success is the lecturer’s confidence in it. At times during the lectures that followed the flip sessions, I sensed that the lecturer was getting frustrated and confused with the equipment and the students which led to more disorder. If the lecturers keep calm and show full confidence in what they are doing, this would hopefully reflect on the student’s behavior.

Bam! Direct hit. In the first session with a live interactive Nearpod “pub quiz”, the WiFi on my iPad cut out and I wasn’t able to show the student view. When I switched to projecting the teacher’s view (the desktop from which I was controlling the quiz), I overlooked that it indicated the correct answer, so the whole thing turned into a joke. Although our e-learning guru was at hand and kindly offered to plug in his mobile for the student view, a couple of fairly chaotic minutes had passed by the time we were back up and running. There are easy ways to avoid this, for example with separate browser windows showing teacher and student views. This is what I meant with “false sense of security concerning the technicalities”- I felt I was confident with using Nearpod because I had been successful previously, but got a bit rusty with the details.

Looking back at my reflection on last year’s flipping, the issue of managing expectations remains. Like last year, some students argued that being talked through the content for 50 minutes is their preferred way of learning (“Didn’t like the way that we had to teach ourselves everything”). One clearly justified comment is that the material for the flipped format lectures looked different (“Flipped sessions makes it more difficult to provide a structure to my notes for revision and has been detrimental to my learning rather than helpful”). If students are used to learning “everything that’s on the slides”, having more material, even if it’s examples and exercises, must be frustrating and look like lots of additional work (“so complicated compared to the other modules where there are booklets with the lecture slides which are the most useful thing”). It seems that I need to provide much more explicit guidance on learning objectives, and perhaps even guidance on how to get there.

Last year I felt that replacing some of the multiple choice questions with “draw it” exercises would give me better insight into students’ understanding and possibly misconceptions. To a degree this is true; the variations were extremely interesting to see. The left half of the pic below shows one “Draw it” question in a homework exercise; the right half my feedback on “typical errors”. This was generic feedback to the course, not the individual student.

flip15blogpics_7

Preparing a detailed feedback document for these homework exercises, using actually submitted drawings, identifying the various types of mistakes and annotating each of them with helpful comments, was extremely time-consuming. I talked through some of these misunderstandings and answered many of the submitted free-text questions in the first half of some of the flipped lectures; the complete, detailed annotated feedback document was available on our VLE. And here I learned what I really should have known all along:

Timely, personalised feedback matters

Now, this is of course not a new insight; the importance of feedback is highlighted in every “Best practice in higher education” list. But there were ramifications for lecture flipping I hadn’t

anticipated: For me to talk through the “FAQs” that arrived via Nearpod, and giving general feedback on quiz responses does not seem to be as useful as I thought:

[What would improve the structure, organisation etc:]  Less time spent in the lectures answering questions from other people – very few seem useful

The issue of timeliness came up with the self-paced Nearpod homework exercises. Because Nearpod allows students to freely move around a presentation in homework mode, it is pointless to show the answers to quizzes after the questions- students could easily cheat by skipping ahead and then back. (I know I would be tempted to do that…) Without a way to make feedback conditional on actually submitting a response, there was no feedback until the lecture. Several students felt that this made the whole homework less worthwhile.

[I did not attempt the exercises that followed the video…]  Because there was not any direct feed-back so I felt it was a loss of time writing an answer that would not have been checked.

Flexibility matters

Without giving it much thought, I had parcelled up the work that I hoped students would do before the lecture into quite large chunks. At least for the more complex Nearpod homework on enzyme kinetics, it is safe to assume that a string of two or three 5-10 min videos and then a couple of not-so-trivial exercises would take well over an hour to complete for a novice. I’m not sure how flexible Nearpod is in terms of coping with logging off and on. The YouTube channel is of course easy to access directly for re-viewing, but the quizzes are straight-jacketed into Nearpod.

 I think that there is slightly too much to do for the flipped lectures i.e. too many questions and it can be frustrating trying to answer them after only having watched a short video on it. 

like I see 20+ slides, knowing this might take over an hour, I know I’ll need a good cuppa coffee to get through this.

[I did not attempt the exercises that followed the video because…] Sometimes seriously not enough time on my hands between writing up lecture notes, the many many online exams, essay prep, volunteering maintaining a job 

 It could be argued that students underestimate the time it takes to work through and genuinely understand the material; an hour or two per lecture might feel like a lot during the semester, but if revision time before the exam is included, that’s probably a conservative estimate. Still, with everyone’s time being more fragmented, having more flexibility in how, when and for how long to access the material and work with it is likely to be popular and beneficial for learning. Both the platform in this case (Nearpod) and the organisation of the material into “mini-MOOCs” reduced flexibility. The latter could be chunked into smaller units, but that would mean that students have to use several different PINs for each lecture, and many already disliked the few we used this year.

Challenge number one for the next year is how to structure content and materials in the clearest possible way while offering the greatest possible flexibility. It should still require active engagement instead of memorisation. And of course everything should be based on exciting, real-world material. Well, that should be easy… As for the videos, I am minded to produce even shorter, single-issue videos. Telling students about the bigger picture, conveying some enthusiasm and explaining why it all matters is something that works well in the lecture format. Explaining several dozen nitty-gritty concepts is not- some combination of short videos and plenty of practice material should work better for that, held together by a clear explanation of learning objectives. There’s some memorisation we can’t do without (eg amino acid structures); a couple of bespoke sets of Quizlet flash cards might do the trick and can be run on any mobile device. For the rest, I’ll have to decide on some form of worksheets and/or self-tests with automatic feedback on our VLE.

Challenge number two remains the question how best to use the lecture slot. Last year I wrote “It is probably going to be a combination of brief summary, case study, worked example and homework discussion.” Doing all of that would be cramming too much into one session, and I know now that the homework discussion is not all that useful. But the part I haven’t spent enough time on is that of case studies and worked examples. As part of his final year project, a student worked with me to produce an interactive revision resource for enzymology based on a case study. Student users in the first year biochem course were very positive about the scenario and found it much more motivating than isolated abstract questions. This is clearly the way to go, but it’s also far more laborious than producing conventional resources. As for worked examples- a colleague in the business school is very successful with video recordings of “pen & paper” statistical calculations. Students are asked to watch these in preparation of small-group exercise-based tutorials. Some of the more mathematical aspects in my course (pH calculations, enzyme kinetics) would clearly benefit from “how to” videos, but not many concepts in the course lend themselves to this kind of treatment.

So many ideas, so many things to consider- not least how much time I can reasonably invest in producing ever more colourful, exciting, engaging resources. It’s an interesting journey. I’ll report back next year! Though in the meantime, I will hopefully have some reflections on my first attempt at designing an online-only course unit for our Year 2 students.

Below: student responses to a survey on their “flipping experience”. N=92 for 2014, n=63 for 2015.

flip15blogpics

Lecture flipping- part 4

Seasoned flip teachers generally agree that the online provision of lecture videos is the less important part of classroom flipping. The whole purpose is to free up contact time for more meaningful interactions, problem solving, peer discussions and the like. Small groups allow for genuine student-teacher discussions and collaborative learning with bespoke advice and feedback. In a very large lecture unit, this is more difficult- but not impossible.

In my flip teaching experiment, I’ve used Nearpod again to deliver online quizzes. These take the shape of online slideshows (originally PowerPoints converted to pdf and uploaded to the Nearpod server) into which “functional slides” can be inserted. These could be quizzes, polls, free-text questions and (the most fun) a “draw it” exercise. With pre-set correct answers, the quiz is clearly the easiest to deploy in a very large class because of the automated evaluation (more below).

quiz2

Trouble is, multiple choice quizzes are pedagogically quite limited tools and tell you little about the current conceptual understanding (or possibly misconceptions) of the students. The “draw it” activity reveals more:

quiz3

Still, the majority of activities here were MCQ questions. One weakness of Nearpod is that the functioning interactive quizzes have to be in “Nearpod format”, which means slotting a limited number of words into a fixed form. The number of possible answers is not fixed, but images can’t really be included which means that I have to ask students to look at the “question slide”, remember which answer was correct, and then proceed to the fixed-format actual “voting slide” (the black one, bottom right). Works OK, but is tedious.

quiz4

Students ideally would have watched the video(s) and attempted the voluntary homework before coming to the lecture. I then talked through the results and explained the answers. The “Post session report” was a great help because I could see which questions were easy and which needed more explanation. I also shared that report –several dozen pages long and available as a pdf – on Blackboard to allow students to check their own results by scrolling down to each question. To keep confidentiality, I had asked them to sign into the homework exercise with their student ID number, but apparently that hasn’t been clear enough… Giving feedback on a “Draw it” exercise took more time (the pdf report shows each participant’s drawing and I added a short note for each), but in the end there was only a finite number of wrong ideas.

quiz6

After that, the fun started. The homework function of Nearpod is actually a more recent development; at the core is the ability to have live, interactive presentations. Tapping into the rich cultural heritage of this country, I dubbed these interactive sessions “pub quizzes” because that’s what they were, minus the beer: Students were invited to get together in teams (as far as the rigid lecture theatre setup allowed), download the Nearpod presentation according to the PIN I gave out and sign in with their team name. The presentations included a bit of fun to live up to the spirit of the pub quiz. Then there were then about five or so MCQ questions that had a real-life connection, required a bit of problem solving and were a little more challenging than what they would see in an exam. Turns out that coming up with these questions was one of the hardest things in the whole lecture flipping experiment! I explained each question and then gave them 2 minutes to discuss it in their team. Full-on chaos ensued- in a good way. I made them vote, and thanks to the excellent wifi capacity of the lecture theatre got virtually instant voting results from all 100-odd teams, which I then discussed briefly.

QUIZ5

The live sessions generated a post-session report on my Nearpod accound as well, which again I made available. Half the fun of evaluating this report was in seeing the team names students had chosen. The first page alone included “1Pussypatrol”, “Add me on Grindr”, “ARAWRBOOBYBOOBY” and “asexual”. No lack of diversity there.

Lecture flipping- part 2

communication cycles

Moving on to the design of my flipped lecture experiment. So here are the principles I wanted to apply in my flip teaching experiment: As a resource for self-paced learning, I would make videos available that would replace the teaching of “stuff” in classical lecture format. Having individually worked through the videos at their own pace, the students would be able to do simple exercises via Nearpod (next post) and submit them online. In the original lecture time slot, I would discuss the solutions to exercises, spending more time on those that seemed more problematic. There would then be a second round of Nearpod quizzes during the lecture, this time (as in Eric Mazur’s classes) allowing peer-to-peer discussions, and again online submission of answers from mobile devices. These pub quizzes would be more challenging and more real-world, thus hopefully more interesting or even inspiring, and perhaps answering the question “why do we have to learn all that stuff?”.

The two rounds of quiz would result in two cycles of learning, articulation of understanding and feedback, which should mean –for those who actually take part- in much better learning.

This pedagogical side is summarised it in the illustration above. The chart analyses the different elements of this flipped lecture according to Diana Laurillard’s Conversational framework (Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies, Routledge 2001, or more recent publication). Showing off what I’ve learned in my DTCE course last semester…

The left hand side shows what Laurillard calls the Teacher communication cycle. Traditionally, the first arrow (here: video) is the lecture, and the two following arrows are the exam (“students articulating their conceptual understanding”, or just ticking boxes in MCQ) and the exam mark as feedback. In fairness, we have a lecture-by-lecture online quiz as well as an end-of-term exam here, but both are summative (marked). Additional cycles of exercises with formative feedback were something my “focus group” had also requested- last summer I quizzed my personal advisees what they thought might improve this lecture unit. The peer communication cycle on the right is new and hasn’t been tried in this large lecture unit before.

Nearpod: What the students thought

2013-10-24 14.49.30

After the iPad & Nearpod experiment two weeks ago (see Oct 11 entry), it is only now that I had a chance to ask students (well, the group that was luckier with the wifi connectivity) how they felt about it. Their response was much more positive than I thought.

The main positive comment was that they could take a more active role, even though technically this only involved scrolling around on a web page, clicking one or two links deep, and then choosing answers in the quiz. Some said they enjoyed the quiz because they could make a choice without the awkwardness of speaking up or raising a hand. Overall, the feeling of being in charge made a difference.

And yes, one student commented that this experience -mobile, interactive, online learning- felt more like 21st century. He did not say in comparison to what- but yes, a lot of the tutorial work I do is still based on printouts, and I probably do too much talking and explaining instead of letting students get on with stuff.

Some student liked using the almost box-fresh iPad minis simply because of the allure of handling the beautifully designed, slick technology- obviously part of Apple’s success (“We made the buttons on the screen look so good you’ll want to lick them” -Steve Jobs on the Aqua interface, 2000) and the reason hundreds are fingering the pads in Apple shops. And I suppose that’s OK, as long as we’re clear this is not the reason we use them in teaching….

This student feedback is very encouraging, for tutorial work as well as for the interactive exercises I want to do with first year biochemistry students in a lecture flipping experiment next year.

first nearpod experiment

first nearpod

Some thoughts on my first experience with using iPads and Nearpod in a first year undergraduate tutorial. One of the first things we are trying to teach our life science students (and indeed the topic of one the first exercises that the distance learners in the MA DTCE course are asked to work through!) is the critical use of online resources. When I was a student 20 years ago, the information sources I could find in the library -textbooks, monographs, scientific journals- were all pre-filtered for quality and authority. Many of them were hard to digest, but I never had to question the accuracy of their content.

That’s all changed of course with the internet. It would be easy for us to wholesale discredit any information students find that didn’t come from a scientific database, but it would also be unrealistic. Not to mention disingenous when I use wikipedia myself to get a first impression of an unfamiliar field, for example when I’m marking an essay that’s far from my scientific comfort zone.

My ideal tutorial session on this topic would be to have each student do their literature search “live” during the tutorial, and I would be able to share any one student’s screen with the group whenever they come across a source they can’t easily judge. I suppose that might take too long and would favour the fastest students. I am also not sure if there is an app that allows “screen sharing” and switching like in a TV studio. The next best thing I could think of was joint web browsing starting from the same page, and a discussion of how to identify quality internet sources. I designed a Nearpod presentation with a short intro on the importance of asking “who wrote it? where is it published? what kind of text is it?” and a succession of six web pages. All were Google search results for the topic “aflatoxins” (the toxins produced by mold growing on foodstuff, particularly peanuts). I gave students some time to look at each page and check the “who, where, what” and then quizzed them “Is this a reliable source?” and “Can we cite this source in an essay?”. The webpages ranged from schoolkids’ chemistry homework to top scientific journal articles.

The individual browsing was made possible by iPad minis which the Faculty’s e-learning team has bought as a pilot project. IPads are of course already widely used in schools, and many universities are getting on board, but I gather there is still debate over how to get the most out of them, and also about “bring your own device” (byod) versus “a free ipad for each fresher”. We have been close to the latter but have held off in the end, not least because Wifi connectivity in our main teaching building – a massive block of concrete that efficiently shields all radiation, good or bad – is patchy. Nevertheless, our intrepid e-learning team arrived with iPad minis for all, set them up for Nearpod and kept them running smoothly as much as circumstances allowed.

I did the same session twice yesterday in two different rooms, and Wifi connectivity indeed made all the difference. In the first group (poor connectivity), moving through the Nearpod presentation and browsing the web pages was slow and frequently disrupted. The technical issues distracted everyone; not being able to view the whole set of pages meant that I couldn’t really get the point across. Second session, better room: big difference. The students seemed to enjoy the browsing but needed some encouragement to discuss what they found.  Some interesting points came up that I hadn’t thought about- for example, why is a ten year-old web page not up to date while a Nature article from 1966 can be “gold standard source material”? Using the Nearpod quiz functions allowed me to see how students had voted and I could discuss with them directly. Getting two students with different answers to justify their choices worked well in one case.

I think I might have made that exercise a little too long; we did not have a great amount of wrapping-up time. There was some inevitable faffing about while I launched the presentation. Maybe four webpages would have been enough, more carefully chosen to be less obviously good or bad. I did not use the full evaluation tool; that would have taken too long, and instant feedback on the quiz they’ve just taken seemed to make more sense. I’ve only now realised that one student did not submit answers even though he seemed engaged. Not sure what happened, teething problems. I have sent the presentation as “homework” to the group that didn’t get the full experience. Overall an encouraging start- if the wifi works!