A Model for Student-Centered Flipped Courses Deployed in Canvas – John Thomson & Pamela Levine

A Model for Student-Centered Flipped Courses Deployed in Canvas – John Thomson & Pamela Levine


Pamela Levine: What we’re going to share in
this session is an approach for deploying flipped courses in Canvas and just a quick
overview of what flipped courses are. That’s where the lecture portion of a class
that typically happens in person and perhaps in large tiered classroom lecture halls with
PowerPoint slides, that is put online in some form, whether it be video or multimedia or
text content. Then students consume that before coming to
class so they’ve built up all their background knowledge. Then in class they use that in-person time
with their peers and instructors to do hands-on problem-solving and apply what they learned
in the lecture online portion. We’re going to share how we did that in Canvas. The way that we did that in Canvas involved
going to really great lengths, which you’re going to see. What we built, it worked well and the faculty
and students valued the experience. We also want to be able to do this, and we
plan to do this in more courses with more students. We need to be able to make this happen in
a more scalable way. We wish at the same time that some of these
features were built into the platform or that we can learn strategies from the folks here
in this room in order to do this model in a way that’s more supportable and more scalable. We’re here just as much to share what we did
as to learn from you. What did we do? Well, first about us. Like Adam mentioned, we’re coming from Stanford’s
Graduate School of Business. We only teach graduate students. They’re coming to us with a broad array of
backgrounds. Some did undergraduate in business administration,
but many did not. They’ve also worked for at least a few years
since their undergrad in a variety of different fields. They’re coming with all these different backgrounds,
related or not to the content that they’re learning in their MBA program. Here’s what we did. We flipped two courses, three sections of
those courses. This occurred this winter. It took a year process between conversations
starting and when we deployed the courses. I’ll let you finish reading those. Okay. There’s what we did. Now we want to show you what it looked like
in Canvas. John Thomson: Alright. Really just to reiterate, part of the goal
of doing these flipped courses was to really level set the students so that they’re all
coming into the classroom with a base level of knowledge, they’ve done their homework
and then the faculty can actually tell that they’ve done that. As you can see, we really focused a lot on
the visual design with video and reading and some interactive elements all on the page. Fortunately, we’ve got a real cracking design
team and they were up to the challenge that our faculty set forth. You’ll see a few of the interactives here
that we use things like Kaltura. We developed some in-house LTIs. We use some publisher tools. We use an adaptive tool. Then there’s also these in-page questions,
which are really, we feel like, the heart of both what was really so powerful about
this model but also has presented some challenges. You’ll notice that this is really different
from the typical instructional design tool where you’re more in a slide model where you
read a little bit, you click ‘next’, you answer a question, you click ‘next’, maybe
you watch a video, you click ‘next’, answer another question. This is all on one long form page. Why do we do this? You’ll see this is breaking down an individual
page. There are some visual elements that help the
student parse that. In some ways that takes the place of that
slide design. We found we got really strong feedback from
the students that they really appreciated this. They used it both as they were studying and
in class. They would just be able to parse through the
information and really quickly find what they needed to do. From the faculty perspective, they really
pushed our team. They said, “We don’t want our students to
think that we’re offloading any of our work, that they’re doing all this stuff online and
we’re just being lazy and not lecturing”. It’s MBAs, it’s Stanford, so there are high
expectations. We really wanted to help meet that. Just to reiterate the student perspective,
what they found was they really like being able to just hit ctrl-F and find the information
that they need. You think about a slide-by-slide model, they
couldn’t have done that. In class, if there was something that they
were getting called on they could quickly find that information. If they’re answering an in-page question,
again, they just hit ctrl-F and they find that or scrub through the video and get to
exactly what they need. We found that it really helped their learning. Pamela: Those in-page questions that you saw
in the gif of our page, I’m going to show you another example of that in a minute. That was a big part of our flipped classroom
model. Why was it so important? These two words, feedback and feed-forward. Feedback is letting the students know how
they’re doing as they’re going through content. Feed-forward is letting the instructors know
how the students are doing. Those in-page questions or those interspersed
questions are a way to gather that feedback and feed-forward. Remember that this portion of the class is
happening online at home. Students are there and they don’t have their
instructor right there with them to let them know whether they’re on track. Remember also that our students are bringing
all these different types of backgrounds with them to bear as they’re going through this
content. Some have been working in finance, for example,
and they already know this stuff cold. Some have never studied finance at all. That background knowledge that they’re bringing
into the classroom influences how they’re understanding the material that our faculty
and instructional designers put together. Unless you check for understanding right there
as they’re going through the content, how else would you know whether the students understand,
whether they’re confused or whether they think they get it but they actually don’t? How do the students themselves know how they’re
doing and whether they need to actually scroll back up that page and reread some of the content
or whether they can breeze through and watch the next video on 2.5X speed? That’s the point of the in-page questions. This feedback and feed-forward are important
parts of flipped learning. That was a big part of our model. We want to share how we did this in Canvas. Okay. I’m going to show you another video and talk
you through this. This was in our finance course. Students get to a Canvas page. This one, they start off with a video. In other cases it might be an interactive
or a piece of multimedia or text. They’re learning about the content through
this. Right now they’re studying about corporate
bonds. They’re learning and then, boom, they hit
a question. This question is directly related to what
they just learned. This is their chance to apply that and to
check whether they’re understanding it and they can move on to the next bit. We gave them, working with the faculty and
instructional designers, a couple attempts for each question. They could try it the first time and then
if they got it wrong they got feedback that they got it wrong and to try again. Then they could try again and then they would
get some additional feedback, a little bit of explanation about the correct answer or
the incorrect answer. Now, this is not Canvas built-in functionality,
as you probably know. Let me talk to you about how we did this. For those of you who are not interested in
the technical stuff, tune out. For those of you who are, tune in right now. What these questions were were Qualtrics surveys. Do any of your institutions use Qualtrics? That’s a lot. Qualtrics is a survey tool. A lot of higher ed institutions use it. It’s really flexible. It can do a lot. It’s really powerful, but it’s a survey tool. If you think about surveys and you think about
checking for understanding surveys, there’s not usually a correct or an incorrect answer. There’s not usually multiple attempts. It wasn’t quite designed for the purpose that
we were using it for, but it was flexible enough that we could get it into these pages
and make it behave like we wanted it to. Each question here is a Qualtrics survey,
an individual survey. In each survey we programmed in validation
and survey logic in order to check if what students were putting was correct and give
them another attempt if not or display correct or incorrect feedback, or on multiple-choice
questions, feedback on each different type of choice that they could pick. We also embedded data into each survey so
that we could use our university sign-on to identify who that student was. That allowed us to know who was answering
what and what they put. We had a custom CSS template in order to style
these so that they would most look like part of the Canvas page that you couldn’t really
tell you are in a different tool. We added custom JavaScript onto each Qualtrics
question in order to override some of the behaviors that make more sense for a survey
tool but not as a checking-for-understanding tool. There’s even more that we did with Qualtrics
that you’re going to hear about in a minute as we talk about feed-forward. John, go ahead and tell us about feed-forward. John: Alright. In the course of this, Pamela unlocked the
guru status badge for Qualtrics. She’s awesome. You’re probably wondering, “Why on earth
put all that work into making those crazy Qualtrics surveys?” You could just do a little maybe JavaScript
thing and just have a little quick self assessment. We’ve seen a few tools like that. Really, it’s feed-forward that’s the answer. Our faculty let us know that they wanted this
data, that they would alter the way that they’re teaching, that they would alter the way that
they maybe go into an office hours one-on-one interaction with a student. Really, it’s this counterbalance between the
feedback that the students would get and the feed-forward that the data would provide the
instructors that we found so powerful. However, yes? Woman 1: Is it an LTI integration? John: It is not. It’s just pure single sign-on, Shibboleth
or SAML. We wish it was, but we’ll get there at some
point. One thing that went with this, especially
on the point of the instructor coming into class, this is actually really a pretty big
paradigm shift for a lot of instructors. I remember this when I was teaching. You have your canned spiel. You go in and you do that. With this data you need to be ready to be
flexible. Maybe you’re going to cover something in a
lot more detail that you didn’t know students have big misconceptions about. Other parts that you think are really cool
and important, maybe they get it and you can just move on. Again, also individual meetings. We found that the faculty really appreciate
it. It’s a business school, there’s cold calling. Being able to look at which student maybe
either did really well or maybe struggled at a point to try to help them through some
of that. To get into a little bit more of the technical
detail, again, you can click off if this isn’t your thing. Each one of these questions is an individual
Qualtrics survey. All of the surveys had to be tagged so that
we knew what was what and all the multiple attempts set up. All that data, our data analysts mapped over
to Salesforce, which has a nice integration with Qualtrics and then also works well with
Tableau, which is the tool that, as you can see, we use for our visualizations. This is a place where we found that the faculty
couldn’t be as flexible as they’d like because it took so much time to actually map all these
different data fields. Really, these questions had to be set far
in advance so that everything could be ready to go. You’re probably hearing why we’re saying this
is going to be difficult to scale up and we want to try some different approaches for
our next time. Just to take a look at Tableau, this is an
example of one of the dashboards where you can see, for a class, see it says ‘Session
Two’. Here are all the different questions. You can see what percentage of students got
things correct or incorrect. It really stands out. This early stuff, they get it. This later stuff, maybe not. Then again, this is something we really heard
that the faculty appreciated, the ability to drill down at the student level. You’re not seeing student names here. We cut those out. For each session, how well are they doing? If you click this, then you get the next level
of questions. Again, that next level of, “Are they getting
it right on the first attempt, the second attempt?” really being able to get to exactly the data
that they want. Tableau is such a powerful tool for that that
we knew it would be a good fit for us. It’s a really powerful tool, but at the same
time we didn’t quite get to the point where, sort of the dream like “Let’s bring everything
together”. I mentioned before, we use Qualtrics at our
institution. This is an example of a Qualtrics dashboard. Pamela: Kaltura. John: Oh, Kaltura, sorry. Thank you. Pamela: Too many tools. John: I know, too many tools. We’ll get back to that. This was an ID. It wasn’t actually a student, but you can
see it gives data on how much have they watched, which video, things like that. This is one of the friction points that we’ll
get to, that having multiple data sources really makes instructors not want to go check
everything out in that way. Pamela: Alright, oops. John: Go ahead. Pamela: We want to stop talking for a bit
and have you guys chat with each other about how you’re using feedback and feed-forward
in your courses, in your work or how you’re thinking about using feedback and feed-forward
after the new products and tools that you’ve seen at InstructureCon. Then we’re going to circle back to those. Take a few minutes. Talk to the people around you and discuss
feedback and feed-forward. [Background conversations] Pamela: Alright, wrap up those conversations. Take another few seconds. John: Great to hear so much conversation. Pamela: Oh yes. John: I might have to break out the teacher
voice. Pamela: I know. Alright, we’ll get back to it. John: How does it go? Three claps? Pamela: One, two, three, eyes on me. [Laughter] John: Right, you’re right. Pamela: That was from the K-12 days. Okay. Hang on to those ideas because we are going
to have some open discussion afterwards. It sounded like there were a lot of good ideas
in here. I’m hoping that we can improve on our approach. Hang on to what you were just talking about. Okay. Why do we have this up here? I wanted to tell you a little bit about a
tradition at Stanford. It’s called the primal scream. There’s something called ‘dead week’. It happens at the end of classes. There’s this week of no classes and there’s
no anything. That is used as a time for students to study
feverishly before finals the following week. On the Sunday of that dead week, after they’ve
been studying all week and they’re gearing up for finals, at midnight they scream from
their dorms as loud as they can. It’s a tradition of stress alleviation for
students. That is basically the exact opposite of what
we were trying to do with our interspersed questions on Canvas pages. Checking for understanding as students go
through content is not a midterm or final exam. Those are high-stakes, summative assessments. They’re designed to capture how students performed
at the end of instruction. That brings us to our problem. The Canvas tools that were available at the
time that we were building these courses are designed for summative assessments that happen
at the end of a lesson or a module. If you think about the quiz tool, it’s labeled
as a quiz. That’s a term that has higher stakes than
what we’re trying to do with these checks for understanding. If you think about the quiz tool, students
go through 10 questions or 20 questions and they don’t see how they did on all of them
until the end when they’ve answered everything. That’s why we used the Qualtrics tool and
did everything with that in order to provide this lower-stakes, check how you’re doing,
get feedback, give feed-forward experience. It’s also why we want to improve on that approach
through built-in tools that support this model. Where are we with this model? What’s next for us? John: Alright. Some of the iterations and improvements that
we’re at least thinking of, first of all, we found with that first really just visually
appealing page, it’s really prone to breaking. We’re looking at some tools to help with the
design of that so that faculty, when they go in and make that little tweak, they don’t
break it and then they have to call a designer and we roll back the changes. We think we can improve on that. Obviously, all that heavy lifting in terms
of setting up the Qualtrics questions with all the tagging and things like that and then,
again, the analysis, we really feel like we can we can improve on that. We also heard pretty loud and clear, some
feedback from the students about that experience with Qualtrics. In some ways it was good but in other ways
it was bad. Sometimes if their single sign-on timed out
but they were still signed in to Canvas they’d see a login screen or 10 if there were 10
questions on the page. Then also that feedback that they got right
away, the next time they come back, just because of how Qualtrics works, that feedback is gone. Again, because that feedback is so important,
we really want to try to maintain that for them. Also, another thing that may be unique to
business schools, there’s a lot of desire to reuse the content. By being so complex, it’s difficult to scale
up, but it’s also difficult to transport to other audiences. That’s something that we’re hoping to do. Then finally, we use a lot of integrations
with this. The ones we mentioned and probably a few more,
some publisher tools. We really found ourselves in that classic
place where you need learning analytics, you need to follow the standards. That’s the next step that we’re starting to
look at. How can we bring all this data together, both
for that teaching dashboard that instructors use as the course was being run, but then
we also tried to pull all this data together after the course is run to really try to validate
what was its success, what else needs improving? Again, both of those things could be served
well by using learning analytics. Another saying that we have at the GSB from
one of our professors is that feedback is a gift. It just seems really apt that that this is
giving the gift of feedback to students and to us. Then we’re also hoping to get some feedback
from you all. What do you think might work for you? What questions do you have? What ways can we work together and utilize
the Canvas community, which is just a super thing? We don’t feel like this is necessarily the
one answer, but together we can all find a way to bring some of these elements of this
model to other areas. Pamela: Alright, so with that we are going
to leave our contact up on the projector and take it open to questions, suggestions. What have you got? Yes, in the back. Man 1: Hi. Oh, thank you. Hey, thanks. Your presentation’s awesome. I like what you’ve been doing. I guess it’s more of a question and maybe
an idea. I see you using the Qualtrics to integrate
with the class. The first thing that popped in my mind was,
why not use a SCORM-compliant authoring tool to throw those knowledge checks in there so
that your data feeds into Canvas as opposed to a third-party site? Then the other thing is, if you’re looking
to throw in the knowledge checks and you’re showing them video, you might be able to leverage
a tool like PlayPosit where they’re watching a video and it’s going to pose questions to
them as they’re watching. If you want go out to a third-party site,
maybe that’s something that you could take advantage of. Thanks. Pamela: Awesome. We heard some other suggestions about those
interactive video tools that are out there. We’re definitely looking at those. We had some specific needs about question
types, like a lot of numeric questions that can be answered with a range of accuracy and
all that kind of stuff. That can be hard to find out there. We do see that Canvas Quizzes.Next has that
as well as formulas coming, so we’re really excited about that. We are also looking at the interactive video
and the SCORM authoring tool approach as well for potential round two. Good suggestions. Others? I see one way back there. You next, you first, you next. I was looking for where the microphone is. Alright, let’s go in the back and I’ll repeat
your question. Woman 2: I don’t need the microphone. Would you mind just putting the slide back
up with the long-forms page? Pamela: Sure. Okay, we are putting the slide back up with
the long-form page. This is all actually a Canvas page that scrolls
and scrolls and has everything in it for that class. Woman 2: The pages are side by side and then
it scrolls down and there’s more. Pamela: This is actually one page but our
slide wasn’t long enough. It would be like scroll down to here and then
scroll to here [Crosstalk] [24:48] and then scroll. Woman 2: Oh, I see. How [Inaudible] [24:52-24:53] over these programs
[Inaudible] [24:57]. Pamela: Are you asking was it static images? We had everything. We had static images. We had video. We had interactive tools. We had simulations. We had articles to read, websites linked,
all of that. The pages, how long were they? It just depended week to week. Some classes used one page. One page was for each class session. It was just however long that they needed
it to be. This was something that we were a little bit
skeptical about at first too, like, “Is this really the way? Is this necessary?” but the students did
really like it for the reason that we talked about of having everything in one place and
being able to search and navigate it. Question back, oh, okay, yes. Man 2: Quick question, coming back to the
video quizzing. Since you’re a Kaltura customer, did you
guys look at the Kaltura quizzing or Camtasia? If so, how was your experience with that? John: Sure. Basically at this point our understanding
is it’s just multiple-choice. We’ve talked with our faculty a lot about
the types of questions, the way that they want to check student understanding in it. Because the two classes were finance and data,
really, the need for math was pretty high. We’ve looked at some other tools that might
plug into Kaltura and things like that for that. Again, this was over a year ago that we were
getting ready to launch this. It wasn’t there yet. Thanks. Woman 3: I’m curious more about the format
of the flipped classroom. To what extent did the students complete the
work before they went into the classroom? How did you use the data? Were you accessing the data and feeding that
back either to the faculty or the students? John: Great question. Actually, it’s interesting because it differed
by class. In the data class, which is a little bit more
applied, they met in a computer lab. The faculty really used the data heavily to
influence how they were teaching. They would really do that typical thing that
you want the faculty to do when they’re flipping their class and alter their methods. In the other class, there were some timing
issues. There was a lot more content than there was. It’s also a little bit less practical. There we found that the students struggled
to get through it. They felt like it was a pretty large time
investment. That’s one of the things that we’re taking
back for the next iteration, right-sizing it, seeing if maybe we can take elements of
that class to fit in other classes. Your question’s spot-on. Pamela: I’ll add on that just one thing. One thing to add on that was, with that teacher
dashboard that we showed, that was a really important piece for accountability purposes
for the students as well. Not only did the students know how they were
doing from the feedback that they saw. Not only did the teachers know how the students
were doing from the dashboard, but the students knew that the teachers knew how they were
doing because they had that dashboard. Then the instructors would enter that into
an assignment in Canvas. Each week they would see my participation
score and my accuracy score. Yes, question in the front. Woman 4: Did you have any challenges with
convincing faculty of the value of formative assessments in a graduate-level program? Pamela: I have to think about that. I think they were on board with that. Nothing is coming to mind about challenges
with this. They had a lot of specific requirements that
they wanted for these questions. That was more what we had was like all the
different, “It has to have this many attempts” or “It has to validate answers in this way
and you have to give this type of feedback” and things like that. They were bought into that, the value of formative
assessment. That’s a good point that that was helpful. Question, yes. Man 3: Our instructional designers are going
to be super jealous of this because it’s a really cool course and idea. I’m curious about, did the faculty come approach
you with this idea? How did this start? I would love to be able to approach someone
at our institution about this. John: It was a little more, I’d say, a conversation
between some of the campus leadership and the faculty. As the instructional designers started talking
with the faculty and really getting in deeper on what made them passionate about teaching,
then they found that mutual interest. At least the first spark was not quite from
the faculty. Good question. Pamela: Last questions. Alright. John: We’ll be hanging out, so if you have
more questions or want a business card we’d be glad to chat with you about information
you want or ideas that you might have. Thanks again. Pamela: Thank you.

Leave a Reply

Your email address will not be published. Required fields are marked *