What Blackboard, Desire2Learn, and Udacity Should Learn from SJSU

e-Literate 2013-09-13

As Phil noted in his analysisof the SJSU report, one of the main messages of the report seems to be that some of what we already know about performance and critical success factors for more traditional online courses also seem to apply to xMOOCs. But how good is the ed tech industry at taking advantage of what we already know?

Not very good, as far as I can tell.

One of the points that the report writers emphasize is that—no surprise—student effort is by far the biggest predictor of student success:

The primary conclusion from the model, in terms of importance to passing the course, is that measures of student effort eclipse all other variables examined in the study, including demographic descriptions of the students, course subject matter and student use of support services. Although support services may be important, they are overshadowed in the current models by students’ degree of effort devoted to their courses. This overall finding may indicate that accountable activity by students—problem sets for example—may be a key ingredient of student success in this environment.

We also know that many of the students in the SJSU MOOCs were at-risk students. They were traditional students who had failed the course for the first time, high school students in an economically disadvantaged neighborhood, and non-traditional students. What do we know about at-risk students? We know that they often need help, and we also know that they are not good at knowing when to get help. They aren’t good at knowing when they are not doing enough and they are also not good at knowing when they are underperforming and are in danger of failing.

We certainly see signs of the latter problem in the SJSU report:

The statistical model pointed to the critical importance of effort. In Survey 3 students indicated that they recognized the need for a sustained effort and the danger of falling behind. In fact, when asked what they would change if starting the semester over knowing what they know now, one of the top choices was to “make sure I don’t fall behind.” Almost two-thirds of survey respondents (65%) in Survey 3 pointed to this change, including 82% of matriculated students in Math 6L and 75% of matriculated students in Math 8. In Stat 95, where students were less likely to fall seriously behind because of stricter adherence to deadlines (see below), 60% of both matriculated and non-matriculated students identified “not falling behind” as a change they would make.

So students in these classes had trouble staying on top of the work, and the evidence (from the statistics class) is that at least part of the problem is an inability to self-regulate rather than pure lack of time. We also see evidence that students in these courses were not good at seeking help:

[O]ne of the top-rated changes students identified in Survey 3 was “more help with course content”. In this area, there was almost no difference between survey responses from matriculated and non-matriculated students with 80% of respondents from both groups rating “more help with content” as a “very important” or “important” change they would like to see Udacity and SJSU make.

This finding, when corroborated with input from faculty (see below) points to an area that may require additional attention. Because while students did not use the opportunity to video-conference with instructors, many students e-mailed their professor, but not with questions about content. Instead, student email communications focused, across the three courses, on questions related to course requirements, assignments and other technical or process-related issues.

Some of the problem could be attributed to students’ lack of awareness that help is available, but you can’t say that for students that actually asked for help and then failed to ask for content help. There is a failure of help-seeking behavior.

As I have written about here before, this is exactly the problem that Purdue’s (and now Elucian’s) Course Signals retention early warning system was designed to address. The whole thing is designed to prod students who are falling behind to get help. Basically, if students are falling behind (or failing), they get increasingly insistent messages pushing them to get help. At its heart, it is really that simple. And the results that Purdue has gotten are impressive. For example, they have been able to drive much higher utilization of their biology resource center:

Does an increase in help-seeking behavior yield greater student success? The answer is a resounding yes:

Screen Shot 2013-09-11 at 6.31.20 PM

They are seeing solid double-digit improvements in most cases. What is most impressive to me, though, is that these results persist over time, even after the students stop using the system: 

Screen Shot 2013-09-11 at 6.34.29 PM

 

Students who took just one Signals-based course are 17% more likely to still be in school in their fourth year of college than those who didn’t. Students who took two or more Signals-based courses are a whopping 24% more likely to be in school in their fourth year than those who didn’t have any. In other words, the technology actually teaches the students skills that are vital to their success. Once they learn those skills, they no longer need the tool in order to succeed. So there is no mystery about what at-risk students need or how technology can help provide it to them. Purdue’s first presentation on Course Signals was in 2006.

This is why I expressed disappointment in my posts about the analytics products from Desire2Learn and Blackboard when, despite obviously following in the footsteps of Course Signals, both products focused on dashboards for teachers. We know that direct and timely intervention is what at-risk students need. We know that this intervention can have substantial and lasting effects. The Purdue model works. I’m sure that either company could sell lots of product if they could credibly claim that they could increase their customers’ four-year retention rates by double digits. But in order to do that, they need to follow Purdue’s example and support direct feedback to the students. This approach, by the way, fits rather well with the MOOC model, where direct instructor intervention is far less likely (or even impossible) on a per-student basis. But Purdue has shown that it also works quite well in a traditional course model and that faculty can get direct benefit and insight from this approach, even if they are not the primary audience.

Can we expect better from the MOOC providers? The early indications are not good. The company and the university, perhaps egged on by the Gates Foundation, rushed courses out for at-risk students with a clearly inadequate focus on getting students to ask for and find the support that they need despite the fact that it is a huge known risk. Anybody with any experience teaching these populations would cite this challenge as a critical success factor—indeed, as the critical success factor. “Fail fast” may be a great mantra for a software startup, but it’s not such a good approach to teaching those students who need our help the most. (Mike Caulfield has a very timely and interesting post on just this subject.) “Measure twice and cut once” might be a better idea. Or even better than that is Steve Jobs’ favorite quote by Igor Stravinsky: “Good artists copy; great artists steal.” Of course, in order to steal like great artists do, you have to start by recognizing that you yourself are not inventing art.

The post What Blackboard, Desire2Learn, and Udacity Should Learn from SJSU appeared first on e-Literate.