Via a combination of thinking about ‘what makes a successful MOOC?’, and looking for a topic for my final project on the Infographics MOOC, I decided to try to pull together the various statistics floating around online about MOOC completion rates. I’m trying to see if any differences emerge on the basis of platform or the assessment methods used.
My draft graph synthesising everything I’ve found so far can be found here: http://www.katyjordan.com/MOOCproject.html Clicking on any of the data points will pull up a bubble with more information about that course, and a link back to the data source.
(note: the interactive version of the chart uses javascript. It has problems with some versions of Internet Explorer; I’ve found it works more consistently with Firefox. If you are having problems, click the pictures below to view screen grabs, although these may not be as up-to-date as the interactive chart).
This is off to quite an interesting start, but I need help sourcing more data and categorising courses according to their assessments.
Courses which I have completion rates for, but need more information about how they were assessed, include:
3.091x Introduction to solid state chemistry at EdX, which ended approximately 2013-016.00x Introduction to Computer Science and Programming at EdX, which ended approximately 2013-01- Artificial Intelligence (Sebastian Thrun), ran in Autumn 2011
Circuits and Electronics, at MITx in 2012 (first time the course ran)Circuits and Electronics, at MITx/EdXComputational Investing, Part 1 at Coursera, ended approx 2013-01Software as a Service, Coursera, ended approx 2012-06Gamification, Coursera, ended approx 2012-10Introduction to genetics and evolution, Coursera, ended approx 2012-12Introduction to Machine Learning (Andrew Ng), 2011
– Used a combination of MCQs and autograding problems sets
– Used a combination of MCQs and autograding problems sets
– Used a combination of MCQs and autograding problems sets
– Used a combination of MCQs and autograding problems sets
– Used a combination of MCQs and autograding problems sets
– Used a combination of MCQs and peer graded essays
– MCQs
If you studied on any of these courses, please do post a comment here outlining how it was assessed (just MCQs? Peer graded projects? Or something else?). If you know about any other sources of data about MOOC completion rates (how many registered and how many completed) in addition to the ones already in my chart, please do post a comment too (& a link to the data source), and I’ll add them to the chart. Thanks & looking forward to seeing the picture which emerges!
Pingback: MOOCs for Global Learning? « Rebecca Frost Davis
Great post and project. Some additions to data sources to consider:
– For the Duke Bioelectricity MOOC, use this data source:
Click to access Duke_Bioelectricity_MOOC_Fall2012.pdf
– For the U Michigan IHTS, you can use this link:
http://www.slideshare.net/fullscreen/csev/internet-history-technology-and-security-grand-finale-lecture-20121001/7
Pingback: The Most Thorough Summary (to date) of MOOC Completion Rates |e-Literate
Pingback: The Most Thorough Summary (to date) of MOOC Completion Rates | MOOC Feeds from around the WorldMOOC Feeds from around the World
Great info, thanks Katy.
Great post – I would quickly put this data on a graph with time as the horizontal access. It is early days – but here might be something interesting in how this is changing over time.
Hi Dr. Chuck! Thanks – looking at changes in enrollment and completion over time would be fascinating. I’m collecting start and end dates in my dataset too, so will probably post something when I have a bit more data – watch this space.
It would be interesting to know about completion rates based on student’s regions. For instance how many from Asia, Europe, Africa, America and so on
I only tweeted last week that, if MOOCs are truly massively open, where’s the data. And here it is (at least a lot more of it than I’ve seen before). I would see whether we can engage some of the data viz and open data twitter godlets to support a campaign where all MOOCs must publish their completion rates, average grade, length of follow and numbers by (week? day? hour?) Participation by country? Increase/decrease from one iteration to another. I too want the data, but I only have thirty-two wonderful, unique, wise, entertaining and, slightly fickle, followers. I think this calls for the big hitters. So much data, PLEASE let us know the details.
Bryan – I am a small hitter but I do represent one Coursera dot in Katy’s data above. When I gathered data in my Coursera course last year, I shared the non-personal data with my students and several wrote blog posts summarizing the data. I am currently teaching a Python MOOC that I built in my own open framework and you can see its progress live – https://online.dr-chuck.com/map.php?course_id=2 (on a map) and https://online.dr-chuck.com/mapjson.php?course_id=2 (raw JSON). It has some identity data, location data, and grade data. The course is at the halfway point and completion rate is 56 / 810 – all the data above is opt-in and each dynamically student controls how much they release. So we have our first example of a MOOC with open data and 100% open content – https://online.dr-chuck.com/files/oer/py4inf-002/ – It is a start.
Pingback: Synthesising MOOC completion rates | MOOCs - sustainability and customer satisfacion | Scoop.it
Pingback: MOOC Data & MoocMoocher (Katy Jordan) | Exploring the Information Ecology
Pingback: Participation and completion of MOOCs | Center for Instructional Technology
Thanks for putting this information together. I think though that you need to look at the definition of ‘completion’ more closely and perhaps make a distinction between meaningful and not-so-meaningful completion.
Pingback: Synthesising MOOC completion rates | Connectivism and Networked Learning | Scoop.it
A few comments on one data point – A History of the World Since 1300 in Coursera… (I’m the unattributed student that figured out the completion rates of the six optional essays.) I would suggest that you remove or at least comment out this outlier data point for the following reasons:
1. First, it’s a reference for assignment completion totals, not course completion rate. These are two different things. Students completing a given assignment is not the same as number of students completing the course proper.
2. The big confounder –> All essays assignments were optional. The six long form essays assigned in total, each 750-1000 words — could either be completed… or not. I was providing the assignment completion rate of these assignments in order to get a sense of the active course community population. And, in fact, the #6 essay had only 504 completions (This was not directly referenced in the graph link).
3. The course not only did not provide any official metrics about completion — it did not define completion (perhaps, simply signing up fulfilled the requirements, so we could also argue it had 100% completion rate.).
4. Finally, this course provided no certificate. Apparently this is a Princeton University administrative holdback…
In fact, at the time, we estimated that the active community population was probably somewhere between 10-20x the essay completion rate … including those students continuing to watch the online videos and participate in the forums. I would mention that our in-class estimate actually falls more into a linear regression line with your other course completion rates.
Hi Michael, many thanks for providing these valuable contextual details 🙂 It’s interesting that Princeton are opting to not issue certificates at the moment, I wonder if they will do in the future.
Pingback: On MOOC (Massive Open Online Course) completion rates
Pingback: How many students actually complete an online course?
Is there any way to get data on who completed the course? For instance, could higher completion rate be associated with schools having their already enrolled students take the course or having some other organization require the course for training purposes? These groups might opt for more auto grading for their own purposes and skew completion results.
Pingback: Synthesising MOOC completion rates | [ gregg festa ] | Scoop.it
Pingback: Synthesising MOOC completion rates | greggfesta::TODAY
Pingback: Synthesising #MOOC completion rates | Voices in the Feminine | Scoop.it
Thank you for doing this. This is very important information. Would be valuable to try and ferret out any patterns that seem to lead to higher retention rates.
I hope no one decides retention rate is more important than letting people stick their toe in and decide “not for me” or “later.” I have completed 16 courses (yikes!) plus downloaded several and dropped out of 5 times that many.
Some I drop to take later but there’s a different couple of courses I want to concentrate on right now. Some I drop because they weren’t what I expected or are far more detailed than my interest in the topic.
Increasingly I’m doing whatever part of a course interests me (instead of going for the “cert”} which might be just the first half, or just the second half, or one of the peer projects, or just hang out in the forum but skip the lectures because the topic interests me but not the slant the prof takes.
There are occasional glitches with course design, peer projects or rubrics, quizzes not matching the lectures. Those probably smooth out by the second offering.
Just by the by: Lizzie’s description of how she approaches MOOC’s is spot on for me as well; I know it’s only of tangential relevance to this work but thought I’d pipe up as I think at some point it will be interesting to look at the different types of student taking MOOCs.
Pingback: Enseñanza Universitaria en línea, MOOC y aprendizaje divergente | Aula Magna 2.0
Pingback: MOOC stats – i12LOL!
Katy,
This is a really good piece of work you are doing. For the Autumn 2011 Udacity precursor AI course made/taught by Peter Norvig and Sebastian Thrun, there were weekly homework assignment (8 of them, from memory, multiple choice questions, machine marked) and two untimed exams, also multiple choice and machine marked, one in the middle and one at the end. Final mark (for those on the “advanced track” – basically this was those learners who elected to do the homeworks and the exams) was based on “the average of your six best homework assignments (30%), midterm exam (30%) and final exam (40%)”. About 400 of the students put up their marks along with some demographic and personal data at https://docs.google.com/spreadsheet/ccc?key=0AsWh-4U3WvLRdFdrQ1BjcnRwY29NYVl0OUpXT0s5X1E#gid=0 and the owner of the workbook provided some views e.g. age profile in other worksheets on the same workbook.
In case it is of interest there is a series of posts about the course here http://fm.schmoller.net/ai-course/ (you have to scroll down a bit to get to the reports proper).
Seb Schmoller
Hi Seb, glad you have found this interesting and thanks for the information about the AI course – I’ve updated the chart to reflect the way it was assessed.
Glad to see your research online. I will be using your data as part of a course in April-May 2013 for a course on “Academia and the MOOC” being offered by Canvas Network. https://www.canvas.net/
Great graph. It might be useful to add the actual number of students who completed the course. Yes, it’s the surface between the dot and the (0,0) point, but it can sometimes be hard to compare surfaces.
I also wish I knew the dropout rate over time (how far did people who did not complete the course), but I understand this data is almost never available.
Gamification gave us a 20 minute video post course with lots of graphs showing participation of various kinds over time. Yes all that data is collected. I guess business school profs love statistics and graphs, yes?
Perhaps most profs aren’t interested in that level of detail? Others have told the forum they aren’t sure they are allowed to give any stats and won’t tell us number enrolled, or completing, or demographic summary, nada. Some have promised a post course email with numbers but I have not received those except from my first course.
Love this chart! Thanks for the intriguing info.
Hi Larry, thanks for commenting – I’ve now added the actual numbers of students as a column in the data table view (which can be found by clicking ‘browse and compare all data’). The dropout rate over time is available for a handful of the courses – some of the courses have provided quite detailed breakdowns of the data – so this can be found in some cases in the data sources linked to from the chart. Hope this helps!
More stats from the email sent to us by World Music – note that the number “still active at end of course” is significantly more than the number who did the required peer projects. Raises a question of what is “completion” – videos only? Coursework too? “certificate”? The number who won the “cert” is probably in the 2000+ range – I know from forum discussions some who did all the work missed the “certificate” by a point or two (ouch!).
Dear Listening to World Music Students,
Some have asked for stats: here are some from the Coursera site:
Course Dashboard
Total Registered Users 36295
Active Users Last Week 3859
# Unique users watching videos 22018
# Unique users submitted (quiz [quiz]) 1671
Peer Assessments Total Submissions 8077
# Unique users who submitted 2731
# Unique users who evaluated 2191
[there were 6 assignments, the grading allowed one to be missed]
Thank you all!
Listening to World Music Course Staff
[PS I LOVE when courses give us lots of the stats. Most I was in last year did, this year none.]
You don’t list Greek & Roman Mythology. The post-course Screenside Chat #5 said 55,000 students (obviously as rounded number). The post course email said “We are very pleased to see that over 2,500 of you earned a certificate of accomplishment, and that 2200 of those certificates are with distinction. We are grateful to everyone who participated, and we hope that you learned what you set out to learn in the course.” Those are the only stats I have for that course. It used mostly weekly quizzes, also two peer projects but a student only had to complete one of them.
Hi Lizzie, glad you have found this interesting and many thanks for the data – I’ve added Greek and Roman Mythology to the chart 🙂
Katy, I’m happy to see our exhibit visualization framework being used for such an interesting purpose. However, I see you are linking to an old version of the code (on static.simile.mit.edu). That machine isn’t all that reliable; if you want to be sure that your visualizations keep working, I encourage you to switch to the more up to date and reliable versions—either version 2.2 or version 3.0—currently being hosted at http://www.simile-widgets.org/exhibit .
Also, regarding another comment, note that exhibit does offer time-series visualizations as well, should you want to look at enrollment over time: http://simile-widgets.org/exhibit/examples/redsox/rivalry.html
Hi David, thanks for this – I’ve switched it over to 2.2 now. There doesn’t appear to be a chart extension for version 3.0 at the moment (http://api.simile-widgets.org/exhibit/3.0.0/extensions/) – is it in the pipeline? Will definitely have a play with the time series too.
Why have you struck out the names of the courses? Are you backing off the data in your original blog post and graph?
Hi cvconnell, the courses struck out in the post are ones which I had enrollment/completion info for and I was looking for information about how they were assessed. I crossed each one out when I found information about each – the link after each one is the source.
While enrolled in the Coursera Fantasy & Science Fiction course, the number of essays submitted each week dropped at a standard rate.
Week 1: 5091 essays
2: 3751
3: 2976
4: 2634
5: 2325
6: 2055
7: 1863
8: 1752
9: 1475
10: 1405
a fit (r-squared=0.99) for:
Essays submitted that week = -1564.5(ln(week #)) + 4896
This MOOC’s half-life is therefore about 4 or 5 weeks.
Hi Dan, many thanks for sharing this 🙂 Do you know how many people were enrolled on the course? Was it entirely assessed by essays – did you have to complete them all to get a certificate? Cheers, Katy
Info for the ‘Data Analysis’ course ( https://www.coursera.org/course/dataanalysis ) which recently finished at Coursera:
“There were approximately 102,000 students enrolled in the course, about 51,000 watched videos, 20,000 did quizzes, and 5,500 did/graded the data analysis assignments.”
This information came from the wrap-up email sent at the end of the course. According to its page at Coursera, it used a combination of auto and peer grading.
Completion rates are apparently not pass rates- for the MOOC Computing with Data Analysis course you list 5.4%, which is apparently the number who completed all quizzes and assignments, not the number who got certficates. Could you please indicate how many got certificates?
Hi Mary, there is some variation in the data depending on the course – I’ve used the number of certificates wherever possible, but some sources (such as the Computing with Data Analysis course) have only provided the number who completed assignments. I’ve added a filter to the chart now so the data can be filtered to according to the type of completion data that is available. If I find out how many certificates were issued for the course, I will update the chart 🙂
I have completed all the classes I’ve taken so far, but I DON’T do any assessments. Are you counting those of us who are listening to everything, reading everything, but don’t care about certificates?
Hello, Katy. I’m teaching the Stat 2X series on EdX. Love the clarity of your presentation. My blog stat2x.blogspot.com has completion data for Stat 2.1X as well as a stem-and-leaf plot of the completion rates in your table, with copious references to your work. Thanks for putting the data together. Much appreciated. Cheers, Ani
Hi Ani! I’m glad that you have found the chart interesting and for your kind words on your blog 🙂 Thank-you for sharing the information about your course – I’ve added it to the chart. Could I ask, how was the course assessed? Thanks! Katy
Please fix the unreasonably high number of functional problems with the html, and java script in the graph.
Next test said html and javascript against the major browser versions you are intending to support, or list the browser versions the graph should display correctly on.
Less minor problems to fix would include the spelling and grammar.
Pingback: On (Barely) Not Being a MOOC Dropout | thehomeschooledgradstudent
Pingback: Synthesising MOOC completion rates | Creativity and learning | Scoop.it
Pingback: H817open: A Tale of Two MOOCs | littlegreycellsblog
Nice one Katy. Sobering to think that pounds to peanuts there’s an engineer at Coursera (etc) who has exactly this data sitting on their desktop and refreshing at the push of a button. I presume the platform providers have been asked (and declined) to provide the deidentified data? Let’s hope your work shames them into coughing up.
Hi Katy – Remarkable effort. Do we know what % of MOOCs are covered in this analysis? ie. do you get a sense of whether these examples represent 10% or 90% of current MOC offerings? Do let me know? Dom
Hi Dom, Thanks for your comment 🙂 Last time I checked (which was March 14th), according to http://www.class-central.com/#pastlist , at that point in time, 100 Coursera courses, 11 EdX courses (including one MITx course), and one Udacity course (although this is a bit of an anomaly as Udacity courses operate on a continually-running basis so don’t ‘finish’ as such I think) had fully run and finished. So, excluding the one course in the chart which finished after March 14th, the dataset includes 25% of completed MOOCs on the major platforms up to that point (28 out of 112). Hope this helps!
Thank you very much for compiling and posting this information!
Since enrolling is free, MOOCs ‘enrolment’ figures are unsurprisingly huge, but I think ‘enrolment’ should be taken more like a mere ‘expression of interest’ than enrolment in the classic sense of the term. To better evaluate MOOCs completion rates I think it’d be useful to take number of students completing week 1 as a reference point too. Otherwise, some courses elicit huge expressions of interest just because of the reputation of the providing institutions or the attractiveness of the subjects.
I notice that a good number of students in my MOOC have explored 1 or 2 particular modules and done some discussion there but ignored the rest of the content. They would be seen as not completing the course, but I suspect that they may have completed exactly what they wanted from the course.
I think that we need to examine not only what completion might mean, but also whether that is really a goal for many students registering for MOOCs, and if we should be measuring courses by measuring it.
As an educator, I have also registered for MOOCs in subjects I teach with no intention to complete assignments or tests, but with an interest in seeing what others are doing in their courses. “Lurker” is also a term that may not apply to MOOCs in the same way we have used it in the past for online courses or forums.
Good point, Ken. I agree. ‘Completion’ in MOOCs has its own meaning. Several metrics would help to paint a more complete picture of the enrolment-to-completion journey. Completion/enrolment rate is just a quite partial view of it.
I’m also interested in what’s in it for universities. I see MOOCs as a way for universities to raise awareness of their name and profile among a global audience. In that sense, much is achieved just by being out there in a MOOC platform, not to say if catching the attention that enrolments demonstrate; in this sense, having good numbers of students completing the courses is the cherry on the cake.
On a separate note, once all measuring caveats are taken in account, completion rates also demonstrate the attractiveness and success of individual courses, whether this is due to the subject itself or/and the course design and instructors ability to engage. In a way, MOOCs also serve as testing grounds for universities.
Hard to know how to measure completion in any sense other than “certificate.” Some courses I download and will watch “later” (does later ever come?). There’s no way to measure which downloads are actively watched and which not.
One course I was interested in 1/3rd of the lecture topics. If I dropped into a classroom to listen to 1/3rd of the lectures, I wouldn’t be thought of as completing the course, even though I had completed the part that I cared about. Personal use of a MOOC can certainly be less than “doing the whole course” but that doesn’t mean it should be counted as “completing the course.”
The real issue is that no one get hung up on “completion rates” as the only measure of the value of a MOOC. This amazing course structure invites broader kinds of partial participation than expensive paid/credit instruction does, and comparisons need to pay attention to “it’s all fruit but it’s not all apples” to keep comparisons meaningful.
I agree about not getting too hung up on completion rates for courses not associated to any certificate, credit or advancement.
I think that initial registration should require some demographics data and ask about intent. If you knew at the outset that 60% of the registrants did not plan on completing all the assignments or modules, I would include that in my data analysis at the end.
Katy – there is, I think, a wrong link in the main post under the Thrun/Norvig 2011 AI course. I just finished Keith Devlin’s “Introduction to Mathematical Thinking” Coursera course. Here is some data (I can send you a 23 page PDF of the page from which I am deriving it):
1. It uses a combination of automarked assignments and peer-based assessment.
2. Over 27,000 enrolled on the course – obviously a lot of these were speculative enrolments.
3. By week 8 there were 4,410 active learners still active
4. 878 learners submitted the final exam.
Seb Schmoller
seb@schmoller.net
Hi Seb, I’d been having trouble tracking down the origin of the often-cited AI course statistics, but have just found the transcript of Peter Norvig’s talk on your blog which is a great source and have updated the link to this. Many thanks for the information about the Introduction to Mathematical Thinking course – I’ve added it to the chart. I would be interested in reading the full document too – my email address is katy dot jordan at open ac uk. Thanks again! Katy
Stats just now given for A Beginner’s Guide to Irrational Behavior
(Dan Ariely). Total registration: 142839 (https://docs.google.com/spreadsheet/ccc?key=0AgaKfhaKCg2OdG9WWWk2YjZLbW1uWFVSS0JOeFdCenc#gid=3)
Total certificates 3829 (from chart posted by prof at https://class.coursera.org/behavioralecon-001/forum/thread?thread_id=4900)
The level required for a “certificate” was 85%; there was a lot of required reading (research papers); and the peer project grading approaches reported by some to the forum suggest rubric problems resulted in a number of active participants being kicked out of the certificate game late in the course.
I just completed the xMOOC “A Beginner’s Guide to Irrational Behavior” which was offered by Dan Ariely (Duke University) on Coursera, https://www.coursera.org/course/behavioralecon. It comprised autograding for quizzed and peer grading for writing assignments.
I took the numbers below from two forum posts. Unfortunately they’re not available openly (as far as I know).
These numbers are from the final statistics:
==
Total users graded: 142,369
Users who received a Statement of Accomplishment: 3,892
==
These numbers are from a post about two weeks before the deadline for the final exam (some people seem to have un-enrolled since then because the number of users graded is a little lower than the following numbe of total registered students):
==
142,839 Total Registered Students
the number of (unique) students currently registered for this session, excluding those who have unregistered
82,008 Total Active Students
the number of (unique) students who have ever logged on to the session site
63,238 Number of Participants
the number of (unique) students who’ve watched at least one video since the start of the class (either streaming or downloaded)
29,849 Number of Participants (quiz)
the number of unique users submitting at least one quiz
42,222 Number of Participants (video)
My note: I think this refers to the number of people who answered an in-video non-graded quiz
6,030 Number of Participants (submitting)
the number of (unique) students who’ve submitted work for at least one peer assessment since the start of the session
5,778 Number of Participants (evaluating)
the number of (unique) students who’ve provided at least one evaluation for another student’s work since the start of the session
5,638 Number of Participants (posting)
the number of (unique) users who have posted at least one forum post once since the start of the session
2,906 Number of Participants (commenting)
the number of (unique) users who have commented on a forum post at least once since the start of the session
5,823 Number of Participants (voting)
the number of (unique) users who have voted on a forum post at least once since the start of the session
==
I hope you can use these numbers.
Best Regards
Oliver
Hi Lizzie and Oliver, many thanks for sharing the information about the Irrational Behaviour course – I’ve just added it to the chart 🙂
I’ve also added all of the University of Edinburgh Coursera courses which ran this Spring, as they released a very detailed report last week, which can be found here: http://tinyurl.com/cbznws3
Thanks again! Katy
Hi Katy
Open2Study (www.open2study.com) is a new Australian platform offering free online education and we’ve got some exciting completion rates that we’d like to share with you.
The worldwide average for MOOC completion is 7%; our first ten classes have recently delivered completion rates up to 25%, which is more than triple the average.
I am Paresh Kevat, Learning Analytics Specialist at Open2Study. Please contact me on paresh.kevat@open.edu.au so we can provide you with more details.
Cheers
Paresh
From the Announcements page of the just-completed Coursera course Mathematical Biostatistics Boot Camp, which started April 16, 2013 : “Of the 21,916 enrolled, 319 of you earned a statement and 1,768 of you earned a statement with distinction.” (I was a student in this class).
Hi Rebecca! Many thanks for sharing this information 🙂 Could I ask, how was the course assessed? I’ve got in my records that when it ran previously it just used auto grading and no peer grading, was it the same this time? Thanks!
Hi Katy,
The class was entirely auto graded this time too. Thanks.
Pingback: Me and my MOOCs - Geocaching Librarian
I recently completed Pattern-Oriented Software Architectures, taught by Doug Schmidt of Vanderbilt University. Quizzes and the final exam were auto-graded, essays and programs were peer-graded. Students could earn either of two Statements of Accomplishment; their criteria were:
Standard Track (auto graded):
8 quizzes worth 90% of the grade
1 final worth 10% of the grade
Distinction Track (auto graded and peer graded):
8 quizzes worth 35% of the grade
10 essay questions worth 20% of the grade
5 programming assignments worth 35% of the grade
1 final worth 10% of the grade
We needed to achieve 70% or above to pass either of these tracks.
Source: https://class.coursera.org/posa-001/wiki/view?page=syllabus
Dr. Schmidt was commendably open with his students about the class progress, and posted these final statistics:
Total Registered Students 30979
Total Active Students 20180
Students who met the “Distinction Track” criteria = 592
Students who met the “Standard Track” criteria = 1051
Total Threads on the Discussion Forum 1250
Total Posts 4334
Total Comments 3424
Source: https://class.coursera.org/posa-001/class/index
Hi William! Thank-you for sharing this detailed information – I have added it to the chart 🙂 Many thanks! Katy
Gamification spring 2013. From the post-course video.
Overall, 5,592 students received a passing score, out of 66,438 enrolled in this session. 8.4% got the certificate. (9% did the final peer project.)
Note: the announcements page says “62,373 enrolled in this session”, the post-course video about two weeks later says 66,438.
Hi Lizzie! Thanks once again, I’ve added the 2013 Gamification course to the chart 🙂 The statistics in Professor Werbachs’ video are also interesting as it’s the first time I’ve seen mention of numbers for the signature track option – http://www.youtube.com/watch?v=E8_3dNEMukQ&feature=youtu.be .