[pulledquote]I announced a few weeks back that I would try and quantify the increase in engagement I have noticed when gamifying my Moodle courses. I have just started teaching two similar groups, and one has been exposed to the regular course, and the other to a gamified version of it.[/pulledquote]
I will post my findings every week, and will draw a summary at the end of the experiment. This post retraces what has happened in the first week of the experiment.
Note: Don’t expect a huge study, this is an experiment involving 2 groups of just over 23 students, aged 12 to 13 years old, over an 11 week period. Please keep this in mind when reading my interpretation of the results, and I would recommend you to think twice before quoting it in an essay 😉
- Course was duplicated twice
- One course per teaching group
- One group is exposed to the gamified course, the other is exposed to the regular course
- The resources/activities are exactly the same, are presented in the same order and are available at the same time on both courses
- The look & feel of both courses is exactly the same
- In short, both courses are the same but one course is gamified (i.e. students can collect badges when completing activities), and the other isn’t
|Gamified course||Regular course|
|Full size image||Full size image|
Course on the left has badges for students to unlock, course on the right doesn’t
The two teaching groups are relatively similar in terms of gender, age, and academic achievement distribution. I currently work at a non-selective/DSS secondary school in Hong Kong, where most of the 720 students’ first language is not English. All of the students concerned by this experiment have access to a computer at home, connected to high-speed Internet. The experiment is carried out on Form 1 students, which equate to Year 7 in the UK system, or 7th grade in the US system.
|Group 1 – Gamified course||Group 2 – Regular course|
23 students (14 boys, 9 girls)
24 students (14 boys, 10 girls)
Findings from week 1
I am trying to record as much information as possible and, although I have a pretty good idea of what I want to find out, the list of metrics I am analysing is growing organically. Here are some questions I have asked myself so far, some with early answers.
Do some students cheat?
The age old question! In order to unlock badges, students need to complete activities and view resources. Some of those activities/resources are marked as complete automatically (e.g. when a student views a webpage, a tick is automatically placed next to that resource), or the student needs to mark it as complete themselves. Students can then cheat, ticking an activity as ‘complete’ even though they didn’t even open the activity/resource in question.
Answer: In the first week, 5 students from each group marked activities done, even though they had not even visited the said activities/resources. So yes, some students cheat, whether the course is gamified or not.
TODO: I need to have a look at the length of time between each click. This leads to a question for the future –> do some students go on a ‘clicking spree’ to mark as many activities complete as possible?
Are some students forgetful?
If some students mark an activity as complete, even though they haven’t completed it, then the reverse must be true. I queried the logs to find out.
Answer: in group 1, 18 students viewed at least 1 resource and did not mark it as complete. In group 2, 20 students viewed at least 1 resource and did not mark it as complete. This is a lot more than I thought, and again there seems to be little difference as to whether a course is gamified or not.
TODO: I clearly explained to my students that marking an activity as done implies that not only they have viewed it, but also understood it. I need to find out whether students did not mark it done as one of those criteria wasn’t met, or if they simply forgot to tick it as done.
Do students spend more time on the gamified course?
I love statistics so there will be a detailed analysis at the end of the 11 weeks, along with the whole data set in case you want to perform your own analysis. For now, here is a quick breakdown of what happened in week 1, using the ‘Course dedication‘ block.
|Time spent on site (hh:mm)||Group 1||Group 2|
Answer: as you can see, students who have been exposed to the gamified course have spent longer on the Moodle course than the students exposed to the regular course (+23% on average). It will be interesting to see how this evolves over time.
TODO: I want to have a better overview of what the students are actually doing on the site. Time to find a way to analyse the logs in a meaningful way.
Do students click on more resources/activities on the gamified course?
In light of the previous answer, the answer to this one might sound obvious but I went to check anyway. I simply looked at the Moodle logs and counted the number of clicks in the course, minus my own and ensured the timeframe was correct.
|Number of clicks||Group 1||Group 2|
|Average per student||74.9||56.2|
Answer: yes they do, quite a bit more in fact. On average, a student exposed to the gamified course performed 33% more clicks than their counterparts exposed to the regular course.
TODO: again, it would be quite interesting to see what students are doing, more in depth. Also, I’d like to see if this levels off over the next few weeks.
There is probably a lot more to look at, so please write a comment below if there is anything you’d like to find out before this experiment is over, for example any specific data analysis. If you have any questions for my students, I am happy to ask.