I’d met a group of the people last year at EduWiki 2012 (and my thoughts at: EduWiki 2012), and my talk built a lot on the work I did at Cambridge on the ORBIT project – creating a platform for OER on interactive teaching particularly in STEM subject, as well as more recent work related to my PhD. In particular I was talking about some stuff I’ve covered in blogs on:
I’ve put some thoughts below on particular aspects of the event, in the long run I think there are some interesting questions around how wikimedia meets its targets (and what those are), one thing I was thinking about yesterday was whether we need to start thinking about the mediawiki platform as a classroom tool in the same way as google has pushed google docs – it’s a good way to encourage brand affiliation, and familiarise people with your tools (and get them off microsoft’s – who of course do exactly the same thing). It may be that tools like mediawiki – particularly given that they are open source, very flexible, and allow a lot of interesting pedagogic and analytic things to happen, might be particularly amenable to the sort of ‘technology for pedagogy‘ things I talked about not so long ago.
Learning Analytics for Learning Wikipedia
There are interesting times ahead with the development of a wikimedia VLE for training, and the wikipedia adventure (also for training). One thing I was keen to suggest was that if VLE training modules had learning outcomes that could be operationalised into activities within a wiki (either a training environment on the VLE or linked to wikipedia contributions themselves) then we could engage in some learning analytics on that data, and perhaps even develop a badging system. That’d be cool because, for example, we might see what sorts of pages a user interacts with – perhaps primarily ‘commons’, or maybe ‘articles’ in the main wikipedia, etc. – and what sorts of activities they’re doing (minor edits, updating references, adding media, etc.) and build on that resource knowledge and user knowledge to make suggestions for further training, areas of strength, areas of weakness. The primary target for this sort of thing is noobs, but if we want genuinely user contributed stuff I think engaging more experienced wikipedians as users is crucial too. But if they come in and think “oh, well I can do this and that” or try modules and find they’re bored, the fatigue dropout will be high. So much the better if people could be “pre-accredited” not from completing the modules on the VLE, but by checking the learning outcomes for particular modules (granularity will matter) against their user contributions. This might also encourage more experienced users to learn new skills (for example, I’m a competent contributor, but I know nothing about templates – perhaps I should learn), and could flag some things where people think they have the skill, but their editing suggests they might actually be missing something.
Learning Analytics for Learning in Wikipedia
Of course, I’m also interested in how we can develop learning analytics for learning in wikipedia (or, at least, mediawiki environments) and I’m starting to think about how we could set up some experimental environments to teach some critical evaluation skills, and explore people’s epistemic commitments in both mediawiki and more structured (e.g. EvidenceHub) environments. More on that another time though! Exciting times ahead.