Tag Archives: definition

What is Learning Analytics and what can it ever do for me?

Putting up definitional fences and looking for connections.
Putting up definitional fences and looking for connections

#DALMOOC’s week 1 competency 1.2 gave me an excuse explore some definitions.

As a scientist, the insistence on using the term “analytics” as opposed to “analysis” I found intriguing…The trusty Wikipedia explained that analytics is “the discovery and communication of meaningful patterns in data” and has its roots in business. It is something wider (but not necessarily deeper) than data analysis/statistics as I am used to. Much of it is focused on visualisation to support decision-making based on large and dynamic datasets – I imagine producing visually appealing and convincing powerpoint slides for your executive meeting would be one potential application…

I was glad to discover that there are some voices out there which share my concern over the seductive powers of attractive and simple images (and metrics) – here is a discussion of LA validity on the EU-funded LACE project website; and who has not heard about the issues with Purdue’s Course Signals retention predictions? Yet makers of tools such as Tableau (used in week 2 of this course) emphasise how little expertise one needs to use them to look at the data via the “visual windows”… The old statistical adage still holds – “garbage in – garbage out” (even though some evangelists might claim that in the era of big data statistics itself might be dead;). That’s enough of the precautionary rant…;)

I liked McNeill and co.’s choice of Cooper’s definition of analytics in their 2014 learning analytics paper (much of it based on CETIS LA review series):

Analytics is the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data (my emphasis)

It includes the crucial step in looking at any data in applied contexts – simply asking yourself what you want to find out and change as a result of looking at it (the “problem definition”). And the “actionable insights” – a rather offensive management speak to my ears – but nonetheless doing something about it seems rather an essential step in closing any learning loop.

The, currently, official definition of Learning Analytics came out of an open online course in Learning and Knowledge Analytics 2011 and was presented at the 1st LAK conference (Clow, 2013):

“LA is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.”

This is the definition used in the course – the definition we are encouraged to examine and redefine as this very freshly minted field is dynamically changing its shape.

Instantly I liked how the definition is open on two fronts (although that openness seems to be largely in the realm of aspirations than IRL practice, but is not surprising, given the definition’s origins):

1. It does not restrict data collection to the digital traces left by the student within Learning Management Systems/Virtual Learning Environments so it remains open to data from entirely non-digital contexts. Although in reality, the field really grew out of and, from what I can see, largely remains in the realm of big data generated by ‘clicks’ (whether it be VLEs or games or intelligent tutoring systems). The whole point really is that it relies on data collected effortlessly (or economically) – compared to traditional sources of educational data, such as exams. What really sent a chill down my spine is the idea fleetingly mentioned by George Siemens in his intro lecture for this week – extending the reach outside of the virtual spaces via wearable tech…. So as I participate in the course I will be looking out for examples of marrying the data from different sources. I will also look out for dangers/side effects of focusing on what can be measured rather than what should be measured. I feel that the latter may be enhanced by limiting a range of perspectives involved in development and application of LA (to LA experts and institutional administrators). So keeping an eye out for collaborative work on defining metrics/useful data between LA and educational experts/practitioners, and maybe even LDs is also on the list (Found one neat example already via LACE SOLAR flare UK meet, which nicely coincided with week 1 –  Patricia Charlton speaks for 3 min about the mc2 project starting at 4.40 min. The project helps educators to articulate many implicit measures used to judge student’s mathematical creativity. Wow – on so many levels!).

2. It is open to who is in control of data collection (or ownership) and use – institution vs the educator and the learner. I was going to say something clever here as well but it’s been so long since I started this post, it’s all gone out of my head (or just moved on). I found another quote from McNeill and co., what is relevant here:

Different institutional stakeholders may have very different motivations for employing analytics and, in some instances, the needs of one group of stakeholders, e.g. individual learners, may be in conflict with the needs of other stakeholders, e.g. managers and administrators.

It sort of touches on what I said under point 1 – need for collaboration within an institution when applying LA. But it also highlights the students as voices of importance in LA metric development and application. It is their data after all so they should be able to see it (and not just give permission for others to use it) and they are supposed to learn how to self-regulate their learning and all…Will I be able to see many IRL examples of such tools made available to students and individual lecturers (beyond the simple warning systems for failing students such as Course Signals)? There was a glimmer of hope for this from a couple of LACE SoLAR flare presentations. Geoffrey Bonin talked about Pericles project and how it is working to provide a recommendation system for open resources in students’ personal learning environments (at 1 min here). Or rather more radical, Marcus Elliot (Uni of Lincoln) working on a Weapons of Mass Education project to develop a student organiser app going beyond institution giving students access to the digested data and involving research project around student perceptions around learning analytics and what institutional and student-collected data they find most useful – data analytics with students not for students (at 25 min here).

(I found Doug Clow’s analysis of LA in UK HE re: institutional politics and power play in learning very helpful here and it was such a pleasant surprise to hear him speak in person at the LACE Solar flare event!)

The team’s perspective on the LA definition was presented in the weekly hangout (not surprisingly, everybody had their own flavour to add) – apologies for any transcription/interpretation errors:

  • Carolyn (the text miner of forums): Defined LA as different to other forms of Data Mining as focussing on the learning process and learner’s experiences. Highlighted the importance of correct representation of the data/type of data captured for the analysis to be meaningful in this context vs e.g. general social behaviours.
  • Dragan (social interaction/learning explorer): LA is something that helps us understand and optimise learning and is an extension (or perhaps replacement) of the existing things that are done in education and research, e.g. end of semester questionnaires no longer necessary as can see all ‘on the go’. Prediction of student success is one of the main focuses of LA but it is more subtle – aimed at personalising learning support for success of each individual.
  • Ryan (the hard-core data miner who came to the DM table waaay ahead of the first SOLAR meet in 2011, his seminal 2009 paper on EDM is here):  LA is figuring out what we can learn about learners, learning and settings they are learning in to try to make it better. LA is about beyond providing automated responses to students but LA also focuses on including stakeholders (students, teachers and others) in communication of the findings to them.

So – a lot of insistence on focus on learners and learning…implying that there are in fact some other analytics foci in education. I just HAD TO have a little peak at the literature around the history of this field to better understand the context and hence the focus of the LA itself (master of procrastination reporting for duty!).

Since I have gone beyond the word count that any sensible co-learner may be expected to read, I will use a couple of images which do illustrate key points rather more concisely.

Long and Siemens’ Penetrating the fog – analytics in learning and education provides a useful differentiation between learning analytics and academic analytics, the latter being closer to business intelligence at the insititutional level (this roughly maps onto the hierarchical definition of analytics in education by Buckingham and Shum in their UNESCO paper – PDF):

Learning Analytics vs Academic AnalyticsI appreciate the importance of such “territorial” subject definitions, especially in such an emerging field, with the potential of being kidnapped by educational economic efficiency agenda prevailing these days. However, having had an experience of running courses within HE institutions, I feel that student experience and learning can be equally impacted by BOTH, the institutional administration processes/policy decisions AND the quality of teaching,course design and content. So I believe that joined up thinking across analytics “solutions” at all the scales should really be the holy grail here (however unrealistic;). After all there is much overlap in how the same data can be used at the different scales already. For that reason I like the idea of unified concept of Educational Data Sciences, with 4 subfields, as proposed by Piety, Hickey and Bishop in Educational data sciences – framing emergent practices for analytics of learning organisations and systems (PDF). With one proviso – it is heavily US-focused, esp at >institution level. (NOTE that the authors consider Learning Analytics and Educational Data Mining to belong in a single bucket. My own grip on the distinction between the two is still quite shaky – perhaps discussion for another post)

educationaldataanalysistaxonomyI would not like to commit myself to a revised LA definition yet – I shall return to it at the end of the course (should I survive that long) to try to integrate all the tasty tidbits I collect on the way.

Hmm – what was the assignment for week 1 again? Ah – the LA software spreadsheet….oops better get onto adding some bits to that:)


Headline image source: Flickr by Pat Dalton under CC license.

Firming up my PLN definition and the shifting institutional sands

Another Place by hehaden
Another Place, a photo by hehaden on Flickr.

With week 3 twitter #xplrpln chat looming I think it is time for me to pull a personal definition of PLN together. Here goes it:

PLN is a dynamic and open network of relationships of varied strength and reciprocity which I actively chose to inspire me, learn from, and share my knowledge with around topics and projects I am profersonallyTM interested in. It stretches across institutional, national and online-offline divides, but its reach and richness is particularly well supported by online social media tools.

Now for the organisational context.

I have lived most of my professional life in Higher Education, most recently supporting students and staff in online distance Masters programme. So this would be a natural point for me to start thinking about PLN “implementation” (I know I share this interest with some other participants so maybe I could contribute to the collaborative ‘artifact’:).

Immediately I wonder which group of people should be involved. Are we talking about the tutors, or academic authors, or the admin? They would most certainly all benefit from learning about online learning and teaching and expanding their digital literacies – and the PLN-way may be the best and most sustainable way to do it. But do they all have the same needs? What about students? Should we not impart the PLN wisdom on them? Is it appropriate in the context of the course subject matter? Would it have impact on how we design or redesign the courses? Does the openness and connectivism of the PLN approach clash with the team’s teaching philosophy?

Then there is the question of scale – would it even work if we restrict ourselves to the programme team? Or should I think at the scale of all of the online distance programme staff, or all teaching staff? Or just – all staff?

I have also been interested in how PLN concept can be applied within scientific research environments in academia. This encompasses the research academics as well as PhD students. Related – but not quite the same as PLNs in teaching. Different structural problems, risks and benefits. But since research has greater power within HE institutions, maybe seeding the PLN ideas there would trickle through to the teaching side?

Phew – already the institution seems like quite a complex beast, where it would be very difficult to apply one PLN “solution” even if we managed to agree on a single definition!

And then I am tempted to venture outside the institutional boundaries (the temptation I seem to share with Helen Crump who spoke of work as a service in her blog for the week) – which in any case are becoming increasingly porous. From the student’s perspective, there is certainly much talk of “unbundling” of higher education – moving from an LP (a degree) to a remixable and personalisable mp3 (a course or a seminar) paradigm as has happened in the music industry (although I think much of this is really just ed-tech hype). From the staff perspective, the trend is abundantly clear towards casualisation of the workforce – at least on the teaching side. I have been one of those permanently short-term contract employees myself. In this context what does organisation even mean I ask with Hellen, and what is its interplay with our PLNs? Does the organisational citizenship (PDF) concept, mentioned by Kimberley Scott in the live session this week, even apply? What are the benefits and drawbacks of plugging in the ‘external’ impermanent contractor PLNs into the institutional hierarchies for both parties?

Now off to put my hand up for the HE group on Google+…it’s always more fun doing things together!

Building PLN defintion in the ambiguity sandbox

Stylin' in his Sandcastle by timsackton
Stylin’ in his Sandcastle, a photo by timsackton on Flickr.

This week’s readings have had us explore a wonderful melee of PLN definitions. Add all of the ideas emerging in our discussions and the confusion is now reaching a delightfully productive level.

Our task this week is to come up with a generic definition of PLN and recast this definition in language relevant to our own professional/organisational context. This is of course the first natural step towards completing our summative task for this seminar – an artifact to convince our CEO (or equivalent) who has become enamored with the idea of implementing PLNs that this is in fact either a great or a foolish idea for our organisation.

As Jeff Merrell pointed out during the live session this week – unless we have a general definition we will not be able to tell if the PLNs and organisations are in “structural conflict with each other”. Nor will we be able to determine the kind of changes required in the organisation to effectively incorporate the PLNs to create a business advantage.

So do I believe that PLNs definitions are absolutely “personal” or do I find some commonalities which form “clear defining attributes”? And can I think of disadvantages or barriers to introduction of PLN thinking into organisations?

I largely agree with the list of attributes presented by Jeff in the live session:

  • It is all about relationships/connections. For me this does set PLN apart from the related concepts of Personal Learning Environment (PLE) which is a toolset and Jarche’s Personal Knowledge management (PKM) which is a set of processes. Jeff asks which relationships – how do we differentiate social from professional for example? Others asked about requirement for reciprocity and even mutual awareness (I for one follow many on Twitter or blogs that would not know I even exist – although they can find out if they wanted to!). Perhaps we should not think of PLN as a uniform creature. I like Jarche’s conceptualisation of it as a continuum of relationships – ranging from strong social ties required for collaboration typical for work teams in organisations (needed to get things done), to weak and diverse ties of informal social networks focused on cooperation (needed for innovation/getting fresh ideas). He also neatly sneaks in Communities of Practice (CoP) – another professional network-related concept. The importance and differentiation between strong and weak ties is echoed in Rajagopal et. al. who attribute the concept to Grabher and Ibert (2008). In real life the model can become more messily dynamic – in a productive PLN the weak ties have a tendency to turn into collaborative projects, in defiance of institutional or even CoP boundaries – a wonderful example of this arising from #etmooc participation was shared by Janet Webster in her #xplrpln blog post today.

Both collaborative behaviours (working together for a common goal) and cooperative behaviours (sharing freely without any quid pro quo) are needed in the network era

Jarshe’s model of social links in PKM
  • PLNs require networking AND toolset skills. Rajagopal et al. define networking skills as “an ability to identify and understand other people’s work in relation to one’s own, and to assess the value of the connection with these others for potential future work” . To fully capitalise on the potential of online connections – Alison Seaman points out that “both technical skills and an understanding of the social elements of the Web […] are required for productive social networks—and Personal Learning Networks (PLNs)”. These are often referred to as digital literacies. I think that Alison mentions another important skill – “capacity for self-direction, which requires higher level of learning maturity“. Indeed, Rajagopal et al ‘s study confirmed that it is the mature learners who “reflect on their work and learning in a broader perspective than their day–to–day practice” who develop the proactive attitude required for effective PLN-building.
  • PLNs are driven by attitude and intention. This is well illustrated in the Rajagopal et al. networking model. They think that attitude to learning and working that sees “each contact as a person to learn from or collaborate with”. I think in this sense this is close to what Alison refers to as “understanding the social elements of the web” – as it is this open attitude to learning and creating WITH others which is a key to success in use of online social media for this purpose. The attitude leads to “a professional who intentionally builds, maintains and activates her strong, weak and very weak ties with contacts within her personal network for the purpose of improving her learning”. To the proactive attitude and intention of leading to behaviours “activating” your network for help – I would add here the willingness to “share” – a key component of Harold Jarsch’s PKM. One important point here is that the workplace culture must support such self-direction and the openness to asking for advice/admission of need for help. An oppeness to sharing not just between departments within an institution and especially outside of the institution may be a real – and very legitimate – issue for many organisations.

I would like to add one key point to this – which for me embodies the essence of “personal” aspect of PLN:

  • Control and ownership. For PLN approach to be fruitful “learners need to have a high level of control on tools they use and the way they use them” according to  Rajagopal et al. I would think this also goes for choice of the relationships to develop. Harold Jarsch aregues in Knowledge Sharing Paradox that “people will freely share their knowledge if they remain in control of it.” Ownership of the network and its artifacts allows the learner to make it truly personal – but more often than not large chunks of PLNs remain locked up in the institutional enterprise systems to be left behind as soon as the links with the institution are severed (is it surprising that students make no more than a token effort on their institutional portfolios? Gardner Campbell speaks of this more than eloquently in his piece on personal cyberinfrustructure within HE context). The choice of the audience for sharing by going beyond institutional walls can also motivate an extra effort – just look at the heartwarming story of incredible improvement in children’s literacy at a deprived-area school in Point England, New Zealand, when they increased their audience from one teacher for in-class assignments to the whole world for their blogs! In agreement with Harold Jarsch I see the issue of control and ownership as a point of highest structural tension between institutional and PLNs.

And two minor points (my last gasp – promise;):

  • It takes time, effort and trust (not just between people in the network but also from the employer to workers and vice versa!). The time and effort often invested in building of trust and reputation, trimming the reshaping the network but also sharing with others. Again – institutions tend to be project focused and may not see the value of real long term investment in PLNs.
  • PLNs span the ‘down the hallway’ AND online worlds. Most of our readings specify PLNs to be purely online creatures. But the concept itself arose in the late 1990s, before the internet world domination (one possible source is Dori Digenti’s work). To me the offline PLNs are equally important – a feeling shared by at least one other PLN Explorer, Mitra Emad. We should not repeat the mistake that educational institutions made by separating elearning from face-to-face learning. They are now trying to frantically fix the damage by applying a blended learning idea. Well – for me it all is just learning, using a different toolkits, each with its own set of advantages and problems. Just like it is all just plain PLN. Surely, internet has made it easier to connect and share at a distance. And perhaps social media promoted more open, contributory and networked approach. But because learning or networks are more visible online it does not mean they are more valuable than those in the analogue world. Perhaps we should add ability to blend the online with offline to the skillset of an efficient PLN-er?

Gosh – I think this is rather enough. It has turned into a monster of a post – and rather theoretical. But rather useful in digesting the ideas…

This post is licensed under Creative Commons Attribution-NonCommercial Share Alike