RSS Feed

Posts Tagged ‘Adam Cooper’

  1. #Learning Analytics – It’s not what you’ve got it’s what you do with it that counts.

    July 15, 2014 by Robin Englebright

    This is the first in a series of posts on Learning Analytics, which has in part has been prompted by a session at the CETIS conference which investigated the creation of an HE learning analytics policy. 

    The session aimed at a practical approach whilst recognising that does *not* mean that “ethical, cultural, epistemological, or pedagogical concerns will be brushed to one side, as these are surely essential considerations for an effective strategy.”

    considerations when planning an HE Learning Analytics policy

    To my mind “Learning Analytics” and “Big Data” go hand in hand.
    I’m not interested in whether you believe the data we now have such easy access to is truly “Big”, I think that’s a red herring introduced by statisticians who are miffed at folk trampling over their turf.
    Regardless of how big it is, there is defintely “More” data, more readily available to tutors and students.

    Neither am I interested in the aspect of learning analytics, that uses data to justify questionable business practices, focussed on making courses financially efficient.

    I *am* interested in ways that technology can be applied to gather relevant emergent metadata, paradata etc, and provide tools to analyse the data and identify “Actionable Insights” that improve learning.
    “Actionable Insights” [as defined by Adam Cooper: http://elearning.jiscinvolve.org/wp/2012/12/03/actionable-insights/ ] are what make learning analytics a practical and pragmatic activity rather than a bit of self indulgent graph making.

    When I was at JISC the collective brain power of the old Innovation directorate identified 9 areas which would inform any activity in learning analytics. I acted as a graphic facilitator in these sessions (I drew pictures about stuff folk said)
    I’m going to use these 9 areas to review opportunities to implement some practical LA tools here at Brighton.

    learning analytics

    It’s not what you’ve got it’s what you do with it that counts.

    So what data have we got?

    We run a hosted version of Blackboard Learn 9.1.April2014 (catchy name) which like all VLEs is basically a bunch of webpages that are fed from a database, or databases.
    In this case it’s hoofing great Oracle stack.
    Direct access to live data isn’t allowed so we have to use the ASR- (Advanced Statistics Reporting) or “stats”,  potentially useful for looking at historic data, although limited to just 180 days making long term reports infeasible. Short term reporting isn’t much better unfortunately as this data gets refreshed only nightly so is always out of date, a source of frustration for colleagues who need to be able to quickly lookup student/course information to act in a timely manner.

    There are other routes to the data:

    Directly through the web interface blackboard provides, these include the tools for ‘instructors’ like System reports, like the “Course Activity Overview” which displays overall activity within a single course, sorted by student and date, including total and average time spent per user and the total amount of activity the user had in the course. There are also new tools like the retention centre, which applies rules based on student performance to provide ‘instuctors’ with indicators as to likley concerns.

    Using BIRT “An open source technology platform used to create data visualizations and reports that can be embedded into rich client and web applications.” I’ve installed the BIRT Eclipse variation but as yet haven’t had time to look at it. In theory it builds queries which can be packaged as .WAR files and plopped into Blackboard as Building Blocks.

    Through webservices, but they look SOAPy… and try as I might I can’t find many redeeming features for SOAP, it all seems needlessly complex and arcane. However there seem to be some helpful posts out there, mainly from Bruce Lawson so I will persevere.

    What do we do with it?
    Not as much as we could. We do run reports, and have a number of scripts that lookup stuff, but my experience is that much of the use is admin and end of year board reports.
    We have plenty of data, and a number of tools, so the time is pretty ripe to explore the opportunities to use data in a more timely manner.

    Next time I’ll look a little closer at the Access to the data,and the types of data.

    For more information on Cetis’ work in learning analytics consider investigating the Learning Analytics Community Exchange project [http://www.laceproject.eu] in association with partners Open University (UK) and Oslo and Akershus University College.


  2. Analytical engines

    December 11, 2013 by Robin Englebright

    Babbage never built his Analytical Engine… there’s a lesson right there.

    Image

    The costs were too high, and the technology too primitive, and frankly the proposed end uses were really rather dubious.

    We are in a similar sort of situation with Learning Analytics, in that the costs of implementing solutions are high, with technology that promises a lot, but won’t provide information in a format or structure that will allow data/stats illiterate users to make better than random decisions… and certainly provides potential to justify unethical decisions based on “financial prudence”

    I attended the #CDEInFocus Learner analytics and Big data event in Senate House at the University of London yesterday. If you want an insightful review of the topics discussed read Myles’ blog: http://myles.jiscinvolve.org/wp/2013/12/10/740/

    Highlights for me were of course Adam Cooper of Cetis who gave a practical overview, and Doug Clow of the OU, who talked faster than me. Adam’s slideshare set says pretty much all you need to know:

    Analytics is the process of developing

    actionable insights

    through

    problem definition

    and the application of statistical models and analysis against existing and/or simulated future data.

    Doug looked at analytics through the experiences of MOOC participation and drop out, useful figures and pretty background pics:

     


Skip to toolbar