RSS Feed

Posts Tagged ‘cetis’

  1. #Learning Analytics – It’s not what you’ve got it’s what you do with it that counts.

    July 15, 2014 by Robin Englebright

    This is the first in a series of posts on Learning Analytics, which has in part has been prompted by a session at the CETIS conference which investigated the creation of an HE learning analytics policy. 

    The session aimed at a practical approach whilst recognising that does *not* mean that “ethical, cultural, epistemological, or pedagogical concerns will be brushed to one side, as these are surely essential considerations for an effective strategy.”

    considerations when planning an HE Learning Analytics policy

    To my mind “Learning Analytics” and “Big Data” go hand in hand.
    I’m not interested in whether you believe the data we now have such easy access to is truly “Big”, I think that’s a red herring introduced by statisticians who are miffed at folk trampling over their turf.
    Regardless of how big it is, there is defintely “More” data, more readily available to tutors and students.

    Neither am I interested in the aspect of learning analytics, that uses data to justify questionable business practices, focussed on making courses financially efficient.

    I *am* interested in ways that technology can be applied to gather relevant emergent metadata, paradata etc, and provide tools to analyse the data and identify “Actionable Insights” that improve learning.
    “Actionable Insights” [as defined by Adam Cooper: http://elearning.jiscinvolve.org/wp/2012/12/03/actionable-insights/ ] are what make learning analytics a practical and pragmatic activity rather than a bit of self indulgent graph making.

    When I was at JISC the collective brain power of the old Innovation directorate identified 9 areas which would inform any activity in learning analytics. I acted as a graphic facilitator in these sessions (I drew pictures about stuff folk said)
    I’m going to use these 9 areas to review opportunities to implement some practical LA tools here at Brighton.

    learning analytics

    It’s not what you’ve got it’s what you do with it that counts.

    So what data have we got?

    We run a hosted version of Blackboard Learn 9.1.April2014 (catchy name) which like all VLEs is basically a bunch of webpages that are fed from a database, or databases.
    In this case it’s hoofing great Oracle stack.
    Direct access to live data isn’t allowed so we have to use the ASR- (Advanced Statistics Reporting) or “stats”,  potentially useful for looking at historic data, although limited to just 180 days making long term reports infeasible. Short term reporting isn’t much better unfortunately as this data gets refreshed only nightly so is always out of date, a source of frustration for colleagues who need to be able to quickly lookup student/course information to act in a timely manner.

    There are other routes to the data:

    Directly through the web interface blackboard provides, these include the tools for ‘instructors’ like System reports, like the “Course Activity Overview” which displays overall activity within a single course, sorted by student and date, including total and average time spent per user and the total amount of activity the user had in the course. There are also new tools like the retention centre, which applies rules based on student performance to provide ‘instuctors’ with indicators as to likley concerns.

    Using BIRT “An open source technology platform used to create data visualizations and reports that can be embedded into rich client and web applications.” I’ve installed the BIRT Eclipse variation but as yet haven’t had time to look at it. In theory it builds queries which can be packaged as .WAR files and plopped into Blackboard as Building Blocks.

    Through webservices, but they look SOAPy… and try as I might I can’t find many redeeming features for SOAP, it all seems needlessly complex and arcane. However there seem to be some helpful posts out there, mainly from Bruce Lawson so I will persevere.

    What do we do with it?
    Not as much as we could. We do run reports, and have a number of scripts that lookup stuff, but my experience is that much of the use is admin and end of year board reports.
    We have plenty of data, and a number of tools, so the time is pretty ripe to explore the opportunities to use data in a more timely manner.

    Next time I’ll look a little closer at the Access to the data,and the types of data.

    For more information on Cetis’ work in learning analytics consider investigating the Learning Analytics Community Exchange project [http://www.laceproject.eu] in association with partners Open University (UK) and Oslo and Akershus University College.


  2. Teaching Technology Timeline

    June 20, 2014 by Robin Englebright

    At the #Cetis14 conference the final keynote by Audrey Watters @audreywatters looked at the history of learning technology, and how it is shaped by folk to tell their point of view.
    It reminded me of a session I ran at the 2012 Jisc Online Conference called “Looking back to shape the future: The History of learning technology in 100 objects…” which managed to riff on both the popular 80’s film AND a (then) popular BBC series. The setup for the session was that sometimes technology changes the way we can work, and the way we can learn.
    The session aimed to record some of the landmarks in teaching technology by creating a collaborative teaching technology timeline using timeline.js (a fantastic bit of scripting).
    I tweeted a link to the timeline and several folk seemed interested so I thought it might be useful to share the link to the timeline, and the google form, and a recording of the presentation.
    We had a number of submissions and in the conference session we discussed what it was that made the difference.
    You can add to the timeline using this Google form.


Skip to toolbar