There are three further events planned in Birmingham Bristol and Newcastle:
and there are extensive supporting resources online:
The day was crammed with activity and engaging discussion, guiding the delegates through the process of identifying challenges, and strategies to change. I took extensive notes with the aim of turning them into a blog summary, but there was just so much going on I realised I wasn’t going to do it justice.
I tweeted some highlights from the event, using the #jiscassess hashtag and got a number of responses, probably the most thought provoking was from @ahiggi:
“@Rob_work not that you need reminding but raise accessibility in assessment. alternative arrangements usually change the nature assessment”
This opened up a huge area of personalisation, and deep discussions around the capacity to provide alternative resources, and whether they can really be counted as equivalent, and was echoed by Dr Mark Russell as he talked about the links between assessment and learning.
Mark asked what sort of things would you do if you wanted to fail more students…
There were many suggestions:
-ask them to work in collaborative groups without guidance on their
-assess their grasp of a subject based on their spelling and grammar,
-don’t explain assessment criteria,
-write assessment in Sanskrit,
-provide feedback the day before an exam..
What struck me was how easy it was to identify ways to get students to fail assessments, and how they seem to map to solutions in the JISC view of the benefits of technology:
-Greater potential for dialogue, teacher:learner, learner:learner
-Immediacy and contingency (interactive tests)
-Authenticity through filmed, simulated or virtual world scenarios
-Speed and ease of data processing
-Opportunities to allow students to fail productively
-Enabling students to participate in actively assessing themselves.
I’ve rattled on here before about WHY I started using technology in my teaching, but it may be that it’s worth reiterating briefly. I used to teach the Entry level provision at an Agricultural college, and coordinate assessment on a two year National Proficiency Test Council Vocational Foundation Certificate scheme. The course was very flexible, aiming to provide a range of routes for students who’d not had a great time at school, and was built around a great range of assessment opportunities.
The assessments were competence based activities, with performance criteria taken from NVQ units.
The assessments were ENTIRELY observation based, so the learner could get on with the activity, and the burden of assessment was on the tutors, the feedback was direct, and personal.
As the students progressed to full NVQ courses there was an obligation upon them to keep a portfolio of evidence, which they had to map to a full set of elements and performance criteria.
If you can’t write , or read, this is tricky.
It doesn’t mean you can’t effectively milk a cow or plough a field, but writing the evidence was a real issue.
The technological intervention that changed the process with these students was the Sony Mavica camera, they used floppy discs which the students could take out of the camera, meaning one camera could service many learners, and the discussions they prompted about WHAT they needed to record to prove they could do a task changed the way they viewed the task.