50 years of assessment in legal education – liveblog

Am liveblogging the conference as much as I can.  Julian and I up first, slides on the Slides tab.  Whirlwind tour of past & present on the theme of the title, ‘Of tails and dogs: Standards, standardisation and innovation in assessment’.

First up, Craig Newbury-Jones and Nigel Firth, Plymouth U Law School, on ‘Digital assessment for the YouTube generation: reflective practice in 21st century legal education’.  The presentation gave us an overview of innovation and technology-enhanced assessment, and contextualised it in ongoing curriculum changes.  Year 2 students operate as a virtual firm using f2f and online dialogue and meetings to progress their client’s case.  Allocated tutors act as clients and supervisor.  The module is a prelude to their work-based learning options in the Clinic.  The emphasis is on transferable skills rather than law/procedure (60% or so of their students don’t enter legal practice).  Blogs and file/firm updates lead to incremental development of their file that case tutors monitor and comment upon.  Part of the environment is the SANSspace, where students upload their reflections on their virtual practice.  Originally Sony language lab software — http://sansinc.com.  Interesting use of this software for reflection online — but Craig said poor interface, not liked by students.  ELGG was a much better bridge between academic and practical legal learning.  As they say, oral reflection is an employability skill; and it helps to signpost to students what they are actually doing.  I agree.  It’s kind of link a constructed debrief outside the sim, which is so important for learning.  Students love the casework, combination of lecture and practical sessions has been engaged in by students, and excellent assessment results from students.

Next up, Dan Hill, head of learning & development, U of Law, on ‘Digital assessment evaluation’.  Assessment is labour-intensive for academics and admins: could digital help streamline the activities, eg marking process, and other pinch-points?  UoL used ES, ExamSoft, which gives students feedback, which was based on use of assessment criteria. Advantages — gives students indication of their performance and areas for improvement. Feedback is generated from comments made during the marking process, therefore no additional work for markers.  UoL also uses QM – QuestionMark.  It provides Coaching Reports, roughly same ideas as ES.  Also shows student’s metrics against other students.  That aspect is neat; but worrying, too.  There’s also an assessment summary report for faculty.  What did students think about both systems?  Liked ES better than QM — clearer feedback, and level of detail.  Students seemed to prefer working with paper-based assessments.  Most students agreed that both stores drained the batter life on their laptops, and wanted paper copies of the assessment, rather than having it in digital form only.  33% preferred to type, 67% preferred to write.  So issues there to think about.

Next, Rachel Dunn and Richard Glancey, Northumbria U, on ‘Assessing students through engagement with legal policy’.  No slides.  They talked about a Student Law Thinktank.  Composition of policy papers, consultation responses to Consultations put out by Parliament, Select C’ttees, and others (eg on whether we need a new Magna Carta).  Gives students skills such as legal research skills (different from client-based legal research skills); legal writing skills – more developed in a policy domain.  Different student groups work together vertically, eg undergrads, LPC students, doctoral students.  Richard wondered if assessment could be developed out of this experience.  So in Civil Liberties students were asked to compile a report, group written, 70% + oral assessment, 30% (if I have that right).  Eg on Terrorism Act, privacy, compensation for miscarriages of justice.  Students identify area of law, critique it and propose changes.  If the submissions are high quality, Richard sends them off to the relevant body.  Richard introduced it three years ago — disappointed because students couldn’t work together in groups.  He then introduced PBL to address the issue, as the basis of a group learning method.  The results were ‘unbelievable’ — all the submissions were good enough to send off to the bodies concerned (unlike the first year, where none were sent off by Richard).  This is a real assessment for students, said Richard.  When at Northumbria Richard and I talked about this — it’s an impressive project, and well worth being further developed and disseminated.

Alison Bone is keeping us all to time well.  Jessica Guth, U of Bradford now, on ‘Thinking critically about law: Book reviews in law and society’.  Her aim was to evaluate different theoretical approaches to the study of law.  Assessment via oral comms, written comms and advanced research & analytical skills.  She allowed the students to crucially examine a range of books, eg a textbook, an extract from a textbook, revision guide, etc.  There was an individual presentation of no more than five mins introducing the text they’d chosen to analyse and outline the core argument, and a written critical analysis of the same text that students analysed in their presentation.    Almost no module fails — just the non-attendees.  Eg of student work — review of Glanville Williams’ Learning the Law, entitled ‘Learning the Law: What Law’ (pointing out the narrow focus of Williams’ text).  Eg – an analysis of a landlord textbook called ‘This is not my house: Female Asians and land law’.  Nothing about extended families, granny flats, moving flexibly from one house to the next.  Students achieving good marks.

Nigel Duncan up next, on ‘Prepared for practice? Assessment for the Bar 1975-2015.  An overview of vintage Duncan stuff.  Legal knowledge, acknowledged by the Bar to be not enough — Hurrah.  He also analysed what was required for General Papers 1 & 2 — lots of cramming, lots of basic work.  3 hour closed book exam in 1975, unseen examinations, most require application of substantive and procedure law, very little application & skills.  Contrast with BPTC in 2015.  Closed book assessments are there, but are multiple choice and short answer questions, centrally-set, no question-spotting.  There are open-book assessments on Opinion writing and Drafting, 3.5 hours, dealing with uncertainty and contingency.  Quality of the drafting can be stunning.  Simulated assessments in Advocacy in Civil submissions, cross-examination in chief, Conference skills and two option assessments, eg Free Representation Units, where they can take specific types of caselaw.  Constructive alignment?  In 1975 there was a diet of lectures and tutorials, and compulsory practical exercises in Advocacy, and there was constructive alignment.  In 2014, there’s a varied diet of assessments, and considerable constructive alignment there too.  But the quality and differentiation between the two sets of work students did do in 1975 and are doing in 2015, is remarkable.  I like this, because it answers one of the criticisms I have of legal education research, namely the historical comparison of what we do now in legal education compared with what it was in the past.  Nigel has more confidence in what students are doing now as being quality learning and engaging in quality assessment.  Thoughtful, thorough piece.

Pat Feast next, from Portsmouth U., on appraisal: ‘Appraisal as an effective method of assessment’.  Based on a module called The Practical Lawyer, a 40 credit unit that replaces dissertation at level 6.  Students work in a number of settings offering legal advice, and trained over a two-year period, and students commit 100 hours to a particular project.  Before they go into the project, the students are assessed to ensure they have the necessary skills. Two assessments in the level 6 project: the work students do for clients, and then an appraisal that incorporates a reflective essay.  Appraisal replicates what the student can expect when employed, asks them to reflect on their performance and think about their future.

Now it’s Clare Sanford-Couch and Jonathan Bainbridge, from Northumbria, talking about a module on legal history, where students are involved in setting their assessment, and in marking those assessments.  The Legal History module is a half-module, for students in the upper school, mix of lectures & two hour seminars, assessments by coursework and oral assessment.  Students don’t like formulating the question.  Why was this done?  Because, quoting LETR, ‘it was widely recognised that legal research skills were not sufficiently acquired by the end of the academics stage’, p.44.  And for general transferable skills: analysis, synthesis, report writing, time management, self-monitoring and goal setting. The oral assessment carries 30% of mark, 15% from module tutor, 15% from students’ peers.  This assessment is about presentation skills.  Students seem to like this oral assessment — peer pressure, but also peer support, is part of the deal.  Why do this? Student is active ‘doer’, helps to install autonomy in learners, interactive classes, transparency in assessment.  But possible pitfalls: doubts about the validity and reliability of peer assessment; student dislike of innovation; student concerns about peer assessment, friends marking each others’ work; inconsistent/arbitrary marking; the role of the tutor.

Final morning session (puh, I’m wilting… but hey, I see from The Guardian live update blog from Melbourne that Andy Murray’s into the final…), Elle Dagilyte, Buckinghamshire New U. & Peter Coe, Aston U.  Developing professionalism via take-home exams: Assessment for learning in law studies.’  What has been developed and why?  Take-home assessment for LLM & LLB students.  Because Law is both an academic and practical discipline.  There’s an emphasis on professionalism: their institutions and LETR, too.  Their definition of professionalism was interesting, quite comprehensive, too.  Take-homes are used in Canada, McGill, Sweden, Uppsala, and Australia, Melbourne.  In the UK, only LSE use it.  Challenges and solutions — see table below. Small cohort in LLM –4.  Larger in LLB — 30 students, both international and home students.  Students comment that it feels like an exam, but the results showed better performance as compared to a prior mock exam.

And that’s the morning session.  I feel like the liveblogger on the Andy Murray game…  Lunch now then back into the blogging & think about the Lord Upjohn this evening.  Busy or what.