George Brandes now, talking about Making Sense of Assessment Data, and in particular Learning from Online Assessment. George is Exec Director of Concord Law School of Kaplan University, LA. Interesting analysis of designing at programme level via designing at more detailed, learning outcome level. He sees the issue being one of moving up and down the scale from top to bottom and back again. What he was saying reminds me of the Jacob’s Ladder sort of diagram in which I tried to express similar ideas, as here.
He did the shut-eyes fold-paper activity — fun. Showed the problems of instruction etc. Gave examples of personalisation of learning from Kaplan’s site, including problem workouts. Assessments — students have a lot of quizzes, MBE style questions. they get explanatory answers from the online quizzing. Quite detailed feedback too. After the quiz, they have a reading assignment, with quizzes, that reinforce what they supposedly read, so they switch between reading and quizzes.
Essays… Feedback is given on ‘common problems’ and model answers. Three essays with developmental feedback, and timed essays . They map the outcomes to assessment items so that there is alignment between them. Easy rating tools for staff. Then there is analysis that staff can do on essay outcomes, showing what students have, over the piece, understood of the material to be learned. He noted that the Standards Review process was creating new opportunities.
Next up, Aaron N. Yaylor, on Using Existing data, He’s Director of the Law School Survey of Student Engagement (LSSSE). Aaron asked some key questions about ethics , and whether we are fostering environments that enhance respect and democratic engagement. Existing data, he said, is key, and we should use it as much as possible. Law schools have lots of data on students, but don’t use it. Law schools are data averse. In terms of assessment, schools are also focused on discrete assessments, and stepping back to a more planned approach. What is student engagement? time on task, activities,ethical exposure and interactions is what makes up engagement. Also, how our inputs are affecting our outputs, resources affecting learning.
LSSSE is a 100 item study, administered online in the spring; 270,000 students, 169 law schools in US have taken part. 50% response rate. Aaron went through aspects of the form with us. I’ve always admired the line of questions and the research (George Kuh, etc) that’s gone into LSSSE. I dearly wish we had LSSSE in the UK rather than the National Student Survey. The feedback to law schools is exceptional. Eg schools can select who their peers are, eg competitors, or even their aspiration peers (ie schools they’d aspire to be like), and compare their data with those schools. The compiled data makes for absorbing reading, and I’m not faculty in the (anonymised) example that Aaron was using.
The gains that students made were analysed, eg gains in critical and analytical thinking ability, gains in solving complex real-world problems. Great question: Would LSSSEVILLE Law 3L students choose the same law school again? Aaron pointed out how common a positive response was. There will be a new question — knowing all you do now, would you have gone to law school? Very good presentation. Reinforces my belief in the LSSSE process and output.