Next up was the session on the Content Validity Study (CSV). David Boyd introduced the subject by focusing on validity — assembling evidence that justifies the decisions made on the basis of test scores; and evidence that the test actually measures what it intends to measure. Bar exams are licensing tests, he pointed out; and according to the Standards, licensing tests focus on a candidate’s skill and knowledge in a particular domain. The standard of the Standards, as it were, is the focus on knowledge and skills at an entry-level practitioner.Who could fault that; but in other jurisdictions the arguments go further than this, to attitudes and values – more of that in my session on globalization.
Content domain for the Bar Exam is not what law schools teach, but what new lawyers need to know in order to practise. There is an overlap obviously. How is this determined? Job or practice analysis; then decide on question formats for the exam. I think Boyd missed out a few steps between employment analysis and the question analysis. Kind of presumed that the Bar Exam format was going to remain unaltered by the results of the consultation, which in other jurisdictions is not the case at all (eg the consultation in Scotland).
Diane Bosse spoke, and pointed out that exam content was defined by experts, drafting committees and expert reviews — but never by practitioners. Job analysis included newly practising lawyers, race, ethnicity, gender; questions included knowledge, skills, deficiencies observed, etc. Participants logged activities for one week, resulting in over 1300 activities. Draft survey was constructed, piloted and participants were given lists of generaly laywering tasks, tasks specific to particular areas of practice and other areas, and asked to rate them according to categories of significance.
Diane gave us information on the final survey, on general lawyering task categories — comms, research, investigation, etc. Interesting results. Many areas of law & skills were untested in the current Bar Exam. Eg information literacy, IP. Final part of the survey asked respondents to categorise skills according to significance — eg oral/written comms, e-researching, issue spotting, listening, decisiveness, advocacy, resource management. Part 5 was demographic & biographic questions to facilitate analysis.
Results are in today, so analysis is still to be conducted. They expect that they will find subtanital confirmation of what they currently test and how they test is confirmed. Changes? Legal research, civil procedure, and analysing the appropriate test speds for the exam programme, eg test conent, iten type, test format, and weighting of components. Diane ended by saying that the changes will be measured in their implementation, and that protection of the public was a key issue throughout.
Interesting issues and contrasts that appeared in the Scottish Law Society consultation.
At question time Susan Case made good points comparing changes in medical education practice assessment, eg Uniform Medical Examination, that she was involved in, in the 1990s, and drew a parallel between the CSV and that earlier process in medicine. She took questions well on the size of the survey population (not that critical); and a sceptical question about the nature and content of change. She observed (and this is true of most jurisdictions and all professions) that protection of the public was an essential focus for evaluation such as the Bar Exam, and therefore particular attention was to be given to newly-qualified sole practitioners, rather than those newly-qualifieds who entered large law organizations and were given training in their jobs. The same was true of medical sole practitioners.