The Sutton Trust has published a fascinating study of the value an aptitude test (the SAT Reasoning TestTM) as a predictor of academic success. The report is of critical interest to anyone interested in access to higher education and access to graduate professions, such as the law. It is particularly interesting in the context of current debates about the value of aptitude tests being considered by both the Bar and the Law Society. The study is sizeable. It looked at 2,754 students graduating in 2009. Furthermore, it is fair to say the Sutton Trust were hoping that aptitude test might be a vehicle for improving access into higher education for disadvantaged groups. Similar hopes have been voiced with regard to aptitude tests for the profession.
The primary aim of the study was:
“to examine whether the addition of the SAT® alongside A levels is better able to predict HE participation and outcomes than A levels alone. Two specific issues were also to be addressed, namely:
“Can the SAT® identify students with the potential to benefit from higher education whose ability is not adequately reflected in their A level results because of their (economically or educationally) disadvantaged circumstances?
“Can the SAT® distinguish helpfully between the most able applicants who get straight A grades at A level?”
The Study provides detailed, multi-level modelling of data on 2,754 students. It examines their A-level, O-level attainment, their performance using SAT® Reading, Writing and Maths tests and data on their schools (types of school, average performance of those schools). In relation to these particular tests they conclude:
“There is no evidence that the SAT® provides sufficient information to identify students with the potential to benefit from higher education whose ability is not adequately reflected in their prior attainment.”
“The SAT® does not distinguish helpfully between the most able applicants who get three or more A grades at A level. The SAT® Reading and Writing components do add some predictive power for some classes of degree at highly selective universities2, but add very little beyond the information provided by prior attainment, in particular prior attainment at GCSE.”
In other words, SATs add little to GCSEs and A-Level grades even at the margins. There is more troubling news for proponents of aptitude tests (if results for SATs are typical), they may prejudice particular groups:
“Regression analyses showed that female students, some ethnic minorities, students with special educational needs (SEN) and students learning English as an additional language (EAL) appeared to perform less well on the SAT® than would be expected from their GCSE and A level attainment.”
There are some interesting points of detail which Universities and firms, who sometimes look at A-level scores should pay attention to. In particular:
- average A-level and average GCSE attainment is better than total A-level point at predicting attainment at university.
- Male and female students do equally well when prior attainment is taken into account (and we know that female students do far better on undergraduate law courses)
- Students attending more selective Universities tend to be less likely to achieve as high a class of degree as students from less selective universities with similar attainment.
This last point is worth dwelling on:
“In almost all subject areas (14 out of 17), the Sutton Trust ‘Top 30′ universities actually gave out proportionally more first class degrees within our sample than the ‘other’ universities, so the UNIdiff finding is not simply due to fewer firsts being awarded at these highly selective universities. It suggests that due to the large number of very able students competing for first class honours, it is more difficult to obtain this classification in highly selective universities than in less selective institutions…. ….To what extent students are already aware of this when applying to universities is unclear and whether they are nevertheless prepared to join such highly competitive environments due to the ‘market value’ of the degrees they obtain when they graduate.”
This finding has echoes for me of US research looking at the importance of ‘prestige universities’ as a predictor of success in the professions.
Given the current debate about the influence of socio-economic class on entry to the profession, the research finds that State school pupils do significantly better than public (or grammar school) pupils when they enter higher education than is predicted by their A-level grades. One way of putting this is that A-level grades exaggerate the abilities of public school students. At least part of the ‘leg up’ that such students receive as a result of the school they attend disappears during University education.
“…although independent and grammar school students disproportionately go to more selective universities they still perform less well than their peers. In one of the models reported later (section 5.1 and model 19 in appendix 3), students from the most highly selective universities (the Sutton Trust Top 30) were excluded, yet the grammar school and independent school coefficients were still significant. In other words students from independent and grammar schools are performing below expectations in other universities, not just the highly selective ones.”
The results suggest that students from state school with the same A-level grades as those from public schools are, in terms of University performance, likely to perform better. Universities, firms and Chambers should be bearing this in mind when they admit students or recruit trainees and pupil barristers. The report writers put it like this: “a comprehensive student with grades BBB is likely to perform as well at university as an independent or grammar school student with grades ABB or AAB.” It is not uncommon to hear recruiters say that public school students get the benefit of better education and that this endures in their performance. These results suggest not.
The importance of thinking about the context in which students have been educated is also very important given the tendency of the profession to recruit from elite institutions. The research indicates that it is harder to get a 2:1 or 1 from these institutions but also that, “those in independent schools tended to achieve places on more prestigious courses”. Their tendency to achieve such places at prestige Universities is an effect independent of their attainment: that is a student with three AAAs from a state school is less likely to go to a prestigious institution than someone with three AAAs from a public school (even though that state student is more likely to get a 2:1 or 1st!). State school pupils are less likely to apply. Furthermore, “More affluent students were more likely to be studying on courses with high entry point requirements than might be expected from their attainment.”
This research is a fascinating indication of what a detailed look at how well selection of students works. The results suggest educators and professional recruiters genuinely interested in selecting the best candidates should pay close attention to school performance and/or school type. In particular student performance has to be placed in a context which better predicts likely performance. One possibility is that students from State Schools should benefit from more generous entry requirements than students from Grammar and Independent Schools if the best students are to be recruited.
It is important to note that although SATs did not appear to predict future performance in this context, other aptitude tests may prove useful in the future in particular contexts. As the report writers note:
“Despite considerable variation in SAT® scores between HE applicants with similar A level attainment, this variation has not proved to be useful in predicting degree outcomes. Other aptitude or admissions tests may also differentiate between applicants in terms of test scores but such differentiation may not be helpful in predicting undergraduate success. Aptitude tests or other admissions tests may well be useful in contexts where other attainment data is lacking. However, the use of any admission test needs to be validated; supported by evidence that it is a reliable and valid measure of HE performance.”
It will be interesting to see how rigorous the Bar’s current testing of aptitude tests is and also how the Law Society decide to proceed in the light of their current investigations. The Sutton Trust Report shows the benefits of proper testing. In the context of University admissions, I would say the view of this report is pretty clear: these aptitude tests are a waste of money.
A key point for me is what the SAT was designed for: to predict success in the first year of an American undergrad education. And it’s very good at doing that. But British education requires a somewhat difficult skillset than American education. American students are used to SAT-style multiple choice questions, so there’s less of an issue of getting to grips with exam taking technique and more about the content of the exam. And remember that there are no other national exams in the US equivalent to A Levels/GCSEs. So American universities don’t have that data when they’re selecting their student intake.
A lot of the content of this post strikes me as the inevitable consequence of the British system trying to pretend that everything is equal. *Of course* “better” schools prepare students for exams better than average schools. They’re judged by their results. So it makes sense that there is effectively grade inflation at grammar schools and public schools: students who go there do better on exams than they would if they went to the local comp. If these schools didn’t prepare students to do better, what would be the point in having them?
Similarly, pretending that the standard of education is the same at all UK universities is disingenuous at best. Better universities attract better students and better staff. Even when good staff teach at sub-standard universities, they quickly learn that their students are not up to the level of analysis they undertook as undergrads. You can start off with high expectations, but if they’re not met, you tend to drop them to meet your students’ abilities.
So *of course* a first from a top university represents a higher standard than a first from an average university. Again, if it didn’t, what would be the point of having top quality, internationally recognised universities teaching undergrads?
I have little doubt that many people getting low 2.1s or 2.2s at Oxbridge could get a 1st or high 2.1 at an ex-poly with the same amount of effort. But I have equally no doubt that they get more education out of their lower degree result.
Where does this leave the question of opening access (which is clearly important)? I believe the existence of selective schools or at least some form of streaming in secondary school is helpful. Age 11 is probably too early to really identify people’s academic strengths and weaknesses. But getting children into groups of a similar academic ability helps all to thrive (the able children are not held back from reaching their potential and the students who need a bit more help are not frustrated). Some people are late bloomers, and the streaming should be regularly re-examined to ensure people remain at the right level.
And affordability is always going to be a barrier to entry. But that’s a subject for another day.
Thanks very much for the post. My understanding of US SATs is the same as yours – the main point being, as I understand it, the absence of good alternative predictors like A-levels. They need something like SATs.
I’m not expert enough to know what the implication of better streaming or selection at 11 would be. My initial reaction though is: why bother – why not give lower offers to state school kids? If a comp kid with AAB is likely to do better than a public school kid with AAA then what’s the problem with saying they’r the better applicant? A meritocracy demands such an approach doesn’t it?
The point about elite schools being ‘obviously’ better is an interesting and common sense one. If Law Schools who rely on A-level grades simpliciter-or worse variants like total points, then the chances are they are not picking the best students – as the Sutton Report suggests. If this is so, one limb of their claim to ‘eliteness’ fails. There is a second problem. Research in the States suggests that it is performance on your course rather than the status of your law school that predicts success in practice. Of course things may be different here, but if it is not so different this too casts doubt on the common sense assertion. : See here
Thanks for the follow-up.
From my perspective, part of the benefit of streaming is to keep bright kids engaged. One of my (many) other hobby horses is the myth that “bright kids will do well everywhere so you can ignore them and concentrate on the rest”. But it’s also easier to justify keeping the entrance requirements the same for all students — particularly in areas like London where there is a significant race differential between students attending state and independent schools.
Thanks for the link to the research on US law students. One major difference that I can see is that all US law schools even the “poorer” ones are more selective than lower ranked British universities (e.g. South Bank where you can study for an LLB with BCC at A Level).
I also think the data used is a bit suspect. The ABA Journal article says that it represents students from “40 public law schools”. This would exclude almost all the most elite schools (Harvard, Yale, Columbia etc. all being private). Seven of the top 10 law schools in the US are private, not public. So this would be like looking at results in the UK but omitting Oxbridge because it wasn’t part of your data set.
Pingback: Should the Profession abandon its search for the G-spot? Aptitude Tests (Again) | Lawyer Watch
Pingback: What is wrong with elitism? | Lawyer Watch