Ofsted Data Dashboards: Spot the Difference


In an earlier post I criticised the then recently introduced Ofsted Data Dashboard for punishing Science departments who hold high ambitions for thier students:


'The issues raised here have been put to Ofsted and this blog is aware of their response. Ofsted chose not to clarify how the Science percentages published on the dashboard were calculated so it is still not entirely clear whether they are reporting on outcomes for every unique GCSE entry or every student entered for GCSE Science. Ofsted confirmed that only e-bacc qualifications were included in their calculations for Science and explained that it would be ‘misleading’ to publish Science outcomes based on entire school cohorts instead of exam entries because ‘pupils don’t have to take Science’. It clearly hasn’t occurred to Ofsted that it is also misleading to publish inflated Science GCSE outcomes by effectively discounting large numbers of lower ability students who are taking GCSE equivalent Science from the data.

In closing the Ofsted dashboard is a missed opportunity. If Sir Micheal Wilshaw wants to help parents make informed choices and governors improve the Science provision in their schools it would help if the dashboard looked at the types of courses on offer to students and the amount of curriculum time and resources a school allocates to Science teaching. Instead Ofsted have produced yet another simplistic, target based accountability tool that will do nothing except encourage faculty heads to do the wrong thing: Biology, Chemistry and Physics will only be offered to the most able children while everyone else will have to be content with an equivalent qualification like BTEC.'

Therefore it is heartening that Ofsted have made some changes to the way Science results are presented on the dashboard:



Although I think that Ofsted need to provide more information if they want parents to make an informed choice about Science provision (for example curriculum time which varies massively) this is a step in the right direction as it is now harder to present sky high examination results to savvy parents without anyone discovering the extent to which those results were achieved by entering lower ability students for BTEC; in other words 100% pass rates in Biology, Chemistry and Physics are actually not particularly impressive when 60% of the cohort are not entered for EBacc Science in the first place:


Methods for Attaining League Table Altitude


In my last blogpost I complained about the self aggrandising nature of schools who boast about their headline GCSE results data. My distaste for this practice stems from the obvious limitations of the data as a comparative tool and the ease at which the data is manipulated by ambitious but untalented SMT.
At school level the obsession with headline GCSE data is passed onto subject leaders who are given a series of numerical targets that have to be met. The most important targets for Science seem to revolve around national averages (usually % of examination entries gaining grades A*-C) and the prior performance of the department:

1. % of entries achieving grades A*-C in the separate sciences (school prestige and organistations like RAISE online)

2. % of entries achieving at least 2 grades A*-C in two ebacc science qualifications (the Ofsted data dashboard uses this target)

3. % of cohort achieving at least 2 grades A*-C in science GCSEs or equivalent qualifications (league tables)

4. % of students achieving grades A*-C in other single science qualifications such as Science A or Additional Science (organisations like RAISE online compare this data)

For pernicious marketing purposes data that shows continual improvement is vital while national averages are associated with RAISE online and other accountability tools. Subjects deemed to have 'under performed' against national averages or previous attainment are, therefore, likely to be subject to further managerial or regulatory interference. The point is that ever increasing pass rates and above average performance are desirable. The problems start when the numbers required are beyond the capabilities of the local system, here subject leaders are faced with three choices: cheat, tell the truth and face the consequences or chase the target while, arguably, doing the wrong thing by the students. What follows are the three most common methods used by science departments to chase targets:

Method 1: BTEC Applied Science and equivalent qualifications

What is it?

Entering lower ability or, increasingly, any student who isn't guaranteed to achieve a C grade for GCSE equivalent qualifications such as BTEC.

Why does it work?

Simply put it removes lower ability students from a departments GCSE data so inflates the percentage of entries gaining A*-C grades. Three quarters of BTEC assessments are coursework so it is, to use the accepted language, a more appropriate course for some students and BTEC Science still counts towards whole school targets (% students gaining 5 A*-C grades) so will contribute to ever improving figures there as well.

What is the evidence of widespread use?

The following photo is from a report submitted to parliament on behalf of the the Fischer Family Trust. It illustrates a steady decline in students entered for E-Bacc Science from 81.8% 2004 to 63.4% in 2010.

Method 2: Quasi options

What is it?

Selecting KS4 science options for students with national averages for the five science GCSEs (Biology, Chemistry, Physics, Science and Additional Science) in mind.

How does it work?

A Science departments GCSE results are looked at collectively (see target 2 & 3 above) and separately (targets 1 & 4). Biology, Chemistry and Physics are still considered to be 'elite' courses and there is pressure on teachers to achieve high pass rates (ideally 100%) for the prestige of the school and to match the corresponding national averages. A comprehensive school is far more likely to meet their targets for all five GCSEs by restricting separate sciences to the most able and pushing some high middle to high ability children into double science to bump the pass rate. The table below illustrates this trick – if four students were at risk of underachieving in Triple Science they can simply be entered for a double award instead – given that triple award students have to take the same examinations as double award anyway this can be done very late into the year; option A is far more ambitious than option B where students have been directed to particular courses.

The effects of this are evident in the numbers (see below); more targets were achieved using option B.

Even though there is no difference in the overall proportion of the cohort who are able to achieve good passes in science and the overall results for a department are worse using option B departments will use it anyway because the national averages are the de facto target.

What is the evidence of widespread use?

The relatively high good pass rates at national level for the separate science awards in comparison to double award qualifications (see above) suggests that, generally speaking, schools select students for triple science. For AQA Physics students are only required to cover three more topics than the double award students – 13 identical topics are covered by double award and Physics students. Physics is not innately harder than double award which suggests that the selection for the subject is motivated by targets more than a concern for students.

Method 3: Limited entry

What is it?

Not entering lower ability students or students who fail to achieve a good pass in Science/Core Science/Science A (the first half of double award GCSE) in Year 9/10 for Additional Science in Year 10/11. This is either planned in advance so Science/Core Science/Science A is taken over two years or is a response to examination results so students repeat a course rather (or shift into BTEC) than being moved onto a new one.

How does it work?

Students who are probably not going to achieve a good pass in Additional Science are removed from data for that particular subject which will inflate the pass rate and increase the chances of a school beating the national average.

What is the evidence of widespread use?

There is a huge difference between the entries for Science/Core Science (451433) compared with Additional Science (283391). That said it is difficult to tell how many of the Core Science entries are Year 10 aged students and how many are Yr 11 aged students re-taking having failed to achieve a grade C the previous year. The culture of re-sits in Science is almost a method in its own right.


When any measure becomes a target there will always be unintended consequences. In Science the national averages are such that it is risky to enter anyone bar the most able for triple science; one cannot risk widespread failure in triple science these days and some higher ability students are needed in double science anyway to boost the pass rate. The most ruthless, results driven, departments enter large numbers of lower ability children for BTEC and choose GCSE options for the rest with headline figures in mind. This behaviour inevitably contributes to grade inflation at national level which creates pressure for harder examinations, in turn this forces more people to game the system in the ways described above. The result of all this is a system that overwhelmingly rejects things like rigour or choice – things that are supposedly important.