Quality Assurance or Quality Control?

Just after 7am this morning I was telephoned by a researcher from BBC 5 Live to ask what I thought about the new ‘tables’ tests for Year 4 pupils? Not a great deal at that time of the morning was my first and honest thought. However, early morning phone calls are an occupational hazard for anyone prepared to make a comment on issues of public interest and that response wouldn’t do. Some calls of this nature develop into big stories and make headlines: others disappear onto the modern equivalent of the editor’s spike, either dumped or relegated to a footnote in a news bulletin.

Sometimes, you don’t get a call back, as promised, but a text message saying that the item isn’t proceeding either due to other stories taking precedence or some similar phrase, as happened this morning and you then wonder whether the point of view you expressed to the researcher was too similar to those everyone else was expressing and what they were looking for was a different view to balance the debate?

On the story about multiplication tests  http://www.bbc.co.uk/news/education-43046142 or ‘checks’ as they are being called, my view is that they should be scrutinised through the lens of whether they are a quality control or a quality assurance measure? If the former, then they are likely to be required of all teachers at the same time. The results then tell us on that day how well the age group are doing. We would possibly expect summer born children to do less well than those with a longer exposure to schooling and those that have remained in the same school to do better than those pupils that have already been subject to changing school one or more times. Pupils will a poor attendance record, for whatever reason, might also do less well.

A quality assurance check would allow the DfE to provide both an expected level but also to help teachers diagnose why those pupils that don’t reach the level expected fail to do so. The DfE might them provide some research into what will work with these pupils to help them reach the standard expected of most children at that point in their education. Such an approach, rich in a developmental approach aimed at helping the system, is more expensive than a simple check that will allow Ministers to blame failing schools and by implication their teachers through the medium of the Ofsted inspection.  If I was in charge of Ofsted, I might want to take the DfE to task for making the job of improving our school system a bit harder if it further reduced trust in the inspection system.

I guess that the DfE cannot afford to spend money on diagnostic tests and a simple pen and paper exercise to be marked by teachers in their own time looks more profitable in terms of political capital.

Take this new the test when a pupil is  ready; collect the data electronically and then let the results tell the DfE if their choice of Opportunity Areas is the correct one or whether key areas such as South East Oxford City have been consistently overlooked for intervention and extra resources? In this technological age, we need to harness the resources at our disposal to help both teachers and their pupils to learn effectively not just impose more burdens on everyone.

Advertisements

Clarity ahead of Select Committee – but still not good news

What has become clear this afternoon is that the DfE may have faced a dilemma last autumn. With the national roll-out of School Direct being enthusiastically taken up by schools, it could either have effectively wiped-out the university-based PGCE courses by meeting the demands of schools or it could have denied schools the places they were asking for in School Direct. The DfE targets for secondary subjects did not allow the third option of satisfying both schools applying for School Direct places and keeping the PGCE going and still keeping within the targets. The extent of the problem can be seen by comparing Table 2b in the underlying data of Statistical Bulletin 32/2013 issued by the DfE on the 13th August and Figure 1 of the School Direct management information published this afternoon by the National College for Teaching and Leadership. In practice, the DfE seems to have chosen a third way by creating inflated ‘allocations’ to try to keep higher education going, but still to satisfy the demands from schools for places. This exercise risked substantial over-recruitment against the real targets.

So what happened? Looking just at the STEM subjects, Chemistry had an allocation of 1,327 in the Statistical Bulletin, but a target of 820 places in Figure 1 of today’s document – a difference of 507. To date, recruitment has been 900 according to Figure 1, so the subject is over-recruited against target, but significantly under-recruited against allocations. School Direct, where bids totalled 422 places last November, and reached around 500 by the time all bids had been collected, apparently recruited just 260 trainees, leaving higher education to recruit the other 640.

Sadly, in Mathematics, Physics, and Biology, despite the target being well below the allocation figure, the target has not been met. In Physics the shortfall is 43% against the target; and in Mathematics, 22%. In Biology it is just 6%. However, these percentages do not reflect the actual numbers who have started courses; that number may be greater or smaller than those released today.

Indeed, in no subject was the allocation met, although in business studies it was missed by just one recruit. However, the target in this subject is apparently higher than the allocation in August, although that may have something to do with classification. Less clear is the Religious Education position where the target is shown as 450, but the allocation in August was 434 for postgraduate courses. Somewhere another 16 places have been added since August when they have been subtracted in most other subjects.

I have suspected for some time that the allocations were above the level required by the DfE’s model, and have hinted as much in earlier posts. More than 40,000 trainees did seem an excessive number to train.

More interesting is how successful School Direct has been.

SUBJECT Target School Direct School Direct % of Target
ENGLISH

1500

850

57%

HIST

540

290

54%

PE

780

350

45%

CHEMISTRY

820

260

32%

MUSIC

390

90

23%

GEOG

620

140

23%

MATHS

2460

510

21%

MFL

1550

320

21%

BIOLOGY

740

150

20%

ART

340

60

18%

OTHER SUBJECTS

1200

200

17%

PHYSICS

990

130

13%

IT/CS

570

70

12%

RE

450

50

11%

BUS STUD

230

20

9%

SOC STUD

180

10

6%

School Direct works in subjects where there are lots of high quality applicants looking to train as a teacher. At the other end of the scale are subjects where either the schools didn’t bid for many places, as in Art & Design or recruitment is a real challenge, as in Physics.

These are the subjects where School Direct faces it greatest challenges for 2014, and where the DfE/NCSL seemingly still cannot do without higher education.

What is also clear is that the DfE cannot repeat this same exercise this autumn for 2014 recruitment. It will have to make it clear how many trainees are needed according to the model. Otherwise students will be paying £9,000 in fees without knowing whether they are a target or an allocation, and totally uncertain about their chance of securing a teaching post. That won’t attract many takers in an improving graduate job market as the risks are too high.