Last summer 29.6% of students taking A level Physics gained an A* or A grade in the examination. However, just 10.6% of students taking Media, Film and TV Studies that achieved the same grades. It’s worth recalling these figures when reading the reports of grade inflation in universities with more students than ever achieving First class honours degrees. (source for A level data: http://www.bstubbs.co.uk/a-lev.htm Source for University data: HESA) Agreed, the extra 4,000 student studying Physics at A level in 2016 compared with 2010 may be partly responsible for the decline of 3.5% in the number of A* and A grades during the same period, but that is to be expected with a widening of the pool of entrants into the examination. However, the top grade is open to all. Maybe there is some degree of selection here with only those needing the subject for university traditionally taking it at A level.
So, does the increase in student numbers at universities mean there is grade inflation and more should mean greater numbers of lower grades? In the end it depends upon what you want the marking system to achieve. Traditionalists, may want a normal distribution curve of outcomes with a bunching around the middle grades and only limited numbers expected to achieve the highest grades or to fail. This system is great for identifying the really high flyers, but does it disincentive everyone else? Should degree class reward hard work and are students working harder now that have to bear the cost of their university education through the fee system? Has a competitive job market through the years of the recession also signalled to students that outcomes rather than just the university experience matters? This takes us back to the A level results. Are there too many A* and A grades in Physics? Of course not.
Perhaps students are becoming pickier at both choosing courses and even modules within courses with a view to outcomes? To what effect does ‘drop out’ among student affect the outcomes of those that remain. Do students realising they selected the wrong course, perhaps during clearing, quit in larger numbers. We know students from poorer backgrounds are more likely to quit. Is this because they received poorer advice about which course to pick at what university and ended up doing the wrong subject?
There is lots more to explore behind the simple headline data. But, maybe there has been some grade inflation and university quality control mechanisms need to ensure that outcomes keep pace with learning. After all, that is what the external examiner system was supposed to achieve. What do these figures also say about the claim that A levels were being dumbed down and students were arriving at university knowing less and less well equipped for university life? Interestingly I had a conversation on LinkedIn about this point with a teacher in Essex recently.
Personally, I think the outcomes are a tribute to our students, but universities do need to ensure that they monitor their learning outcomes to keep pace with changes elsewhere.