28 June 2022
Reflections on the Edexcel GCSE Python on screen test
Over 79,000 students have just finished their GCSE exams in Computer Science: 9% more than five years ago. The majority of those students sat two traditional exams on paper but students sitting the Edexcel Computer Science GCSE had an on screen practical python exam.
There’s more to Computer Science than purely programming, but the practical skills of designing, writing, testing and debugging code play a pivotal role alongside the theory of logic, data representation, networks, security and ethical impact of computing.
The way that a subject is assessed has a significant impact on the way that the course is taught, which then affects how students engage with the content covered.
One of the main reasons that I wanted students at my school to sit Edexcel’s python on-screen test alongside their theory paper exam was to prioritise curriculum time for practical problem solving with code to consolidate and support the theory rather than being a tag-along extra.
I know that many schools have been waiting to see how those who’ve been involved in Edexcel’s pioneering python programming paper have fared, so I wanted to share my thoughts in the hope that I can persuade more schools to consider making the switch.
The context
I teach in a non-selective state academy in York with three specialist computing teachers. This year we had three classes of Y11s with a total of 67 students. Any student can opt for Computer Science GCSE, with 5 hours per fortnight throughout Y10 and Y11.
The logistics
Edexcel have provided a free, fully resourced scheme of work with topics being categorised as either Computational Thinking (CT) or Principles of Computer Science (P). We teach CT and P strands in parallel so that students have one lesson of practical programming each week, one lesson of theory and one lesson per fortnight for assessment, reflection or exploration.
Before their final exams, students had two mock python assessments under exam conditions: one in Y10 and one in Y11. Restrictions limit how close students can be sat together for on-screen tests but we have enough computer rooms to allow all students to sit the exam at the same time, where all students login with restricted exam accounts (no internet access).
Students are given a digital and paper copy of a python reference guide and a folder with python files to use as a starting point for each question in their printed exam question booklet. They then have two hours to work through the questions. Whilst students are welcome to annotate and highlight their question paper, the only thing that is submitted at the end of the exam is a zipped folder with all of the students’ code: there’s no justification, explanation, analysis or discussion in paper 2: it’s all about code comprehension, debugging, sequencing instructions and solving problems with code.
The advantages
Just before attending an online “debrief the exam series” event with other Computing teachers, I asked my current Y10 students if they’d like to switch to another exam board where they’d sit two paper-based exams. They’d just sat a challenging mock exam so I wasn’t expecting the unanimous enthusiasm for remaining with Edexcel’s on-screen approach.
Whilst there’s some benefit to being able to write out algorithms on paper (some software job interviews require this) but being able to write actual code on a computer allows students to test their code and edit their own mistakes.
The structure of the tests has been designed to be as accessible as possible: the six questions start with simple code comprehension or code completion activities. They progress to debugging and re-ordering code before building up to complex challenges to stretch the most able, involving manipulating data using files, 2d lists and string formatting.
The disadvantages
Some schools may struggle with the logistics of running the on-screen tests, although it is possible to stagger start times if you agree a schedule with the exam board.
Uploading student work after their exam was not a pleasant experience, with the predictable and preventable system crashes leading to an anxious time for many exams officers. Hopefully these teething problems will be resolved before next year.
Python is the most popular programming language at GCSE but schools who buck that trend will not be able to enter students for the Edexcel course as all questions have to be answered using a subset of Python 3. Edexcel has it’s own quirky way of laying out python code which takes some getting used to, but it’s consistent and I can see the benefits and rationale behind the pascal/delphi-like convention of declaring constants and variables at the top of your code, despite that not being strictly necessary in python.
The decision
We won’t know until GCSE results day if the decision to run with Edexcel’s on-screen python test was the right one but I’d encourage any other school to look seriously at the differences and consider switching. I’d love to hear your thoughts and reflections – do let me know via the CAS discussion forum.
Discussion
Please login to post a comment
Our technicians are brilliant: underpaid and underappreciated (hopefully not by me although they might argue otherwise!) - they’re worth their weight in gold. They wouldn’t want to be involved invigilating other than setting up the accounts and copying across the files (remotely, bulk job via script rather than individually), being on call during the exam (remotely) to resolve any problems logging in (hasn’t happened yet) and to zip up all of the files (remotely) at the end of the exam. Our next mock is Tuesday so I’m hoping it all goes to plan!
It does not have to be a technician, but someone (maybe 2 people in case one is absent) that has been well briefed and follows all protocols as set out by Edexcel, and can call the technicians if/when needed. The technician just needs to be informed that he is on call for any eventualities. That is my recommendation, but you might have your own system of course.
That’s all well and good but the issue is where do you get these “specialist invigilators” from? If it’s suggesting appointing a technician to do it we’ve got 1 for the entire school of ~1100 students so it would be difficult to take them redeploy them for this.
We could do with more technicians yes, but the one we’ve got we only managed to get after several rounds of having an ad out. The pay we can offer is just not competitive for the skill set required especially in Oxfordshire.
On the Edexcel page FAQs, they recommend (see below) the appointment of special invigilator(s) with IT skills and that they have read carefully the instructions on how to conduct the exam. You can contact them (See link below) for support on invigilations as Pete mentions above.
Edexel FAQs re exams page here
Invigilator’s checklist from Edexcel
Pearson’s customer support portal
Eek. That sounds really stressful! To be fair to Edexcel they provide a checklist for IT technicians, exams officers and invigilators which spell out really clearly what’s expected. We ran a mini briefing for invigilators before the exam so they knew what they needed to do (and what they weren’t expected / allowed to do). It was well worth going through the process for mocks first. There’s not really much that can go wrong on the day during the exam: students do all the work offline. Submitting the work afterwards was unnecessarily stressful (will be better this year now that we know what to do) but the exams themselves went smoothly. I completely understand why some people would prefer not to do this though.
Some good points! For me it will depend on the computer suite in a school too. My previous school had laptops which were quite temperamental so I wouldn’t risk it with those. Just something to potentially consider in the future
I’m not planning to move to Edexcel for GCSE. The reason being that we’re currently with AQA for A-level (on-screen paper 1) and despite me breaking it down into as simple steps as possible and explaining it to them multiple times the invigilators have still administered it incorrectly on multiple occasions. In fact this is part of the reason I’m currently considering moving to OCR for A-level. The invigilators I get allocated for exams just cannot seem to cope with running an on-screen test. I’d rather they did a paper exam than them having their chances wrecked by an improperly conducted exam.
I really like the look of the assessment with easier and more difficult tasks to cater for everyone and the fact that they can use IDE tools like any programmer can. The limitation to Python might be an issue for a few schools, but to be honest, most schools I know teach GCSE students Python language. I agree that doing it a Friday afternoon doesn’t help.
Are there many schools moving to Edexcel from OCR or AQA?
Thank you for your response! Completely understand not liking to post results online, I wouldn’t either. This was very helpful. I just wanted an overview really as I’ve not known of many who opted for Pearson but the idea of testing python skills is quite appealing. Thanks again!
We had our best results yet this summer: students and staff across the department worked really hard but I do think that the practical aspect to the course (and assessment) was a big factor. In June 22 our paper 2 results were better than our paper 1 (both in terms of raw scores and when compared to national averages) which suggests that not every school had the same experience. I do ask the students every now and then if they’d prefer a paper assessment for algorithms and code and the responses are pretty adamant that they prefer being able to write (and edit, debug and test) real code rather than do an additional exam on paper. It’s a big part of what the students signed up to for and we’ve worked hard to support and resource practical programming throughout the GCSE course alongside the theory. Happy to talk through any questions / specifics with anyone individually: sounds silly but I don’t like publishing grade analysis online.
Thank you this is very helpful. Can I ask how results went? Has it had a positive impact? Or do you think students prefer and/or do better in paper assessments?
We also ran this exam this year and I agree that it was an excellent way to assess pupil’s programming skills. The six question structure was very effective in that all our pupils knew that they could access the first four questions and expect to score well on them. The final two questions were more demanding but our most able pupils all said they were able to complete most if not all of the questions. The mark scheme (at least for the sample exam) is fair and even a limited attempt at th last two questions can gain a number of marks.
My only complaint was that it was the Friday afternoon before half term which put pressure on the exams officer ensuring all the work was correctly sent to the exam board before leaving - this was not Edexcel’s fault though. In addition because of when it was sat our technicians left before the work was submitted and the exams officer does not have the ability to zip files (network restrictions). An urgent phone call to Edexcel confirmed we could upload without it being zipped in this instance, they were very helpful.
Obviously, we await marks and grade boundaries to know how successful our pupils were but I am confident that it will be better than for other exam boards because of this component.
I just had a look at the sample paper! I like it, I really do.
Now I know where my Diagnostic tests will come from
Students are encouraged to use any of the tools that an IDE provide, as long as they are not in any way collaborative. So the live share feature of Visual Studio would have to be blocked but the whole point is to equip students to use the tools at their disposal, for example one of the requirements is that students must have an IDE which shows line numbers. The red squiggles and code hints are only useful if a student has been taught to interpret and understand them.
I did enjoy your question about CS teachers being trusted! Can you imagine?! Here’s the checklist for technicians which explains the process. Essentially, the exams officer gets a secure download on the morning of the exam. The technician then copies it directly to the network shares for each exam account and zips them up after the exam. I wasn’t involved at all on the day and wouldn’t want to be. I did turn up to the exams officer’s room at the end of the day to offer moral support (and chocolate), predicting that the upload system would crash (it did).
I can think of ways in which a teacher could conceivably access the code files an hour or two before the exam but they wouldn’t have access to the question paper and they’d surely know that they could expect the same malpractice dismissal as they’d get for stealing a physical exam paper protected by lock and key.
I haven’t spoken to all the Y11s. There’s a huge variety in scores for Y10 mocks. In previous specs I always used to get some students who would leave pseudocode questions blank regardless of how many times we’d go through exam technique (probably says a lot about my teaching…). I think those same students could now feel confident that they could at least do some of questions 1-3. For the most recent Y10 mock (44 students), the lowest score was 1 mark (long term absence and complex issues) and the highest was 69/70 with a fairly linear distribution of marks in between.
At least next year there’ll be some grade boundaries to work with. This year it’s been very difficult to predict outcomes. We shall see!
Thanks, Pete. Gosh! I had a look at the sample paper, and I’m impressed. This is a bold innovation, but - IMO - this is definitely the right direction to go. I liked the broad range of question types, and the number of them - giving you six non-connected programing questions each with several parts. You obviously don’t know the results yet, but what was the post mortem (as it were) view of the pupils who took it?
A couple of questions for my clarification, please:
Are there any restrictions placed on the tooling used in the exam? I happened to open the .py files in Visual Studio and it immediately added a red squiggle at various points in the code where there were (deliberate) syntax errors - with very helpful tooltips to show what was needed e.g. ‘Expected indent’, “Expected :”, “Expected expression”, “( was not closed”. That could be a significant advantage in an exam over, say, using Idle Python. However, even with such help there is still value just in getting pupils to fix these errors.
What are the logistics, and specifically the timings, for the distribution of the code files prior to the exam - and their installation on the exam PCs? Is this done by CS teachers on trust (that the pupils won’t be shown it ahead of the exam); is it the direct responsibility of the exams officer; or is the code distributed in a password-locked .zip file, and the password revealed on the paper exam (I’m just trying to imagine how it could be done)?
(off by one error!) Yes, the program layout is a bit unusual, but I can see why they’ve gone that route. It certainly helps the markers, and (even though, like you, I don’t agree with the idea of up-front variable declarations) I imagine that it could be helpful to pupils also - ecnouraging them to work within a standard structure.
Thanks Richard - good question. A criticism of Edexcel’s previous spec was that it had the smallest difference between the number of marks required to get a 9 compared to the number of marks required for a 1, which implied that it wasn’t set up well for weaker students. I think the new style of exams is much better (in terms of language used, range of question difficulty, structure of both papers), but we’ll have to see after results day.
You can see sample assessment papers here if you’re interested. The actual papers were in the same style.
If a school doesn’t want to teach Python at GCSE or hasn’t got the infrastructure to run the live exam then the Edexcel course obviously isn’t for them, but I’m struggling to think of another reason why people might prefer OCR / AQA’s model, other than perhaps continuity of exam board from KS4 to KS5. I’d like to hear a contrasting opinion though.
I’m sorry I’ve not responded to your email yet… I’m working on it
Thanks for taking the time to write this up, Pete. I look forward very much to hearing the feedback, especially once the results are known! Will this result in a repetition of the OCR vs. AQA debate at A-level - i.e. will people end up saying that it works very well for the best pupils and not well for the less able? Or will Edexcel’s initiative have got it right i.e. found a better way to test practical programming on screen. I haven’t seen the paper (or any mocks) - but your description of it sounds intriguing to me.