Close this search box.

Assessment data and report writing in geography

‘Data only becomes effective if it stimulates questions about the actual learning that is taking place and how it can be developed further.’

National Foundation for Educational Research (NFER) p 8 

Topics on this page:

  • The use of data in schools
  • Key information
  • Published data
  • Finding out about the use of data
  • ‘Closing the gap’
  • Writing reports
  • Student portfolios
  • Reading
  • Reference

The use of data in schools

Over the last few decades schools have become increasingly data-rich. They have developed highly sophisticated software systems to handle assessment data, monitor students’ progress, set targets and track pupil data throughout their time at school. 

It is has become common for schools to require teachers to provide progress and/or attainment information for a centralised school database. To do this, a geography department would need to carry out some form of assessment from which to generate the data.

This drive for collecting data to track students came from a desire to improving standards of achievement. Gardner et al (2015) explain how this developed:

‘Originally developed in core subjects, these practices spread to non-core subjects and were the result of the relentless pressure to raise standards and for all staff to be accountable for students’ progress. It resulted in the entrenchment of datadriven systems in schools’ management and culture. Unfortunately, under pressure from the needs of Ofsted and successive governments to show standards are rising, many schools were seduced by the availability of ‘big data’ and software packages to track progress by recording student progress through levels and sub-levels.’ (p 7)

However, subsequent research has not proved that using data to monitor and track student attainment means that students learn more. Data management systems often expect attainment to follow an upward, linear path, otherwise students are judged not to be making the required progress. 

We have seen that geography is a subject that encompasses many different aspects and this makes student assessment complex. Progress in geography is rarely linear or made in precise ‘steps’; most students make uneven progress in different aspects of the subject.

As well as failing to make the positive impact on students’ progress that was intended, it was realised that any attempts to create ‘standarised’ data across subjects was flawed because of the differences between disciplines. More importantly, data collection and analysis significantly increased the workload of teachers without the looked-for learning gains. 

In Making data work (2018), the Teacher Workload Advisory Group recommended that schools should not have more than two or three data collection points a year, and that these should be used to inform clear actions. It was also concluded that short classroom tests are not precise enough to measure annual changes in student progress and attempting to identify ‘expected progress’ numerically is not particularly meaningful.

Every school has a slightly different system for collecting and using data. Most schools expect teachers to regularly provide assessment data on the progress of individual students, but approaches are changing. Increasingly, schools are moving away from attempting to use data to provide a grade or to say whether pupils were on track to meet a certain grade.

Schools are re-thinking how they track and report progress. Ofsted no longer requests internal data and schools are using a less centralised approach, leaving departments to use assessment systems that work best in their subject. However, some schools continue to expect teachers to take registers and keep a mark book online to give central access to attendance and progress records for every student.

  • Refer to Biddulph et al (2021) pp. 239-242 ‘Using data to support teaching and learning’ and the chapter by Weeden (2017), which provides a good overview of the use of assessment data in different contexts.

Key information

Every teacher needs some key information about students they are going to teach as a starting point: e.g. information about SEND or disadvantage, information about prior learning and any specific curriculum targets. Some schools use standardised data such as cognitive ability tests (CAT). Read Hoare (2023) p 36-7 who outlines how he used such data as a tool when designing his year 8 geography curriculum.

Go through any such class data available for the students you teach with your geography mentor. Discuss how to get access to this data and how you should interpret it. Remember that you are expected to contribute to this information about individual students and need to know about the school expectations and requirements for doing so. This, again, should be discussed with your mentor. It is generally expected today that any data you record about how a student is progressing should be shared with the student and their parents.

Consider how you will manage your time in collecting and analysing data, so you do not get overwhelmed by gathering excessive amounts of information. A good principle is to only collect data that you will use to influence a student’s learning and progress i.e. you can make formative use of it. 

You should expect to be asked two or three times a year to provide a rounded, professional judgment of student attainment, from which specific development actions can be taken. It is helpful to remember that data should be used to raise questions about students, rather than draw conclusions.

Published data

The DfE publishes data on school performance including league tables for secondary qualifications, including Progress 8, Attainment 8 and EBacc scores. Progress 8 is a value-added measure that compares pupils’ progress made between key stage 2 and GCSE results. 

It is combined with Attainment 8, which measures a student’s achievement across eight different subjects (geography can be included). Progress 8 is part of a secondary school’s accountability system; it informs parents and students about school performance and is also used by Ofsted.

National figures for GCSE and AS/A level results are published annually. See them on the GA webpages for GCSE and A level results together with some analysis of the figures.

The use of statistical data for students and their performance are part-and-parcel of most schools, so you need to get to grips with what will be asked of you. As a new teacher, you will be expected to both use and contribute data for your students.

  • With the data manager in your school, discuss what data is collected on students and how it is used currently. Do they identify any issues in doing this?
  • With your geography mentor, discuss how the geography department contributes to and uses school data.
  • With your mentor or head of geography, discuss how they use the examination results tables as national benchmarks with which to compare their school results.
  • Refer to the NFER quote at the top of this page. How is assessment data used in the geography department to question the effectiveness of the learning and how can it be developed?

Gardner et al (2015) provide some good advice about the realistic use of data:

There is considerable evidence of teachers and policy-makers inferring too much from assessment data, or using such data in ways that are neither valid nor reliable. Good assessment requires sound professional judgement on your part: if you know, largely, where your students are ‘at’. It requires skill and competence – as well as a measure of confidence – to adjust marks that have previously been awarded to students as indicators of their levels of attainment. You must tailor your expectations of assessment systems to realistic parameters, and maintain a healthy scepticism about what objective assessment ‘evidence’ seems to be telling you!’ (p 15-6)

‘Closing the gap’

The original thinking behind using data to improve performance was to close the gap between current attainment and desirable performance. The attainment gap between children from rich and poor backgrounds is detectable before they go to school and widens throughout the education system. For example, students from the lowest-income homes are half as likely to get five good GCSEs and go on to higher education.

An important message from Black and Wiliam (1998) ‘Inside the Black Box’ was that ‘standards can be raised only by changes that are put into direct effect by teachers and pupils in classrooms’. They strongly believed in ‘stimulus and help for pupils to take active responsibility for their own learning, the particular help needed to move pupils out of the trap of “low achievement”, and the development of the habits necessary for all students to become lifelong learners’.

In the decades since Black and Wiliam wrote this, raising student attainment has been imperative for schools. Teacher determination to work at this with effective assessment and good target setting can ‘close the gap’; it does not have to be data-related.

  • Read some school case studies about assessment and target setting to identify ways in which this can be most successful. Hamson and Sutton (2000), Leat and McGrane (2000) and Howes (2003), who describes a department’s approach to target setting.

Although all these case students were developed to meet the requirements of level descriptions which were in use at the time, the assessment strategies can equally be applied to teacher-designed benchmarks. 

Hamson and Sutton (2000) describe how an 11-14 school department had considerable success in raising attainment through the use of formative assessment and target setting. They designed assessments that were challenging and open-ended, included a wide variety of assessment methods across the year to cater for all students and include diagnostic and formative assessment.

Peer assessment and target setting played a key role. The outcome was students who accepted responsibility for their own learning and did this effectively while teachers concentrated on giving formative feedback. Recording was important, but the department changed from a system that was excessively time-consuming and inefficient to one that was more manageable for the teachers.

Leat and McGrane (2000) explored how to assess students’ thinking and set targets to improve it. The authors were keen to avoid making target setting into a data generation and analysis process that was disconnected from teaching and learning. 

They also recognise that for target setting to work you need to translate the desire for better grades into achievable steps for students; and a teacher must use interventions that will encourage improvements in students’ performance. This example clearly illustrates that the point of AfL is not to compare each student’s performance with that of his or her peers, but to determine what the next educational steps should be so that they can make progress.

The problem with target-setting in schools prior to 2014 was that schools were often obsessed by numeric data and setting students targets to achieve the next level. These geography examples illustrate how these geographers employed good assessment practice within this context. The removal of levels meant that teachers could move away from numeric targets and focus on setting targets that more clearly articulate their next steps in learning for students and achieve ‘closing the gap’.

Writing reports

New teachers need to be able to report on and communicate a student’s progress to parents, both orally and in writing. As a trainee teacher you should arrange to shadow your geography mentor and/or a class teacher when they meet parents of students in a class you teach to discuss students’ progress. 

You should also use your assessments of students’ progress to practise writing reports in the format used in the school for some students you teach. Discuss these with your geography mentor.

Student portfolios

A portfolio is a sample of students’ work that exemplifies standards in geography. Evidence portfolios allow teachers and students to carefully consider evidence for progress. Some schools ask students to keep an evidence portfolio of important pieces of work that demonstrate their achievements. This creates a longitudinal sample of their work to show progress between 11 and 14.

Biddulph et al (2021) encourage you to really look at students’ work, to share this with other teachers and consider what kind of geographical knowledge, understanding and skill they demonstrate in the different types of work that pupils produce.

  • See Biddulph et al (2021) p. 235 Task 8.2 Assembling and moderating a portfolio Box 8.6 Compiling an evidence portfolio: a checklist of key points.


  • Biddulph, M., Lambert, D. and Balderstone, D. (2021) Learning to Teach Geography in the Secondary School: A Companion to School Experience, 4th edition. Abingdon: Routledge, pp. 265-8.
  • Gardner, D., Weeden, P. and Butt, G. (2015) Assessing progress in your key stage 3 Geography Curriculum (eBook), Geographical Association.
  • Hamson, R. and Sutton, A. (2000) ‘Target Setting at key stage 3’, Teaching Geography, January.
  • Hoare, C. (2023) ‘Well-designed assessment leads to better feedback’, Teaching Geography, Spring.
  • Howes, N. (2003) ‘Setting targets for students’, Teaching Geography, April. This describes a department’s approach to target setting based on end-of-key stage assessment at key stage 3. Although this was written when level descriptions were statutory, it provides useful insights into how a geography department used targets.
  • Thompson, L. (2006) ‘Target setting and target getting in geography‘ in Balderstone, D. (ed) Secondary Geography Handbook, Sheffield: Geographical Association.
  • Weeden, P. (2017) ‘Assessing Geography’ in Jones, M. (ed) The Handbook of Secondary Geography, Sheffield: Geographical Association, chapter 14.



  • Kirkup, C., Sizmur, J., Sturman, L. and Lewis, K. Schools’ Use of Data in Teaching and Learning: National Foundation for Educational Research (NFER). Department for Education and Skills.