The holiday season has just ended. For many education professionals, it offered a brief moment to pause: to catch their breath after intense exam periods, or to prepare for the next round of assessments that quickly follows the holidays.
For me, the start of the new year feels like a natural moment for reflection. Much like a PDCA cycle, it invites us to look back and look ahead at the same time. What steps have we taken? What challenges did we encounter? And which good intentions can help us organise assessment even better in 2026?
In conversations with assessment coordinators, examination boards, teachers, and educational specialists, the same themes keep returning. You may recognise them too.
1. Improving the assessment development process
Many education teams experience their assessment development process as fragmented. Questions live in separate documents, test blueprints go missing, and it is not always clear which version is the most up-to-date. This not only causes uncertainty but can also lead to inconsistencies in assessment quality.
A valuable intention for 2026 could be to take a fresh look at the development process. What works well? Where does it get stuck? And which steps could help create more clarity, calm, and collaboration? Think about agreements on feedback moments, responsibilities for review, or version management. Minor process improvements can make a big difference.
2. Structurally safeguarding assessment quality
The desire to strengthen assessment quality is something I hear often. Teams are well aware that assessments should be reviewed and updated regularly, but in practice, this does not always happen. Time pressure, team changes, or other priorities often get in the way.
This year is a good time to schedule regular review sessions to go over assessment questions together, not only from a content perspective, but also to check whether they still align with learning objectives. Student performance data can also be a valuable input. Analysis results help you better understand why a question does or does not work. Discussing this together not only improves assessment quality but also strengthens the sense of shared responsibility within teams.
3. Organising and updating the question bank
Many institutions have built extensive collections of questions over the years. Without a clear structure, however, this wealth of content can turn into a maze. You know the good questions are there, but you cannot always find them.
A meaningful intention could be to maintain the question bank actively. This does not have to be a large-scale project all at once; even cleaning up a single theme or module can make life easier for you and your colleagues. By agreeing on how questions are categorised, updated, and managed (for example, using additional metadata), you gradually build a reliable and future-proof question bank.
4. Exploring new assessment formats
More and more teams are exploring assessment formats that better reflect what candidates need to do, not just what they need to know. Applied assessment, case-based questions, and multimedia can support this, but they also require practice and experimentation.
2026 is the right year to pilot one new assessment format as a team. Even a small-scale pilot can provide valuable insights: Does this format work for our candidates? What kind of guidance do they need? And what does it require from teachers? Starting small allows you to learn and grow carefully.
5. Taking a more critical look at assessment security
With the rise of digital assessment and hybrid education, assessment security is becoming increasingly important. Not only to prevent fraud, but also to ensure fairness and reliability.
It may be worthwhile to revisit your assessment policies together. What agreements are in place? What works well? Where are the bottlenecks? And which technical or organisational measures could help? Regularly evaluating security keeps it an integral part of quality assurance.
6. Exploring the possibilities of AI
AI has made significant strides this year, and within assessment, there are growing opportunities to support processes. Think of improving question wording, identifying patterns in analysis data, or speeding up the marking of open-ended questions.
Many institutions want to explore this, but are still looking for safe and responsible ways to do so. This could be a good intention: choose one concrete area to explore AI, such as rewriting questions or analysing assessment results. Start small, reflect together, and see what it brings.
And you?
What are your good intentions for assessment in 2026?
You may want more overview in your processes, more insight from data, or more room to experiment.
Assessment is never finished. It is a continuous cycle in which improvements can always be made, and new ideas can be “tested.” I am curious which steps you want to take next year. Feel free to let me know, I am happy to think along with you.


