Part of Job Evaluation Audit and KPI data collection technical guidance
Annex 2: Phase 2 - assessment of job evaluation practices audit
Overview
The Assessment of Job Evaluation (JE) practices audit forms Phase 2 of the Job Evaluation Audit and KPI data collection.
This phase builds on the work set out in the NHS Employers guidance Action Needed on Job Evaluation Outcomes for Nursing and Midwifery Staff (June 2025). While that work focuses on nursing and midwifery roles, the Assessment of JE Practices Audit has been designed to provide assurance on local JE arrangements across all staff employed under the NHS Agenda for Change (AfC) pay system.
The audit questions have been updated to align with the forthcoming national Job Evaluation Enabling Agreement, which has been agreed in principle by the NHS Staff Council and is expected to be published as an Annex to the NHS Terms and Conditions of Service handbook in Spring 2026. Our approach reflects this developing framework and supports the ongoing commitment to robust JE practice and assurance.
The mandated annual audit begins in April 2026, with returns submitted via the Data Collections Framework (DCF) portal alongside board readiness and Nursing and Midwifery JE metrics for the relevant reporting quarter.
All organisations within scope are expected to complete the audit.
Purpose and principles of the audit
The Assessment of Job Evaluation (JE) practices audit has been developed and agreed by the NHS Staff Council to support NHS organisations in reviewing the robustness, governance and partnership delivery of their local job evaluation arrangements.
The audit is intended to provide assurance that the principles and standards set out in the NHS Job Evaluation Handbook are being applied fairly and consistently. It reinforces the need for transparent, well-governed JE practice that is aligned with national requirements and supported by appropriate systems, resources and training.
The audit has been developed to help organisations to:
- understand the extent to which their current JE arrangements reflect national expectations
- identify gaps or areas requiring improvement
- monitor progress over time
- strengthen governance and partnership working
Submission details
The submitted return should accurately reflect local JE practices and demonstrate that the assessment has been completed in partnership and shared and discussed with the appropriate staff-side representatives and local forums.
Confirm that the return has been compiled with your staff side JE lead (or staff side chair where there is no staff side JE lead) and has been shared and discussed with your local partnership forum/Joint Negotiating Committee (JNC)
Response options: Yes / No
Assessment of job evaluation practices audit
The audit consists of 21 questions covering organisational governance, partnership arrangements, training and resources, panel capacity, timeliness, monitoring arrangements and record-keeping.
Questions are structured to provide an evidence-based assessment of whether organisations have the key components of a robust, fair and sustainable JE system in place.
Questions
1. Our board receives a regular report on job evaluation (application and outcomes) and issues are raised on the corporate risk register as appropriate, including but not limited to, equal pay risk
| Red | No |
| Amber | Yes, but not routinely |
| Green | Yes, a report is made at least annually including an assessment of performance/risk |
Explanation: Regular board oversight supports transparency, monitoring of resourcing and activity levels, and appropriate escalation of JE-related risks in line with the expectations set out in national guidance.
2. There is an identifiable lead for JE in our people/HR team and designated resources* for JE activity (*could be a separate funding line)
| Red | There is no identifiable HR lead nor designated resources |
| Amber |
There is an identifiable HR lead but no designated resources or There is no identifiable HR lead, but we do have designated resources |
| Green | There is an identifiable HR lead and designated resources |
Explanation: Organisations are expected to have an operational lead and sufficient resource to support timely, consistent and partnership-based JE delivery. This reflects core principles in the NHS Job Evaluation Handbook.
3. There is a staff side lead for JE
| Red | There is no staff side lead for JE |
| Green | There is a staff side lead for JE |
Explanation: Having a nominated staff-side JE lead is central to partnership working and ensures staff-side visibility and involvement in key JE matters.
4. The partnership forum/joint negotiating consultative committee receives regular reports on JE activity and performance including training and resources
| Red | No |
| Amber | Only when requested |
| Green | Yes regularly (at least quarterly) |
Explanation: Regular and shared JE reporting helps ensure shared understanding of pressures, activity, risks and training needs. It supports open dialogue and stronger partnership working.
5. There is an up-to-date JE policy that has been agreed in partnership that outlines all local processes and practices and is in line with the national NHS Job Evaluation Handbook
| Red | No – or the policy is over 5 years old |
| Amber | Yes, but it needs reviewing |
| Green | Yes, and it is reviewed at least every 3 years |
Explanation: A current partnership-agreed JE policy ensures local practice reflects national standards, including roles, processes, governance and consistency requirements.
6. The NHS Staff Council recommends that the end-to-end process for determining pay banding is no longer than 12 weeks unless mutually agreed (not including time taken for role holders and line managers to agree job information and/or job analysis questionnaires)
| Red | We do not have any JE activity targets or less than 50% is turned around within 12 weeks |
| Green | Over 50% of our JE activity is completed within 12 weeks and we have a plan to improve |
| Amber | 90% of our JE activity is completed within 12 weeks (from date agreed information is submitted for JE to delivering outcome to role holder/manager) |
Explanation: Timeliness is a core expectation within the JE process. Monitoring performance helps ensure fairness, prevent delays, and mitigate equal pay risks.
7. Systems are in place that allow JE leads to monitor the interaction between panels – for example, if there are frequent misunderstandings over the same issue/factor or regular over/under-evaluation by panels, so that remedial action made, or further training arranged
| Red | No – there’s no feedback from panels other than their reports |
| Amber | We review panel reports on an ad hoc basis or when there is a complaint/grievance |
| Green | Yes, and there is evidence to prove this |
Explanation: Monitoring panel trends supports consistency, identifies training needs, and aligns with good practice in quality assurance.
8. JE leads are involved in service reconfiguration/redesign at an early stage
| Red | No or only after the org change has happened |
| Amber | Our JE teams are made aware when this is happening so they can plan panels |
| Green | Our organisational change policy recognises the need to assess the JE implications of service reconfiguration/redesign at an early stage, and we can evidence that JE advice and expertise is available to advise managers, for example if changes to roles have banding implications |
Explanation: Early JE lead input ensures that role changes are understood, and implementation impacts are managed appropriately.
9. JE leads and JE practitioners keep up to date with relevant matters for example, any changes in national profiles or the JE handbook
| Red | No idea – no mechanism |
| Amber | JE leads subscribe to the workforce bulletin |
| Green | JE leads are active members of the national JE CoP and there is a formal/regular mechanism to update all local practitioners |
Explanation: Maintaining current knowledge supports consistent decision-making and reflects expectations set out in national JE guidance.
10. Our agreed JE policy specifies how to identify and determine how the organisation will assess and deal with any temporary capacity issues or backlogs
| Red | No – there is no plan |
| Amber | We occasionally assess capacity and put on more panels if we can |
| Green | We regularly assess our capacity and have a range of options to deal with temporary issues/backlogs |
Explanation: Proactive capacity planning prevents delays and supports the sustainability of JE delivery.
11. Do you ever outsource your JE work to a private, third-party consultancy (such as a non-NHS organisation)?
| Red | Yes – most or all of our JE work is done by a private company |
| Amber | Only occasionally in line with requirements of the JE handbook |
| Green | Never |
Explanation: National guidance emphasises that JE should be delivered in partnership. External provision should be limited to exceptional circumstances on a temporary basis and only by local partnership agreement, with a clear plan to address bringing job matching and evaluation back in house.
12. All JE panels (including consistency checking) are conducted in partnership
| Red | No |
| Green | Yes |
Explanation: Partnership-based panels are a core requirement of the JE system.
13. We have sufficient practitioners to ensure that every matching or evaluation panel is made up of between 3 and 5 trained practitioners.
| Red | No, some panels sit as 2 or sometimes sit without staff side member present |
| Amber | All panels sit with at least 3 practitioners with at least one staff side and one management member |
| Green | All panels sit with at least 4 members – with at least 2 staff side and 2 management side panellists |
Explanation: Ensuring and maintaining sufficient capacity to hold panels in partnership is key to ensuring timely decision making and to maintaining confidence in the panel process.
14. How many panel practitioners do you have that are active and available to sit on JE panels?
Enter number (see validation rules).
Explanation: Maintaining an adequate pool of trained JE panel practitioners and convening panels at appropriate intervals is essential to ensure fairness, consistency, and robustness in JE outcomes. Sufficient numbers and regular panel activity help prevent delays, reduce bias, and support compliance.
For the purposes of this return, an active panel practitioner is defined as someone who has participated in at least one JE panel in the past 12 months and remains available to continue doing so.
15. Of those, how many are staff side panellists (i.e. staff side job evaluation panellists should be nominated by, and be accountable to, a local union or staff side)?
Enter number (see validation rules).
Explanation: Panels should include trained staff-side practitioners, nominated through recognised union structures. This question helps assess whether staff-side capacity is sufficient to support partnership-based delivery.
16. We ensure that we have panellists from across all parts of the organisation and all occupational groups to ensure panels are representative of the workforce.
| Red | We don’t currently have panellists from across the organisation/occupational groups |
| Amber | We don’t currently have panellists from across the organisation/occupational groups, but we are developing an action plan to address this |
| Green | Yes, we ensure that we have panellists from across the organisation/occupational groups |
Explanation: Representative panels help ensure fair and knowledgeable evaluations, as recommended in national guidance and the NHS Job Evaluation Handbook.
17. We make sure that trained practitioners get sufficient paid time off to undertake JE work (this should be separate from any facilities time agreed for TU representatives)
| Red | We don’t monitor this |
| Amber | We expect managers to release staff, but we don’t enforce it |
| Green | Our policies require managers to release practitioners, and we monitor and enforce this to ensure that all practitioners of any staff group can be released |
Explanation: Protected time is essential to maintain an effective and sustainable pool of trained practitioners, supporting timely JE delivery.
18. Refresher training is offered regularly for trained job evaluation practitioners (every 3 to 5 years)
| Red | We do not provide any refresher training |
| Amber | We provide refresher training but do not mandate attendance or monitor take up |
| Green | Yes, we have a programme of refresher training that ensures all active panellists receive refresher training at least every 5 years (and records to prove it) |
Explanation: Regular refresher training helps maintain practitioner competence and is recommended in the JE handbook.
19. All staff have an up-to-date job description that is reviewed at least every 3 years
| Red | Some do but we have no mechanism to monitor this |
| Amber | Some do but we have an action plan in place to address those that don’t |
| Green | Yes, they do, and we have a process to ensure this happens; including re-banding when required |
Explanation: Accurate job descriptions are essential for fair and valid JE decisions.
20. There is a robust system in place for recording all JE activity and outcomes
| Red | We have paper-based system/JE outcomes are not stored |
| Amber | We use a spreadsheet to record information |
| Green | Yes, we have secure system that records all job information and outcomes/panel activity and keep records indefinitely, such as CAJE or IJES |
Explanation: Complete, accurate and accessible JE records support audit, equal pay assurance and transparency.
21. If a member of staff asked for the job evaluation report for their job, we would be able to provide it
| Red | No |
| Amber | We could do this for some roles, but not all |
| Green | Yes |
Explanation: Being able to provide JE reports enables transparency and staff confidence in the JE process. It is an expectation under the JE handbook and will be a requirement of the new annex to the NHS Terms and Conditions of Service handbook, expected to be published in Spring 2026.
Data validation rules
The following validation rules apply to Phase 2 - Assessment of job evaluation practices audit responses submitted through the DCF portal:
1. Rated questions (R/A/G or R/G)
Each rated question allows one response only.
Depending on the question, the available options will be either:
- Red, amber or green
- red or green
Only one rating may be selected per question.
Rated questions cannot be left blank.
2. Yes/No questions
Only one option may be selected.
A response is required for each Yes/No item.
Questions cannot be left blank.
3. Numerical questions (questions 14 and 15)
All numerical values must be whole numbers.
Values must be equal to or greater than zero (≥ 0).
The number of staff-side panellists recorded in Question 15 must not exceed the total number of active panel practitioners recorded in Question 14.
If the value in Question 15 is greater than Question 14, the system will flag an error and prevent submission.
Questions cannot be left blank.
Last edited: 5 February 2026 10:06 am