Read through our playbook for further inspiration on how to conduct a performance review calibration!
Many companies use performance review calibration meetings to standardize manager scoring behavior to ensure fair and bias-free assessments. With our powerful review analytics capabilities, Leapsome can simplify your calibration process.
Getting started
Please refer to this article to familiarize yourself with general review analytics options (heatmaps, distribution charts, box grids, radar charts, multiple-choice questions, and timelines).
Timing
Calibration meetings should happen after the manager finalizes, but before they share their assessment with the employee. As such, managers will still have the opportunity to make changes to their assessments if the calibration committee deems it necessary.
Please note: To prevent the signing of assessments and keep calibration possible, please leave the "The manager and the reviewee need to sign the review after sharing it to permanently lock them." box unchecked on the Scope tab of the review settings. This box can be checked off after calibration is complete to allow signatures, if desired.
Analyzing data on the heatmap and distribution chart
Some views are beneficial when running a calibration meeting.
Please note: results cannot be calibrated / changed if the manager has signed and finalized the review. Hence, it is best to hold calibration talks before development talks.
On the heatmap, you might want to
- Filter by 'Quick-filter: Latest manager reviews' to retrieve the latest review for each employee, regardless of the date or review cycle. It is useful if you have decentralised reviews and want to understand the most recent data on employee performance, rather than an aggregate of all past reviews.
- Filter by 'Review cycles' to see the specific (open) review you wish to calibrate
- Filter by 'Perspectives' to see 'Manager assessments’
- Segment results by 'Manager of the reviewee' to see the average ratings by each manager for their direct/indirect reports (this is what you would be comparing)
- Group results by category or total score to see the big picture
Making changes during the calibration meeting
By default, only managers can calibrate scores for their team members. In the admin settings, super admins can enable calibration rights for direct and indirect managers as well as for (review) admins and HR Business Partners. These admins and HRBPs will then be able to calibrate scores as well.
In the review analytics ('Analytics' > 'Reviews'), you will find the 'Calibrate' button under 'Actions’, which prompts you to select a cycle for the calibration. Once a cycle is selected, you will have the option to calibrate scores for all individual questions / skills:
- given by the manager
- not yet signed by the manager
- and in cycles that are not yet closed
To do so, click on the corresponding score in the heatmap. Each change will be tracked, and you will see the pre-calibrated score in brackets beneath the new score, as well as who calibrated the score. Changing the score will trigger a recalculation of the overall manager score. The reviewee will only see the final / calibrated score.
Calibration in the box-view
As Admin or Manager, you can also calibrate in the box view. When you navigate to 'Actions' > 'Calibrate' you will be able to choose the review cycle for calibration. Automatically, the view will be filtered by 'Perspective' > 'Manager Assessments'.
On the right, you can choose the questions to be calibrated for the x- and y-axis. If you want only to calibrate one question, you can also choose 'None' for one of the axis.
You may also use different filters in the box to find the employees you're aiming to calibrate or you would like to see in the same view.
Within this view, the names of individual boxes as well as the target percentage of respondents falling into each box can be set, to best reflect internal processes and workflows. To do this, navigate to 'Actions' > 'Open calibration box settings' which will present you with a menu to make the pertinent changes.
Please note: Calibration is blocked if a review is signed, if more than one answer contributes to the score or if the score is not originating from a manager review. Calibration will also be blocked if the two dimensions use different scales (in this case, results will be displayed in a Box). Calibration will also not be possible if the scale is normalized.
You can simply drag and move the employee who is being calibrated by hovering over their picture or initials bubble. Any changes will be auto-saved. Each box will also show how many users are in each as well as the percentage of users from the review cycle. By clicking the bubble, a pop-up will open, giving past feedback context on both of the questions being calibrated, and calibration history related to the questions, or skills. You can open these by clicking them.
Having trouble calibrating scores?
- Please make sure the person conducting the calibration is:
- A super-admin, admin or Reviews Module admin (If enabled in the Reviews admin settings).
- An HRBP for the person(s) for which the calibration is taking place (If enabled in the Reviews admin settings).
- The primary manager for the (in)direct report(s) being calibrated.
- Check to confirm that the question/score being calibrated only has one input factor.
- Inputs such as 'Total Manager (Average)' cannot be calibrated.
- When multiple questions are selected, ensure that they have the same scale.
- When questions are normalized, meaning that they have different scales, calibration will not be possible.
- Check that the scores have not yet been calibrated.
- Confirm in the 'Current Status' dashboard that manager assessments are not signed.
- If a manager has signed their direct report's assessment, scores cannot be calibrated.
- Please check that the cycle of the score you are trying to calibrate is ongoing.
- Scores of a closed review cycles cannot be calibrated unless you re-open the cycle.