Read through our playbook for further inspiration on how to conduct a performance review calibration!
Many companies use performance review calibration meetings to standardize manager scoring behavior to ensure fair and bias-free assessments. With our powerful review analytics capabilities, Leapsome can simplify your calibration process.Â
Getting startedÂ
Please refer to this article to familiarize yourself with general review analytics options (heatmaps, distribution charts, box grids, radar charts, multiple-choice questions, and timelines).
Timing
Calibration meetings should happen after the manager finalizes, but before they share their assessment with the employee. As such, managers will still have the opportunity to make changes to their assessments if the calibration committee deems it necessary.
Analyzing data on the heatmap and distribution chart
Some views are beneficial when running a calibration meeting.
Please note: results cannot be calibrated / changed if the manager has signed and finalized the review. Hence, it is best to hold calibration talks before development talks.Â
On the heatmap, you might want toÂ
- Filter by 'Quick-filter: Latest manager reviews' to retrieve the latest review for each employee, regardless of the date or review cycle. It is useful if you have decentralised reviews and want to understand the most recent data on employee performance, rather than an aggregate of all past reviews.
- Filter by 'Review cycles' to see the specific (open) review you wish to calibrate
- Filter by 'Perspectives' to see 'Manager assessments’
- Segment results by 'Manager of the reviewee' to see the average ratings by each manager for their direct reports (this is what you would be comparing)
- Group results by category or total score to see the big picture
Making changes during the calibration meeting
By default, only managers can calibrate scores for their team members (direct reports only). In the admin settings, super admins can enable calibration rights for admins and HR Business Partners. These admins and HRBPs will then be able to calibrate scores as well.
In the review analytics ('Reviews’ > 'Analytics'), you will find the 'Calibrate' button under 'Actions’, which prompts you to select a cycle for the calibration. Once a cycle is selected, you will have the option to calibrate scores for all individual questions / skills:
- given by the manager
- not yet signed by the manager
- and in cycles that are not yet closed
To do so, click on the corresponding score in the heatmap. Each change will be tracked, and you will see the pre-calibrated score in brackets beneath the new score, as well as who calibrated the score. Changing the score will trigger a recalculation of the overall manager score. The reviewee will only see the final / calibrated score.
Calibration in the box-view
As Admin or Manager, you can also calibrate in the box view. When you navigate to 'Actions' > 'Calibrate' you will be able to choose the review cycle for calibration. Automatically, the view will be filtered by 'Perspective' > 'Manager Assessments'.Â
On the right, you can choose the questions to be calibrated for the x- and y-axis. If you want only to calibrate one question, you can also choose 'None' for one of the axis.Â
You may also use different filters in the box to find the employees you're aiming to calibrate or you would like to see in the same view.
Within this view, the names of individual boxes as well as the target percentage of respondents falling into each box can be set, to best reflect internal processes and workflows. To do this, navigate to 'Actions' > 'Open calibration box settings' which will present you with a menu to make the pertinent changes.
Please note: Calibration is blocked if a review signed, if more than one answer contributes to the score or if the score is not originating from a manager review. Calibration will also be blocked if the two dimensions use different scales (in this case, results will be displayed in a Box). Calibration will also not be possible if the scale is normalized. This means that your 'Default Scale' defined in the 'Admin Settings' > 'Reviews' needs to follow the same scale as the cycle you're aiming to calibrate.Â
You can simply drag and move the employee who is being calibrated by hovering over their picture or initials bubble. Any changes will be auto-saved. Each box will also show how many users are in each as well as the percentage of users from the review cycle. By clicking the bubble, a pop-up will open, giving past feedback context on both of the questions being calibrated, and calibration history related to the questions, or skills. You can open these by clicking them.Â
Comments
0 comments
Please sign in to leave a comment.