Who can see review analytics?
Please watch this video for how to interpret the visibility settings of a review.
Super-Admins have full access to review analytics across all review cycles. They will not see results if the visibility of the cycle has been restricted in the visibility settings (see the video above).
Admins will, by default, have access only to the results of the cycles for which they are designated cycle owners. If the option ‘Admins can access all reviews' is enabled in the admin settings by the super admin, admin will be able to access all analytics.
- If they are neither the owner of a cycle and the above super-admin setting is not turned on, they will not have access to cycle analytics. This is true even if the 'Visibility for admins' allows admin access – as this only impacts the cycle owners (see the video linked above to learn more about 'Visibility for admins' settings).
- If the super-admin setting is active, the 'Visibility for admins' settings of a specific cycle or template will still apply to all admins. This means if you change the visibility to 'No visibility' for each assessment, admins will not have access to any analytics for that cycle.
Managers will, by default, have access to the analytics of their direct reports (as well as indirect reports if cascading access in the super-admin settings is enabled). You can change what managers see in the 'Visibility for the Manager' settings of the review.
Employees will have access to their results depending on the 'Visibility for the Reviewee' settings.
Filters
By default, you will see an aggregate of all results across all review cycles. Any teams/groups/custom attributes you create for your organization will immediately be available to use as a filter under analytics. On most views, you can use (and combine) the following filters. These are especially useful during calibration meetings:
- Teams: Filter to see the results of members of the filtered teams
- Managers: Filter to see the results of the direct reports of the filtered manager
- Employees: Filter to see the results of specific employees
- Perspectives: Filter to see the results of particular assessment types (e.g., only see the analytics for self or manager assessments)
- Contributor type: Filter to see the results of managers or individual contributors.
- Review cycles: Filter to see the results from specific cycles
- Timeframe: Filter to see analytics from any reviews completed in the specified timeframe
- Skills and categories: Filter to focus on particular skills, questions, or categories
- Custom attributes: Filter to see the results for employees that have a particular custom attribute (defined in the 'Users & Teams' section by the super admin)
Please note: Filters remain in place if you switch from one view to another.
In the heatmap, the filtering logic works as follows: If you filter for two or more of the same filter (e.g. two teams) Leapsome will apply an OR logic, meaning that it will show people who are either in Team 1 or in Team 2. If you combine different filters, e.g. a team and an office location, it will apply an AND logic. This means you would only see the answers that came from people who are in e.g. Team 1 and work in the London office.
Demographic and Attributions
Attribution will be based on the current team membership / demographic data at the time of submitting the review. For example, if a member from Team 1 starts the review and then moves to Team 2, then the results of the user would be considered as part of Team 2.
Aggregation levels ('Group results by')
On most views, scores can be aggregated in a:
- Total score: This generates an unweighted average score across all questions
- Category: An unweighted average across all skills in a category
- Skill: As the most granular view, this allows you to compare skill scores one by one
Normalization of results
If you have chosen the same scale throughout all review cycles, you will also see that scale in the review analytics. If you use different scales across reviews, you will still be able to see the review analytics. However, Leapsome will normalize all results to a 0-100 scale.
Working with the heatmap
If you want to compare the skill, question, or aggregate category score given by managers to different teams or employees, the heatmap is ideal.
Checking 'Show deactivated and cycle-specific skills' will display any custom questions you have added to the cycle alongside the skill ratings. To see these question scores, you will need to group the results by skill.
Segmenting the results changes the groupings you see on the left-hand side (the default is to show individuals and their corresponding scores). You can segment the results by:
- Team: Compare results across teams
- Reviewee: This allows you to compare employees (this is the default)
- Manager of the reviewee: Compare the aggregated scores of the direct reports of managers. This is particularly useful to see if there are differences in scoring behavior between managers
- Perspective: Compare the average scores from self, peer, direct report, and manager assessments
- Gender: Compare differences between genders. This is particularly useful to see if there are structural differences in feedback given to different gender groups
- Custom attributes: Comparison depends on the attributes defined by the super admin in the Users & Teams section (read more here about custom attributes)
Many companies use the heatmap for calibration meetings: After running a review round, leadership and HR teams typically compare whether any teams / employees were evaluated exceptionally high or low to ensure a fair standard across all participants. To find the calibration option, head to 'Actions' > 'Calibrate'.
Working with the 9-Box
The 9-box view is beneficial for understanding which employees are outstanding at a specific skill or combination of skills. You can also filter here for all the same filters as described above.
You can determine the X and Y axis in the top right corner. Once you've determined which combination you want to look at (or if you want to see everyones' total score in a line), the box will then display each individual (after you've filtered) as a dot on the chart. Hover over each dot to see the name of the employee and their score. You can view the chart as a schematic 9-box, which groups the dots into their respective box, hiding their exact placement on the chart.
Only 40 employees will be visible in each box, however, the total number of employees who apply to each box will be visible in the upper left-hand corner of the box.
Example use cases
This view can also help you find the best people for a lead role in a certain project requiring a specific set of strengths. For example, you may need a project lead for a new project that requires strong listening and presentation skills. You can set the X-axis to 'Listens actively' and the Y-axis to 'Providing feedback' (or similar skills). You would then choose from the people at the top-right most corner, as they have the highest rating in both skills simultaneously (see the screenshot below).
Maybe you want to know which managers receive the best ratings from their direct reports to understand how they lead their teams. In this case, you would filter to see Teams = 'Manager (auto)' and Perspectives = 'Direct report assessments.' You would then want to set the X and Y-axes to 'Total average.'
Some companies also like to plot employees on a 'potential v. performance' matrix (explained in the graphic below). To achieve this, you would want to add custom questions to your reviews that require a rating of these two aspects. You would then set 'potential' as the Y-axis and 'performance' as the X-axis. The 9 box naturally grids employees into one of the 9 boxes.
Working with the distribution chart
If you aim to achieve a particular score distribution during a calibration meeting, the distribution chart will be helpful.
Please note: By default, the view will show aggregate scores. You'll want to filter for a specific question, a specific review cycle, and 'Perspectives' = Manager Assessment in many cases.
The chart will show absolute counts of aggregate ratings (as opposed to their share).
Working with the timeline
The timeline view helps you see how your company, teams, or employees change in ratings over time.
Each line shows the aggregate score for a specific skill or category (depending on which grouping you've chosen in the 'Group results by' drop-down) or displays the total score.
Each dot or moment in the timeline represents when a review was submitted. For example, if you're looking at the timeline for the entire company and everyone went through a review in September and February, you would see a dot for those two months on the graph. You can filter by 'Timeframe' to specify a time window to narrow in on results.
Example use cases
You may decide to use this view to see how much an employee is improving over time. If they are consistently improving, this displays a high amount of potential and ability to learn. You may then mark them for a promotion.
Conversely, if you notice a certain skill rating (for an individual or the whole company) is going down over time, you may want to invest in training for that particular skill.
Working with the radar chart
In the radar chart, you can compare skills between employees, teams, managers, and/or the entire company, as well as see the specific strengths or areas for growth of a certain group.
For example, you could check how your People and Culture team is performing compared to the rest of the company and evaluate any strengths or weaknesses for the team based on this graphic.
To do so, you will want to click 'Enable comparison group.' In the example above, you would filter for 'Teams' = People and Culture in the first filter section. In the second filter section (the comparison group), you would filter to see all teams (which you do by selecting all teams apart from the People and Culture team).
There will then be two different colored lines representing the first group (People and Culture) and the comparison group (everyone). You can hover over a point on the chart to see which color represents which group, and the scores for each skill / category. The first group will be referred to as 'All assessments', and the second group as 'Comparison: All assessments'.
Analytics for multiple-choice answers
If you run review processes based on multiple-choice questions, you can find data on the answers to these questions in the 'Multiple-Choice' tab. You will see the answers listed and next to that, the number of participants that chose each answer. For each answer, you can also see the individual names of the participants that chose the answer, unless anonymized.
For example, you may want to analyze active/ past review cycles to understand which skills are being selected across the company and build training processes based on this.
Comments
0 comments
Please sign in to leave a comment.