- Overall Participation
- View Options
- Participation Breakdown
Engage's reports section provides a comprehensive view of both high-level and detailed reporting. The reporting plays a key role in displaying metrics such as employee voting patterns on various question types, favorability scores, and comments. It also can be a key tool for action planning, and implementing next steps in order to increase employee satisfaction. To access a survey report, click on "See report" next to the survey of your choice within the module.
That will then take you to this page.
Other key information:
- Survey reports present data for the entire survey, and cannot be broken up into custom date ranges if you'd like to view reporting over time. To achieve that level of reporting, we recommend utilizing the dashboard. Filters present in the dashboard allow you to set custom or preset date ranges and will aggregate survey reporting data (which can be filtered even further).
- Visibility into what data segmentation a manager/admin user can see is fully customizable depending on what dimensions the user has visibility into. For more information on permissions, see this article. Setting specific dimension permissions for managers and admins is useful because it can allow them to see only the data they need to see (e.g., employees from the locale they oversee, specific department(s) they manage, etc.) and allows for greater confidentiality in the reporting.
- Tenure is calculated once every 24 hours in the application, and will reflect appropriately in the survey reporting/segmentation as well. Tenure is determined by the "Employee lifecycle" settings in App Settings.
Participation shows how many employees have taken the survey. This number is present on all reporting tabs. The number of people who took the survey will appear in the parenthetical, next to the total number of users within the audience (e.g. out of 4995 employees included in the survey's audience, 2999 took the survey, which makes up 60% of the audience).
Additionally, you can add filters to the reports using the filter option, which will bring up the "Add a Dimension" button where you can select which dimension(s) to filter to (for more information on that topic, see this article).
The results in this tab will reflect information only for any filters that are selected. Also, the selections within the filters will only reflect the participants of that particular survey.
Example: An organization has 10 departments. A survey only included 4 of those departments. This means that only 4 departments will be displayed in the drop-down menu for the "Department" dimension.
The "View options" button allows you to compare the results from this survey with the company average, or benchmark.
Checking off the "Show comparison" checkbox will show an additional percentage next to each average on this page, to compare the averages from this survey with whatever comparison option you have selected.
"Company's average" will aggregate all the data from every user within the company across all survey reporting in Engage, and use that as a metric for the survey data. "Benchmark" comes from predefined world percentages for these categories. For more information on benchmarks, see this article.
Lastly, you can export the overview report in PDF or CSV using the export option.
Clicking the "Overview" tab will take you here by default.
This tab has several sections:
eNPS (Employee Net Promoter Score)
This score will only be present in the reporting if there is at least 1 eNPS question in the survey.
This important metric is used to measure employee loyalty, and is built around the Net Promoter Score (NPS). This score captures the level of willingness that employees will want to recommend your workplace to their family, friends, and/or network of colleagues.
eNPS is calculated based on the following formula:
eNPS Score = % of Promotors - % of Detractors
The score is based on the survey question “On a scale of 0-10 how likely are you to recommend our organization to your family or friends?” Afterward, the answers submitted are divided into the following three categories:
0-6: Detractors — Users that fall within this range are dissatisfied with the organization, and may cause negative word of mouth to peers.
A survey taker is a detractor when they give a score between 0 and 6.
7-8: Passives — Those that have no positive or negative sentiment about your company. A respondent who has scored you between 7 and 8 is considered to be passive or neutral.
9-10: Promoters — Very positive and enthusiastic about the company. Those that score 9 or 10 on the eNPS question fall within this category.
This section measures the favorability percentage by considering the aggregate count of favorable/very favorable votes provided by survey participants. The count is then divided by the total number of votes received. For example, if we received a combined total of 5 Strongly Agree/Agree votes out of 11 total votes, our system would perform the calculation 5 divided by 11, then multiply the result by 100 to yield the favorability percentage.
The bar graph splits up the favorability sentiment in a visual, color-coded display. The following colors represent what particular votes were cast:
- Dark Blue represents — Strongly Agree
- Lighter Blue — Agree
- Grey — Neutral
- Lighter Red — Disagree
- Darker Red — Strongly disagree
Satisfaction Areas vs Opportunity Areas — This section splits up the top and bottom performing categories. These categories/questions show which scored favorability, vs more negatively. Every question within the survey is associated with a specific category. By default, you will see the top 3, and the bottom 3 categories based on their respective favorability scores.
When employees leave comments, the AI embedded in the Engage system will aggregate the various key topic points commonly raised by employees, and within each topic the positive, negative, and neutral sentiments are calculated based on their comments. Below shows the circular graph representations of each topic.
Clicking "View Topics Report" will take you to the Topics reporting tab. For more information on that topic, see this article.
The final section of the Overview report is divided into 2 sections, and shows which segments scored more favorability, versus those that did not.
- Strongest segments — Displays the top 3 segments that had high favorability for the survey.
- Weakest segment — Bottom 3 segments with low favorability.
This tab aggregates all of the survey answers data into one visual/color-coded heatmap. This representation can give admin users a quick overall of positive/less favorable areas without having to through more granular data in the other tabs.
In the upper-right area, there are 5 color codings linked to different sentiments:
- Strongly Agree - blue
- Agree - lighter blue
- Neither Agree or Disagree - grey
- Disagree - light red
- Strongly Disagree - red
Various segments/dimensions can be filtered in the drop-down menu next to "Segment by", and clicking the arrow next to a category will show the questions under that category.
This tab shows favorability sentiments in a visual manual, broken down by the category types within the survey questions.
Hovering your mouse cursor over each color will show the percentage/vote breakdown of those that voted in that favorability field. Additionally, putting your mouse cursor will display a "See details" link to the right.
Clicking that will take you to an overview reporting of that category.
For more information on category favorability trends, see this article.
The questions report shows what were the highest and lowest-scoring questions.
Every question within the survey has its own favorability scores, which are shown on this page within the color-coded bar graph. The color coding works much like those in other tabs of the reporting, e.g. blue is more favorable, grey is neutral, and red is negative.
Clicking next to "See details" next to the question will take you to a more in-depth report of the favorability metrics for the question.
Will take you here:
Within this tab, the AI integrated into the Engage system will compile the most frequently discussed key points within comments left by employees in survey comments. For each topic, the AI also assesses the sentiments expressed, categorizing them as positive, negative, or neutral. Visual representations of these topics are presented in the following circular graph format.
By default, you will be on the "Bubble Cloud" visualization. Clicking on "Word Cloud" will show the topics.
At the bottom of this page, you are also able to search within the AI topics to see what results appear.
This tab takes all the comments entered for any open-ended questions in the survey, or any other questions that allowed open responses, and aggregates them into favorability and sentiment metrics and color codings to give you a top-level view of how your employees feel.
This overview will not only show the sentiment color codings, but also commonly used words in the survey response to show what was entered the most by your employees. If any other questions also collect open-ended comments, then the application will give an overall sentiment of the comments as positive, negative, or neutral for each of the questions. You can also search within the comments for any desired words/sentences to drill down to specific reporting.
Our application also utilizes artificial intelligence to single out comments that contain suggestions. These comments are then copied over to "Suggestions". This is another tool that helps managers and HR Teams quickly focus on topics that employees are concerned about.
This section shows how many users within a segment participated in the survey.
All available dimensions in your account will appear in the drop-down menu next to "Segment by". For more information on dimensions, see this article. Filtering to them will show all the relevant segment data in the reporting, as well as the number of users that completed the survey out of the total number of users within the segment.