Engage: Survey Reports



The reports feature in Engage is where you'll be able to see all high-levels and granular levels of reporting. This part of the UI is a huge part of where you can see metrics on how employees voted on question types, get favorability scores, and comments. From there, you can do any necessary action planning, or execute next steps to ensure you meet your employees' satisfaction. To view the report for a survey, click on "See report" next to the desired survey within the module:


That will then take you to this page:


The following sections will give a breakdown of each section within the "Reports" tab.


This section shows the overall participation of the employees within the survey. It is present on all reporting tabs. The number of people who took the survey will appear in the parenthetical, next to the total number of users within the audience:


Additionally, on the right corner you can add filters to the reports using the filter option, which will bring up the "Add a Dimension" button where you can select which dimension(s) to filter to:


The reporting in this tab will then update to reflect the information solely for the dimensions that are filtered. The "View options" button allows you to compare the results from this survey with a previous survey, company average, or benchmark:


Checking off the "Show comparison" checkbox will show an additional percentage next to each average on this page, to compare the averages from this survey with whatever comparison option you have selected.

"Previous survey" will compare your results with the last closed survey. "Company's average" derives the total averages from all survey reporting in the system. "Benchmark" comes from predefined world percentages for these categories. For more information on benchmarks, see this articleLastly, you can export the overview report in PDF or CSV using the export option:



Clicking the "Overview" tab will take you here:


This section  is divided into 4 main parts:

  • Participation — This displays the overall number of people who have completed the survey. There are also various filter/view options in this section. 
  • Overview — eNPS, Favorability, Linked Survey trends, and top/worst-performing categories are displayed here.
  • Comments Overview — The total number of comments entered by employees, with various metrics, are within this section. 
  • Segments — This section displays which segments (i.e. users, employee titles) have the strongest/weakest/biggest improvements and participation.

This section has 3 main features:

eNPS (Employee Net Promoter Score)


This score will only be present in the reporting if there is at least 1 eNPS question in the survey.

This metric is used to measure employee loyalty, and is built around the Net Promoter Score (NPS). It is a key measurement in the system, as this score captures the level of willingness that employees will want to recommend your workplace to their family, friends, and/or network of colleagues. 

eNPS is calculated based on the following formula:

eNPS Score = % of Promotors - % of Detractors

The score is based on the survey question “On a scale of 0-10 how likely are you to recommend our organization to your family or friends?” Afterward, the answers submitted are divided into the following three categories:

0-6: Detractors — Users that fall within this range are dissatisfied with the organization, and may potentially be spreading negative word of mouth to their peers. 

A survey taker is a detractor when they give a score between 0 and 6.

7-8: Passives — Those that have no positive or negative sentiment about your company. A respondent who has scored you between 7 and 8 is considered to be passive or neutral.

9-10: Promoters — Very positive and enthusiastic about the company. Those that score 9 or 10 on the eNPS question fall within this category. 

Overall Favorability


This area calculates the sum of “Strongly Agree” & “Agree” votes from the survey, and puts it into a favorability score. Essentially, the total sum/averages of people who voted favorably across all questions within the survey are present here. No matter how many rating questions the survey has, the score displayed here is the median or average of all those questions put together.

The bar graph splits up the favorability sentiment in a visual, color-coded display. The following colors represent what particular votes were cast:

  • Dark Blue represents — Strongly Agree votes
  • Lighter Blue — Agree
  • Grey — Neutral
  • Lighter Red — Disagree
  • Darker Red — Strongly disagree

If a prior survey is linked to the one you're viewing the reports for, the "Linked Surveys Trend" section will show a line graph, comparing the favorability trends comparison between the two surveys:


Hovering your mouse cursor over one of the dots on the line graph will display the survey and favorability info:


Satisfaction Areas vs Opportunity Areas — This section splits up the top and bottom performing categories. These categories/questions show which scored favorability, vs more negatively. Every question within the survey is associated to a specific category. By default, you will see the top 3, and the bottom 3 categories based on their respective favorability scores:



This section provides various sentiments and metrics based on the comments and responses submitted by your employees for the survey. The overall "Sentiment" can be either Positive, Negative, or Neutral, and is calculated by taking the difference between the percentages of positive comments minus negative comments:


The most popular question categories also have their own sentiment circles under "Popular categories". Within the circle, blue represents positive comments, grey is neutral, and red is negative. Hovering your mouse cursor over each of those categories will display the percentage breakdowns of each of those color codings:


Clicking on "View Comments Report" will take you to the Comments tab of the reporting.


The final section of the Overview report is divided into 4 sections, and shows which segments scored more favorability, versus those that did not:


  • Strongest segments — Displays the top 3 segments that had high favorability for the survey.
  • Weakest segment — Bottom 3 segments with low favorability.
  • Biggest improvements — This section is displayed only when another survey is linked to the current one.  This will show the top 3 segments that had the highest increase in favorability from the previous linked survey.
  • Biggest decline  — Only displayed when there's a linked survey. This shows the top 3 segments with the highest decline in favorability from the previous linked survey.


This tab aggregates all of the survey answers data into one visual/color-coded heatmap.  This representation can give admin users a quick overall of positive/less favorable areas without having to through more granular data in the other tabs:


Various segments/dimensions can be filtered in the drop-down menu net to "Segment by", and clicking the arrow next to a category will show the questions under that category:



This tab shows favorability sentiments in a visual manual, broken down by the category types within the survey questions:


Hovering your mouse cursor over each color will show the percentage/vote breakdown of those that voted in that favorability field. Additionally, putting your mouse cursor will display a "See details" link to the right: 


Clicking that will take you to an overview reporting of that category:


For more information on category favorability trends, see this article


The questions report shows what were the highest and lowest scoring questions, as shown below:


Every question within the survey has their own favorability scores, which are shown on this page within the color-coded bar graph. The color coding works much like those in other tabs of the reporting, e.g. blue is more favorable, grey is neutral, and red is negative. 

Clicking next to "See details" next to to question will take you to a more in-depth report of the favorability metrics for the question:


Will take you here:


The above is for an open-ended question.


This tab takes all the comments entered for any open-ended questions in the survey, or any other questions that allowed open responses, and aggregates them into favorability and sentiment metrics and color codings to give you a top-level view of how your employees feel:


This overview will not only show the sentiment color codings, but also commonly used words in the survey response to show what was entered the most by your employees. If any other questions also collects open ended comments, then the system will give an overall sentiment of the comments as positive, negative or neutral for each of the questions. You can also search within the comments for any desired words/sentences to drill down to specific reporting.


This section shows how many users within a segment participated in the survey:


All available dimensions in your account will appear in the drop-down menu next to "Segment by". More information on dimensions, see this article. Filtering to them will show all the relevant segment info in the reporting, as well as the number of users that completed the survey out of the total number of users within the segment. 


  • Survey reports present data for the entire survey, and cannot be broken up into custom date ranges if you'd like to view reporting over time. Instead, the best way to achieve that type of reporting would be to utilize the dashboard, and the filters present there. More info on the dashboard can be found in our help center article here. Filters present in the dashboard allow you to set custom or preset date ranges and will aggregate survey reporting data (which can be filtered even further).
  • Visibility into what data segmentation a manager/admin user can see is fully customizable depending on what dimensions the user has visibility into. More information on setting those permissions in User Management can be found here. Setting specific dimension permissions for managers and admins is useful because it can allow them to see only the data they need to see (e.g., employees from the locale they oversee, specific department(s) they manage, etc.) and allows for greater confidentiality in the reporting.
  • Tenure is calculated once every 24 hours in the system, and will reflect appropriately in the survey reporting/segmentation as well. Tenure is determined by the "Employee lifecycle" settings in App Settings. More info on that can be found here.