Student Survey of Teaching Task Force Final Report - 2024-25
Background
Executive Summary
Background and Purpose. The AY24-25 SST Task Force conducted a comprehensive review of KU's Student Survey of Teaching tool following three years of implementation since 2021. The review included gathering input from faculty, GTAs, unit leaders and graduate and undergraduate students, analyzing KU SST data, and reviewing relevant research on student surveys of teaching.
Key Findings
- Most stakeholders found the SST questions acceptable but requested an expanded rating scale, clearer language, and additional questions to gather certain additional information.
- Response rates over the last three years averaged 31% for small courses and 23% for large courses. Both stakeholder input and research on response rates revealed key factors indicating that administration methods, perceived value, and timing are key factors in response rates.
- Input from instructors and chairs highlighted technical and usability issues to enhance dashboard functionality and utility.
Major Recommendations
Survey Structure and Questions: Expand from 3-point to 4-point rating scale, consolidate separate course/instructor surveys into a single survey per instructor, center the student experience in the language, and add questions on learning outcomes and teaching assistants.
Administration and Response Rates: To ensure effective administration that supports higher response rates, provide instructors with clear communication, training, and ongoing support. Instructors should allocate dedicated class time for students to complete evaluations, consider offering incentives to encourage participation, and explain and model how student input is used to support the instructor's continuous improvement of the course.
Dashboards/Reports: Enable longitudinal trend analysis, optimize dashboards for instructor needs versus chair needs, investigate AI-powered comment summaries and provide multiple ways to export comments.
KU Student Survey of Teaching Task Force AY24-25: Charge and Membership
Project Sponsor: Provost Barbara Bichelmeyer
Project Customer/Client: VP Faculty Affairs Amy Mendenhall
Partnering Units: CTE – subject matter expert AIRE – technical service delivery Faculty Affairs – end user representative
Work Group Facilitator/Chair: Dea Follmer Greenhoot
Timeline: Launched September 2024, recommendations report submitted June 2025
Overview
This year the Office of Faculty Affairs, AIRE, and CTE are partnering to launch a Student Survey of Teaching Task Force. With an overarching charge set by Provost Bichelmeyer, this group will work to conduct a comprehensive review of all aspects of the current University Student Survey of Teaching.
The Office of Faculty Affairs will serve as the overall sponsor of the group, providing support and important contextual framing for the work, and Dea Follmer, Director of CTE, will serve as the group’s chair.
Our goal is to have the committee membership reflect a broad range of perspectives from across the university. Each committee member brings a unique perspective and experiences that will contribute to this important work. We are including some members of the 2021 ad hoc committee that developed the current SST so that we have an opportunity to build on that work.
KU SST Background
Student surveys of teaching are a measure of teaching effectiveness that provide an important modality for hearing students’ perspectives about their experiences in a course and with an instructor. Though KU policy notes multiple sources of evidence should be utilized in assessing instructors’ teaching, student surveys of teaching are a required component per KBOR and KU Policy. As such, it is critical for KU to have a student survey of teaching that provides students with an opportunity to share their perspectives, while also providing instructors and chairs with relevant data that is informative to ongoing teaching assessment and improvement.
Launched in 2021, following the work of an ad hoc university task group, the current Student Survey of Teaching represents a major shift in survey questions and rating scales, as well as modality of survey administration. After three years of administration of the current version, it is clear some aspects of the current survey are an improvement over previous versions (e.g. removing questions which could lead to biased responses), while there are also notable challenges which limit the usefulness of the resulting data for instructors and unit leaders (e.g. low response rates). These limitations have led to a breakdown of the current university survey with some units developing their own survey, other units considering discontinuance, and instructors and unit leaders relying on data from only a small sample of students to make assessments of teaching for annual evaluation and promotion.
Task Force Charge
The guiding principles for the work of the group are:
- The SST needs to be developed and administered in a way that encourages student completion.
- The SST must meet the baseline need of having some comparative, quantitative data for individual faculty and unit leaders (as required by some disciplinary accrediting bodies).
- To most effectively solicit relevant, informative, and unbiased feedback, the SST should focus on areas that students are qualified to evaluate based on their experiences in class.
Work Group Deliverable
Informed by user feedback, research on best practices for teaching assessment, and system/platform capabilities, the work group is charged with developing a report outlining recommendations for improving the University Student Survey of Teaching, to make it an instrument that is user friendly for students, meets the needs for units across the university, and provides data that can be confidently incorporated into assessments of teaching.
The full recommendation report should be completed and delivered to the Provost for review by the March 1, 2025.
SST Task Force Members
- Dea Follmer Greenhoot (task force chair)
CTE, Psychology (Director, Professor) - Meagan Patterson
Faculty Affairs Fellow, Education (Professor) - Jason Koepp
AIRE - Marta Caminero-Santangelo
English (Chair, Professor) - Shahnaz Parsaeian
Economics (Assistant Professor) - Brian Lagotte
Global & International Studies (ATP) - Alesia Woszidlo
Leadership Studies/Communication Studies (Associate Professor, Director) - Caroline Bennett
Civil, Environmental, & Architectural Engineering (Chair, Professor) - Jason Matejkowski
Social Welfare (AD, Professor) - John Bricklemyer
Professional Studies & KU Edwards Campus (AD, Professor of Practice) - Colin Roust
Music (AD, Associate Professor) - Pat Downes
Business (Associate Professor) - Olakunle Akinniyi
Student Senate member
Summary of SST Task Force Process
The SST Task Force undertook a comprehensive multi-phase approach to review and refine the Student Survey of Teaching (SST) over the 2024-2025 academic year. The process combined analysis of existing materials, reviews of the empirical literature on student evaluations, and information gathering from various stakeholders including faculty, academic leaders and students.
- Initial Assessment and Data Analysis
- Tool Review. The task force began by conducting a thorough review of materials and survey instruments developed by the original SST committee in 2020, which created the current tool.
- Policy Review. We gathered KU and KBOR policies regarding student evaluations of teaching (Appendix A).
- Response Rate Analyses. To understand usage patterns and effectiveness, the task force requested AIRE to analyze response rates for the existing SST tool. Additional analyses were performed to identify predictors of response rates and pinpoint courses that achieved higher-than-expected participation rates. This enabled us to survey KU faculty of those courses with higher-than-expected participation rates to gather information about implementation strategies that could be potentially replicated.
- Literature Reviews
- Research on Student Evaluations. We reviewed the literature review created for the 2020 SST committee.
- Performance Management Research. We conducted a review of the performance management literature from the business sector, bringing external perspectives on evaluation methodologies and their effectiveness in organizational contexts.
- Research on Online Evaluation Response Rates. We also examined research on response rates to online student surveys to identify evidence-based insights into factors that influence student participation and strategies for improving engagement.
- Stakeholder Engagement
The task force prioritized gathering input from multiple stakeholder groups through various engagement methods.
- Faculty conversations. Faculty perspectives were collected through small group conversations. Major questions were: What do you need/want out of the tool and the reports? What are you using them for? How do you usually feel about looking at your results? Are there particular items that would be especially useful for your teaching development? For summative evaluation? Tell us about your process for administering evaluations?
- High responder survey. We also conducted a survey of instructors of “high response” courses to gather information about their methods of administration to inform recommendations to improve response rates.
- Chair Feedback. Department chairs provided input through both small group sessions and email communications. We asked unit leaders: What information do you need/want from the tool and the reports? What is your unit’s process for administering evaluations?
- Student Focus Groups. We gathered perspectives from undergraduate and graduate students through separate focus groups conducted by a CTE staff member with qualitative research expertise. Students were asked: What factors influence whether you complete the SST? Whether you add open-ended comments? What incentives would encourage you to complete it? What do you think happens with the information you provide? Do the questions enable you to provide the feedback you want to provide instructors on your course experience? Are there other avenues besides the SST that you have to provide feedback to your instructors?
- Synthesis and Recommendations
We then synthesized and organized the information gathered from these multiple sources to generate a set of recommendations for improving response rates, the tool itself, and the way the data are represented to instructors and unit leaders/supervisors. The task force’s recommendations for administration/improving response rates were distributed to instructors for the Spring 2025 administration of the SST.
Synthesis of Information Gathered on Student Surveys of Teaching
Note, all documents referenced are available in a Supplemental Documents folder.
Questions and Survey Construction
- SST literature review (2020, Doug Ward) - See Appendix A. We reviewed the literature review developed by Doug Ward for the SST Committee in 2020. The major takeaway is that SSTs are an opportunity for students to provide feedback on their own experience, but not to rate all aspects of teaching effectiveness.
| What students are qualified to judge | What students are NOT qualified to judge |
| • What occurred in the class, including organization, use of class time, approaches an instructor took to help students learn • Clarity of goals, expectations and presentation • Timeliness and clarity of feedback • Availability of instructor outside class • Sense of class climate • Sense of workload compared with other classes • How often a class engaged in discussion • Quality of an instructor’s presentations | o Quality of course content o Instructor’s knowledge of subject matter o Effectiveness of course design or effectiveness of the instructor o Appropriateness of course goals o Quality of the instructor’s assessment of students |
Sources: Clayson (2020); Benton and Young (2018); Task Force on the Assessment of Teaching and Learning (2007); Frey (1974)
- Review Performance Management literature. Task force member Pat Downes conducted a literature review of the performance management literature. See Appendix B. Major findings:
- Employees react more favorably to systems that emphasize learning and development. But as long as the system is perceived as fair and accurate, employees do not react adversely to evaluative performance management systems
- Multi-source feedback is particularly useful for employee development
- Absolute ratings as opposed to relative rankings are preferable (rating systems should focus on the quality of instruction, not how an instructor stacks up against peers)
- GTA feedback/recommendations
- Want more open-ended questions
- Would like midsemester feedback for real time course adjustments
- Include questions about learning outcomes or that ask students to reflect on what they have learned
- Faculty feedback/recommendations
- Value non-numerical feedback. Add comment boxes after each section/question instead of just a single point at the end
- Expand response scale to allow more variation (between Throughout and Sometimes)
- Improve question clarity
- Consider a tiered question structure with university-wide questions, program specific questions and course-type specific questions?
- Include yes/no questions about engagement, attendance, Canvas use
Add mid-midsemester evaluations for formative feedback
- Chair feedback/recommendations
- Value numeric feedback
- Chairs who have large numbers of instructors to evaluate would like a concise standardized evaluation that could serve as a quick diagnostic tool
- Would like questions to help identify egregious behaviors so that they can intervene
- Need language in questions that students will understand
Administration and Response rate
- 2020 SST Literature review (Appendix A) indicated that we would want response rates of 15-25% for large courses and 40-53% for small courses (30 or fewer) - these are the thresholds below which there are concerns about the validity of feedback/data
- Brief Review of Research on Response rates (2024)- Andrea Follmer Greenhoot conducted a literature review of recent research on response rates, particularly when surveys are administered online. See Appendix C. Major factors affecting response rates:
- Perceived value and impact - will responses lead to actual changes? Students are even more motivated if they think they will benefit personally. Explain how results will be used and model use of results.
- Survey Design – shorter surveys are better, online delivery, mobile devices increase access but cut back on open-ended feedback, laptop or desktop is best for open-ended questions
- Timing - Avoid busy times in semester, create dedicated time during class to administer (first 15 minutes), provide reminders to complet
- Anonymity - students do fear that they will be identified and it will affect their grade, especially if they will have the same instructor later
- Incentives - Incentives increase participation. For example, prizes, points for completion; class wide incentives for overall response thresholds work as well as individual incentives, especially if the threshold is high (80% or more)
- Undergraduate student feedback (from focus group)- Recommendations
- Make it clearer to students when the survey is available, and send reminders
- Set aside class time
- Explain why and how it is used. Students do not believe their responses matter
- Provide opportunities for earlier feedback that can affect them personally
- Grad student feedback- Recommendations
- Do not administer during such a busy time of the semester
- Explain why and how feedback is used
- Address concerns about anonymity, which appear to be heightened among graduate students given their smaller class sizes and faculty ability to identify their unique writing styles
- Faculty feedback- Low response rates are a major concern. Recommendations from faculty for process and Administration recommendations:
- Provide students extra credit for completion
- Provide multiple reminders
- Share survey links/embed in Canvas as assignment
- Share response rates with class
- Chair feedback- As with faculty, low response rates are a major concern. Privacy and retaliation fears potentially impact response rates and candor. Administration suggestions include:
- Timing: Administer at start rather than end of class sessions
- Strategic Announcements: Faculty actively promote survey importance
- Incentives: Recognition that students may wait for grade feedback before completing evaluations, course credit increases response rates
- Integrate survey into Canvas as assignments
- High Responder Survey: After identifying courses with higher-than-expected response rates (based on a predictive model developed to on the full set of response data from the last 4 years by task force member Pat Downes), we conducted a survey of instructors of those courses to find out about their administration practices. Key insights:
- Almost all who teach synchronous courses reported that they set aside class time for students to complete the SST, typically during one of the last two weeks of the semester.
- Most allocate time at the beginning of class to avoid students leaving early.
- Most inform students in advance of the survey through Canvas, in-class verbal reminders, and emails. Some include it in their syllabus.
- Many emphasize the importance of the survey by linking it to course improvement, instructor evaluation, and university processes like promotion to demonstrate that they will use it constructively
- Almost half used midsemester surveys to gather formative feedback from students, another strategy to model the value of student input.
- Only a small number (14%) used incentives like extra credit or treats to boost completion.
Reports and use of data
- Research Literature
- See Part I for information that students can and cannot report on
- Student surveys are best used as one of multiple sources of information given the bias and limitations associated with this tool
- GTA feedback: Concerns about awareness and timing of feedback and use for improvement
- They need more info about when and how to retrieve their feedback.
- They would like the feedback sooner, so they have time to make course improvements
- They would like more mentorship and feedback from faculty (e.g., review student survey results together and discuss implications for teaching)
- Faculty feedback: Concerned about technical issues with the reports/dashboards and limitations on what they show (e.g., questions are truncated, hard to look at data over time and locate prior data, unclear when data are available, some would like excel format). Technical and process recommendations:
- Provide orientation to new faculty on the system
- Make sure faculty are notified when results are available, and that they know how to get prior semester results as well
- Provide results in multiple formats (PDF and Excel)
- Provide AI tools to summarize comments to make open-ended feedback more useful
- Provide mechanism to look at progress over time
- Create clearer criteria for evaluating performance with the system so that faculty have a better understanding of whether they are meeting expectations and how to improve
- Chair feedback: Also concerned about difficulty of finding the reports and gleaning the information they need from them. Recommendations:
- Make reports easier to process and more diagnostic
- Develop tools that enable cross-course and longitudinal tracking
- Consider AI based analysis of open-ended comments
- Provide implementation resources that connect feedback to improvement resources to support implementation of better practices
Recommendations: Refining the SST Questions and Structure
The task force found that most stakeholders were generally satisfied with the current SST questions and wanted refinements rather than major overhauls. The literature review reinforced that students are qualified to assess their classroom experience, clarity of instruction, and course climate, which suggests keeping a focus in the questions on the student experience and student perceptions.
The primary improvements suggested were:
- Expanding the rating scale to allow more nuanced responses (particularly between "Throughout" and "Sometimes")
- Enhancing question clarity using language students can more easily understand • Provide more open-ended questions for qualitative feedback
- Adding questions about student learning outcomes and reflection (it was not clear to many instructors why there is just one question on learning and it is about critical thinking)
- Adding one or more questions that could help chairs identify concerning behaviors requiring intervention
- Adding question(s) about teaching assistants
- Organizing the questions into teaching dimensions that will facilitate interpretation and use of the resulting data by the instructor and supervisors
A. Rating Scale Revision
The task force recommended revising the rating scale from a 3-level scale to 4-level scale. For the frequency ratings, the current scale is (Did not do this, Did this Sometimes, Did this Throughout the Course). The task force recommends the following alternative:
- Hardly ever
- Sometimes
- Often (or Frequently)
Almost always
B. Structural Revision
The task force also recommended dropping the separation between “course items” and “instructor items,” which necessitated two separate survey links for students to complete an evaluation of a course taught by a single instructor (and more with multiple instructors). The large number of survey links is cited as one barrier to high response rates. The task force felt it was possible to distinguish between course qualities and instructor behaviors without two separate surveys. Thus, we recommend:
- Students will complete a survey attached to each instructor of their course
- Aggregation of items at the course level across instructors can be done “on the back end” by AIRE. AIRE has confirmed that this is possible and has already made this revision to the Spring 2025 survey administration.
C. Proposed Items
This proposal eliminates 2 items, adds 4 items, makes minor revisions to 9 items, and keeps 5 items the same.
Table 1. Crosswalk of Teaching Dimensions, Original Questions, Proposed Questions and Rating Scale
| Dimension of Prior Questions Proposed Questions Response Scale Teaching (to be used in reports) | Prior Questions | Proposed Questions | Response Scale |
| Time spent on course | Outside of scheduled class meetings and exams, about how much time did you spend on this course per week, on average | KEEP | 0-4 hours per week 5-10 hours per week 11-15 hours per week 16-20 hours per week 21 or more hours per week |
| Time spent on course | Compared with other classes you have taken at this level, how much time per week did you spend on this course? | DROP | |
| Course Administration | ADD Class meetings started and ended on time. | Hardly Ever Sometimes Frequently Almost Always NA (for asynchronous online) | |
| Course Administration | ADD Class meetings started and ended on time. | Hardly Ever Sometimes Frequently Almost Always NA (for asynchronous online) | |
| Course Administration | The instructor made deadlines clear | The course policies (attendance, late work, etc.) and deadlines were clear. | Hardly Ever Sometimes Frequently Almost Always |
| Course Administration | The instructor was clear about how I would be graded | The grading criteria were clear. | Hardly Ever Sometimes Frequently Almost Always |
| Course Goals, Content and Alignment | The instructor helped me understand what I was expected to learn | I knew what I was expected to learn and do in this course. | Hardly Ever Sometimes Frequently Almost Always |
| Course Goals, Content and Alignment | Course materials were useful in my learning | The course materials were helpful to my learning | Hardly Ever Sometimes Frequently Almost Always |
| Course Goals, Content and Alignment | The instructor explained the purpose of work I did in the course (things like discussions, assignments, exams, class activities) | I understood the purpose of the work I did in the course (things like discussions, assignments, exams, class activities). | Hardly Ever Sometimes Frequently Almost Always |
| Teaching Practices (Feedback) | ADD The instructor was available to answer questions or provide assistance. | Hardly Ever Sometimes Frequently Almost Always | |
| Teaching Practices Grading and feedback (helpfulness) | The instructor provided feedback that helped me learn | KEEP | Hardly Ever Sometimes Frequently Almost Always |
| Teaching Practices (engagement) | The instructor used approaches that encouraged me to participate in class activities (in person or online) | The instructor’s approach encouraged me to participate and engage in class meetings or activities | Hardly Ever Sometimes Frequently Almost Always |
| Teaching Practices | ?? I had opportunities for engagement with my instructor(s) and peers. The task force did not have consensus on this item. Peer engagement may not be relevant in all courses, and other items may be sufficient to assess engagement with the instructor | Hardly Ever Sometimes Frequently Almost Always | |
| Class Climate | The instructor helped create an environment in the class (whether in person or online) that motivated me to learn | DROP (redundant with two others) | Hardly Ever Sometimes Frequently Almost Always |
| Class Climate | The instructor responded respectfully if I had questions | The class climate was respectful, open, and welcoming. | Hardly Ever Sometimes Frequently Almost Always |
| Class Climate | The instructor helped me feel that I could succeed in the class | KEEP | Hardly Ever Sometimes Frequently Almost Always |
| Self-assessment of learning | The course helped me improve my critical thinking | I acquired the knowledge and skills that were identified in the syllabus as course learning outcomes. | Hardly Ever Sometimes Frequently Almost Always |
| Self-assessment of learning | The instructor helped me understand different ways to apply what I learned | I acquired knowledge and skills that I can use in the future (in other courses, work, life). | Hardly Ever Sometimes Frequently Almost Always |
| Open-ended feedback | What aspects of this class were most helpful to your learning? | KEEP | Open response |
| Open-ended feedback | What aspects of this class need improvement? | KEEP | Open response |
| GTAs | ADD Did this course have one or more teaching assistants (in addition to the instructor)? Y/N
If Y, how effective were the TAs in helping you learn the course material | Yes/No Then Open response |
Final Proposed Survey Text:
Student course surveys play an important role in enhancing the quality of instruction at KU. Your responses to this survey are anonymous. Please give them careful attention. The results will be made available to the instructor, the department chair and other appropriate administrators (after all final grades are turned in).
We value your input about the class. Unfortunately, studies have revealed that responses on course surveys can be influenced by unconscious and unintentional biases about race and gender.
KU is committed to access, success, respect, and belonging, so we encourage you to avoid comments about personal characteristics of the instructor or other students. The survey should take 10 minutes or less, and you will have opportunities to elaborate on your responses.
Time Spent on Class
- Outside of scheduled class meetings and exams, about how much time did you spend on this course per week, on average?
- 0-4 hours per week
- 5-10 hours per week
- 11-15 hours per week
- 16-20 hours per week
- 21 or more hours per week
This section focuses on course administration, materials and expectations in this class
(Rating Scale: Hardly Ever, Sometimes, Frequently, Almost Always, NA, for asynchronous online)
- Class meetings started and ended on time.
- Class time was well utilized.
- The course policies (attendance, late work, etc.) and deadlines were clear.
- The grading criteria were clear.
- I knew what I was expected to learn and do in this course.
- The course materials were helpful to my learning.
- I understood the purpose of the work I did in the course (things like discussions, assignments, exams, class activities).
This section asks about the instructor’s teaching approaches and the class environment they created. (Rating Scale: Hardly Ever, Sometimes, Frequently, Almost Always)
- The instructor’s approach encouraged me to participate and engage in class meetings or activities.
- The instructor was available to answer questions or provide assistance.
- The instructor provided feedback that helped me learn.
- The class climate was respectful, open, and welcoming.
- The instructor helped me feel that I could succeed in the class.
These questions ask you to reflect on your learning in this course. Rating Scale: Hardly Ever, Sometimes, Frequently, Almost Always, NA (for asynchronous online)
- I acquired the knowledge and skills that were identified in the syllabus as course learning outcomes.
- . I acquired knowledge and skills that I can use in the future (in other courses, work, life).
Open-Ended Feedback
- What aspects of this class were most helpful to your learning?
- What aspects of this class need improvement?
- Did this course have one or more teaching assistants (in addition to the instructor)?
Y/N If Y, how effective were the TAs in helping you learn the course material?
Recommendations: Administering the SST and Improving Response Rates
A. Recommendations for the University
- Consolidate Course and Instructor Surveys. Address logistical barriers to completion by consolidating the surveys so that there is only one link per course/instructor. Students indicate that they are overwhelmed by the number of survey links in their dashboards, which may depress response rates. Additionally, some faculty report that students complete the first survey on the course items and not the instructor survey (despite the fact that they are taken to the instructor survey upon completion of the course survey).
- Develop More Comprehensive Communication to and Scaffolding for Instructors. This includes:
- Clarity on the Timing of Administration. Improve communication about when the SST is available. This may include simplifying and adjusting the existing administration timetable: https://kansas.sharepoint.com/teams/StudentSurveysofTeaching. The task force recommends adjusting administration dates for short courses so that SSTs are administered in a timely manner for courses offered in the first half of the semester as well as for winter term courses. Currently, the administration dates for short courses are timed at the end of the 16-week semester, which is appropriate for second 8-week courses but creates a lag of weeks or months for short-term courses that finish in the first half of the semester.
- Information on How Students Can Find the SST. Instructors do not have access to the student view of the SST. Develop video tutorials/visual materials for faculty to see how students navigate to the SST so they can better coach students in completing it.
- Sharing Best Practices for Administration. Maintain regular communications to faculty with information about when the survey will open, recommendations for how to administer it to enhance response rates (see instructor recommendations and sample email below), a script for administration, and when and where the results will be available.
- Sharing Data and Examples. Provide response rate data from the Spring 2025 pilot of these new recommendations, along with examples of effective practices shared in the High Responder Survey, in communications to instructors about administration.
- Allow Incentives. Research on response rates suggests that incentives for participation can drive higher completion rates. Typically, these are enacted by instructors at the course level.
- Consider providing an option for units to distribute the standard SST in paper form even if the unit is responsible for managing the logistics of the paper administration (in part this is why so many alternate forms are being used, e.g., CEAE, Business). If this option is provided (and/or if units are able to continue to use alternate tools as they do now), providing an option for units to opt out of the university-wide SST would facilitate improved tracking of the courses utilizing the standard form and response rates among those courses. Currently, the data include many courses in which students are being asked to complete alternate forms.
- Consider Support for Early-or Mid-semester Surveys. Students indicated that a major factor that influences whether they complete SSTs is whether they believe their instructors will use the data. More consistent use of early or midsemester surveys for students to provide input on the course and their learning would help create a culture of responsiveness to student feedback. Thus, the university might consider how to provide more robust support for this option (but it is important to note that if midsemester surveys were administered for all courses, without systematic use of them, this practice could reinforce student views that their input is not used).
B. Recommendations for instructors
- Use class time. Set aside 10-15 minutes at the beginning of a class period (for synchronous courses) for students to complete the survey. Be prepared with the information students will need to find the survey and complete it (see the script below). Leave the room while students complete the survey; ask a student to let you know when the class is finished.
- Schedule it for students. Add completion of the SST as a no-credit assignment or an event on the Canvas course calendar/to-do list and/or include it on your syllabus.
- Don’t wait until the very end of the semester. Consider using a class session that falls before the final week of the semester when student and instructor schedules are particularly full.
- Explain and model how you use student feedback. One major reason why students do not complete SSTs is that they do not believe their feedback makes a difference. Take time to explain to your students why their feedback is important and how you will use it. Identify particular dimensions of the course about which you are eager for their feedback. If possible, model use of student feedback by sharing examples of course features you have changed in response to prior student feedback.
- Gather and use student input earlier in the course. Demonstrate the value of student input by using an early- or midsemester survey or reflection assignment(s) to gather student feedback earlier in your course, and use it to make adaptive adjustments to your course. See this page on the CTE website for more information about gathering student feedback.
- Consider an incentive. Offer extra credit or another incentive for completion. Two strategies that both keep student responses confidential and have been shown by research to be effective in improving completion rates:
- All students receive extra credit if the class hits a target completion rate (e.g. 80% or higher). Update the class on completion levels until the criterion is hit (you can view completion rates in your Qualtrics dashboard here).
- Provide individual incentives; students receive extra credit for sending you a screenshot showing that they have completed the survey.
C. Draft Email to Instructors (note, this was the email distributed in Spring 2025)
Dear Instructors,
In this email you will find some recommendations for how to administer the University Student Survey of Teaching. These recommendations are aimed at promoting response rates and getting the most useful feedback from your students. The survey will open to students in all full semester courses on April 21. The last day for students to complete the survey is Stop Day, May 9. Your results will be available to you one day after the grade submission deadline in your Qualtrics dashboard.
Recommendations for Instructors
- Use class time. Set aside 10-15 minutes at the beginning of a class period (for synchronous courses) for students to complete the survey. Be prepared with the information students will need to find the survey and complete it (see the script below). Leave the room while students complete the survey; ask a student to let you know when the class is finished.
- Schedule it for your students. Add completion of the SST as a no-credit assignment or an event on the Canvas course calendar/to-do list and/or include it on your syllabus.
- Don’t wait until the very end of the semester. Consider using a class session that falls before the final week of the semester when student and instructor schedules are particularly full.
- Explain (and model) how you use student feedback. One major reason why students do not complete SSTs is that they do not believe their feedback makes a difference. Take time to explain to your students why their feedback is important and how you will use it. Identify particular dimensions of the course about which you are eager for their feedback. If possible, model use of student feedback by sharing examples of course features you have changed in response to prior student feedback.
- Consider an incentive. Offer extra credit or another incentive for completion. Two strategies that both keep student responses confidential and have been shown by research to be effective in improving completion rates:
- All students receive extra credit if the class hits an 80% or higher completion rate. Update the class on completion levels until the criterion is hit (you can view completion rates in your Qualtrics dashboard here).
- Provide individual incentives; students receive extra credit for sending you a screenshot showing that they have completed the survey.
Recommended script for introducing SST during class time: Today you’ll be completing the Student Survey of Teaching for this course. This survey gives you the opportunity to provide feedback about your learning experience in this class.
Your input is valuable because it helps me improve course content and teaching methods for future students. Courses evolve and improve directly because of student feedback like yours. For this feedback to be most useful, we need everyone to participate and provide comments about your learning experience that are specific, constructive, and actionable. Please complete the survey individually.
[Insert information about any incentive you are offering for completion.]
I want to assure you that your responses are completely confidential and anonymous and survey results will only be available to me after the final grade submission deadline. 19 Please take a few minutes now to log into Canvas and complete the survey. To access the survey, click on “account” in the left column; from the menu choose Course Evaluations and then click on the first link for this course. The survey will begin with questions about the course, followed by questions about the instructor(s). I'll step out of the room to ensure your privacy while you complete this process.
Recommended script for Canvas announcement for asynchronous classes:
Greetings students. It's time for the Student Survey of Teaching for our class, and I'd really value hearing about your experience. This survey gives you the opportunity to provide feedback about your learning experience in this class.
This is your chance to share what worked well for you and what could be better next time around. Your feedback helps me improve course content and teaching methods to make this course even better for future students. For this feedback to be most useful, we need everyone to participate and provide comments about your learning experience that are specific, constructive, and actionable. Please complete the survey individually.
[Insert information about any incentive you are offering for completion.]
I want to assure you that your responses are completely confidential and anonymous and survey results will only be available to me after the final grade submission deadline.
Please take a few minutes to complete the survey by [insert deadline]. To access the survey, from your Canvas home page click “account” in the left column; from the menu choose Course Evaluations and then click on the first link for this course. The survey will begin with questions about the course, followed by questions about the instructor(s).
Thanks so much for taking the time to complete the survey. I genuinely appreciate your thoughtful input.
D. Preliminary Response Rate Data
In Spring 2025, the email above was distributed to faculty when the SST was distributed for 16-week courses. Although anecdotally a number of faculty reported following the above recommendations and observing a significant uptick in their individual course response rates, and analysis of all Spring 2025 data and comparison to Fall 2024 data do not reveal a meaningful shift:
- 2024 Fall: Overall response rate was 26%, 38,574/147,647
- 2025 Spring: Overall response rate was 25%, 33,265/131,833
Nonetheless, there are multiple other recommendations summarized above that have not yet been implemented, so it will be important to continue to track changes as more recommendations are implemented. Additionally, it may take time and additional faculty training to get the word out about the instructor administration recommendations. Finally, a number of schools and departments included in the analysis are using their own separate surveys, so faculty in those units would be unlikely to attend to these recommendations for administering the university-wide tool. A school- or department-level analysis may be more illuminating.
Recommendations: Administering the SST and Improving Response Rates
A. Representing the Data for Instructors
The main purposes of the dashboard are to support instructors in (a) interpreting and using student feedback about their courses, and (b) generating concise, readily-interpretable reports on their teaching that can be submitted as evidence of teaching effectiveness, along with other material, in annual reviews or promotion, tenure or reappointment reviews. To this end, we recommend the following:
- Establish Teaching Categories/Dimensions. Use categorization of items (different dimensions of teaching) in the dashboard to keep the items organized and to quickly highlight areas of strength or needing attention. This will be particularly important if the number of items increases.
- Enable Trend Analyses. Develop functionality that allows instructors to track and visualize changes in SST performance over time with flexible filtering options by course, course type or semester. This would enable instructors to examine longitudinal patterns in specific teaching areas, such as tracking student perspectives on the class climate across multiple offerings, or comparing student engagement responses between upper-level and lower-level courses that they teach.
- Improve Interface with Open-Ended Comments. Investigate tools to facilitate summarization of open-ended comments (e.g., AI-powered tools) to make qualitative feedback more interpretable and actionable. Make individual comments available at bottom of dashboard and exportable in an excel file. Keep comments separate from PDF report generation to streamline the report.
- Enable Report Generation. Enable instructors to generate PDFs (using the functions described above) that concisely represent the student voice that they can submit as part of their materials for evaluation purposes.
- Improve User Experience and Navigation. Provide an orientation training for new faculty on dashboard functionality and data interpretation. Provide notifications/reminders to faculty when SST results become available, with clear instructions for accessing both current and historical data. Consider developing video tutorials and examples.
- Consider Implementation Support. Connect dashboard insights directly to teaching improvement resources and professional development opportunities. Develop quick-reference guides linking common feedback themes to specific improvement strategies
- Consider a New Strategy for Identifying Course and Instructors. The stakeholder conversations revealed that we need a more comprehensive and effective strategy for identifying the instructors and courses about which students should be surveyed, beyond using information in the academic timetable. In some departments, there are instructors/GTAs who have a significant teaching role but do not have a course code/line number associated with their name. Conversely, there are also learning experiences with course codes/line numbers such as dissertation, research hours, and independent study that have course codes but do not follow formal course structures, thus the survey is not well suited to gather student input on those experiences.
B. Representing the Data for Unit Leaders/Supervisors
The main purposes of the Unit Leader dashboard are to enable unit leaders to (a) assess student perspectives on their learning experiences in the unit as a whole (or among different categories of classes, such as in undergraduate classes or graduate classes), (b) to support quick diagnostics that quickly highlight areas of teaching or instructors needing attention or intervention, and (c) to enable supervisors to review and generate course-level reports on multi-section courses. To these ends we recommend the following:
- Represent Data in Dashboard. We recommend returning to a dashboard presentation of the data rather than a PDF report, to provide supervisors flexibility in how they look at their data and to streamline data processing so that results can be delivered to unit leaders earlier each semester.
- Establish Teaching Categories/Dimensions. Use categorization of items (different dimensions of teaching) in the dashboard to keep the items organized and to quickly highlight areas of strength or needing attention.
- Develop Performance Standards. Establish and communicate explicit criteria for evaluating SST data (including how to contextualize SST data with other forms of evidence). Create guidance documents linking specific feedback patterns to improvement strategies.
- Enable Trend Analyses. Develop functionality so that unit leaders can track and visualize changes in SST performance over time while filtering by course(s). This would enable unit leaders to examine how responses in specific areas of teaching have changed over time within a course (or course type, such as undergraduate courses). Below is a mock-up of a trend analysis for the Unit Leader Dashboard (based on the current version of the SST questions), showing ratings in individual areas over three semesters in the same course.
- Improve Interface with Open-Ended Comments. Investigate tools to facilitate summarization of open-ended comments (e.g., AI-powered tools) to make qualitative feedback more interpretable and actionable. Make individual comments available at bottom of dashboard and exportable in an excel file.
- Encourage use of Instructor-Generated Report for Individual Instructor Evaluation. Teaching evaluation should incorporate multiple forms of evidence, with SST data serving as one component. We recommend that departments and evaluation committees request that 22 instructors submit their own SST dashboard reports as part of annual reviews, promotion and tenure, and reappointment processes, alongside other required evaluation materials. This process will reduce the burden on unit leaders in generating reports on individual faculty while facilitating a more holistic or integrated review process.
- Improve User Experience and Navigation. Provide hands-on training for unit leaders on dashboard functionality, data interpretation and data use. Provide notifications/reminders to unit leaders when SST results become available, with clear instructions for accessing both current and historical data. Consider developing video tutorials and examples.
- Consider Implementation Support. Provide chairs with tools that translate SST data into actionable development plans for the department and/or instructors. Develop quick-reference guides linking common feedback themes to specific improvement strategies.
C. Managing the Transition
The reporting recommendations summarized above can be retroactively applied to the first four years of data (as well as data collected moving forward). But if the recommended changes to the SST items and scale are implemented (presumably in AY25-26), it will not be possible to include data from before and after those changes are implemented in the same representation/trend analysis. AIRE has indicated that they can create separate tabs in the report dashboards for pre-AY2025-2026 and post-AY2025-2026. Thus, the way the data are represented and the filtering/sorting capabilities will be the same, but there will need to be separate tabs in the dashboard if any of the items and/or metric are changed. The Task Force felt that this was a reasonable approach to managing the transition to a refined tool.
D. Platform Considerations
The Task Force felt that any change in the platform for gathering and representing the SST data should be made once final decisions were made about the survey construction, administration and dashboards/reports. Our understanding is that the Qualtrics platform does constrain the way the data can be represented in the dashboards. The AIRE representative on the task force (Jason Koepp) was able to develop some approaches to address several of the report/dashboard recommendations, but it may be worth considering a platform with expanded representational/analysis capabilities (e.g., Blue, formerly Explorance Blue) once there is a final wish list to drive selection. If this step is of interest, a subset of the Task Force could potentially request some demos from other platforms in Summer 2025.