Institutional Effectiveness and Course Evaluations with Cloud-Based Systems
Student feedback through course evaluations is an important component for improving institutional effectiveness. But evaluation processes which require on-campus resources can be challenged by crises like the current pandemic. One solution is a cloud-based system for course evaluation. These systems can also come with added time savings and improved response rates. In this article, I share the experiences of moving from a locally-based to a cloud-based course evaluation system. As a result, there has been greater satisfaction among students, faculty, and academic leaders with the tool. In addition, we’ve seen increased response rates, and decreased institutional research staff time to administer the surveys. I share some suggestions for higher education leaders considering making the switch.
How do you engage with students when face-to-face instruction no longer occurs? What happens to the ability of the college or university to assess student learning and quality of instruction? How do support staff perform their tasks remotely? In the spring of 2020, college and universities worldwide experienced a dramatic paradigm shift. The global pandemic required immediate cessation of in-person student interactions and challenged the maintenance of student-centered learning environments. Technology impacts how effectively institutional researchers evaluate quality of instruction through instructor/course evaluations.
Course Evaluations During Hurricane Recovery – Lessons Learned
Located in southeastern North Carolina, Fayetteville Technical Community College (FTCC), is no stranger to campus disruptions. Many institutions in the southeast have their hurricane preparedness plans ready to go. And — there is usually alot of warning before a hurricane hits. FTCC began in 2010 to require every division and department to create Continuity of Operations Plans (COOP). Hurricanes and other events had impacted the ability to access campus resources in the past. And FTCC, like many in the southeast was becoming all too familiar with disaster preparation and recovery.
Anticipating the loss of valuable instructional time and ability to continue operations, the College mandated disaster recovery planning – in advance of the next issue. We considered questions like:
- What is the best way for institutional effectiveness carry out daily functions if we cannot access any campus location?
- How can we securely access the student information system to extract data for reporting purposes?
- What method can we deploy for course evaluations if students are not on campus?
In an effort to be more proactive, my department actively sought answers to these questions using technology. The technological solution did not occur within a few months. We closely examined our assessment priorities, and met with curriculum information technology leadership many times. We considered whether we were measuring what we intended through our course evaluations. Adopting technical solutions prompts such larger discussions and analyses. We wanted the technology to meet our evaluation needs, not vice versa. When technology dictates business solutions, the result is often levels of analyses which do not support a student-centered learning environment.
Student Success and Course Evaluations
Course evaluations represent a singular, but important, aspect of assessing learning. Quality of instruction is one data point that provides a snapshot how closely the learning environment meets the needs of each student. When evaluation does not occur, academic leaders have a limited ability to identify successes and challenges in the learning environment. Further, periodic evaluation does not lend itself to substantial analyses. In the past, it was not uncommon to evaluate a single term. With course evaluations, a best practice is to evaluate as many courses as possible over an entire academic year. The more snapshots you can produce, the higher the quality of the overall picture.
Student Learning Environment Disruptions and Reactions
In a crisis, stakeholders’ emotions can range from an extreme sense of urgency to fear. The uncertainty of desired outcomes, student distress, disruption of learning environments, and lack of support and resources can cause a sense of isolation. We tend feel the impact of local disruptions more acutely than state or national crises. Within that framework, we tend to create our action plans for such localized contingencies. However, the recent pandemic illustrates that learning environments and support systems in higher education must consider longer periods of disruption on a national or global scale. Historically, community colleges are more adept at creating faster and creative solutions to learning environment disruptions due to their inherent nature of responding quickly to workforce training demands.
Challenges with Course Evaluations for Students Online
FTCC enrolled over 19,000 students in more than 1,800 course sections over two terms. This made the challenge of how to remotely evaluate quality of instruction a clear priority. Even a decade ago, the evaluation system did not lend itself to deployment under extreme circumstances like we have today. With limited virtual private network (VPN) capability, connecting to our campus PCs was not an option at that time.
Every course had to be manually entered into the software package, which was not web-based. The time-consuming data entry of course sections, enrolled students, instructor(s), varying start and end dates, inhibited the ability to carry out such operations remotely and in a timely manner. The most logical answer involved a web-based or cloud-based solution that could be accessed regardless of location. However, before we could embark upon a solution, we needed to re-evaluate our internal workflows. We had to also consider the demands of the end user to support a student-centered learning environment.
Evaluating the Course Evaluation System
To determine a new solution, we had an honest look at the existing course evaluation system. Here’s what we found:
- Staff spent dozens of hours entering courses names, sections, and instructor information. This wasn’t a good use of talent. With multiple start and end dates, these data entry tasks were constant.
- Students navigated a tedious system of drop-down menus and often did not complete the surveys.
- Tabulating and creating visuals of results for academic leaders took weeks.
- Low response rates and frustration among students were not uncommon.
FTCC needed to find a web-based platform that offered simple front-end operations (such as securely loading course and enrollment files). The software had to be easy to navigate to facilitate survey completion, and present instant, visually informative results for the end user. Also, we wanted to engage students to communicate the importance of completing these evaluations.
Gathering key stakeholder input was critical. As I interacted with academic leadership, I spoke with students as well about their perceptions of course evaluations. Some felt they were not taken seriously. But most wanted an easier way to complete them from their mobile device(s). We discussed how course evaluations were used at the College and that their voice informed academic leaders about the quality of instruction. Students said:
- Explain in the survey invitation that College leadership uses student feedback to making course improvements.
- Provide clear deadlines for survey completion and send reminders as needed. Engage faculty and technology to help remind students, as well.
The College’s culture of assessment supported a web-based solution that was not tied to a specific campus location. Disaggregation by modality, location, and division offered data that evaluated the quality of instruction across a broad spectrum of categories. As courses were added and deleted, the software would need to allow for secure batch uploads throughout the semester and filter out students who withdrew. Students needed reminders and the ability to access surveys via mobile devices. Further, with the amendment of the Americans with Disability Act, the evaluations had to be accessible for those with disabilities.
Survey Response Rates
Once we evaluated the business rules, the next issue became response rates. According to Chapman and Jones (2017), “Multiple studies reported that while response rates for online Student Evaluations of Teaching (SETs) initially average near 60%, they soon drop off to the 30 to 40 percentile range.” Under the old course evaluation system, the College’s overall response rate was at 15%, and needed to increase in order to provide more robust data regarding quality of instruction. As distance education grew in convenience, students became adept at using their mobile devices in a learning environment. Electronic surveys delivered via e-mail became more manageable.
It is important to note that the College does not require students to complete course evaluations. But FTCC strongly encourages course evaluations to enable leaders to assess instructional quality. Professional development activities sparked discussion among faculty and other support staff regarding how to encourage students to complete these evaluations. Further, the college engaged faculty and departmental leaders throughout the software selection process to ensure that the platform chosen would benefit both faculty and students. As a result of this careful planning, end-of-course evaluation response rates through the new cloud-based system quadrupled to 60 percent within the first year.
Forced Student Evaluation Concept
The online survey platform has the capability to hold student grades for ten days, or until the student completes the survey. However, this feature has not been activated. FTCC’s culture of assessment does not lend itself to forced survey completion. So, we rely on other strategies to complete these evaluations such as synchronous monitoring of responses during the evaluation period and sending reminders to students that have not completed the evaluations.
When incorporating any technical solution to a business problem, it’s important to be mindful of automatic settings. The culture of assessment at FTCC involves educating stakeholders about the importance of end-of-course surveys, and involving stakeholders in finding creative ways to increase response rates.
The cloud-based software solution used by FTCC allows for secure upload of course and enrollment data files throughout the semester. My department uploads seven data files, much like attaching documents to an e-mail. Data quality checks automatically occur to identify missing data fields and allow for timely correction. Two weeks before the course ends, the student receives an automatic e-mail with a unique link to the course(s) in which they are enrolled. And the software integrates with our learning management system so that survey links appear within their respective course(s).
Students automatically receive reminders until they either complete the survey or the course ends. Within 24 hours of the course ending, the results are automatically tabulated and the academic leader of the appropriate division receive a link to the results. With their unique login, they are presented with a dashboard view of response rates and the ability to drill into individual instructor evaluations. Further, the software stores data going back several semesters, which allows for longitudinal analyses.
Assessment of the Course Evaluation System
As the software became operational, I worked closely with the vendor, information technology, and academic leaders to ensure smooth implementation. Within one academic year, the overall responses rates to course evaluations quadrupled. At the division level, response rates rose as high as 85%. Students responded positively to the technology, and academic leaders were pleased with the dashboard view of their data. My department was elated with the time saved and seamless tabulations the software produced quickly. We still watch the evaluation results to ensure there are no glitches. However, the results have been impressive.
Considering a Change to Cloud-Based Course Evaluations?
As one semester closes and our College prepares for another, like you, I am working remotely due to the pandemic. Here are some things to think about as you move your institution forward for course evaluations:
- Can you access the student information system, extract the necessary files via secure VPN access, and upload them into the software remotely? Cloud-based technology does not require an in-office presence for this process. Batch uploads take only a few minutes and eliminate the need for time consuming manual data entry. Evaluations deploy automatically into student e-mail accounts and our LMS.
- What are your current practices for course evaluation and their frequency? Some colleges choose to evaluate a random sample of courses during a specific term. Others may choose to evaluate all courses during the Fall term only.
- Could you gain a significant return-on-investment (ROI) moving to a cloud-based assessment platform? Movement to a cloud-based evaluation system can be quite costly. Smaller colleges, that offer a few hundred course sections, may not benefit as much as larger institutions with thousands of course sections.
- How will a cloud-based assessment platform integrate with the current student information system (SIS) and LMS? Involve your Information Technology division in these discussions to ensure that you’re following all institutional security protocols, if you decide to move to a cloud-based platform.
- Do you need to simply “tweak” the current system? Engage academic leaders in discussions regarding whether the current system delivers information to support academic quality.
Forward Thinking for Course Evaluations at Your Institution
The COVID-19 pandemic has forced almost every institution to rethink the way standard processes function. But this gives us an opportunity to prepare ourselves for future, more localized, disruptions in learning environments. Supporting student learning goes beyond the movement to an online platform for delivering instruction and support resources. Course evaluations, as an integral part of the culture of assessment, must continue, as well. Web-based technologies offer the most efficient delivery method, particularly when remote working becomes necessary. Cloud-based solutions also remove the burden of housing data on local servers. Further, while costly, these alternatives allow for the continuation of business practices in a crisis to keep the momentum of student learning and support. It’s important for institutional researchers to become more familiar with new technology like these cloud-based evaluation options to keep our work operational during times of crisis.