Quality Assurance for blended and online courses

2018/19 — San José State University

— A Focused Quality Assurance Learning Community

Proposal Summary:  At San José State University, we proposed to provide training along with a faculty-mentor support structure for the faculty participants in the program. We were able to complete the goals of this project through a combination of Quality Matters Workshops, campus-based webinars, and faculty team leaders. The campus needs to work toward offering guidelines that faculty can use as they develop their online courses. The quality assurance program provides extensive training, faculty expertise, and guidance on the redesign of a course that takes into account best practices as identified in the Quality Matters (QM) rubric.


Campus QA Goals

Campus Goal for Quality Assurance

The EOQA goals are in line with the CSU’s Graduation Initiative 2025 specifically addressing the measure of ensuring effective use of technology is part of every CSU student’s learning environment. Additionally, our approach follows guidelines proposed in SJSU’s Four Pillars of Student Success that promote the development of:

  • richer and more readily accessible on-line supplemental study materials;
  • more elaborate and interactive homework and self-check instructional materials;
  • and more engaging in-class teaching strategies.

This proposal focused on developing a standard with which online and hybrid courses can use as a way to reflect upon their current course design and make the necessary revisions to reflect best practices. There are multiple needs of the campus to support Quality Assurance Efforts.

  • 1: Develop materials and resources that faculty can access and use to guide course design.
  • 2: Provide professional development opportunities to increase faculty awareness regarding quality assurance.
  • 3: Build a group of faculty that can become experts in quality assurance and provide mentoring for new faculty.

Quality Assurance Lead(s)

  • Jennifer Redd, Project Facilitator
  • Debbie Weissmann, Faculty Quality Assurance Team 1
  • Ravisha Mathur, Faculty Quality Assusrance Team 2

Supporting Campus Partners

  • Yingjie Liu, Lead Instructional Designer
  • Bethany Winslow, Instructional Designer
  • Emily Chan, Interim Associate Dean for Research and Scholarship
  • Ashour Benjamin, Course Reserves/Leganto Coordinator

Campus Commitment Toward Sustainability of QA Efforts

  • Developed multiple Canvas course templates based upon Quality Matters Principles
  • Encourage quality assurance principles in instructional design consultations

Summary of Previous QA  Accomplishments

This quality assurance program was the sixth iteration on campus. The previous cohort participated during the 2017-18 academic year. A variety of quality assurance efforts continue to expand on campus.

  • Effort 1: A faculty cohort completed two Quality Matters trainings: Applying the Quality Matters Rubric and Improving Your Online Course. A previous cohort completed the Peer Reviewer Course. Addiitonal information about last year's effort can be found in the Quality Assurance ePortfolio.
  • Effort 2: Increase awareness through outreach activities. This includes posting resources on the eCampus website and through participation in informational webinars. It also includes promoting workshops and encouraging attendance through flyers and presentations at campus events.
  • Effort 3: The rubric is provided as a resource for faculty in a password-protected Canvas course. Also, it serves as a guide when instructional designers consult with faculty members on course design.
  • Effort 4: Encourage faculty and staff that have completed Peer Reviewer Training to become a Quality Matters Peer Reviewer.
  • Effort 5: A Canvas course template that adheres to Quality Matters Standards is available to all faculty.

Course Peer Review and Course Certifications

  1. Peer Review: Faculty members were assigned a partner. Using a rubric, they provided a peer review for each other's course.
  2. Faculty Lead Review: Faculty members were equally divided into two teams. The faculty lead for each team provided each faculty member with an informal course review.

Faculty Participants

NameCourse NumberCourse Name
Roxanne CnuddeCOMM 20Public Speaking
San-hui ChuangCHIN 101BChinese Culture
Jennifer MorrisonCOMM 101CJunior Seminar: Theorizing Communication
Sarah PrasadENGL 2Critical Thinking and Writing
Mary SunseriCOMM 100WWriting Workshop

Accessibility/UDL Efforts

  1. One of the webinars during the yearlong program focused on accessibility. This included document preparation, presentations, and LMS features.
  2. One of the webinars during the year-long program focused on Universal Design for Learning. This included tips and examples.
  3. Workshops throughout the year that introduce faculty to new teaching methods as well as to the LMS includes time for discussions regarding developing accessible instructional materials.

Feedback was gathered following each of the webinars that were part of the year-long cohort. Themes specifcally related to the accessibility and universal design for learning sessions are noted below:

  • Webinar III. Academic Integrity and Universal Design for Learning
    • Chunking assignments (drafts)
    • Include more student-centered activities
  • Webinar VI. Accessibility
    • Canvas accessibility features
    • Google Slides Captioning
    • Accessible Syllabus Template 

Next Steps for QA Efforts 

  • Expand the number of courses that are Quality Matters certified
  • Provide guidance to faculty interested in becoming a Peer Reviewer
  • Offer an on-campus workshop and/or online opportunities for faculty to attend Quality Matters Trainings
  • Provide a faculty-team lead Quality Assurance Training program that incorporates Quality Matters workshops, webinars, and peer-guidance/feedback

Quality Assurance Results

Training Completions

The following table summarizes all of the SJSU faculty and staff Quality Assusrance training completions that occured during the 2018/19 academic year.

TrainingNumber of Completions
Applying the Quality Matters Rubric18
Applying the Quality Matters Rubric Face-to-Face and Online Facilitator1
Connecting Learning Objectives and Assessments1
Improving Your Online Course11
Introduction to Teaching Online Using the QLT Instrument2
Master Reviewer Recertification4
Reviewing Courses Using the QLT Instrument1


Student Quality Assurance Impact Research 

Student Survey Results

The CSU QA Student Online Course Survey was distributed via Qualtrics to the classes taught by the 2018-2019 EOQA participants. The survey was completed by 56 students in four classes, three of which were upper division courses. Seventy-three percent of respondents were female, 25% were male, and 2% were “Other.” The majority of respondents (48%) were freshmen and this was the first online course taken by most respondents (48%). Twenty-five percent of respondents had taken one online course before, 9% had taken 2-3 online courses, and 18% had taken 4 or more online courses. Forty-three percent of respondents were Asian, 21% were Hispanic or Latino, 14% were Caucasian, 10% were “Two or More Races,” and there were low percentages of African American (7%), Alaska Native/Native American/Pacific Islander (2%), and “Other” (2%) respondents.

In addition to the questions pertaining to course details and student demographics, there were 25 questions asking students to rate their agreement with a statement on a six-point scale from Strongly Disagree (1) to Strongly Agree (6). There were 4 questions pertaining to Course Overview and Introduction, 5 questions pertaining to Assessment and Evaluation of Student Learning, 4 questions addressing Instructional Materials and Resources Utilized, 3 questions addressing Student Interaction and Community, 2 questions pertaining to Facilitation and Instruction, 2 questions pertaining to Technology for Teaching and Learning, 2 questions addressing Learner Support and Resources, and 3 questions addressing Accessibility and Universal Design.

The average response for each question was greater than 5.0, falling between the ratings of Agree (5) and Strongly Agree (6) on the scale. The average responses for the Course Introduction and Overview questions ranged from 5.36 to 5.48 (Figure 1). The average responses for the Assessment and Evaluation of Student Learning questions ranged from 5.02 to 5.43 (Figure 2). The lowest average rating was for the question addressing the student’s understanding of how the learning activities (including the assignments and ungraded activities) helped him or her achieve the learning objectives each week. The highest average rating was for the question pertaining to whether the student was given opportunities to receive feedback from the instructor and self-check progress in the course.

Figure 1. Mean responses to questions about course overview and introduction. In all graphs, n = 56 and error bars depict the standard deviation. 

Figure 2. Mean responses to questions about assessment and evaluation of student learning. 

For the questions addressing Instructional Materials and Resources Utilized, the average responses ranged from 5.14 to 5.30, with the lower average responses for questions addressing the variety of course material types and perspectives and the instructor explaining how the materials supported the course objectives/competencies (Figure 3). 

Figure 3. Mean responses to questions about instructional materials and resources utilized. 

For the questions addressing student opinions on Student Interaction and Community, the averages ranged from 5.07 to 5.45 with the lowest average for the question asking whether learning activities helped the student build fundamental concepts and skills useful in the real world (Figure 4). 

Figure 4. Mean responses to questions about student interaction and community. 

There were two questions on Facilitation and Instruction (Figure 5). The average response was 5.04 for the question addressing the instructor’s clarity on how long feedback on assignments would take and whether feedback was provided in a timely fashion. The average response was 5.48 for the question asking whether the instructor sent reminders of due dates and other information to help keep the student on task. 

Figure 5. Mean responses to questions about facilitation and instruction. 

The average ratings for the two questions pertaining to Technology for Teaching and Learning were 5.11 for the variety of technology tools used to engage the class and encourage them to interact, and 5.30 for the providing of clear information on how to access/acquire the required technologies (Figure 6). Regarding Student Support and Resources (Figure 7), students more strongly agreed that the syllabus and/or website linked to technical support (average 5.25) compared to the syllabus and/or website linking to academic support services and resources (average 5.12). 

Figure 6. Mean responses to questions about technology for teaching and learning.

Figure 7. Mean responses to questions about student support and resources. 

Finally, average ratings for the Accessibility and Universal Design items ranged from 5.04 to 5.37 with the lowest average pertaining to the question asking whether the course syllabus or website provided or linked to the campus policy on accommodating students with disabilities (Figure 8). Respondents more strongly agreed that course materials were in accessible format and it was easy to navigate the online components of the course. 

Figure 8. Mean responses to questions about accessibility and universal design. 

Faculty Interview Summary

Four of the five faculty participants in the 2018-2019 EOQA program were interviewed upon completion of the program. They were asked several questions about the changes they had made to their courses based on various components of the EOQA program. Overall, the program was viewed favorably, with participants enthusiastically reporting that they had made many positive changes to their courses as a result of the training received, and that they are planning to make additional changes. Modifications made thus far include effective design and placement of information on Canvas, adding module overviews, more clearly conveying course objectives to students, changing the home page frequently, providing effective grading rubrics, and applying the QM template. Participants reported that the program increased their desire to learn new things, learn best practices, be more aware of effective learning and teaching in online courses, and understand the student perspective in online courses.

The UDL/DI webinar was reported to be informative in that it emphasized how students learn and different ways for students to demonstrate learning and complete assignments, but the information was not new for some participants. Improving accessibility is a goal for most participants, and many already knew the information presented in the Accessibility webinar or were already using accessible materials. Most participants felt that the Affordable Learning Solutions and Copyright webinars were useful, especially the information on Permalinks, library resources, and providing affordable materials to students. Some participants are already using lecture capture or were already knowledgeable, but others are planning to implement it or attempt to use it more effectively in their courses.

Overall, participants were enthusiastic about what they had learned, the ideas that they plan to implement, and continuous professional development and improvement in curriculum delivery to improve the student learning experience. Participants would recommend the program to colleagues although they suggest enrolling when one has ample time to complete the extra work.

Key Findings 

  • Overall favorable feedback about the program.
  • Significant, positive changes made to courses as a result of EOQA training.
  • Increased desire to learn more about best practices and various aspects of online teaching.
  • Information in UDL/DI and Accessibility webinars was review for most participants.
  • Several suggestions for improving the effectiveness of the peer review process.
  • Participants would recommend and have recommended the EOQA program to other faculty but emphasize the additional workload incurred.

Development of Campus QA Resources

Canvas Course Templates are available to SJSU faculty