Quality Assurance for blended and online courses

2020/21 — San José State University

— A Focused, Peer Review Approach to Quality Online Course Design

Proposal Summary: At San José State University, we proposed to provide training along with a faculty-mentor support structure for the faculty participants in the program. We were able to complete the goals of this project through a combination of Quality Matters Workshops, campus-based webinars, and faculty team leaders. The campus continues to expand our support with guidelines that faculty can use as they develop their online courses. The quality assurance program provides extensive training, faculty expertise, and guidance on the redesign of a course that takes into account best practices as identified in the Quality Matters (QM) rubric. As this academic year was primarly conducted online, this program assisted faculty with the design of those courses.

Campus QA Goals

Campus Goal for Quality Assurance

The EOQA goals are in line with the CSU’s Graduation Initiative 2025 specifically addressing the measure of ensuring effective use of technology is part of every CSU student’s learning environment. Additionally, our approach follows guidelines proposed in SJSU’s Four Pillars of Student Success that promote the development of:

  • richer and more readily accessible on-line supplemental study materials;
  • more elaborate and interactive homework and self-check instructional materials;
  • and more engaging in-class teaching strategies.

This proposal focused on developing a standard with which online and hybrid courses can use as a way to reflect upon their current course design and make the necessary revisions to reflect best practices. There are multiple needs of the campus to support Quality Assurance Efforts.

  • 1: Develop materials and resources that faculty can access and use to guide course design.
  • 2: Provide professional development opportunities to increase faculty awareness regarding quality assurance.
  • 3: Build a group of faculty that can become experts in quality assurance and provide mentoring for new faculty.

Quality Assurance Program Team and Participants

Quality Assurance Lead(s)

  • Jennifer Redd, Project Facilitator
  • Debbie Weissmann, Faculty Quality Assurance Team Leader
  • Ravisha Mathur, Faculty Quality Assurance Team Leader

Supporting Campus Partners

  • Yingjie Liu, Lead Instructional Designer
  • Bethany Winslow, Instructional Designer
  • Valin Jordan, Equity and Accessibility Educator
  • Emily Chan, Interim Associate Dean for Research and Scholarship
  • Ashour Benjamin, Course Reserves/Leganto Coordinator

Campus Commitment Toward Sustainability of QA Efforts

  • Developed multiple Canvas course templates based upon Quality Matters Principles
  • Encourage quality assurance principles in instructional design consultations

Summary of Previous QA  Accomplishments

This quality assurance program was the eighth iteration on campus. The previous cohort participated during the 2019-20 academic year. A variety of quality assurance efforts continue to expand on campus.

  • Effort 1: A faculty cohort completed one or two Quality Matters trainings: Applying the Quality Matters Rubric and Improving Your Online Course. A previous cohort completed the Peer Reviewer Course. Additional information about last year's effort can be found in the Quality Assurance ePortfolio.
  • Effort 2: Increase awareness through outreach activities. This includes posting resources on the eCampus website and through participation in informational webinars. It also includes promoting workshops and encouraging attendance through flyers and presentations at campus events.
  • Effort 3: The rubric is provided as a resource for faculty in a password-protected Canvas course. Also, it serves as a guide when instructional designers consult with faculty members on course design.
  • Effort 4: Encourage faculty and staff that have completed Peer Reviewer Training to become a Quality Matters Peer Reviewer.
  • Effort 5: A Canvas course template that adheres to Quality Matters Standards is available to all faculty.

Course Peer Review and Course Certifications

  1. Peer Review: Faculty members were assigned a partner. Using a rubric, they provided a peer review for each other's course.
  2. Faculty Lead Review: A faculty leader provided each faculty member with an informal course review.

Faculty Participants

NameCourse NumberCourse Name
William B AndreopoulosCS 147Computer Architecture
Avizia Long
SPAN 201
Modern Spanish
Christine Ma-Kellams
PSYC 1
Introduction to Psychology
Linda Mitchell
ENGL 103Modern English
Charles Park
PH 99Introduction to Public Health
Matthew Spangler
COMM 123I
Performance of Ethnodrama

Resources and Program Efforts

Accessibility/UDL Efforts

  1. One of the webinars during the yearlong program focused on accessibility. This included document preparation, presentations, and LMS features.
  2. One of the webinars during the year-long program focused on Universal Design for Learning. This included tips and examples.
  3. Workshops throughout the year that introduce faculty to new teaching methods as well as to the LMS includes time for discussions regarding developing accessible instructional materials.

Feedback was gathered following each of the webinars that were part of the year-long cohort. Participants answered two questions in the Discussions section of the course.

  1. Did the ________ webinar help you to identify something you would like to include, change, or develop in your course?
  2. Share 1 question or comment regarding the ________ webinar.

Development of Campus QA Resources

Canvas Course Templates are available to SJSU faculty

Next Steps for QA Efforts 

  • Expand the number of courses that are Quality Matters certified
  • Provide guidance to faculty interested in becoming a Peer Reviewer
  • Offer an on-campus workshop and/or online opportunities for faculty to attend Quality Matters Trainings
  • Provide a faculty-team lead Quality Assurance Training program that incorporates Quality Matters workshops, webinars, and peer-guidance/feedback

Quality Assurance Training Completions

Training Completions

The following table summarizes all of the SJSU faculty and staff Quality Assurance training completions that occurred during the 2020/21 academic year.

TrainingNumber of Completions
Advanced QLT Course in Teaching Online
8


Applying the Quality Matters Rubric (APPQMR)

2
Designing Your Online Course (DYOC)
4
Improving Your Online Course (IYOC)
19
Introduction to Teaching Online Using the QLT Instrument (QLT1)
12
Reviewing Courses Using QLT
1


Student Quality Assurance Impact Research: Student Survey Results

The CSU QA Student Online Course Survey was distributed via Qualtrics to the classes taught by the six 2020-2021 EOQA participants. The survey was completed by 31 students in 2 classes, a lower division and an upper division class. Fifty-five percent of respondents were female, 39% were male, and 6% were other.  The majority of respondents were seniors (58%) or juniors (29%). Ninety percent of respondents reported having previously taken 4 or more online courses, and 10% reported that this was their first online course. These numbers are consistent with mostly online delivery of courses at the university since the middle of the Spring 2020 semester. Thirty-two percent of respondents were Asian, 32% were Hispanic or Latino, 26% were Caucasian, 7% were African American, and 3% were “Two or More Races.” No respondents self-reported as Alaska Native/Native American/Pacific Islander. In addition to the questions pertaining to course details and student demographics, there were 25 questions asking students to rate their agreement with a statement on a six-point scale from Strongly Disagree (1) to Strongly Agree (6). There were 4 questions pertaining to Course Overview and Introduction, 5 questions pertaining to Assessment and Evaluation of Student Learning, 4 questions addressing Instructional Materials and Resources Utilized, 3 questions addressing Student Interaction and Community, 2 questions pertaining to Facilitation and Instruction, 2 questions pertaining to Technology for Teaching and Learning, 2 questions addressing Learner Support and Resources, and 3 questions addressing Inclusivity and Accessibility.  

Descriptive statistics are presented in the Table. The average response for each question was greater than 5.0, falling between the ratings of Agree (5) and Strongly Agree (6) on the scale.  Almost all of the averages were above 5.5. The average responses for the Course Introduction and Overview questions ranged from 5.87 to 6.00 (Figure 1). The average responses for the Assessment and Evaluation of Student Learning questions ranged from 5.87 to 5.97 (Figure 2).  

Figure 1.  Mean responses to questions about course overview and introduction. In all graphs, error bars depict the standard deviation.Figure 1.  Mean responses to questions about course overview and introduction. In all graphs, error bars depict the standard deviation.

 

Figure 2.  Mean responses to questions about assessment and evaluation of student learning.Figure 2.  Mean responses to questions about assessment and evaluation of student learning.

For the questions addressing Instructional Materials and Resources Utilized, the average responses ranged from 5.55 to 5.90 (Figure 3).

Figure 3.  Mean responses to questions about instructional materials and resources utilized.Figure 3.  Mean responses to questions about instructional materials and resources utilized.For the questions addressing student opinions on Student Interaction and Community, the averages ranged from 5.48 to 5.52 (Figure 4).  

Figure 4.  Mean responses to questions about student interaction and community.Figure 4.  Mean responses to questions about student interaction and community.There were two questions on Facilitation and Instruction (Figure 5). The average response was 5.87 on each question.
Figure 5.  Mean responses to questions about facilitation and instruction.Figure 5.  Mean responses to questions about facilitation and instruction.

The average ratings for the two questions pertaining to Technology for Teaching and Learning were 5.39 for the use of a variety of technology tools to engage the class and encourage them to interact, and 5.54 for providing clear information on how to access/acquire the required technologies (Figure 6).

 
Figure 6.  Mean responses to questions about technology for teaching and learning.Figure 6.  Mean responses to questions about technology for teaching and learning.

For the questions on Learner Support and Resources (Figure 7), the average was 5.53 for the question pertaining to the syllabus referring to academic support services and resources, and 5.71 for the question addressing whether the syllabus refers to technical support provided by campus.

Figure 7.  Mean responses to questions about learner support and resources.Figure 7.  Mean responses to questions about learner support and resources.

Finally, average ratings for the Inclusivity and Accessibility items ranged from 5.69 to 5.90 (Figure 8).

Figure 8.  Mean responses to questions about inclusivity and accessibility.Figure 8.  Mean responses to questions about inclusivity and accessibility.


Table.  Descriptive statistics for the 25 items in the Student Impact Survey completed in AY 2020-2021, n = 31
QuestionMinimumMaximumMeanStd Deviation

Course Overview and Introduction

The instructor provided clear and detailed instructions for how to begin accessing all course components, such as syllabus, course calendar, and assignments.

5

6

5.97

0.180

Detailed information about the instructor was available and included multiple ways to contact him/her, times s/she was available, a brief biography, and a picture or welcome video.

5

6

5.87

0.341

The course description included the purpose and format (e.g. fully online, blended; schedule/calendar specifies dates/times) of the course, as well as any applicable prerequisite knowledge (e.g., prerequisite course).

5

6

5.97

0.180

The instructor clearly defined academic integrity and/or provided a “code of ethics” and provided institutional policies and/or links to those policies (e.g, academic dishonesty, cheating, and plagiarism).

6

6

6.00

0

Assessment and Evaluation of Student Learning

The instructor provided specific, well-defined, and measurable learning objectives. I understood what I was supposed to accomplish both weekly and by the end of the course. For example, each week there were specific learning goals and I knew exactly what I was supposed to learn/accomplish (e.g., there were bulleted list of activities to complete each week).

5

6

5.97

0.180

I understood how the learning activities (including the assignments and ungraded activities) helped me achieve the learning objectives each week. For example, I understood how a discussion forum could help me prepare to develop a “reaction paper” on a topic.

5

6

5.87

0.341

The instructor made it clear how individual papers, exams, projects, and/or group contributions would be evaluated. For example, I was given grading sheets or detailed descriptions of how points were distributed for major assignments.

4

6

5.87

0.428

The instructor provided a course grading policy that clearly defined how much each assignment or category of assignments contributed to my overall course grade.

5

6

5.90

0.301

I was given opportunities to receive feedback from my instructor and to self-check my progress in the course. For example, my instructor posted grades regularly, provided comments on my work, had us self-grade assignments, allowed us to submit drafts of projects for comments, and offered discussion forums for feedback and practice tests.

5

6

5.94

0.250

Instructional Materials and Resources Utilized

The instructor gave me adequate notice and time to acquire course materials. For example, I received information on how to obtain the course textbook/materials prior to the start of the course via email, or the instructions for how to acquire the materials were in the syllabus or elsewhere in the course.

5

6

5.90

0.301

The instructor offered a variety of course material types (such as audio, video, and readings) and perspectives. S/he did not over-rely on a single way to deliver content such as via text or from a single source/textbook or author.

3

6

5.55

0.768

The materials supported the content of what I was learning in the course. For example, the textbook, articles, audio recordings, and videos were all tied to the course topics and objectives.

4

6

5.90

0.396

The instructor provided a good explanation to show how the instructional materials (e.g., textbook, videos organized by topics) support the course objectives or competencies.

4

6

5.74

0.514

Student Interaction and Community

The instructor provided an opportunity at the beginning of the course for students to introduce themselves. This created a sense of community among course participants.

4

6

5.52

0.724

The learning activities (e.g., discussions and activities) encouraged me to log on and interact with my fellow classmates often.

4

6

5.48

0.724

The course learning activities helped me understand fundamental concepts and build skills that will be useful in the real world. For example, the activities made connections with real-world problem solving, and involved real-world scenarios.

4

6

5.52

0.677

Facilitation and Instruction

The instructor was clear on how long it would take to receive feedback on assignments.  I received feedback about my coursework and progress in a timely fashion.

5

6

5.87

0.341

The instructor sent reminders of due dates (email, weekly announcements) and other information and instructions to help keep me on task.

3

6

5.87

0.562

Technology for Teaching and Learning

The instructor used a variety of online technology tools to engage me and encourage me to interact with others in the course and I felt the tools used supported the course objectives.  Examples include, but are not limited to, web meetings, online discussions (e.g., VoiceThread), online collaboration tools (e.g., Google Docs), social media tools (e.g., Twitter).

2

6

5.39

0.919

The instructor provided clear information about how to access or acquire the technologies required to successfully complete the course. Examples include, but are not limited to, web authoring software (web pages, blogs, wikis), proctoring software, printers, scanners, browser plug-ins or media players.

4

6

5.54

0.637

Learner Support and Resources

The course syllabus listed and/or the course website linked to a clear explanation of the TECHNICAL support provided by my campus and provided information about when and how I can access it. For example, the syllabus had links to the technical support website, Help Desk contacts, and online tutorials.

4

6

5.53

0.571

The course syllabus listed and/or the course website linked to ACADEMIC support services and resources, such as Supplemental Instruction, Writing Center, Math Center, Tutoring Center, testing services, and library resources.

5

6

5.71

0.461

Inclusivity and Accessibility 

The course syllabus or course website provided or linked to the campus policy regarding accommodating students with disabilities.

5

6

5.69

0.471

The course materials (whether created by the instructor or from external sources) were in accessible formats (e.g., videos were captioned and/or had text transcripts).

5

6

5.82

0.390

It was easy to navigate the online components of the course. For example, the module or weekly organization was easy to follow and course headings and links were clear and easy to understand.  It was easy for me to locate respective course resources/components.

5

6

5.90

0.305


 

 

 

 

 

 

 

 

Student Quality Assurance Impact Research: Faculty Interview Summary

All six of the faculty participants in the 2020-2021 EOQA program were interviewed upon completion of the program. They were asked about the changes they had made to their courses or planned to make based on various components of the EOQA program. The results of the interviews differed from previous years, in which most participants were new to online teaching. Overall, participants were enthusiastic about what they had learned in the program and about the positive changes they had made to their courses as a result of the EOQA training as well as the Quality Matters (QM) courses they had completed in Fall 2020. They were complimentary of the eCampus trainings in general.  Participants are planning to make additional changes to their courses next year, continue to improve the online learning experience for students, and continue their own professional development. Participants appreciated that they developed a better understanding of the student perspective and effective structuring and delivery of online content. They reported success in defining learning outcomes and/or expected student achievement, and aligning learning activities and course structure with those outcomes. 

In terms of specific webinar topics, the UDL/DI webinar was useful and some participants included some UDL principles in their courses. Generally, participants were already cognizant of accessibility considerations but learned from this webinar. Student concerns and complaints could be mitigated with good implementation of UDL and accessibility.  

The Lecture Capture/Storytelling webinar provided many good ideas that participants were able to implement, with more planned changes as well. This seems to a be a change from previous years in which participants were inspired but were not using lecture capture yet. There were good technology and equipment recommendations.

The Copyright webinar was helpful and prompted good discussion but some participants felt it was too detailed.

The facilitators were reported to be responsive and accessible. They are good role models. They provided excellent guidance and reminders. However, it was reported that some assignments had to be completed in a specific, detail-oriented way to earn credit and to progress but those details didn’t necessarily apply to the courses being taught. Also, more feedback or check-ins from facilitators would be helpful and it would have been helpful if everything had been posted at the start of the program for those who wanted to work ahead when they had time in their schedules. Peer reviews were useful but more critical feedback would have been helpful. It was beneficial to see how peers’ courses are structured and taught. 

Participants would highly recommend the program to their colleagues if they have time to complete it. The fall required a large time commitment and the spring wasn’t as busy.

There weren’t any suggestions for major changes to the program.