The Student Assessment of Learning Gains Instrument is designed for instructors from all disciplines who wish to learn more about how students evaluate various course elements in terms of how much they have gained from them. Feedback from the instrument can guide instructors in modifying their courses to enhance student learning. It may be used at any point during a course (for formative feedback) as well as at the end. The Web delivered instrument allows you to Step 1: Modify the SALG instrument to fit your class Step 2: Implement the SALG Step 3: View and analyze the data
Type of Material:
Quiz/Test for course feedback and assessment.
Once a questionnaire is changed, it becomes very important to have saved the previous questions and student responses, since prior versions are not saved on the SALG server.
Identify Major Learning Goals:
"The SALG instrument is designed for instructors from all disciplines who wish to learn more about how student evaluate various course elements in terms of how much they have gained from them." This is a free site for instructors who would like feedback from students about how course elements are helping students learn. Important feedback can help instructors determine student perceptions regarding the efficacy of a particular approach to teaching (thus reducing the risk by immediately identifying some of the consequences of particular ?teaching experiments?).
Target Student Population:
The site is offered as a service to the college-level teaching community, for higher education faculty to have students evaluate courses.
Prerequisite Knowledge or Skills:
Students: Very basic abilities in navigating web pages.Faculty: Understanding the statistical analysis provided. The ability to design likert scale items that discriminate student perceptions relevant to specific objectives.
Evaluation and Observation
: SALG is designed to help instructors collect periodic feedback from students about the quality of course components. SALG is unlike any other on-line assessment tool due to two important strengths: (1) anonymity of the responses, and (2) the opportunity to deliver course specific (activity specific and even week specific) feedback items.
Sub-topic categories are excellent: (aspects that help your learning, class activities, graded activities, resources, information, the way the class was taught over-all, understanding, skills, gains, impact). Many universities provide course evaluations for students to complete following each semester/quarter. These evaluations are usually extremely generic and provide little, if any, information for instructors to make changes to subsequent course offerings. This tool allows instructors to customize each and every question for any particular course. Also, a statistical analysis of the students' responses is made available.
Among the sample items are some that measure student comfort with complex material. Simply measuring whether approach X helps a student achieve comfort with complex outcome Y makes it easier for the instructor to emphasize the importance of managing complexity as a goal. This is a boon to faculty who are struggling against the typical student response: ?make it simple and just tell me what I?m supposed to know,? with the underlying assumption on the part of many students that the instructor is not doing the job if the student must think hard. The option to collect comments relevant to specific sub-categories is a tremendous strength!!! This tool makes it easy to gather individual student comments anonymously.
General Comments: Instructors who truly wish to improve their teaching and course content will find this an invaluable tool.
One sub-topic, ?class and lab activities,? may not be the best descriptor for a class that has no lab component. The fixed format questions are not always appropriate (i.e. How the class activities, labs, readings, and assignments fit together may not be appropriate for a seminar class.)
General Comments: One wonders whether some of the fixed questions might be aimed at collecting data from our courses for use in some other study. Along with the statement that ?any information you place on this site is for your benefit only, and will not be revealed? without your permission,? clarity is needed about whether this covers the information students put on the site. Faculty would appreciate clarification regarding how the results of student responses will OR will NOT be used by the authors. If student responses will NOT be used, then why fix any question items? Fix categories, allow for defining new categories, and then allow all question items to be designed by the instructor. It would be helpful if it were possible to match more of the items or categories to our departmental student opinion questionnaires.
Potential Effectiveness as a Teaching Tool
SALG supports creativity by making it possible to reduce the risk inherent in trying out any new instructional approach by shortening and facilitating the feedback/improvement cycle.
General Comments: This is a very valuable enhancement to any course, particularly those offered online. THANKS to the authors for providing this important resource. We hope that needed funding will continue to provide access and improvement of SALG.
The meaning of each sub-category is not quite clear. Subcategories might be matched more closely to learning goals ? such as achievement of information competence, understanding the nature of science, communication and writing skills,
Ease of Use for Both Students and Faculty
The questionnaire is set up so that one cannot accidentally alter the questions once some students respond, providing excellent protection from the temptation to destroy the meaning of results by improving poor questions halfway through the process, and then forgetting to distinguish between the responses before versus after correction of the questions.
This site has limited visual appeal and is not very engaging, but perhaps the simplicity is warranted in this case. The student responses should download together with the questions (not just with item numbers); since it can be difficult to track which responses match each version of the questionnaire in a course where the instrument is used repeatedly throughout the semester.