See the latest updates and information regarding the COVID-19 pandemic, including a list of University contact information at semo.edu/covid19.
The guiding principles of Southeast's assessment program clearly specify that assessment is to be motivated, informed, and evaluated in terms of its contribution to continuous program ' improvement. Though it is understood that the results of assessment will typically indicate that programs are functioning satisfactorily, it is also expected that changes in curriculum, instruction, and practices will result from assessment efforts. The process of assessment can itself improve the quality of teaching and learning by bringing faculty together to articulate shared standards and expectations (for example, when a department agrees on rubrics for scoring a performance-based assessment procedure) or by improving communication between faculty and students (for example, when a department conducts exit interviews). The results of assessment can help to identify, components of the curriculum that need to be strengthened (for example, when a subscore on a nationally normed exam indicates that students are relatively weak in that area) or demonstrate that an effort to improve has succeeded (for example, when data show that retention rates have increased or failure rates on a writing exam have decreased).
Longitudinal assessments have been used to demonstrate that in certain skill areas students at Southeast improve over time. The average writing proficiency score of cohorts tracked through the University's three-tiered writing assessment program increases significantly, and comparison of freshmen with upper-division student scores on Academic Profile suggests that students improve in key areas of general education at a level equivalent to students at comparative institutions. But, like Peter Ewell ("Back to the Future." Change, November 1991, pages 4-8), the Assessment Review Committee has found that the use of these and other assessment data for program improvement has not been either dramatic or extensive. Nevertheless, we can cite several instances in which assessment has led to positive changes. When early results from the writing assessment program suggested that students in EN 1 40 were not receiving focused instruction on essay writing in writing-on-:demand situations, the English department conducted a full-day workshop on holistic scoring for all composition instructors. Subsequently, a 32.6% failure rate on Writing Proficiency Exit exam was reduced to 20.7% and then to 17%. Results from the seventy-five hour writing proficiency test led to the creation of a portfolio-oriented upper-division composition course for students requiring additional instruction in basic writing skills. Based on an analysis of subscores on licensure exams, Nursing and Human Environmental Studies revised curricula within their departments. The Department of Mass Communication revised its curriculum based on information collected in senior exit interviews. The Department of Art reexamined drawing components of its curriculum in response to evaluations of an external judge for a senior art exhibition. Evaluations by a faculty workgroup of student writing in General Education interdisciplinary courses led to publication of student and faculty guidelines for writing assignments in all such courses. A major thrust in the revision of departmental assessment plans is to increase emphasis on the collection and use of information that supports continuous program improvement.