There are many programs at LMU that use assessment to better understand student learning and make changes for improvement. Here we present the stories of a few of these programs. Please join us in celebrating their success.
The Environmental Science major is a rigorous interdisciplinary program involving course work in biology, chemistry, physics, and environmental science and engineering. Given its interdisciplinary nature, it relies on faculty affiliates to make decisions about the program and drive assessment of program-level student learning. Nicole Bouvier-Brown, Associate Professor of Chemistry & Biochemistry, recently presented at a workshop in the CTE on the program’s assessment strategies and successes. Hosted by the Office of Assessment, the workshop was geared toward highlighting different models of program-level assessment and openly discussing program-level assessment challenges.
Professor Bouvier-Brown described the work the program has undertaken over the last year to examine learning outcomes focused on disciplinary knowledge and effective communication. The program engaged multiple faculty affiliates to determine a manageable timeline, designate who would work on each outcome, and choose courses that would be appropriate sources of evidence for analysis. They decided to use standardized tests and exam items developed by department faculty to examine students’ understanding of core concepts, and student work such as lab reports assessed with rubrics to examine students’ communication abilities. In some cases they developed the assessment measures themselves, and, to assist with this process, won two separate internal assessment grants offered through the Office of Assessment.
Regular discussions about assessment help build support for and understanding of assessment, and given the structure of the Environmental Science program, the activity is especially crucial. Faculty affiliates meet regularly and the program has built discussion about assessment into these meetings, taking advantage of their time together to discuss assessment plans, report on findings, and make decisions about actions for improvement. They also discuss very practical concerns about keeping assessment on track given an affiliate-heavy faculty, as well as other issues that arise in these programs; for example, how to deal with the fact that many students in their cross-listed courses are pursuing majors in other programs.
The program has created a sustainable, robust model for how to approach assessment work in an interdisciplinary program.
Assessment has been embedded in the everyday workings of the Math Department for many years. In the fall of 2017, Suzanne Larson, Professor of Mathematics, shared details about the Department’s ongoing assessment activities in a workshop in the CTE. Hosted by the Office of Assessment, the workshop served to bring departments of different sizes and compositions into dialogue about how they assess their program learning outcomes.
Professor Larson described multiple tools the Math Department utilizes in its assessment of student learning, including questions embedded in exams to ensure achievement of content proficiency, rubrics applied to student papers to ensure achievement of communication skill, and – as a complement to these sources of direct evidence – senior surveys and exit interviews to gauge students’ perceptions of their learning.
The department keeps the assessment process manageable by assessing an area of content proficiency and one or two additional program learning outcomes per year. For example, during the 2016-2017 academic year, the department focused on learning outcomes having to do with the ability to use computing tools to solve problems. To assess these outcomes the department administered a knowledge survey and, as a result of what it learned from students’ responses, determined that student skills in this area could be improved. The department is writing a formal proposal to modify program curricula so that computing skills are addressed at several points throughout.
The Math Department focuses on a multi-method approach to assessment, takes action when it perceives an area of student learning could be improved, and approaches assessment as a valuable fixture of departmental activity. Through these practices, which LMU as an institution holds in high regard (see LMU’s Guiding Principles for Assessment), the department demonstrates dedication to continually improving student learning.
Inspired by the realization that their program learning outcomes focused more on what their program would offer students than on the learning their students would demonstrate, the Animation Program undertook full revision of its program learning outcomes in 2011. With the support of an internal assessment grant secured by Assistant Professor Adriana Jaroszewicz, the Animation faculty worked collaboratively to develop new outcomes that state what students will know, be able to do, and value upon graduation; define success in a dynamic industry; and set a strong foundation for meaningful and manageable assessment.
Adriana also used grant funding to lead Animation faculty in assessment of their new program learning outcomes. In order to assess the program’s collaborative learning outcome, she developed an exercise that prompted students to reflect on their ability to work effectively in groups. She also developed a senior exit interview that asked students to reflect on their achievement of all program learning outcomes, which the faculty as a whole revised. Using qualitative methods she learned through workshops offered by the Center for Teaching Excellence (CTE), Adriana worked with her colleagues to analyze the results. They found that students appeared to be achieving the collaborative learning outcome, but student achievement of the technological foundations outcome did not meet expectations. Not all students were able to recognize that technical skills acquired in one course could apply to other courses and across software programs. Adriana noted that students need to learn “foundations rather than specific tools” given that software programs constantly evolve and fall in and out of use by the industry.
This insight has led Animation faculty to discuss plans for ensuring that the technological foundations outcome, and all program learning outcomes, are effectively reinforced and advanced as students move throughout the curriculum. Next steps include creating a curriculum map that indicates the level at which each program learning outcome is addressed in each course, developing assessment rubrics, and implementing portfolio reviews in students’ sophomore and senior years.
The Animation program offers a model for how to use quality program learning outcomes as the basis for curricular decision-making. Faculty are happy to report that doing so has made many program processes simpler and more effective. This is especially true of assessment. According to Adriana, “Having focused and specific learning outcomes makes assessment goals easier to set."
Faculty in the Chemistry and Biochemistry programs in the Seaver College of Science and Engineering have initiated two major assessment projects in recent years, developing a rubric to assess students’ oral communication skills and administering a standardized subject matter test. When the initial results of both assessments did not seem to accurately reflect their students’ learning, faculty devoted time to improving their assessment methods rather than implementing changes to pedagogy or curriculum based on potentially faulty findings.
According to Jeremy McCallum, chair of the faculty committee in charge of program assessment, faculty realized after the initial administration of the standardized subject matter test that students’ motivation and performance had been affected by the fact that the test was not formally incorporated into a course. Similarly, when they first applied a rubric they had developed to junior students’ presentations, they noticed scoring disagreement among raters.
Intent on capturing more accurate pictures of their students’ learning, faculty turned their focus to improving and re-administering both assessments, brainstorming ideas to ensure that students performed to the best of their ability on the standardized test, clarifying the language of the rubric, and discussing how to make the application of the presentation rubric more consistent. In line with their ongoing commitment to quality assessment, they plan to use the results of subsequent, more accurate assessments to inform changes that will have a direct impact on students.
The Chemistry and Biochemistry programs’ commitment to meaningful assessment, exemplified by these activities, positions the programs for continual improvement. In the words of Professor McCallum, “Our department hopes these minor adjustments to our assessment methods will give us more accurate measures to better evaluate our learning outcomes and ultimately improve student learning."
For four years, Professor Teresa Heiland of the Dance Program has been engaged in Scholarship of Teaching and Learning (SoTL) research toward the goal of improving Dance majors’ writing skills. Because her objective has been not only to advance scholarship in this area, but also to improve student learning in her own program, assessment of her students’ learning has played an important role in her research.
By capturing information about Dance majors’ writing skill through rubrics and surveys, Teresa discovered that, while creative with source materials and adept at using scholarly voice, students demonstrated less facility with grammar, syntax, and paragraph structure. As a result, the Dance Program moved to incorporate strategically scaffolded, expository writing lessons into its capstone writing assignment. Teresa continues to research the effects of this change – which have been significant – and the program intends to weave more analytical writing into the dance curriculum.
Teresa’s project brings to light the relationship between assessment and SoTL work, and the common goals they share. Furthermore, the effect her project has had on the culture of writing in the Dance Program is an example of the unforeseeable but powerful changes assessment work can bring about. Teresa explains, “I believe the project, by providing more opportunities to focus on writing and offering detailed feedback, has helped guide Dance majors toward a deeper understanding of how writing helps them reflect upon themselves and dance-related issues.”
Since 2010, the Accounting Program has incorporated program assessment measures into course exams. This efficient method, which builds upon existing materials to capture information about multiple program learning outcomes at once, has allowed the program to make informed changes for improvement without placing overwhelming demands on faculty’s time or energy.
According to Assistant Clinical Professor Nancy Coster, the approach has also encouraged faculty dialogue. When faculty met to develop measures, they were brought into conversation about the types of knowledge, skills, and values students should demonstrate at a given point in the Accounting curriculum. Explains Nancy, “The process created an opportunity for discussion about what we teach in our sections and the knowledge we expect students to be able to demonstrate… It was good to talk with each other, to explore our similarities and differences and share our various approaches to teaching.”
After reaching consensus on an assessment measure, the Program engaged in further discussion about how to use results to inform action. In the first year, they decided to refine and re-administer their measure, and in the second year they will deliberate how to advance students’ achievement of program outcomes if evidence points to a need. In the words of Dr. Mahmoud Nourayi, Associate Dean of the College of Business Administration and Chair of the Department of Accounting, “Learning Outcome Assessment activities have provided the faculty the opportunity to view and evaluate our pedagogy from a new perspective. The collaboration has brought about more clearly defined instructional goals.”
We know that there are more assessment success stories out there. If your program has an assessment story that you’d like to share, please contact our office at firstname.lastname@example.org.