There are many programs at LMU that use assessment to better understand student learning and make changes for improvement. Here we present the stories of a few of these programs. Please join us in celebrating their success.
Since 2010, the Accounting Program has incorporated program assessment measures into course exams. This efficient method, which builds upon existing materials to capture information about multiple program learning outcomes at once, has allowed the program to make informed changes for improvement without placing overwhelming demands on faculty’s time or energy.
According to Assistant Clinical Professor Nancy Coster, the approach has also encouraged faculty dialogue. When faculty met to develop measures, they were brought into conversation about the types of knowledge, skills, and values students should demonstrate at a given point in the Accounting curriculum. Explains Nancy, “The process created an opportunity for discussion about what we teach in our sections and the knowledge we expect students to be able to demonstrate… It was good to talk with each other, to explore our similarities and differences and share our various approaches to teaching.”
After reaching consensus on an assessment measure, the Program engaged in further discussion about how to use results to inform action. In the first year, they decided to refine and re-administer their measure, and in the second year they will deliberate how to advance students’ achievement of program outcomes if evidence points to a need. In the words of Dr. Mahmoud Nourayi, Associate Dean of the College of Business Administration and Chair of the Department of Accounting, “Learning Outcome Assessment activities have provided the faculty the opportunity to view and evaluate our pedagogy from a new perspective. The collaboration has brought about more clearly defined instructional goals.”
Inspired by the realization that their program learning outcomes focused more on what their program would offer students than on the learning their students would demonstrate, the Animation Program undertook full revision of its program learning outcomes in 2011. With the support of an internal assessment grant secured by Assistant Professor Adriana Jaroszewicz, the Animation faculty worked collaboratively to develop new outcomes that state what students will know, be able to do, and value upon graduation; define success in a dynamic industry; and set a strong foundation for meaningful and manageable assessment.
Adriana also used grant funding to lead Animation faculty in assessment of their new program learning outcomes. In order to assess the program’s collaborative learning outcome, she developed an exercise that prompted students to reflect on their ability to work effectively in groups. She also developed a senior exit interview that asked students to reflect on their achievement of all program learning outcomes, which the faculty as a whole revised. Using qualitative methods she learned through workshops offered by the Center for Teaching Excellence (CTE), Adriana worked with her colleagues to analyze the results. They found that students appeared to be achieving the collaborative learning outcome, but student achievement of the technological foundations outcome did not meet expectations. Not all students were able to recognize that technical skills acquired in one course could apply to other courses and across software programs. Adriana noted that students need to learn “foundations rather than specific tools” given that software programs constantly evolve and fall in and out of use by the industry.
This insight has led Animation faculty to discuss plans for ensuring that the technological foundations outcome, and all program learning outcomes, are effectively reinforced and advanced as students move throughout the curriculum. Next steps include creating a curriculum map that indicates the level at which each program learning outcome is addressed in each course, developing assessment rubrics, and implementing portfolio reviews in students’ sophomore and senior years.
The Animation program offers a model for how to use quality program learning outcomes as the basis for curricular decision-making. Faculty are happy to report that doing so has made many program processes simpler and more effective. This is especially true of assessment. According to Adriana, “Having focused and specific learning outcomes makes assessment goals easier to set."
Faculty in the Chemistry and Biochemistry programs in the Seaver College of Science and Engineering have initiated two major assessment projects in recent years, developing a rubric to assess students’ oral communication skills and administering a standardized subject matter test. When the initial results of both assessments did not seem to accurately reflect their students’ learning, faculty devoted time to improving their assessment methods rather than implementing changes to pedagogy or curriculum based on potentially faulty findings.
According to Jeremy McCallum, chair of the faculty committee in charge of program assessment, faculty realized after the initial administration of the standardized subject matter test that students’ motivation and performance had been affected by the fact that the test was not formally incorporated into a course. Similarly, when they first applied a rubric they had developed to junior students’ presentations, they noticed scoring disagreement among raters.
Intent on capturing more accurate pictures of their students’ learning, faculty turned their focus to improving and re-administering both assessments, brainstorming ideas to ensure that students performed to the best of their ability on the standardized test, clarifying the language of the rubric, and discussing how to make the application of the presentation rubric more consistent. In line with their ongoing commitment to quality assessment, they plan to use the results of subsequent, more accurate assessments to inform changes that will have a direct impact on students.
The Chemistry and Biochemistry programs’ commitment to meaningful assessment, exemplified by these activities, positions the programs for continual improvement. In the words of Professor McCallum, “Our department hopes these minor adjustments to our assessment methods will give us more accurate measures to better evaluate our learning outcomes and ultimately improve student learning."
Chicana/o Studies is a relatively small department at LMU, with a yearly average of about 20 majors, minors plus additional Liberal Studies students who have selected it as their area of emphasis. When it comes to conducting their department’s assessment, Chair Dr. Karen Mary Davalos says that because they are small, “we have to be strategic.”
One example of their strategic approach is in their longitudinal assessment of their three student learning outcomes. Faculty regularly collect data on each of the outcomes, and conduct a formal analysis every spring. Their plan allows faculty to see growth from freshman to senior year. It also allows the possibility of aggregating data from multiple years so that the formal analysis is based on a large enough number of students that faculty feel they can really understand learning in the department.
In addition to the formal analyses, Chicana/o Studies faculty have regular discussions about student learning, including how the outcomes are evidenced in capstone presentations given by all Chicana/o Studies students. All Chicana/o Studies faculty observe these presentations each year, and they had noted that students who had taken CHST 360 as sophomores or juniors more clearly evidenced the outcomes than those who had not. As a result Chicana/o Studies faculty decided to advise all students to take a specific course sequence prior to the capstone. Faculty share this information one-on-one with advisees, have hosted an information session and gone to student club meetings to connect with students.
Dr. Davalos indicates that the key to approaching assessment as a smaller department is to, “focus on one thing,” and work from there. By focusing on what’s meaningful and what can be manageably accomplished, all programs can plan for successful assessment.
For four years, Professor Teresa Heiland of the Dance Program has been engaged in Scholarship of Teaching and Learning (SoTL) research toward the goal of improving Dance majors’ writing skills. Because her objective has been not only to advance scholarship in this area, but also to improve student learning in her own program, assessment of her students’ learning has played an important role in her research.
By capturing information about Dance majors’ writing skill through rubrics and surveys, Teresa discovered that, while creative with source materials and adept at using scholarly voice, students demonstrated less facility with grammar, syntax, and paragraph structure. As a result, the Dance Program moved to incorporate strategically scaffolded, expository writing lessons into its capstone writing assignment. Teresa continues to research the effects of this change – which have been significant – and the program intends to weave more analytical writing into the dance curriculum.
Teresa’s project brings to light the relationship between assessment and SoTL work, and the common goals they share. Furthermore, the effect her project has had on the culture of writing in the Dance Program is an example of the unforeseeable but powerful changes assessment work can bring about. Teresa explains, “I believe the project, by providing more opportunities to focus on writing and offering detailed feedback, has helped guide Dance majors toward a deeper understanding of how writing helps them reflect upon themselves and dance-related issues.”
For a number of years, the Department of Economics has used the Major Field Test (MFT) to assess two of its student learning outcomes. The test is administered near the end of the senior year, and the scores provided allow the Economics faculty to make decisions about how well their students are accomplishing the outcomes by the time they graduate. The MFT is not just about assessment, however, because preparing for the test means students have another reason to review essential knowledge and skills for life after graduation.
A few years ago the faculty decided that scores on the MFT were not truly reflective of how well they felt their students knew the material. The faculty decided to make a couple of changes for improvement so that the test results provided a more accurate measure of the true abilities of students. Each year small changes were made, including instituting a set of topics in their principles courses to make sure that all of the needed information was covered, developing a review packet and session, using classical terminology to familiarize students with language used on the MFT, and requiring that students pass the test by scoring above a fair, but low bar. These changes, combined with a number of other improvements in the Department, resulted in improved MFT scores that the faculty feel are more representative of the capabilities of their students.
Dr. Jennifer Pate of the Department of Economics says that by adding another piece to the puzzle of improving student learning each year, she has seen how small changes can make a big difference. Closing the assessment loop does not require major action, but can be done with small and thoughtful changes.
The Environmental Science major is a rigorous interdisciplinary program involving course work in biology, chemistry, physics, and environmental science and engineering. Given its interdisciplinary nature, it relies on faculty affiliates to make decisions about the program and drive assessment of program-level student learning. Nicole Bouvier-Brown, Associate Professor of Chemistry & Biochemistry, recently presented at a workshop in the CTE on the program’s assessment strategies and successes. Hosted by the Office of Assessment, the workshop was geared toward highlighting different models of program-level assessment and openly discussing program-level assessment challenges.
Professor Bouvier-Brown described the work the program has undertaken over the last year to examine learning outcomes focused on disciplinary knowledge and effective communication. The program engaged multiple faculty affiliates to determine a manageable timeline, designate who would work on each outcome, and choose courses that would be appropriate sources of evidence for analysis. They decided to use standardized tests and exam items developed by department faculty to examine students’ understanding of core concepts, and student work such as lab reports assessed with rubrics to examine students’ communication abilities. In some cases they developed the assessment measures themselves, and, to assist with this process, won two separate internal assessment grants offered through the Office of Assessment.
Regular discussions about assessment help build support for and understanding of assessment, and given the structure of the Environmental Science program, the activity is especially crucial. Faculty affiliates meet regularly and the program has built discussion about assessment into these meetings, taking advantage of their time together to discuss assessment plans, report on findings, and make decisions about actions for improvement. They also discuss very practical concerns about keeping assessment on track given an affiliate-heavy faculty, as well as other issues that arise in these programs; for example, how to deal with the fact that many students in their cross-listed courses are pursuing majors in other programs.
The program has created a sustainable, robust model for how to approach assessment work in an interdisciplinary program.
Each Fall the reference librarians introduce 900 to 1000 freshman to the library and the research process in a ‘one-shot’ class session. Through a partnership with the English 110 faculty the librarians have worked to make sure that students gain key information literacy skills in this session. Students arrive at the session having completed two library research modules, and they work through three more modules during the session.
This past year the reference librarians developed a rubric to apply to the completed modules to determine if students were achieving the information literacy learning outcomes that the session is intended to address. A random sample of 100 worksheets were selected, and all of the reference librarians applied the rubric to an equal share of worksheets.
Reference Librarian Elisa Slater Acosta reports that the rubric scores have helped the librarians plan modifications to their teaching materials and the worksheet intended to improve student learning. Closing the loop like this is exciting, but Ms. Acosta became most excited when she discussed the assessment process they went through: “Calibrating the rubric led to the most serious discussion about library instruction we’ve ever had.” In fact the process of coming to agreements about the rubric led them to come to agreements about the process of instruction. For example, all librarians will give students more examples and include more practice exercises.
Ms. Acosta said that one of the keys to their success in designing instruction to help students achieve learning outcomes is the partnership they have with the instructors of English 110. She looks forward to working on future assessment projects aimed at understanding and improving students’ information literacy within majors, and hopes that faculty will contact the library to collaborate.
Assessment has been embedded in the everyday workings of the Math Department for many years. In the fall of 2017, Suzanne Larson, Professor of Mathematics, shared details about the Department’s ongoing assessment activities in a workshop in the CTE. Hosted by the Office of Assessment, the workshop served to bring departments of different sizes and compositions into dialogue about how they assess their program learning outcomes.
Professor Larson described multiple tools the Math Department utilizes in its assessment of student learning, including questions embedded in exams to ensure achievement of content proficiency, rubrics applied to student papers to ensure achievement of communication skill, and – as a complement to these sources of direct evidence – senior surveys and exit interviews to gauge students’ perceptions of their learning.
The department keeps the assessment process manageable by assessing an area of content proficiency and one or two additional program learning outcomes per year. For example, during the 2016-2017 academic year, the department focused on learning outcomes having to do with the ability to use computing tools to solve problems. To assess these outcomes the department administered a knowledge survey and, as a result of what it learned from students’ responses, determined that student skills in this area could be improved. The department is writing a formal proposal to modify program curricula so that computing skills are addressed at several points throughout.
The Math Department focuses on a multi-method approach to assessment, takes action when it perceives an area of student learning could be improved, and approaches assessment as a valuable fixture of departmental activity. Through these practices, which LMU as an institution holds in high regard (see LMU’s Guiding Principles for Assessment), the department demonstrates dedication to continually improving student learning.
The Department of Modern Languages and Literatures believes that assessment of student learning is an important activity because it helps them to give their students the best possible education. Both the Spanish and French programs rely on a combination of direct measurements of learning, such as applying rubrics to student work, and indirect measurements of learning, such as asking students to reflect on what they learned in the program.
Recently the Spanish program used their assessment findings to guide their selection of a new textbook for SPAN 321: Composition and Stylistics, since the current textbook was out of print. The textbooks selected for pilot testing in the course were chosen based on two assessment findings. First, the rubric assessment of student writing indicated that students made some grammatical errors in writing. Second, in their capstone reflection on the major, students reported that they wanted to learn more grammar.
Dr. Mónica Cabrera, Chair of the Assessment Committee in Modern Languages and Literatures, reports that, “assessment has been good in that we pay attention in a systematic way to what our students say they need,” and that in this example paying attention has “resulted in better writing in my courses.” Dr. Cabrera stresses that programs should incorporate existing student work and activities, such as the capstone reflection, into their assessment plans to avoid adding to the faculty and student workload.
Dr. Cabrera has a lot of great tips and ideas for keeping assessment manageable and effective. One point that she stresses is that to get the most out of assessment it should be incorporated into the program’s regular business. As an example of this, Modern Languages and Literatures spends time at each faculty meeting discussing assessment.
We know that there are more assessment success stories out there. If your program has an assessment story that you’d like to share, please contact Dr. Chris Schmader, Research Associate in Assessment.