Celebrating Assessment at LMU

Assessment Success Stories 

There are many programs at LMU that use assessment to better understand student learning and make changes for improvement. Here we present the stories of a few of these programs. Please join us in celebrating their success.

Accounting
Animation
Chemistry and Biochemistry
Chicana/o Studies
Dance
Economics
Library Instruction
Math
Modern Languages and Literatures: Spanish

 

Accounting
February 2012

Since 2010, the Accounting Program has incorporated program assessment measures into course exams. This efficient method, which builds upon existing materials to capture information about multiple program learning outcomes at once, has allowed the program to make informed changes for improvement without placing overwhelming demands on faculty’s time or energy. 

According to Assistant Clinical Professor Nancy Coster, the approach has also encouraged faculty dialogue. When faculty met to develop measures, they were brought into conversation about the types of knowledge, skills, and values students should demonstrate at a given point in the Accounting curriculum. Explains Nancy, “The process created an opportunity for discussion about what we teach in our sections and the knowledge we expect students to be able to demonstrate… It was good to talk with each other, to explore our similarities and differences and share our various approaches to teaching.” 

After reaching consensus on an assessment measure, the Program engaged in further discussion about how to use results to inform action. In the first year, they decided to refine and re-administer their measure, and in the second year they will deliberate how to advance students’ achievement of program outcomes if evidence points to a need. In the words of Dr. Mahmoud Nourayi, Associate Dean of the College of Business Administration and Chair of the Department of Accounting, “Learning Outcome Assessment activities have provided the faculty the opportunity to view and evaluate our pedagogy from a new perspective.  The collaboration has brought about more clearly defined instructional goals.”

back to top

Animation
September 2013

Inspired by the realization that their program learning outcomes focused more on what their program would offer students than on the learning their students would demonstrate, the Animation Program undertook full revision of its program learning outcomes in 2011. With the support of an internal assessment grant secured by Assistant Professor Adriana Jaroszewicz, the Animation faculty worked collaboratively to develop new outcomes that state what students will know, be able to do, and value upon graduation; define success in a dynamic industry; and set a strong foundation for meaningful and manageable assessment.

Adriana also used grant funding to lead Animation faculty in assessment of their new program learning outcomes. In order to assess the program’s collaborative learning outcome, she developed an exercise that prompted students to reflect on their ability to work effectively in groups. She also developed a senior exit interview that asked students to reflect on their achievement of all program learning outcomes, which the faculty as a whole revised. Using qualitative methods she learned through workshops offered by the Center for Teaching Excellence (CTE), Adriana worked with her colleagues to analyze the results. They found that students appeared to be achieving the collaborative learning outcome, but student achievement of the technological foundations outcome did not meet expectations. Not all students were able to recognize that technical skills acquired in one course could apply to other courses and across software programs. Adriana noted that students need to learn “foundations rather than specific tools” given that software programs constantly evolve and fall in and out of use by the industry.

This insight has led Animation faculty to discuss plans for ensuring that the technological foundations outcome, and all program learning outcomes, are effectively reinforced and advanced as students move throughout the curriculum. Next steps include creating a curriculum map that indicates the level at which each program learning outcome is addressed in each course, developing assessment rubrics, and implementing portfolio reviews in students’ sophomore and senior years.

The Animation program offers a model for how to use quality program learning outcomes as the basis for curricular decision-making. Faculty are happy to report that doing so has made many program processes simpler and more effective. This is especially true of assessment. According to Adriana, “Having focused and specific learning outcomes makes assessment goals easier to set."

back to top

Chemistry & Biochemistry
August 2013

Faculty in the Chemistry and Biochemistry programs in the Seaver College of Science and Engineering have initiated two major assessment projects in recent years, developing a rubric to assess students’ oral communication skills and administering a standardized subject matter test. When the initial results of both assessments did not seem to accurately reflect their students’ learning, faculty devoted time to improving their assessment methods rather than implementing changes to pedagogy or curriculum based on potentially faulty findings.

According to Jeremy McCallum, chair of the faculty committee in charge of program assessment, faculty realized after the initial administration of the standardized subject matter test that students’ motivation and performance had been affected by the fact that the test was not formally incorporated into a course. Similarly, when they first applied a rubric they had developed to junior students’ presentations, they noticed scoring disagreement among raters.

Intent on capturing more accurate pictures of their students’ learning, faculty turned their focus to improving and re-administering both assessments, brainstorming ideas to ensure that students performed to the best of their ability on the standardized test, clarifying the language of the rubric, and discussing how to make the application of the presentation rubric more consistent.  In line with their ongoing commitment to quality assessment, they plan to use the results of subsequent, more accurate assessments to inform changes that will have a direct impact on students.

The Chemistry and Biochemistry programs’ commitment to meaningful assessment, exemplified by these activities, positions the programs for continual improvement. In the words of Professor McCallum, “Our department hopes these minor adjustments to our assessment methods will give us more accurate measures to better evaluate our learning outcomes and ultimately improve student learning."

back to top

Chicana/o Studies
March 2010

Chicana/o Studies is a relatively small department at LMU, with a yearly average of about 20 majors, minors plus additional Liberal Studies students who have selected it as their area of emphasis. When it comes to conducting their department’s assessment, Chair Dr. Karen Mary Davalos says that because they are small, “we have to be strategic.” 

One example of their strategic approach is in their longitudinal assessment of their three student learning outcomes. Faculty regularly collect data on each of the outcomes, and conduct a formal analysis every spring. Their plan allows faculty to see growth from freshman to senior year. It also allows the possibility of aggregating data from multiple years so that the formal analysis is based on a large enough number of students that faculty feel they can really understand learning in the department. 

In addition to the formal analyses, Chicana/o Studies faculty have regular discussions about student learning, including how the outcomes are evidenced in capstone presentations given by all Chicana/o Studies students. All Chicana/o Studies faculty observe these presentations each year, and they had noted that students who had taken CHST 360 as sophomores or juniors more clearly evidenced the outcomes than those who had not. As a result Chicana/o Studies faculty decided to advise all students to take a specific course sequence prior to the capstone. Faculty share this information one-on-one with advisees, have hosted an information session and gone to student club meetings to connect with students. 

Dr. Davalos indicates that the key to approaching assessment as a smaller department is to, “focus on one thing,” and work from there. By focusing on what’s meaningful and what can be manageably accomplished, all programs can plan for successful assessment.

back to top

Dance
February 2012

For four years, Professor Teresa Heiland of the Dance Program has been engaged in Scholarship of Teaching and Learning (SoTL) research toward the goal of improving Dance majors’ writing skills. Because her objective has been not only to advance scholarship in this area, but also to improve student learning in her own program, assessment of her students’ learning has played an important role in her research. 

By capturing information about Dance majors’ writing skill through rubrics and surveys, Teresa discovered that, while creative with source materials and adept at using scholarly voice, students demonstrated less facility with grammar, syntax, and paragraph structure. As a result, the Dance Program moved to incorporate strategically scaffolded, expository writing lessons into its capstone writing assignment. Teresa continues to research the effects of this change – which have been significant – and the program intends to weave more analytical writing into the dance curriculum. 

Teresa’s project brings to light the relationship between assessment and SoTL work, and the common goals they share. Furthermore, the effect her project has had on the culture of writing in the Dance Program is an example of the unforeseeable but powerful changes assessment work can bring about. Teresa explains, “I believe the project, by providing more opportunities to focus on writing and offering detailed feedback, has helped guide Dance majors toward a deeper understanding of how writing helps them reflect upon themselves and dance-related issues.”

back to top 

Economics
September 2009

For a number of years, the Department of Economics has used the Major Field Test (MFT) to assess two of its student learning outcomes. The test is administered near the end of the senior year, and the scores provided allow the Economics faculty to make decisions about how well their students are accomplishing the outcomes by the time they graduate.  The MFT is not just about assessment, however, because preparing for the test means students have another reason to review essential knowledge and skills for life after graduation. 

A few years ago the faculty decided that scores on the MFT were not truly reflective of how well they felt their students knew the material. The faculty decided to make a couple of changes for improvement so that the test results provided a more accurate measure of the true abilities of students. Each year small changes were made, including instituting a set of topics in their principles courses to make sure that all of the needed information was covered, developing a review packet and session, using classical terminology to familiarize students with language used on the MFT, and requiring that students pass the test by scoring above a fair, but low bar. These changes, combined with a number of other improvements in the Department, resulted in improved MFT scores that the faculty feel are more representative of the capabilities of their students. 

Dr. Jennifer Pate of the Department of Economics says that by adding another piece to the puzzle of improving student learning each year, she has seen how small changes can make a big difference. Closing the assessment loop does not require major action, but can be done with small and thoughtful changes.

back to top 

Library Instruction
August 2010

Each Fall the reference librarians introduce 900 to 1000 freshman to the library and the research process in a ‘one-shot’ class session. Through a partnership with the English 110 faculty the librarians have worked to make sure that students gain key information literacy skills in this session. Students arrive at the session having completed two library research modules, and they work through three more modules during the session. 

This past year the reference librarians developed a rubric to apply to the completed modules to determine if students were achieving the information literacy learning outcomes that the session is intended to address. A random sample of 100 worksheets were selected, and all of the reference librarians applied the rubric to an equal share of worksheets. 

Reference Librarian Elisa Slater Acosta reports that the rubric scores have helped the librarians plan modifications to their teaching materials and the worksheet intended to improve student learning. Closing the loop like this is exciting, but Ms. Acosta became most excited when she discussed the assessment process they went through: “Calibrating the rubric led to the most serious discussion about library instruction we’ve ever had.” In fact the process of coming to agreements about the rubric led them to come to agreements about the process of instruction. For example, all librarians will give students more examples and include more practice exercises. 

Ms. Acosta said that one of the keys to their success in designing instruction to help students achieve learning outcomes is the partnership they have with the instructors of English 110. She looks forward to working on future assessment projects aimed at understanding and improving students’ information literacy within majors, and hopes that faculty will contact the library to collaborate.

back to top


Math
August 2009

The Math Department at LMU is using assessment on an ongoing basis to make certain that math students are getting the best possible education. Dr. Curtis Bennett, Chair, recently shared a story of closing the assessment loop.

Using a knowledge survey to assess their content proficiency outcome, it was discovered that at the end of Calculus II and the beginning of Calculus III students did not feel very confident about their abilities with sequences and series. To address this, the faculty decided to spend a bit more time during Calculus II on sequences and series; however, to do this they chose to cut time spent on examples of applications of calculus. A formal assessment of the results of this change has yet to be conducted, but the faculty feel that while students have improved in sequences and series there also seems to be a corresponding drop off in ability to apply calculus. At this time, the faculty are thinking about other ways to approach the situation, and are carefully considering the relative importance of the topics to their students in making their decisions.

Math is doing a great job of using what they learn from assessing their learning outcomes to make relatively small changes for big improvements, and as a conversation starter for faculty about what’s most important for their students. They see the process of closing the assessment loop as ongoing, as they are aware that not every change will be perfect and may need tweaking for best effect. Dr. Bennett called these small changes ‘baby-steps,’ and said that most faculty make these kinds of assessments of student learning and changes for improvement all of the time—essentially faculty are already doing assessment but are just not realizing it.

back to top 


Modern Languages and Literatures: Spanish
June 2010

The Department of Modern Languages and Literatures believes that assessment of student learning is an important activity because it helps them to give their students the best possible education. Both the Spanish and French programs rely on a combination of direct measurements of learning, such as applying rubrics to student work, and indirect measurements of learning, such as asking students to reflect on what they learned in the program.

Recently the Spanish program used their assessment findings to guide their selection of a new textbook for SPAN 321: Composition and Stylistics, since the current textbook was out of print. The textbooks selected for pilot testing in the course were chosen based on two assessment findings. First, the rubric assessment of student writing indicated that students made some grammatical errors in writing. Second, in their capstone reflection on the major, students reported that they wanted to learn more grammar. 

Dr. Mónica Cabrera, Chair of the Assessment Committee in Modern Languages and Literatures, reports that, “assessment has been good in that we pay attention in a systematic way to what our students say they need,” and that in this example paying attention has “resulted in better writing in my courses.” Dr. Cabrera stresses that programs should incorporate existing student work and activities, such as the capstone reflection, into their assessment plans to avoid adding to the faculty and student workload. 

Dr. Cabrera has a lot of great tips and ideas for keeping assessment manageable and effective. One point that she stresses is that to get the most out of assessment it should be incorporated into the program’s regular business. As an example of this, Modern Languages and Literatures spends time at each faculty meeting discussing assessment.  

back to top 

We know that there are more assessment success stories out there. If your program has an assessment story that you’d like to share, please contact Dr. Laura Massa , Director of Assessment.