Congratulations to all of our past Summer Assessment Grant Recipients on their hard work and dedication to improving assessment efforts in their programs. Below you will find summaries of the work completed by the grant recipients from 2009 to 2018. Below you will find summaries of the work they completed.
Summer Assessment Grant 2018 Recipients
Associate Professor Adriana Jaroszewicz developed an assessment plan for the Animation Department to evaluate two program learning outcomes, and a rubric to gather data on students’ thesis projects from ANIM 495/496. The plan sets discussion and analysis goals for two existing surveys measures; the rubric data would be collected at the end of the Spring semester during the screening of the thesis projects. The plan is being presented to the Animation faculty, and upon approval, it can be carried out to inform the direction of the program.
Asian and Pacific Studies
Yanjie Wang in the Department of Asian and Asian American Studies created a rubric to assess the twenty-three senior thesis projects completed in Spring 2018 by the majors and minors in the Asian and Pacific Studies program (ASPA). The goal is to examine students’ accomplishment of the ASPA learning outcomes. The assessment identified students’ areas of interest as well as the effectiveness of the knowledge and skills imparted to them. The outcome of the assessment was presented at the departmental faculty retreat in Fall 2018 to help the department further refine its interdisciplinary curriculum and consider its hiring priorities.
The Counseling Program is creating new opportunities for bilingual counseling training and assessment. The current project, led by Dr. Fernando Estrada, focused on the development of materials to help instructors evaluate oral and written Spanish language proficiency. Students are now able to translate from English to Spanish several forms that are common in counseling, and receive formal assessment of their translation performance. Stimulus materials and rubrics were piloted and align with counseling-based competencies, and also the standards set by the American Council on the Teaching of Foreign Languages.
The History Department’s Curriculum and Assessment Committee assessed our Department Learning Outcome #5 (“Create historical arguments and narratives”). We solicited papers from three upper-division courses. Members developed a rubric and engaged in an exercise to coordinate criteria for scoring. We selected fifteen papers from each of the three courses and each read and scored two-thirds of the total, averaged scores for each paper, and averaged the results across all papers. Our results show that students are doing a “good” to “very good” job of addressing Department Learning Outcome #5. We will discuss these results with the Department and assess another Learning Outcome in 2019.
This assessment project entails planning and implementation of LMU Marketing Program learning outcomes (oral communication) for an innovative teaching method: development of a marketing plan in the form of videos for the most fundamental marketing course, which is required for all business majors. Recent literature in teaching provides vast evidence showing significant changes in millennials’ preferred teaching and learning process compared with previous generations. Learning outcomes based on teaching methods incorporating video deliverables in business courses are yet to be assessed. This proposal specifically assesses the ‘students should develop effective oral and written communication skills’ learning outcome using the Oral Commutation VALUE Rubric.
Associate Professors Anna Bargagliotti and Christina Eubanks-Turner developed an exit survey for the Masters of Arts in Teaching Mathematics (MAT) program. The survey will inform MAT faculty and staff how students perceive the curriculum and its relation to their specialized content knowledge, where Specialized Content Knowledge defined by Ball, Hill, and Bass (2005) is a deeper understanding of mathematics that allows teachers to explain new ideas, work problems in multiple ways, and analyze student solutions. Survey results will be used to inform future curriculum design and create a more cohesive curriculum that serves to strengthen students’ mathematical content knowledge.
The Mathematics Department is currently undergoing the Academic Program Review process and has decided to focus on four primary areas of inquiry. Specifically, we are (1) reviewing our Master’s of Arts in Teaching Program, (2) examining the diverse teaching styles of our faculty, (3) reflecting on the use of computing in our teaching, (4) investigating the ways in which our faculty engage in service at multiple levels. We collected data in multiple forms about each of these areas during the previous academic year, and this summer we analyzed the data. This upcoming academic year, we will decide as a department how to use this data to draw conclusions about our future directions.
Stephanie Limoncelli and Rachel Washburn assessed sociology students’ ability to use sociological information to critically analyze contemporary social issues. A rubric was developed, evidence was collected from Sociology Seminar courses taught in AY 2017-2018, and the rubric was applied to a sample of papers from this course. Rubric scores were quantitatively analyzed to characterize student achievement on the selected learning outcome. This evidence was presented in a short report to the Sociology Department, along with recommendations for improving student learning, at the annual fall retreat. This report will be used to make curricular decisions during the 2018-2019 academic year and the process employed to generate the data will serve as a model for future assessment activities.
I have created a rubric for skills-heavy classes which can be adapted as needed. This rubric assesses in a non-quantitative yet objective manner the students’ abilities and learning within skill-based techniques and the use of materials. The next steps on this process are already underway. I have shared this rubric with my colleagues and asked for feed-back to help tweak it and “standardize” it. I will then implement it and use it THEA 124, Costume Crafts in Fall of 2018, and encourage my colleagues to do so in their classes. We can use this as a way to continue to examine how well our students meet these demands of our major and prepare them for their futures.
The Master of Arts in Yoga Studies program faculty and staff developed a yearly assessment plan comprising review of both individual courses and the overall program outcomes. The process of ongoing annual program assessment will ensure accountability and integrity between individual course work and overall program goals. This year, the Yoga Studies M. A. program refined its overall program outcomes due to such reflection. Next year the faculty and staff will assess an additional three courses in light of the revised program outcomes. The results will inform future curricular decisions.
Summer Assessment Grant 2017 Recipients
The Counseling program is in the process of creating new opportunities for bilingual counseling training. The current project, led by Assistant Professor Fernando Estrada, focused on the development of assessment tools to help instructors evaluate language proficiency among students conducting Spanish-language counseling in their fieldwork sites. Three rubrics were created and piloted. The rubrics align with several fieldwork counseling competencies, and also reflect the standards set by the American Council on the Teaching of Foreign Languages. Content expertise was provided by the Center for Equity for English Learners. The program looks forward to using the rubrics and additional bilingual counseling competency descriptors to assess and recognize candidates with valuable bilingual skills.
In an effort to assess its program learning outcomes, the Economics department has been using its own version of the MFT (Major Field Test), administered as a required 1 unit course. Prior to administration, the department allows the relevant instructor(s) to offer a review session for students. However, time constraints preclude a comprehensive review of the material. Associate Professor Zaki Eusufzai created a set of nine tutorials covering the relevant statistics and econometrics material using Snag-It, a screen casting software. These tutorials are now available online and allow students to prepare at their own pace, on their own time.
Associate Professor Nicole Bouvier-Brown and Professor Martina Ramirez created an exam to assess an Environmental Science program learning outcome focused on discipline-specific knowledge. The exam will be given in the ENVS 491 capstone course and responses will be evaluated using a rubric. Responses and evaluations will be shared with the ENVS Advisory Board over the next academic year to see if modifications are needed for next year’s assessment tool. Ongoing discussion within the program will help shape assessments for other program outcomes as well as ensure that tools such as this exam continue to be developed for years to come.
Professor Lambert Doezema and Associate Professor Michelle Lum developed an assessment plan for an Environmental Science program learning outcome focused on discipline-specific knowledge. They determined the core concepts to assess as well as the courses to be used for the analysis. A portion of the assessment will involve the use of standardized tests; however, they also identified student products to collect and developed accompanying rubrics. The assessment plan is being presented to the Environmental Science faculty affiliates, and upon approval, the collection of student work will begin starting in the fall of 2017.
Health and Human Sciences
Associate Professor Sarah Strand, Assistant Professor William McCormack, and Clinical Assistant Professor Stephanie Perez worked to create an implementable assessment plan for the department. They edited and combined seven program outcomes to create four new ones, created performance indicators for each of the new learning outcomes, set proposed benchmarks, and determined the frequency of measurement. The work will be done primarily via assignments and evaluations that currently exist in classes, with some changes to rubrics in order to better capture information about student learning. As a result of what they learned through this project, the faculty will be recommending that a current upper division lecture and lab be changed to a required course for all majors.
Marital and Family Therapy
Assistant Professor Louvenia Jackson developed instruments to assess the impact of incorporating instruction about the concept of cultural humility in the MFT/art therapy curricula. A combination of pre/post measures, offered before and after community-based art engagement field excursions, and examination of students’ art pieces, art statements, and written reflections will be used to assess development of students’ understanding and valuing of the concept of cultural humility. The project offers a qualitative and a quantitative methodology to assess student learning and establishes evidence for the effectiveness of the department’s evolving pedagogical practices and its core commitment to community-based engagement and cultural awareness.
Professor Brian Leung, Professor Terese Aceves, Associate Professor Emily Fisher, and Lecturer Janet Fulton-Jackson engaged in a comprehensive review and analysis of six assessments related to candidate performance. These assessments include (1) results from a national exam; (2) signature assignments from all program courses; (3) candidate performance from second year fieldwork; (4) candidate performance from third year fieldwork, as assessed by field supervisors; (5) candidate performance from fieldwork, as assessed by faculty; and (6) interns’ positive impact on students and family, as assessed by faculty. Program improvements have been derived from analysis of these assessments, and a report summarizing the assessments is being prepared for continued national accreditation of the School Psychology program.
Summer Assessment Grant 2016 Recipients
Nicole Bouvier-Brown and Lambert Doezema developed an entrance exam for Chemistry students going into the department’s freshmen general chemistry courses. The exam is administered online through Blackboard, which means that future students could potentially take it prior to the summer freshmen orientation session. The exam will allow the department to better understand the level of knowledge with which students are entering the program and – through administration of another assessment at the end of the course – what types of gains in knowledge the course is facilitating. The department also plans to explore whether course performance can be predicted from certain exam questions.
Humanities faculty members Alexandra Neel and Aine O’Healy developed a rubric for the Humanities program’s capstone project, as well as a rubric to assess the program’s oral presentation final. Three faculty members from BCLA applied the capstone rubric to three years of capstone projects, collecting invaluable data about their students’ level of learning in several key areas. The Humanities Program is going through the Academic Program Review process beginning fall 2016. They will include the information they collected as direct evidence of their students’ achievement of program outcomes, and will use it to inform future actions.
In concert with a revision to the undergraduate Finance curriculum, David Offenberg initiated a review of the major’s learning outcomes. Following a review of the literature and a study of learning outcomes at peer and aspirational universities, a new set of learning outcomes was proposed. The finance faculty then met to further refine the learning outcomes to ensure that they were realistic, clearly stated, achievable, assessable, and stemmed from overarching program goals. They also worked to create a curriculum map that shows where in the new curriculum the new learning outcomes are introduced, reinforced, and assessed.
Led by Stuart Ching and Michael Datcher, the English Department’s Assessment Committee developed an assessment plan comprising two strands: an annual spring assessment targeting student writing in specific areas of the English curriculum, and periodic assessments informing or emerging from Academic Program Review (APR) cycles. The annual writing assessment will ensure departmental accountability and continuity among different departmental administrations, and the periodic assessment will enable the department to connect assessment to program/curricular changes and pedagogical questions arising from APR cycles. Both will help the Department realize and sustain an ongoing culture of assessment.
Led by Vanessa Newell, faculty in the Production department applied a rubric to films from two sections of a graduate course – one offered prior to a pilot change in the design of the course and one offered after. The department wanted to examine whether allowing students to incorporate dialog into their first-year film impacted the films’ visual quality. Faculty involved in the project engaged in a norming session prior to scoring to promote consistent application across assessors, and will present their findings to the faculty as a whole in fall 2016. The department will take what they learned from the project into consideration in deciding whether to make the change permanent, investigate a different curricular change, or revert back to the original curriculum.
Summer Assessment Grant 2015 Recipients
Charles Peterson led the Art History program in a major revision of the two knowledge surveys used for assessment in their two cornerstone lower-division courses. Improving understanding of student learning in these courses was important to the program given that the courses fulfill several core requirements, attract students from a range of backgrounds and disciplines, and account for the first part of many Art History majors’ and minors’ program curriculum. The surveys were integrated into Qualtrics, LMU’s online survey platform, so that students could access them with computers and mobile devices. The revised surveys were first administered at the start of the fall 2015 semester.
Kathleen Norris and Judy Battaglia developed an online survey to assess Communication Studies Academic Advising. The survey includes a variety of questions related to their Academic Advising student learning outcomes and will be administered for the first time in November 2015, after advising for the spring 2016 semester is complete. Results will be shared with both the Communication Studies faculty and the College of Communication and Fine Arts’ Office of Academic Affairs. The results will be used to improve Communication Studies academic advising practices as needed. Communication Studies plans to distribute the survey and use the results on an annual basis.
In response to university strategic planning initiatives focused on strengthening academic rigor and graduate education, the Counseling Program recently implemented a multiple-choice comprehensive exam that tests students’ knowledge in several content areas. Fernando Estrada led an analysis of exam data in order to determine performance trends, low- and high-scoring domains, and shifts over time, and worked collaboratively with both full-time faculty and adjunct professors in the program to make changes based upon the findings. Changes included modifications to the exam and the creation of additional items.
Electrical Engineering, Graduate Program
Lei Huang and Stephanie August led their colleagues in the department of Electrical Engineering and Computer Science to establish a program assessment framework and process. They revised program educational objectives and refined student learning outcomes for several graduate-level courses, and identified possible rubrics and other techniques that could be used to asses each of the student outcomes addressed in each course. A pilot implementation of this framework and process will be carried out in the graduate courses offered in the fall 2015 semester.
English, Accelerated Masters
Dermot Ryan, the Director of the graduate program in English, worked to develop a program proposal for an “Accelerated Masters” (AM) that would provide the best undergraduates at LMU with the ability to earn a combined Bachelor’s and Master’s degree in five years. He developed the learning outcomes and a curriculum map for the proposed program. He also researched similar programs at comparator schools and interviewed a number of the program directors to assess the potential costs and benefits of offering an AM. A proposal to develop an AM was approved by the English Department and they plan to submit their proposal to the APRC for review in fall 2015.
Health and Human Sciences
For the past nine years, LMU students who have completed a yearlong sequence of Human Anatomy and Physiology have taken a nationwide comprehensive exam created by the Human Anatomy and Physiology Society. Todd Shoepe analyzed the exam data in order to see what percentage of all LMU students who take the exam – and what percentage of LMU Natural Science/Health and Human Sciences students who take the exam – matriculate to graduate or professional school. He also compared LMU students’ scores to the national average. The now-existing database will serve as a scaffold to maintain an annual collection of data as part of an ongoing departmental assessment plan.
Velitchka D. Kaltcheva in the Department of Marketing and Business Law researched literature on program assessment strategies and developed a plan for assessing the Marketing program’s eight learning outcomes. Once direct and indirect measures for assessing all eight outcomes have been developed by the department, the plan will ensure the completion of one assessment cycle every two years. At this time, the Defining Issues Test (DIT) has been identified as a suitable measure for assessing Marketing students’ moral reasoning ability.
In order to assess the department’s learning outcomes related to written communication and critical thinking, Jennifer Ramos and other members of the Political Science assessment committee evaluated senior seminar research papers using a rubric that had been established last year in conjunction with other department faculty. After ensuring inter-coder reliability, they presented their results at a faculty retreat and made recommendations for how the department might further refine its program assessment; for example, through following a pre-/post-assessment model. The committee also made suggestions for coordinating course policies.
The Psychology Department administers case study questions to every graduating Psychology cohort, but the results had never been analyzed. Nora Murphy examined the data from 2008 to present, analyzing item reliability and the score distributions by content area and cohort. She created a master SPSS file, an SPSS syntax scoring file, and a step-by-step guide for entering and scoring case study data and conducting analyses in the future. She will present findings and offer recommendations for future assessments during a department meeting in the fall 2015 semester.
Brian Leung led School Psychology faculty in creating an online folio of evidence to facilitate an upcoming reaccreditation review. The work involved summarizing performance data on students who’ve completed the program in the last three years, including candidates’ performance in program coursework and their impact on P-12 students during school internships. The program also reflected on continuing improvement strategies, deciding upon a number of modifications over the next several years. These include realigning course content to accreditation needs and refining methods for assessing candidate dispositions.
Saeri Cho Dobson from the Studio Arts program created a new rubric for Graphic Design Faculty to apply at the Annual Graphic Design Junior Portfolio Review. The rubric will be used to assess students’ ability to analyze and present three potential thesis ideas, as well as their capacity to contextualize these topics within their own contemporary design practice. She also developed a rubric that can be applied across courses to assess progress towards the development of a strong design portfolio. Over the next year, faculty will analyze the rubric evidence they gather to pinpoint areas of strength and areas in need of improvement.
Summer Assessment Grant 2014 Recipients
Adriana Jaroszewicz conducted qualitative analysis of the Animation program’s annual learning outcomes survey and led Animation faculty in the development of a self-assessment that students will fill out at two points in the curriculum: sophomore year and senior year. In addition, at both points students will be required to compile portfolios of their work. Adriana created a rubric for assessing demonstrated learning in the portfolios. The portfolio assessment will generate evidence of student learning and facilitate faculty-student dialogue about students’ strengths, areas in need of improvement, and career options.
Amanda Herring led Art History faculty in modifying their assessment of their 200-series lower-division courses. They created a new assignment template for each class that will address key learning outcomes, notably the ability to write critically and the ability to analyze textual and visual sources within their historical context. They also developed a rubric that can be applied across courses to examine students’ achievement of the outcomes. Over the next year, faculty will analyze the rubric scores to pinpoint areas of strength and areas in need of improvement.
Casa de la Mateada Study Abroad Program
Jenifer Abe from the Psychology Department set out to examine the impact of the Casa de la Mateada study abroad program on student learning. She conducted and transcribed pre- and post-program interviews with two cohorts of students and has begun the process of administrating six-month follow-up surveys. She will continue to use qualitative methods to isolate themes in the interviews and surveys, and will use what she finds, along with student work from classes, to understand students’ achievement of learning outcomes in program’s first two years.
Chemistry & Biochemistry
The Chemistry and Biochemistry Department decided to implement an updated standardized exit exam to assess seniors’ content knowledge. Professors Jeremy McCallum and Emily Jarvis, in consultation with the other faculty members in the department, rated the 60 items on the new exam according to problem difficulty and content area, and mapped where topics appeared in their curriculum map. They statistically analyzed student answers and examined the related course content for questions that the majority of students answered incorrectly. They presented their results at the programs’ annual assessment retreat and will discuss actions for improvement at an upcoming retreat.
Classics and Archaeology
Katerina Zacharia in collaboration with Maire Ford from the Psychology Department developed assessment tools and a department assessment plan, which Classics and Archaeology faculty unanimously approved. The tools consist of an online senior exit survey to assess students’ experiences in the department and their perceptions of their learning, and a rubric that can be applied to student presentations from the annual departmental research symposium to assess two learning outcomes common to all majors and minors. The department also decided to add a 6-unit required capstone for all students to promote disciplinary writing, oral presentation, and research skills.
Jennifer Ramos and her colleagues in Political Science engaged in assessment of four program learning outcomes. To assess students’ ability to think critically and communicate effectively about politics, they revised an existing rubric, agreed on its use, and applied it to student work. To understand the extent to which students value active citizenship and a just society, they analyzed responses to a senior assessment survey. They discussed the results of the assessments at a department retreat and were pleased to have objective data to help them with current and future department decisions.
The Production program revised its program learning outcomes to better articulate what they expect students to know, be able to do, and value upon graduation. Mikael Kreuzriegler and Vanessa Newell led the effort, drafting clear and measurable outcomes and a map that shows where in the curriculum the outcomes are addressed. Upon approval of the proposed program-level outcomes, Mikael and Vanessa will lead faculty in the review and revision of course-level outcomes. Through this process they hope to standardize and strengthen course-level outcomes and to ensure that course-level and program-level outcomes align.
Summer Assessment Grant 2013 Recipients
Marilyn Beker, in discussion with colleagues and after careful review of existing syllabi in pre-requisite screenwriting courses SCWR 220 and FTVS 210, developed a knowledge survey to assess learning in these courses. This fall, survey results will be tabulated and based on these results, the Screenwriting Faculty will determine how to standardize syllabi and if and how they need to augment courses. They will also consider the development of a new course to directly address areas of knowledge deficiency in film aesthetics as they apply to screenwriting.
Upon the graduation of its first cohort, the Entrepreneurship Program decided to revisit its program’s learning outcomes. Led by David Choi, with guidance from Associate Professor Linda Leon and Director of Assessment Laura Massa, faculty worked to update program and course-level learning outcomes based on best practices at other schools, review of academic research, and their current expectation of what students should know and be able to demonstrate when they graduate. They also worked together to create a curriculum map to evaluate how well courses address and reinforce the new outcomes. Finally they designed a rubric to assess student achievement of two of the new outcomes, and planned to implement the rubric during the 2013-2014 academic year.
Fernando Estrada and William Parham worked collaboratively with other full-time and adjunct professors in the Counseling Program to develop a comprehensive exam that will be administered at the end of graduate training and will assess content knowledge across the eleven core domains of the curriculum. The project is intended not only to help faculty understand learning in each cohort, but also to help them identify significant shifts in learning over time and support their efforts to design evidence-driven interventions.
In the spring semester of 2013, Professor Mikael Kreuzriegler devised new rubrics to assess a different set of Production program learning outcomes than had been assessed in previous years. He then piloted those rubrics at the final screenings of the undergraduate and graduate Production capstone films. During the summer Mikael tabulated the completed rubrics and prepared a PowerPoint presentation of the results and conclusions, which he will share with Production Faculty this fall in order to prompt discussion of students’ performance, the pilot measures, and which learning outcomes the assessment process should focus on in the coming year.
Educational Support Services
Brian Leung led faculty and staff in the Department of Educational Support Services in a summer retreat to identify specific activities and assessments to support the School of Education’s newly adopted strategic plan. The faculty and staff, welcome of an opportunity to share their opinions and expertise through group work, successfully completed their retreat goals. With the support of the program’s directors, they intend to implement the plans they agreed upon and will document their progress in the program’s annual report for the Dean of the School of Education.
Holli Levitsky and the Jewish Studies Program assessment team examined several aspects of the now six-year-old program. Evidence of student learning was used to revise program learning outcomes. In addition they planned to collect evidence over the next one to two years that will allow them to determine if their summer study abroad program should be a requirement for the minor. Finally, they identified and developed courses to be offered as part of the new Core Curriculum.
During the program review process the Production Department identified curricular issues with students’ postproduction sound knowledge and abilities. As a result, the Department instituted a pilot approach to sound mixing during the 2012-2013 academic year. This pilot approach paired experienced sound mixers with students in the final mixing of students’ capstone films. Vanessa Newell’s project assessed the value of this approach. To do this she surveyed the mixing experts and the students who participated. Her findings exposed gaps and successes in the curriculum and will be used by the Dean and Production Faculty to determine whether or not to continue this approach to sound mixing.
Summer Assessment Grant 2012 Recipients
Jennifer Ramos – Political Science
The goal of Jennifer’s project was to develop a plan for assessing the Political Science program learning outcomes, which were recently modified in light of significant curricular and personnel changes. Over the course of the summer, she created measures for the seven learning outcomes, including rubrics and knowledge survey questions. She presented the assessment plan at the first fall department meeting, and the department has decided to move forward with piloting the proposed measures over the course of the 2012-2013 academic year.
Dean Scheibel – Communication Studies
Dean has begun building the foundation for what he hopes will become an interdisciplinary minor in Public Relations and Strategic Communications. He drafted and sought faculty feedback on a mission statement, program goals, and student learning outcomes, and continues to work on developing faculty goals and objectives. In the coming year, he intends to meet with Public Relations faculty to discuss the mapping of course content onto student learning outcomes and the sequencing of public relations courses that might eventually comprise a minor.
Karol Hoeffner – Screenwriting
Karol and her colleagues are in the process of creating and testing assessment tools to measure students’ achievement of the learning outcomes for the Bachelor of Arts in Screenwriting. In the spring of 2012, members of the Department agreed that a faculty-developed rubric would effectively serve as its primary direct assessment tool. Karol felt that it would be useful to supplement the rubric findings with an indirect assessment, so she worked over the summer to develop a survey. The survey, which serves to capture seniors’ perceptions of their learning and the educational environment, will be piloted this fall.
Sarah Strand – Health and Human Sciences
The Athletic Training Program’s accrediting body, CAATE, recently devised new standards and guidelines for accreditation, so Sarah and other members of the Athletic Training faculty took on the project of re-evaluating where each competency is being taught and assessed within the program. By August, 2012, they had ensured that each competency was covered in at least one didactic or clinical course. In order to improve student learning and ensure the success of their upcoming accreditation visit in the 2013-2014 academic year, they also determined better ways to assess their students for each of the CAATE-designated outcomes.
Dermot Ryan and Kelly Younger – English
Dermot and Kelly revised the mission statement and learning outcomes and developed a new curricular model for the Undergraduate Program in English. Their project was informed by research on larger trends in English studies and curricular revision, the English Department’s views on transformative education, and a synthesis of models previously proposed by English faculty. They presented their work to the English Department at its first meeting of fall 2012. Should the Department vote to accept the changes, Dermot and Kelly’s efforts will guide the Program’s assessment of student work.
Summer Assessment Grant 2011 Recipients
Jill Bickett – Educational Leadership
In 2010, the Doctoral Program (EdD) in Educational Leadership for Social Justice decided that, in alignment with best practices, doctoral dissertations should be subject to external review by experts in the field. In summer 2011, Jill worked with faculty members Antonia Darder, Franca Dell’Olio, Karen Huchting, Magaly Lavadenz, Brian Leung, Emilio Pack, and Tony Sabatino to draft an assessment rubric for this purpose. Future steps in Jill’s plan include working with her colleagues to survey doctoral faculty about dissertation elements, refining the draft rubric based on the results, pilot testing the rubric internally, identifying external reviewers, and distributing dissertations for review.
Adriana Jaroszewicz – Animation
Adriana’s project focused on revising the Animation Department’s outcomes to better define the knowledge, skills, and values it deems essential, and on developing measures to assess them. In summer 2011, she presented revised outcomes and rubric drafts to fellow faculty and incorporated their suggestions for improvement. In the 2011-2012 academic year, she plans to coordinate development of two surveys to supplement evidence of learning captured by rubrics – a senior exit survey, and a survey to gauge students’ perceptions of group collaboration.
Holli Levitsky – Jewish Studies
Holli set out to gather evidence of student learning in the Jewish Studies Program because, even though she had a sense of which aspects were most effective and which were in need of improvement, she knew that it was crucial to validate her assumptions before proposing changes. Holli worked with colleagues Jeffrey Siker and David Greenfield to create an exit survey and conduct interviews with graduating seniors, as well initiate the development of a measure to solicit information from alumni. These activities have empowered the Jewish Studies program to make evidence-informed decisions. Holli’s future plans include developing direct measures of student learning to complement the program’s survey and interview evidence.
Chan Lu – Modern Languages and Literatures
Chan’s goal was to review and refine assessment materials already in use in the Chinese minor program, consolidate them in an assessment package that could be shared among faculty, and develop assessment rubrics for areas in which adequate materials did not exist. She collected existing materials and developed several new rubrics in summer 2011, and will continue to work with her colleagues to review, pilot, and refine assessment materials throughout the coming year. Additionally, she has designed an exit interview and questionnaire, which she plans to pilot in spring 2012 and administer to the first cohort of students graduating with a Chinese minor.
Elizabeth C. Reilly – Educational Leadership
Elizabeth, a professor in the Institute for School Leadership and Administration (ISLA), proposed that a signature assignment in the Business of Education course was needed to close a gap in course and program assessment. She developed the assignment in summer 2011, as well as an assessment rubric to measure achievement of student learning outcomes. Throughout the 2011-2012 academic year, she will work with fellow ISLA faculty to revise and pilot test the assignment and rubric, and will post program-approved versions on LiveText for faculty and student use.
Summer Assessment Grant 2010 Recipients
Marilyn Beker - Screenwriting
The goal of Marilyn’s project was to create a rubric to assess knowledge of feature film screenwriting elements that could be applied to projects produced in the graduate feature film screenwriting course. Through working with Department Chair Jeffrey Davis, talking with professionals in the field, and consulting faculty at other universities, Marilyn was able to break the learning outcome down into seven elements for the rubric. This rubric was created not only to help assess the learning outcome, but also to act as a learning tool for students. Students will be provided with the rubric along with the assignment it will be applied to, and students will get a chance to apply the rubric to their peers’ screenplays in a workshop. Marilyn hopes that this rubric will also act as a template for creating other rubrics in the Screenwriting department.
Stuart Ching - English Department
In support of their ongoing program assessment and for their APRC program review, Stuart, and English Department Faculty members, Aimee Kilroy-Ross and Juan Mah Y Busch, set out to collect assessment data. In order to assess one of their student learning outcomes: “English majors are able to write creatively and effectively,” two rubrics were created. One rubric focused on effective communication, assessing components such as focus and use of sources; while a creative thinking rubric assessed components such as innovative thinking and making connections. Writing samples were collected in the spring from 300-level literature and theory courses and the rubric was applied to 61 writing samples. The results from the application of the rubric will be shared with the English Department and used in completing their program self-study.
Mikael Kreuzriegler - Production
The Department of Film Production collected rubric data from their graduate capstone course for the second time in December 2009. At the end of this course students screen their films and all SFTV faculty are invited to apply the rubric to the screened films for the purpose of assessment. The goal or Mikael’s project was to compile the data from the second collection and compare the results to the previous collection. Mikael then created a presentation for the Production faculty to discuss and a report that can be utilized in their upcoming program review. In addition, Mikael developed suggestions for improving the assessment process in the future.
Jeremy McCallum - Chemistry and Biochemistry
The goal of Jeremy’s project was to revise the department’s goals and outcomes and to create an assessment plan that aligns with these revisions. Further, he focused on using new technologies to simplify the department’s data analysis strategies. After completing the revision of the department’s goals and outcomes a document was created, which suggests methods on assessing these goals both directly and indirectly. To improve current assessment methods, Jeremy suggested revisions to the senior survey and developed new test and survey forms that are compatible with Remark Office optical mark recognition software, a program the department uses to score student work. The work completed thus far has established a strong foundation for continued assessment.
Anna Muraco - Sociology
Anna chose to focus on one component of the Sociology Department’s assessment plan- the senior exit interview. Each graduating senior is given the exit interview to measure the quality of their learning experience. The goal of Anna’s project was to conduct qualitative analysis on the past five years worth of interview data. By completing this analysis Anna was able to learn about what graduates of the program think about advising, availability of faculty, coursework, and more. One of the ways Anna utilized her findings was to develop suggestions for improving the senior exit interview, such as by possibly using an exit survey, and including more questions to assess student learning outcomes.
Nora Murphy - Psychology
As part of the department’s ongoing assessment plan, all graduating Psychology majors complete a senior survey. This online survey includes a variety of questions related to student learning outcomes, academic performance, future plans, department services, Psychology courses and faculty, and LMU’s mission as it relates to Psychology majors. Nora’s project was to analyze the data collected from seniors in 2009 and 2010. The results will be helpful to the program as they conduct their self-study for program review. Beyond the results of the survey, Nora was able to identify specific recommendations for easier analysis in the future, such as reordering questions and changing the format of certain questions. This project will enable the department to analyze and use the information from the senior survey more effectively in the future.
Vanessa Newell - Production
In preparation for the Production Department’s Academic Program Review, Vanessa, along with Mikael Kreuzriegler, worked to create a curriculum map for the program. Their goal was to determine where and at what level the department’s student learning outcomes were being addressed in the curriculum. By using syllabi, assignments, and talking with faculty members the team created a new curriculum map. This new curriculum map will be used as a starting point for reviewing the undergraduate curriculum and student learning outcomes.
Judy Pollick - School Psychology
The School Psychology Program is approved by the National Association of School Psychologists (NASP), with its next review process beginning in Fall 2013. In order to prepare for this Judy began the process of collecting 3 years of student data focusing on students’ growth in knowledge, skills and disposition. Judy reviewed the NASP assessment requirements and determined what the School Psychology Program is already doing to meet their requirements. Creating a map to show what evidence of the standards are being collected also revealed what additional assessment data is needed. In particular, School Psychology now knows what evidence of learning the faculty will need to develop in order to be ready for the next NASP approval. Judy also attended the American School Counselor Association conference to learn more about assessment and data analysis to help sustain assessment in the School Psychology Program.
David Ramirez - Natural Science
David Ramirez, along with Hawley Almstedt, Todd Shoepe, and Sarah Strand, set out to create the mission statement, vision, goals, and learning outcomes for proposing a new ‘Health Sciences’ department in the Seaver College of Science and Engineering. The team also created a proposed curriculum and curriculum map so that they could be sure that the learning outcomes would be sufficiently addressed. Faculty are working to complete the remaining elements of a new program proposal, including an assessment plan, so that the proposal can be submitted in the Fall 2010 semester.
Aimee Ross-Kilroy - English
The goal of Aimee’s project was to continue the ongoing assessment of the Freshman English Program. Assessment of the beginning of the sequence, English 110, was completed last year, leaving the second half of the program to be assessed this year. Final papers were collected from the three courses in the second half of the sequence and a rubric was applied to a random sample of the papers. Aimee gave a presentation of the data at the Freshman English Program orientation for new and returning faculty. Faculty discussed the findings, and made suggestions for improvement that were used to inform changes to the Freshman English Program courses.
Jeffrey Siker - Theological Studies
The Theological Studies program is in the process of carrying out their 2010-2011 self-study for re-accreditation review with the Association of Theological Schools. The primary goals of Jeff’s project was to begin gathering and analyzing student work portfolios and assessing the work based on the department’s student learning outcomes for the Masters of Arts programs in Theology and Pastoral Theology. After compiling the student work portfolios, DropBox, an online file sharing tool, was utilized to make the various materials accessible to other members of the department. Working with other members of the department, Jeff reviewed the current assessment rubrics and curriculum maps and revised them as needed. The department will now work together to assess the assembled portfolios and write the self-study report.
Summer Assessment Grant 2009 Recipients
Holli Levitsky – Jewish Studies
The Jewish Studies Program completed its first year of the academic minor in 2009, so Holli’s goal for this project was to develop an assessment plan for the program and develop rubrics to assess student work. As a first step, Holli created an assessment committee for the program. The committee developed a plan to involve faculty teaching Jewish Studies courses in aligning course and program outcomes, and identifying student work that reflects achievement of program-level outcomes. The committee also examined capstone projects from graduating seniors and developed rubrics to assess program outcomes. These initial steps have helped to establish a manageable course of action for meaningful, ongoing assessment of the Jewish Studies Program.
Edmundo Litton – Specialized Programs in Urban Education
The LMU/LA CAST Program in the Department of Specialized Programs in Urban Education developed a laptop initiative in 2007 aimed at integrating teaching and learning strategies with technology both inside and outside of the classroom and promoting digital equity. Edmundo’s project examined this initiative by assessing a student learning outcome of the LMU/LA CAST program. He conducted phone interviews with alumni and current students and used a rubric to examine examples of work students developed for use with their own K-12 students. Using the findings from both sets of evidence, he was able to identify ways to enhance students’ ability to integrate teaching and technology in the program. Edmundo plans to present his findings to program faculty to discuss how the program can close the assessment loop for this outcome by making changes for improvement.
Vanessa Newell – Production
The goal of Vanessa’s project was to analyze collected evidence of student learning in the Production Department and facilitate a dialogue between faculty on strategies for improvement. Vanessa analyzed data from a rubric applied to over 50 undergraduate and graduate capstone films and data from a pilot curriculum-wide knowledge survey administered in several sections of an introductory course. She plans to present her findings from both sets of data to program faculty in order to build a shared understanding of student learning in the program and develop strategies for improvement. As a final step in her project, she developed a central storage place for program assessment data, helping to make assessment of student learning in the program a much more manageable and sustainable process.
Jennifer Ramos – Political Science
Jennifer examined the alignment between the Political Science curriculum and the program learning outcomes. She worked with John Parrish and Richard Fox to create a curriculum map for the program by collecting and reviewing the learning outcomes on course syllabi, assignments, and other course related work. With these materials, they were able to determine where and at what level program learning outcomes are being addressed in the curriculum. After conducting a detailed analysis of the curriculum map, they were able to provide the program with baseline data from which faculty can now begin a conversation on how to ensure that students have ample opportunity to achieve program outcomes.
Aimee Ross – Freshman English
The Freshman English Program’s ongoing assessment plan uses multiple assessment tools. Aimee’s project focused on one component of this plan – the analysis of portfolios to assess several of the program outcomes. She collected scored rubrics from a sample of student portfolios and analyzed the data. Aimee then worked with K.J. Peters to interpret the findings and develop small, immediate steps to improve student learning. She presented her findings at the orientation for Freshman English Instructors and stressed the areas of the curriculum that should be addressed to improve student learning, helping the Freshman English Program to close an assessment loop.
Seth Thompson – Political Science
The goal of Seth’s project was to create a system for organizing and analyzing assessment data collected as part of the Political Science assessment plan. Each year the program administers a multiple choice exam in two introductory courses and one senior seminar to assess two of the program’s student learning outcomes. This pre-post testing allows the measurement of improvement over the course of the program. Seth developed procedures for preparing test data for analysis, documented procedures for conducting the analysis, and designed an archive system for the data. His project has ensured that collecting and analyzing data for this component of the assessment plan will not be dependent on any one individual.