ASSESS THIS! What, How and Who Cares? Presented by the Delaware Valley Chapter of the ACRL
April 16, 2010 10:00-3:30
Speakers: Peter Hernon and Stephen Spohn
Panel discussion by Nonny Schlotzhauer (Penn State); Melissa Gold and Scott Anderson (Millersville University); Ruth Perkins, Krista Prock and Karen Wanamaker (Kutztown University)
Dr. Hernon (a prolific speaker and author of 45 books) brought fresh perspectives to the assessment conundrum such as how to examine program learning (not course learning), i.e. specific to each discipline; consider direct evidence (not indirect evidence like self-reporting), reflect on outcomes (not outputs), and identify the stakeholders (parents, accrediting bodies, taxpayers, school boards, students, etc.). Read Assessing Service Quality: Satisfying the Expectations of Library Customers (2010) by Peter Hernon & Ellen Altman for more information.
Innovative ideas for direct evidence include asking faculty to write a quality story, students generating a portfolio, and students participating in internships. Assessment is a continuous process of data collection—not a onetime survey. San Francisco Public Library is a good example of an online survey that patrons can complete at any time (http://sfpl.org/ "Take Our Survey"). Counting Opinions is a commercial company that was recommended by Dr. Hernon that provides an instrument for continuous data collection and is customizable.
The planning documents at Sawyer Library, Suffolk University serve as an excellent university assessment model which includes an Institutional Accountability plan as well as Student Learning Outcomes plus a Long Range Plan.
Consider the following query which should be answered collaboratively with other departments of the university: What impact does the library have across the university?
Stephen Spohn’s afternoon presentation reinforced one aspect of Peter Hernon’s concepts: decide who the stakeholders are and solicit feedback from them. Additional planning includes: identifying trends/long term goals and generating actionable data.
Millersville University is collecting student artifacts such as videos, research papers, and tests. The IL rubric from the Association of American Colleges and Universities will be useful in analyzing them. One measure is to document the percentage of resources in student bibliographies that were derived from the Library. Penn State convened a Library Assessment Metrics Council and Kutztown University engaged external reviewers (who were from a library that Kutztown wanted to model).
[Thanks to Carol Videon for guest writing this post.]
This workshop was partially funded with Federal Library Services and Technology Act (LSTA) funds administered by the Office of Commonwealth Libraries and would not have been possible without the help of the College and Research Division of PaLA.