Thursday, April 22, 2010
ASSESS THIS! What, How and Who Cares? Presented by the Delaware Valley Chapter of the ACRL
April 16, 2010 10:00-3:30
Speakers: Peter Hernon and Stephen Spohn
Panel discussion by Nonny Schlotzhauer (Penn State); Melissa Gold and Scott Anderson (Millersville University); Ruth Perkins, Krista Prock and Karen Wanamaker (Kutztown University)
Dr. Hernon (a prolific speaker and author of 45 books) brought fresh perspectives to the assessment conundrum such as how to examine program learning (not course learning), i.e. specific to each discipline; consider direct evidence (not indirect evidence like self-reporting), reflect on outcomes (not outputs), and identify the stakeholders (parents, accrediting bodies, taxpayers, school boards, students, etc.). Read Assessing Service Quality: Satisfying the Expectations of Library Customers (2010) by Peter Hernon & Ellen Altman for more information.
Innovative ideas for direct evidence include asking faculty to write a quality story, students generating a portfolio, and students participating in internships. Assessment is a continuous process of data collection—not a onetime survey. San Francisco Public Library is a good example of an online survey that patrons can complete at any time (http://sfpl.org/ "Take Our Survey"). Counting Opinions is a commercial company that was recommended by Dr. Hernon that provides an instrument for continuous data collection and is customizable.
The planning documents at Sawyer Library, Suffolk University serve as an excellent university assessment model which includes an Institutional Accountability plan as well as Student Learning Outcomes plus a Long Range Plan.
Consider the following query which should be answered collaboratively with other departments of the university: What impact does the library have across the university?
Stephen Spohn’s afternoon presentation reinforced one aspect of Peter Hernon’s concepts: decide who the stakeholders are and solicit feedback from them. Additional planning includes: identifying trends/long term goals and generating actionable data.
Millersville University is collecting student artifacts such as videos, research papers, and tests. The IL rubric from the Association of American Colleges and Universities will be useful in analyzing them. One measure is to document the percentage of resources in student bibliographies that were derived from the Library. Penn State convened a Library Assessment Metrics Council and Kutztown University engaged external reviewers (who were from a library that Kutztown wanted to model).
[Thanks to Carol Videon for guest writing this post.]
This workshop was partially funded with Federal Library Services and Technology Act (LSTA) funds administered by the Office of Commonwealth Libraries and would not have been possible without the help of the College and Research Division of PaLA.
Monday, April 19, 2010
Best Practices in Library Instruction
Doug is Reference Librarian/Professor at Shippensburg University, and Ryan isInstructional Technology/Information Literacy Librarian /Assistant Professor at California University of Pennsylvania. If you weren't able to attend Friday's program, you can sign up for their Preconference, Practical Pedagogy for Library Instructors, which will be held from 1:00 to 4:30pm on Friday, June 25 at the ALA Annual Conference.
Following the keynote speakers was a Best Practices Panel. Larissa Gordon shared how Arcadia University Library used mini-grants to foster faculty-librarian collaboration; Margaret Montet and Willliam Hemmig shared how they enhance an embedded eBrarian program at Bucks County Community College; and Kelley Beeson shared how the Allegheny County Library Assoication used 23 Things-n'at to create an non-threatening environment for library staff to learn about Web 2.0 technologies.
A second panel featuring assessment followed the lunch break. In addition to Hedra Packman, who spoke about how the Free Library of Philadelphia uses a variety of methods to assess the variety of instruction programs they provide, there were two presentaions on assessment in academic libraries. Tom Reinsfelder, from the Mont Alto Campus of Penn State, explained how PSU used the SAILS Test for incoming students at select PSU campuses. They administered the test before any Information Literacy instruction had been done in order to establish a baseline for students' knowledge about Information Literacy. This enabled them to identify what was the most important skill set for librarians to teach. When asked about testing upper level students, Tom indicated that it would be difficult to re-test the same students since the original test was administered throught faculty class time. Olga Conneen presented a rubric that the library is using to assess student learning outcomes for a library assignment in the "Achieving the Dream" program at Northampton Ccommunity College. Though active learning, she demonstrated how the librarians were able to improve this assignment by evaluting interrater reliability.
Bonnie Imler, Altoona Campus of Penn State, the day's final presenter, compared the feature four screen capture software options and included some tips for using this type of software for online tutorials.
This workshop was partially funded with Federal Library Services and Technology Act (LSTA) funds administered by the Office of Commonwealth Libraries and would not have been possible without the help of the College and Research Division of PaLA.