(Download PDF) [m][s]
Note: This Best Practice uses the
Good, Better, Best, Aspirational terminology
As shared print programs become widespread and achieve some degree of longevity, it becomes important to create review mechanisms that ensure programs continue to provide value to their membership, and are working towards their agreed-upon vision.
This document suggests various criteria for program assessment under the categories Good, Better, Best, and Aspirational. Each level presumes having met the criteria of the previous level.
Good
Good Practice would involve
- holding meetings or conferences to elicit feedback from members
- providing opportunity for focus group feedback
- creating program structure and governance documents that should include a formal decision-making process, including such specifics as leadership teams, subcommittees, and working groups
- hosting occasional webinar member meetings to inform of action plans and get further feedback
- considering the following assessment components:
- program alignment with vision/mission
- ongoing communication with members
- value of participation for program members
Better
Better Practice would involve
- annual webinar member meetings to inform of action plans and get further feedback
- a consideration of the assessment components noted in Good Practice, plus the following:
- retention data information (to determine accessibility and value to members)
- an assessment of the program’s engagement with national/international programs, including memberships in them
Best
Best Practice would involve
- the creation of a strategic plan
- distributing and collating member surveys and creating an annual action plan which closes the feedback loop
- a consideration of the assessment components noted in Good and Better Practice, plus the following:
- participation cost for members (both human and financial)
- assessing the program for member and collection diversity
Best Practices in Program Assessment: Appendix
General Information
For Best Practices in Program Assessment, we talked to representatives of CI-CCI, HathiTrust, MCLS/MI-SPI, Colorado Alliance, ConnectNY (which previously had a shared print program), WRLC, EAST, MSCC, COPPUL, and WEST. Most did not conduct both formal and informal assessment, but for those that did, the most amount of formal assessment occurred during annual meetings or planning retreats. At these events, the previous academic years’ goals are assessed, and new goals are set for the upcoming year. The types of data assessed included Interlibrary Loan, collection development data, and acquisitions data. In addition, most shared print programs’ leadership teams meet by conference call often, either each month or quarterly, and some, like EAST, hold regular web-based membership meetings and webinars. This is when they review action items and cross-reference against stated goals to continually determine if they are on track.
Some shared print program leadership teams either have used, or plan on using, surveys of members, as well as focus group conversations (often during annual meeting break-out sessions). ConnectNY used survey results to create its MOU. Many shared print programs also use subcommittees (both standing and ad hoc) and/or working groups, which report back to leadership teams, usually Executive Committees and/or Operations Committees (EAST also has an ongoing Program Team, while ConnectNY has a Board of Directors). ConnectNY established a Shared Print Trust Management Committee, which assessed the effectiveness of the shared print trust program, tracked developments and emerging trends in this field, and shared new ideas about best practices; it reported back to the Board of Directors. Connect NY also created a Strategic Plan 2015-2020 and has established formal Bylaws. It has also produced a white paper identifying strategies for sharing and marketing special collections and archives via physical loan, digitization, and digital repository. In late 2020, EAST also embarked on a program assessment process, partly informed by the WEST assessment work, which is the most robust and formal detailed program assessment.
Survey Examples
WEST, HathiTrust, and EAST presented their respective program assessment work at the January 2021 Print Archive Network (PAN) Forum. More information about this event can be found at https://www.crl.edu/events/pan-ala-midwinter-2021-online and specific details can be found below.
WEST
WEST conducted its third program assessment in 2019 to inform strategic planning and future directions for the program. The 2019 assessment included a membership-wide survey, focus groups, data monitoring for archival decisions, deselection statistics, and WEST’s cost share model. The third survey was designed with the previous surveys in mind in order to assess change over time in key areas. Based on feedback from the first two surveys, WEST sought collaborative opportunities with other programs across the country in order to further leverage and scale its efforts. The 2019 assessment focused on four major themes:
- the value of WEST
- member satisfaction and areas for change
- potential new services
- archive building
More information about the WEST Program Assessment, including reports, presentations, and more recent assessments, can be found at https://cdlib.org/west/about-west/documents-presentations/. See also the West Program Assessment Plan Template.
HathiTrust
More information about the HathiTrust Shared Print Program assessment can be found in the HathiTrust Shared Print Program Assessment document, and general program information can be found at https://www.hathitrust.org/shared_print_program. See also the HathiTrust Strategic Vision work from 2024.
EAST
The EAST 2020 Program Assessment page includes instruments used and results.
Last Updated May 2024