1932
Hands hover over a red button reading “subscribe” with academic journals in the background

CREDIT: Katina Magazine

In a Tough Environment, How Should We Make—and Communicate—Collections Decisions?

In this interview, Madeline Kelly discusses the Western Washington University Libraries’ annual collections evaluation process and what she’s learned about sharing news of subscription cancellations.

By Maria Collins

|

LAYOUT MENU

Insert PARAGRAPH
Insert H2
Insert H3
Insert Unordered List
Insert Ordered List
Insert IMAGE CAPTION
Insert YMAL WITH IMAGES
Insert YMAL NO IMAGES
Insert NEWSLETTER PROMO
Insert QUOTE
Insert VIDEO CAPTION
Insert Horizontal ADVERT
Insert Skyscrapper ADVERT

LAYOUT MENU

Academic library spending continued to decrease between 2014 and 2022, with per student spending dropping almost 20 percent (Jung and Choi, 2025). Combined with recent events such as the reduction in federal research spending, this trend leaves many of us in libraries wondering how we can continue to provide our faculty and staff access to the resources they need.

For insight into one library’s approach to adding and cutting titles in this environment, I reached out to Madeline Kelly, interim dean of libraries at Western Washington University (WWU). We corresponded via email about the Western Libraries’ responsive and transparent annual collections evaluation process, which includes both qualitative and quantitative metrics and incorporates faculty feedback and librarian expertise, and the lessons she’s learned as she’s communicated these decisions to stakeholders. Our conversation has been edited for length and clarity.

What should we know about your institution?

Western Washington University is a regional comprehensive university in Bellingham, WA with about 14,700 students (full-time equivalent). Our libraries provide traditional services like collections, interlibrary loan, special collections, and archives, etc., and we also house a writing center, a tutoring center, and archives focused on the Pacific Northwest region. Our collections include a special focus on children’s literature, Mongolian studies, and the Pacific Northwest. WWU is also a member of the Orbis Cascade Alliance, a consortium of 38 academic libraries across Washington, Oregon, and Idaho.

What were the initial goals of your subscription review project? What drove you to pursue a review?

We started developing our subscription review process in 2019 out of a desire to manage our collections more proactively. Subscriptions make up such a large portion of our collections budget but weren’t being evaluated except in a reactionary way during periodic budget crises. We also wanted a process that would allow us to add new subscriptions, even with a flat budget. By introducing an annual review process, we opened the door to new subscriptions, allowing ourselves to shape our subscriptions portfolio more actively over time.

What were the recent campus challenges that you all faced?

WWU has faced many of the same challenges as other regional comprehensives these past few years, including post-pandemic enrollment and revenue pressures, student protests and campus tensions around divestment, and anxiety over federal policy and funding. In Washington state, we’re experiencing a state-wide budget shortfall that is likely to trickle down to higher education in a bad way. And WWU has institution-specific challenges, including our own budget gap and rapid turnover in library leadership that has made it difficult to feel strategic.

Can you provide a brief description of your subscription review process and why it was important to include all types of subscription resources?

Our subscription review process is an annual cycle that includes four major stages. First, we gather data throughout the summer and early fall. We try to go beyond just cost-per-use for our subscriptions so that we get a fuller picture of value. Our metrics have evolved over time as we’ve discovered which pieces are most useful for decision-making, which pieces are prohibitively labor intensive, etc. Each subscription ends up with a numeric score based on all the metrics. The goal is to provide good starting data for discussion.

The second stage of the process is the development of our draft review list, which includes all the titles under serious consideration to be cancelled, as well as any new subscriptions we are considering adding to the collection. In this stage, our subject teams and a small group of collections personnel review all the data and recommend specific subscriptions to go on the list. Sometimes subscriptions make it onto the list because of their score, but sometimes there are other reasons. For example, a database may be very similar to a much less expensive resource, in which case both subscriptions (the existing database and its potential budget replacement) would make it onto the list. We use our expertise to assemble the most actionable and strategic draft review list we can—one that presents options to maximize access, align subscriptions with Western’s curriculum, and establish more sustainable spending moving forward.

The third stage of the process is the feedback stage. In February, we post the draft subscription review list and invite the university community to give input via survey. At the same time, we meet with departments and governance groups to talk through any questions.

Once the feedback stage is complete—usually around mid-March—we move into decision-making and implementation. During this stage, we review all the feedback and decide which subscriptions to cancel and which subscriptions to add.

When we designed our process, it was important to include all types of subscriptions in the process so that we could manage our portfolio as holistically and strategically as possible. It’s not perfect—we still struggle with how to put individual journals on equal footing with massive publisher or aggregator packages—but the goals of being strategic and holistic still motivate us as we refine the process.

Your evaluation process (detailed on your website) uses a scorecard with categories for both evaluation and format (see Table 1). What can you tell us about how this scorecard was developed?

Table with two columns, the first listing

TABLE 1

We developed our initial rubric over a series of meetings with subject teams and collections personnel. We talked about values that were important to us in the context of subscriptions—things like cost effectiveness, WWU curriculum, user privacy, and accessibility. Then, we tried to figure out metrics for each of these values. The last step was to work with a subscriptions task force to finalize the balance of factors. The result was a complex scorecard that incorporated quantitative and qualitative factors in a standardized way. In the past five years, we’ve streamlined our process considerably and the rubric is a bit simpler than it was at the outset.

What were your strategies for campus engagement in this process?

Since we instituted the process, we’ve tried to incorporate communication and engagement at every stage of our subscription review. This has included sharing information with governance and leadership groups like Faculty Senate, Senate Library Committee, and Council of Deans. It has also included outreach to department chairs and departmental outreach by our subject teams. We try to engage with the university community as we develop, revise, and implement the process—so, multiple times per year. We work internally to develop a common set of talking points that include local factors (why we’re doing the review, how much we might need to cancel) as well as information about the broader subscriptions and scholarly publishing landscape. These talking points form the basis of conversations we can have across the university. We try to include a variety of passive and active communication mechanisms as well, since we never know exactly which mechanism or message will resonate with people.

Describe your communications with campus and how they are coordinated.

As I mentioned above, we try to communicate with the university community at multiple points throughout the process, based on a shared set of talking points. We use a variety of mechanisms—news announcements, direct emails to faculty, messages and presentations to governance groups, meetings with departments—in the hopes of reaching as many people as we can.

What has proved to be successful during these exchanges? Have there been any surprises or challenges?

We’re most successful when we’re able to speak with smaller groups of stakeholders, like in department meetings. There is a small subset of faculty who staunchly disagree with the subscription review process, but it seems like most people just need a chance to ask questions and learn more about the situation. The more of these small, informative conversations we can have, the better the process goes. Unfortunately, we don’t always have the bandwidth to go to every department meeting—and not every department has the bandwidth to meet with us. So, we do what we can to connect on as personal a level as possible, recognizing our limitations. It can also be challenging to engage with that subset of faculty who disagree with the process. As the stewards of the university’s collections budget, we’re caught between stakeholders who need the subscriptions they need and administrators who are grappling with deep institutional budgeting challenges.

What lessons have library staff learned from developing and implementing this process?

I think the two big lessons we’ve learned include:

Streamline the work as much as possible based on how decisions are made. We included a lot of data points in our original rubric, but not all that data contributed to final decisions the way we thought it would. Over the course of five years, we’ve worked hard to ask ourselves: Why are we including this data? Does it help us make tough choices, or do we find ourselves discounting it in favor of other evidence?

Assert the expertise of library personnel, even while seeking input. The subscriptions landscape is complex, and subscriptions decisions have to factor in more than just the needs of each stakeholder community. Our job is to look at the big picture—across the university, across our consortia, across the publishing landscape—and make decisions based on that specialized knowledge. We need to be clear to stakeholders that their input is critical to the process, but that there’s not a straight line from their input to the final outcomes.

Would you recommend your process to other libraries?

I would recommend aspects of our process to other libraries. I recommend incorporating a variety of metrics into decision-making—whatever metrics feel most important to your stakeholders. I recommend outlining a clear, year-long communications plan to accompany any subscription review, including major talking points and communications mechanisms. I recommend that collections personnel work closely with subject librarians throughout the process, from identifying data points to developing the draft review list to engaging the university for input. I recommend reviewing and revising the process each year.

References

Jung, Y. J., & Choi, J. (2025, June 2). Academic libraries’ spending matters for college student success.Information Matters. https://informationmatters.org/2025/06/academic-libraries-spending-matters-for-college-student-success/

This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error