Supporting Strategic Planning with Project and Data Management
CHRISTA WERLE is Public Services Project Manager, Sno-Isle Libraries, Snohomish and Island Counties, WA, and SAMANTHA LOPEZ is Project Coordinator for the Public Library Association. Contact Christa at cwerle@sno-isle.org. Contact Samantha at slopez@ala.org. Christa is currently reading Heart & Brain: Gut Instincts by Nick Seluk. Samantha is currently reading Lilac Girls by Martha Hall Kelly.
No matter the size of the library or the population it serves, all public libraries are working toward a common goal—providing relevant and impactful services in areas most important to patrons. As we strive to be a data-driven organization at Sno-Isle Libraries in Snohomish and Island Counties, WA, it is our job to make sure our programs are allocating the right amount of resources to our highest priority services and addressing the needs and interests of our communities. And we need the data to show it.
Data-Driven Change
Sno-Isle Libraries started its outcome data work in 2015 with project management. We began adopting project management methodologies including project charters, the role of project sponsor, and adherence to planning before execution. We considered how to evaluate quality control and user experience, and we were introduced to the outcomes-based logic model by Moe Hosseini-Ara and Rebecca Jones.1 Continuing to look at systems and processes that would keep our organization moving forward, we saw that we needed better support for our programming services through support and evaluation. Of course, all of this needed to happen without any additional resources (sound like a familiar problem?). We needed to learn how our patrons are benefiting from our programs and we needed to do so quickly, easily, and inexpensively. With the release of the Public Library Association’s (PLA) Project Outcome2 in June 2015, it’s as if the stars had perfectly aligned.
Sno-Isle Libraries started engaging with Project Outcome in Fall 2015, as the work of the Programming Support and Evaluation Project began, and was invited by PLA to participate in the 2016 pilot-testing of the Project Outcome Follow-Up Surveys. For us, the power of using Project Outcome is in its consistency and standardization. It is our belief that libraries don’t need nine thousand different ways of doing evaluation. Again, we’re all working toward that common goal, so why keep reinventing the wheel? Project Outcome is a free service available to all US and Canadian public libraries. It provides standardized outcome measures in seven core library services areas, measuring four key outcomes: knowledge, confidence, application/behavior change, and awareness. As a result of participating, Sno-Isle Libraries has shifted its programming purpose statements to align with Project Outcome’s seven core service areas and has fully adopted the key outcomes into our planning and surveying.
The Evaluation Plan
Knowing our evaluation plan and having clear, standardized outcome metrics (Project Outcome) helped us define our audit and evaluation objective deliverable: to produce a quarterly State of Programming Report to our organization.
This was our first time attempting a report like this, so in Spring 2016 we took the opportunity to pilot-test the Project Outcome Follow-Up Surveys as a way to pilot-test our own evaluation process. Following the testing, we submerged fully into the audit with one-month snap- shots in July and October. The auditing included collecting inputs, outputs, and the outcome-based surveys. We were able to gather enough big-picture data to understand the state of our programming services.
What we learned from the audit is that we need to level the playing eld where resource availability is scarce or inconsistently applied. We also need to support staff with core curriculum in our strategic priorities and provide feedback to resource managers to allow progress in priority areas. This revealed the need for the State of Programming Report.
The complication, not surprisingly, is that multiple people are accountable for multiple services and programs and they’re all vying for the same resources to move their service plans forward. The audit showed that Sno-Isle Libraries invests approximately $1.13 million annually in programming services, including staff time and financial resources. We need to take this seriously and know exactly how those programs are contributing to our strategic plan.
We entered into a new strategic plan process for 2017–19 following the initial audit. Our new strategic plan is outcome-focused, including the core service: to present programs addressing community needs and interests. While all of these plans are in motion—strategic planning, writing outcome-based service plans, auditing and evaluating programs, and creating the State of Programming Reports—they all tie back together through the core Project Outcome measures.
Supporting the Strategic Plan
Sno-Isle Libraries practices a one-page strategic plan. Each core service and strategic priority of the plan has its own service plan, which includes:
- How to identify the need/ demand of the service?
- Who is accountable?
- What is the impact?
- Who is the target audience? and,
- How are we using outcomes to get where we want to be (over the three-year strategic plan period)?
The quarterly State of Programming Report summarizes the audit from several operational reports, such as average cost of staff time per programmer, per location, by programming service area, and Project Outcome metrics.
The operational reports reflect changes made to our programming purpose statements, which were adjusted in the 2016 audit to directly align with the Project Outcome areas of service. The past programming purpose statements were already similarly aligned to Project Outcome, so there was minimal resistance implementing this change. Now, our data can be compared across future strategic plans, and not be tied to specific strategic planning or “seasonal” periods. Measuring programming consistently across long periods of time, using consistent language and metrics, allows us to make data-driven decisions, track change over time, and better inform future strategic planning.
Another key area measured in the operational reports is the program purpose. Shifting our focus to quality control and the user experience means shifting the way we think about the programs themselves, not just how to evaluate them.
We need our programmers to understand what the desired outcomes of the program are, and if the program isn’t aiming to impact our patrons’ lives, is it worth keeping? The operational report asks programmers, “Is the purpose of this program to improve customer skills or knowledge or change a specific behavior?” In the initial audit, a quarter of responses came back neutral or negative. This data should inform decisions about resource shifting or program elimination when the results show that a program is not designed to make a change for the customer.
Of course, we can’t forget about out- comes! Our operational reports provide our snapshot survey data demonstrating our impact in key outcome areas like knowledge, confidence, and behavior change. We conducted both Immediate and Follow-Up Surveys using the Project Outcome measures. Immediate Surveys work best for quick, snapshot assessments and are easily administered after a program is complete, while the Follow-Up Surveys capture more robust outcome data but require more resources and staff time to follow up with patrons.
As most libraries have experienced, convenience sampling and snapshot survey data often lead to overly-positive results, and Sno-Isle Libraries is no different. We aim to focus on the question “where are the opportunities?” in a sea of positive datasets and move the needle where patrons feel most neutral about our programs and are responding “neither agree nor disagree.” We’ve also been able to leverage the outcome surveys as an opportunity to cross-pollinate with other non-programming departments. Surveying allows us to collect additional data such as marketing channel reach to determine effectiveness of efforts, and audience demographics to help us become more intentional about determining the target audiences of our services.
Maintaining Momentum
The State of Programming Report and program auditing will continue to happen quarterly (January, April, July, October) and become business-as-usual for our organization. The outcome surveys will be available for programmers to administer at any time, but will only be required during the audit months. Ideally, we would like both the Immediate and Follow-Up Surveys to be used throughout the year, but until we allocate more resources, we will only follow up on Educational/Life-long Learning and Digital Learning programs occurring during the audit months and on Economic Development programs throughout the year.
The more auditing we do and the more patron surveys we collect, the clearer the big picture will become and the better informed we’ll be for strategic, organizational decision making, like resource allocation and program expansion. While the pilot year of survey data is statistically significant for big picture assessment, we need more data to represent the micro-levels of our organization. By the end of our strategic plan (2019), we will have enough meaningful datasets to inform the same decisions by supervisors and managers at the community library level.
But that doesn’t mean we have to wait to start learning from our results and take action. Don’t take the immediate impact of the survey data for granted. Our programmers have reported how much they enjoy and find meaning in their patrons’ feedback and how quickly they can learn and make adjustments to their programs as a result. We’ve been able to make quick changes like speaking more loudly during instruction and more long-term changes like adding more robust and challenging digital literacy classes.
With any change comes some resistance. Some staff members were hesitant at first, but most have seen the benefits from having those immediate results right in from of them. The more they interact with the surveys, the more they want to use them (even during non-audit months). At the management level, we can show that each staff member’s time contributes to something larger than their own work. Our strategic plan is our atlas, the ongoing audit and evaluation provide the roadmap, and the survey feedback is building the roads to take our programming services where our customers need them to be.
Not only are staff members contributing to something larger than their own programs, but Sno-Isle Libraries is then able to contribute to something larger than itself. Participating in Project Outcome means our data is aggregated with hundreds of other libraries across the United States and Canada. The aggregation of all participating data allows us to see how we compare state-wide and nationally in the seven library service areas. More importantly, it means libraries of various sizes and capacities, serving diverse populations and community needs, finally have a shared vocabulary and practice—through outcomes-based measurement.
References
- Moe Hosseini-Ara and Rebecca Jones, “Overcoming Our Habits and Learning to Measure Impact,” Computers in Libraries 33, no. 5 (June 2013),
- https://www.projectoutcome.org/