A Publication of the Public Library Association Public Libraries Online

News & Opinion

Work Smarter, Not Harder

by Daniel Hensley on November 25, 2020

How One Library Incorporated Project Outcome into their Strategic Plan to Better Serve Their Community

by Daniel Hensley, Adult Programming Coordinator, Carnegie Library of Pittsburgh, hensleyd@carnegielibrary.org.

Allen County Public Library has strategically incorporated logic models and outcome measurement in planning and evaluating programs and services in a relatively short amount of time by using Project Outcome tools as a central part of an overall shift to a more outward-facing approach to library services.

Judging by Allen County Public Library’s (ACPL) reputation as a national leader in using strategic metrics to improve library services, it would be easy to assume that they were among the earliest public library adopters of outcomes measurement. But, in fact, outcomes have only been a part of the library’s strategy for about three years.

“Not unlike other public libraries, ACPL had not used outcomes-based measurement prior to 2017,” reports Denise Davis, who, as ACPL’s Director of Strategic Initiatives, oversees data collection efforts. “The library relied almost entirely on outputs and anecdotes — customer feedback and quotes — to describe the value of public programming.”

This changed with ACPL’s 2018-2022 Strategic Plan, which established an outcomes-based framework for the development of all library services. According to Davis, this newly central role of outcomes required a great amount of commitment from staff at all levels of ACPL: “This has been a sea change for staff, and a good deal of education has been needed to help staff adjust to this outward-facing approach to planning, service delivery, and evaluation.”

ACPL has been able to quickly integrate outcomes measurement by making a commitment to outcomes at all levels of the organization, applying a strategic approach to planning and evaluation that includes a number of data inputs, and putting the data to work in ways that impact library users right away.

A Marathon, Not a Sprint

Outcomes feature prominently in ACPL’s 2018-2022 Strategic Plan. Tellingly, the document resembles a logic model in its structure – broad goals are connected to demonstrable outcomes, which in turn relate to “investments,” or activities that support the goals of the organization.

The ACPL Director at the time of this interview was Greta Southard. She saw the 2018 Plan, her first as Director of ACPL, as an opportunity to shake up the strategic planning process. “When I came, the strategic plan was not a strategic plan. It was a laundry list of to-do items that we did because you had to submit something to the state library.” Southard’s vision was to use the strategic planning process as a catalyst for organizational change and a way to focus services on community outcomes.

Project Outcome tools play a central part in ACPL’s commitment to using outcomes to improve services, but the surveys and visualization tools are only one part of a larger investment of time, resources, and energy required to make lasting organizational change.

ACPL began using Project Outcome in 2017 and continued training staff in outcomes-based measurement, the Project Outcome portal, and logic models in 2018 and 2019. To put the training into practice, staff had six months to work with program planning and logic models, survey a program, and then use the results to improve the program.

The big picture goal of using logic models, according to Davis, is to encourage professionals to “take the emotion out of program planning.”

“We know that our staff knows how to plan a program — we don’t question that. What we want people to question is their process for identifying topics, and their fallback if a program fails. How will you know if the topic resonates with the community, and how will you adjust it so that it meets community needs over time?”

Davis encourages staff to be strategic about what programs they survey. Rather than surveying every program, she advises staff to be selective when identifying programs to evaluate, using guiding questions such as “Is this a cornerstone program that has gotten stale?” and “Is this a new program that isn’t getting the traction you expected?”

For support, Davis is available for staff to contact to discuss evaluation; additionally, three managers who have received more in-depth training in logic models and outcomes-based measurement serve as point people for staff. Time is given on agendas at regular meetings to guide staff through logic model planning and evaluation processes.

Davis also speaks “almost daily” with staff who want to add or change questions on Project Outcomes surveys. “I ask back what they plan to do with that information.” Planning, evaluation, and development are all parts of the same process, and every decision in that process ultimately ties back to the intended impact on the community.

Avoiding A Colossal Waste of Time

Davis speaks plainly when it comes to the “measure everything” approach to surveying program participants. “[That] is a colossal waste of time.” Instead, Project Outcome surveys are just one of a number of methods that ACLP uses to get data about programs and services, each of which adds to a larger picture of progress on the Strategic Plan.

The experience of using Project Outcome surveys for programs across the library system has taught ACPL some lessons in getting a good response rate, though Davis admits that survey response is always a challenge. It helps to be selective about what programs to survey so that regular patrons do not get “survey fatigue.” Paper surveys continue to be the most effective for in-person programming. But perhaps the best way to encourage feedback is to show patrons that you are listening. ACPL’s Genealogy department regularly sees response rates of 80% or higher, and Director of Special Collections Curt Witcher credits that to a strong connection between customers and staff.

“Carefully crafted questions focusing on how the customer is benefiting from our programs invite responses that are more meaningful and better guide us in our offerings. Team members hearing directly from those experiencing our programs about their needs is powerful in both motivating and guiding our programming work.”

While Project Outcome surveys give valuable insight into the effectiveness of targeted programs, these surveys alone do not show the whole picture. To get more real-time feedback, staff are also encouraged to regularly use informal methods, such as posting a flip chart in the lobby with one question to get feedback about a program, service, or space change. Outreach events are also seen as opportunities to get feedback about what community members would like the library to do. Both of these methods provide real-time feedback, and give people who may not otherwise be vocal a chance to have their voices heard.

To complete the data picture at ACPL, staff have access to a highly developed warehouse of output data. ACPL also uses feedback cards to get customer satisfaction information, and questionnaires administered in print and through Survey Monkey to periodically get topical feedback. The responses to these targeted surveys can be impressive – a recent online-only survey, which was only open for a week, yielded nearly 7,000 responses. All of this is supplemented by reports from Gale Analytics (formerly Analytics on Demand), which help staff understand customer behavior and trends.

ACPL’s experience shows a good example of how to make the most out of Project Outcome. Project Outcome surveys are most effective as part of a more holistic program of measurement. Outputs show attendance trends and help staffing decisions; regular customer surveys provide a baseline of community attitudes and expectations; informal customer feedback gives frontline staff real-time data on a local level. In this context, ACPL’s targeted Project Outcome surveys are used to assess the quality of programs and services by measuring them against outcomes that are clearly defined within a logic model.

As a result of these coordinated efforts, staff at all levels have access to data to help guide decision making, improve services, and track progress on the strategic plan.

Working Smarter, Not Harder

ACPL’s experience in training and support to adopt Project Outcome and other elements of their evaluation strategy has been a slow process and required a major effort, but the investment has already shown returns in service improvements and community relations.

The Summer Learning Program is perhaps ACPL’s largest annual programming initiative, and so has been a major focus in efforts to plan, evaluate, and adjust programming using outcomes and other metrics. The program is supported by a significant local foundation in Fort Wayne. As a result of ACLP’s move to a more intentional planning process, the relationship with the foundation has strengthened; the foundation has offered longer grant cycles and has invited ACPL to apply for additional funding. Southard believes that this improved relationship played a big role in millions of dollars in funding.

Countless smaller changes have been made by staff across the system as they learned to use logic models and outcomes in their planning practice. Project Outcome results have become part of Board communications, too: visualizations, details, and quotes from Project Outcome are included in quarterly Board updates, which are also made available to all staff.

Southard has found that using outcomes has helped board members to better understand the library’s impact. Many board members come from the world of business and are accustomed to seeing profits or other quantitative statements; outcomes provide a relatable way to track progress. This approach can also help situate the library’s work in the context of greater community goals.

“One of the goals for the region is growth of population – we can link increased skill and knowledge to be good for the community in general for future economic development. We are helping build that pipeline for the workforce of the future. Helping make those linkages about the work that we’re doing, from preschoolers up, we can show how we’re feeding into the longer-term aspirational elements.”

To Davis, however, the biggest impact is more subtle.

“Overall, it would not be an overstatement that staff are using the feedback to understand how best to focus their time on program development and delivery to ensure that we are making the most of the limited capacity we have for program delivery – working smarter not harder. As with any evaluation tool, adoption and integration happens slowly. We continue to learn where it makes the most sense to apply the surveys and where not, and the why of evaluation. This is not meant to be a “do you like us” survey. It is meant to guide us in program development and delivery. Sometimes we learn simple but critical things – is the program at the best time, is the room the best location for the program, was the presenter effective. And, we do this in a neutral way through the outcomes surveys.”

As proud as Southard is in the work that has been done, she sees it as only the beginning of an ongoing process. “ We are a learning organization and we have to continue learning what the community wants and continue applying that learning.” By using Project Outcomes tools in the context of a larger strategy that is based on community impact, Southard is confident that her team will be energized as they see the impact of work that has been developing over the past few years.

“I think people are finally starting to understand that you have to have building blocks, and it takes time to put those building blocks in place. Things may not happen in the exact sequence that you would want, and you can’t necessarily predict when all of the variables will happen. But, with a plan, once the variables come into place, you can see the greater impact of your hard work.”

Advice to a Newcomer to Strategic Measurement

According to a recent survey of Project Outcome users, many libraries indicated that they used, or hoped to use, Project Outcome as a one-stop data collection tool for library services. While you may be able to add custom questions and make this work in theory, this approach is not likely to yield much valuable information, and it is sure to cause survey fatigue among staff and patrons alike.

Project Outcome is best used as part of an overall measurement strategy – an important part, but not the only part. Denise Davis, ACLP’s Director of Strategic Initiatives, offers some advice for libraries looking to get more out of Project Outcome by using it within the context of a larger planning and evaluation strategy.

Don’t survey everything. Spend time thinking about what you need to learn to improve a service, and pick one program to start.

Use a logic model to think through the process. They really do work.

Have a specific learning goal in mind for the surveys. Don’t burden staff and attendees with unfocused surveys.

Give it time. Don’t be discouraged if you have a slow start. This is new to your customers as well as staff, so they need to understand why you are asking for feedback and how you will use the information.

Consider an incentive. Don’t go overboard, but you may want to give out some chocolate or another inexpensive “thank you” for helping the library improve programs.

Look at the results…especially the open-ended responses. This is your baseline to guide program development.

Make a plan about what you will do. What can you act on now? What more do you need to know?

Follow up if you need to. If you need more information, find an easy way to get it, such as a flip chart or a sign inviting attendees to speak with staff to give more input.

Share the results with the community. Let them know what you changed, and thank them for helping the library to provide better service.


Tags: , , ,