A Publication of the Public Library Association Public Libraries Online

Getting the Most ROI from Customer Surveys

by Galen Schurlein & Tracy Strobel on April 26, 2013

Marketing is defined as “the process or technique of promoting, selling, and distributing a product or service.”1 Clearly it is a critical part of getting the word out to customers about the various services, products, and happenings at your local library. But there is a step before marketing that can significantly increase your chance for success: making sure you use the right messages with the right audiences. This means that the messages must be properly researched and tested in order to discover which message moves someone from indifference or opposition to approval and support. The best way to test a message is to conduct a statistically accurate poll. While some balk at this expense, putting resources into this initial step will pay dividends, particularly in this climate where competition for an individual’s attention has never been tougher.

Statistically accurate polls are an excellent way to shape your message and course of action. While they can be costly, if they define a course of action that results in success, it is a wise investment. Statistically accurate polls take a snapshot in time of perceptions in a defined area. A professional pollster will tell you what he or she thinks is a statistically accurate sample depending on your current positioning.

Objectivity is key to getting honest answers. It is important that questions are not worded in a way that leads to your desired answer. Make sure that the questions are clear and in layperson’s terms. Avoid the urge to ask too much, leading questions, or things most respondents would not disagree on. Be judicious with your questions because participants’ time is limited. Before you include a question, ask yourself what you will do with the information. It might be interesting to know what percentage of respondents like to read mysteries, but if genre preference is unrelated to your messaging then you have wasted time and will have irrelevant data.

Using Surveys to Determine Effective Messages

Cuyahoga County (Ohio) Public Library (CCPL) has tested its messages by polling for the last thirty years. Ignorance is not bliss. In this day and age, everyone is barraged with messages. E-mail, television, radio, friends, family, phones, texting, instant messaging, Facebook, they all give constant information. You now have less time to get your audience’s attention or to get your message across. So when you get their attention you better get the right message (an effective message) to the right audience. So, how do you find out what the most effective messages are?

Start with baseline questions that you ask on every survey. These will become testing benchmarks through time. For example, customer satisfaction ratings can be tested and tracked and then used as a marketing point. Next determine needed demographic information. You may already have this from your sample data or you may need to ask. Here you are likely looking for age, gender, income, education level, and so on. You will already have place of residence and voting history from your sample data. You may also wish to ask things like “how often do you use the library?” and “which library branch do you use?” Why is this important? It helps you drill down and determine which individuals are moved by a message and which are not by combining multiple data points and analyzing crosstabs. This crosstab analysis allows you to draw a conclusion such as, “Females between the ages of twenty-six and thirty-nine with college degrees who are frequent users of the Smithville Branch will support a bond issue for a new building if it includes an enhanced preschool play area.” What course of action does this lead you to take? When marketing to demographically similar groups, you know that your message should emphasize the preschool play area.

After the initial baseline questions and demographic information, move on to set a scenario and assess initial support. Then test possible messages — both positive and negative — to determine which of the messages increase, maintain, or decrease level of support.

Here are sample questions that set up a scenario and then test messages:

  1. Your library may have a tax issue on the ballot in November. Assume it would be a replacement issue with a small increase. If the election were held today, would you vote for or against this replacement and increase?
  2. I would like to read you a list of things the tax increase for the library would be used for. Please tell me if you think each is a very good reason, just a good reason, or not a good reason to support the tax increase:
    a. To buy more books, magazines, music CDs, and movies for library customers.
    b. To offer more special programs for children like storytimes and homework help.
    c. To open the library branches for longer hours and on Sundays.
    d. To renovate and expand library buildings so that they better meet customer needs.
  3. I would like to read you some reasons people may have for opposing the levy. Please tell me if you strongly agree, agree, disagree, or strongly disagree with each statement:
    a. The library has enough money. It doesn’t need any more.
    b. I would like to support the levy but I just can’t afford more taxes.
    c. I vote against all new taxes.
    d. With everything available on the Internet, I don’t think libraries are important anymore.
  4. What if you knew that this levy was the only source of local library funding? Does knowing this make you more or less likely to vote for the library levy?

CCPL used this strategy to help shape its 2008 levy campaign messaging. Through polling we learned that voters were not at all supportive of a levy that promised to improve our aging and inadequate library facilities. This was very unexpected. They were eager to support a levy that maintained and improved services, hours, and collections. Had we gone with our assumption that everyone would be excited about the prospect of new buildings and vote to raise tax dollars for them, we would have wasted hundreds of thousands of dollars on ineffective messaging. Using questions similar to those listed here, we knew exactly what to put on our campaign literature, what to focus on during speaking engagements, and who to target in order to move voters from opposition or indifference to support.

Shaping Successful Marketing Campaigns

Surveys can be used to justify and then shape marketing campaigns. From 2004 to 2007, three national studies served as a wakeup call to those of us in the reading business. The National Endowment for the Arts (NEA) released results of a 2004 survey called “Reading at Risk.”2 This study documented a dramatic decline in literary reading. It reported that fewer than half of American adults read literature with an overall decline of 10 percentage points in literary readers from 1982 to 2002. In 2005, OCLC published “Perceptions of Libraries and Information Resources.”3 Across all the regions OCLC surveyed, respondents associated libraries (the library brand) first and foremost with books. In 2007, NEA followed up their 2004 survey with the study “To Read or Not to Read,”4 which showed startling declines in how much and how well Americans read.

These results were eye-opening to CCPL staff. We reacted by launching the Reconnect with Reading campaign. This was an all-out effort to ignite the childlike passion for pleasure reading in adults who may have let busy lives and competing entertainment sources lead them astray. Among many efforts, we conducted a countywide survey that was a hybrid of the three national surveys mentioned previously. Our goals:

  1.  Obtain a benchmark measure of reading levels among adults in our service area.
  2. Examine reasons for not reading and learn what else people do in their free time.
  3. Determine possible intervention strategies the library might engage in to increase reading.

What we asked:

  1. With the exception of books required for work or school, have you read a book during the last twelve months?
  2. Respondents were asked whether or not they regularly participate in nine different leisure-time activities including reading.
  3. Where do you usually get the books you read?
  4. What is the most frequent reason for limiting your reading? Respondents were asked to indicate reasons from a list.
  5. What might the library do to encourage people to read more? Respondents were asked to indicate helpfulness level from a list. They were also given an open-ended prompt for ideas that were not included on the list.

What we learned:

  1. Reading levels are much higher in our service district than they are nationwide (78 percent locally versus 57 percent nationally).
  2. Younger adults (18–24) are just as likely to read for pleasure — just not as much as those 55 and older.
  3. Only two other activities were rated higher than reading: watching TV and watching movies.
  4. Nearly two-thirds of our residents read a newspaper or magazine every day.
  5. The top three reasons people don’t read are:
    a. Lack of time (68 percent)
    b. They get news and information from other sources (63 percent)
    c. Would rather do other things (56 percent)
  6. Readers get their books from a number of places but the library leads the list. Forty-two percent said they usually get their books at the library.
  7. Respondents indicated that encouraging parents to bring their children to the library for storytime is the most helpful way to get people to read more.
  8. Analyzing crosstabs, we discovered that females (83 percent) were more likely than males (73 percent) to have read a book in the last 12 months.
  9. Only 55 percent of those who almost never go to the library said they read a book in the last 12 months versus 87 percent who visit the library every week. This discredits the claim that people only use the libraries to borrow DVDs or use computers.

There are many interesting conclusions to be drawn from these survey results. We were pleased to find out that people in our service area read more than the national average and value pleasure reading more than the average American. We capitalized on these results in our marketing campaign and were able to generate pride in reading in a region that can struggle to find reasons to be proud. We also used this information to support a shift in priorities of our staff from a concentration on reference and information services to a new focus on readers’ advisory. A community of readers needs librarians who read and know books. This survey fueled the Reconnect with Reading campaign and proved that we had a receptive audience. The result? Our percentage of  circulating printn materials has increased. A recent snapshot indicates that 58 percent of the materials currently checked out are print, up about six percentage points from prior to the campaign. Our next step is to replicate this survey in 2011 after three years of aggressively promoting pleasure reading in an effort to isolate the most successful efforts of the campaign and make decisions about resource allocation in the future.

Affordable Alternatives

For the best results, it is important to use a reputable polling firm that will use clean data, proper sampling, geographic grouping, and objective questions. But if you don’t have thousands of dollars to spend on a professional pollster, there are affordable ways to gain good data. These tools are not as dependable as a statistically accurate poll, but other options are valuable as long as the data collected is interpreted with an awareness of its limitations.

There are a number of online survey resources to choose from. If you have a solid e-mail list, a robust social media presence, and strong community partners you can build enough participation to provide valuable information. In addition, online surveys have the opportunity to have open-ended questions that may be too cumbersome in statistically accurate polls. It is important to ask demographic questions to be able to understand the self-selected respondent group. For the best results, be sure to proactively solicit participation from non-library users. Don’t just post a link to an online survey then assume you are getting a broad cross section of responses from users and non-users.

Along those same lines, in-branch customer surveys are an effective way to gather data as long as it is acceptable to only hear from library users. Of course if you are looking for customer satisfaction data, or input on service improvements or modifications, this may suffice.

Whether you are testing messages for a levy campaign or conducting market research to determine the direction of a new promotional effort, surveys can play a significant role in your success. Making assumptions about the way people feel or relying on national data to forecast local behavior can take you down a costly and wasteful path that may derail your intended outcome. If you take the time and invest the dollars you will reap the many benefits that a good survey provides.


  1. Merriam-Webster Online, s.v.“marketing,” accessed Jan. 21, 2011, www.merriam-webster.com/dictionary/marketing.
  2. Tom Bradshaw and Bonnie Nichols, “Reading at Risk: A Survey of Literary Reading in America,” National Endowment for the Arts, 2004, accessed Dec. 12, 2010, www.nea.gov/pub/ReadingAtRisk.pdf.
  3. Sunil Iyengar et al., “To Read or Not to Read: A Question of National Consequence,” National Endowment for the Arts, 2007, accessed Dec. 12, 2010, www.nea.gov/research/ToRead.pdf.
  4. Cathy DeRosa et al., “Perceptions of Libraries and Information Resources,” OCLC Online Computer Library Center, 2005, accessed Dec. 12, 2010, www.oclc.org/reports/pdfs/Percept_all.pdf.