Show simple item record

Search, Report, Wherever You Are: A Novel Approach to Assessing User Satisfaction with a Library Discovery Interface

dc.contributor.authorVacek, Rachel
dc.contributor.authorSmith, Craig
dc.contributor.authorVarnum, Kenneth J.
dc.date.accessioned2020-12-01T03:19:51Z
dc.date.available2020-12-01T03:19:51Z
dc.date.issued2020-11-30
dc.identifier.urihttps://hdl.handle.net/2027.42/163532
dc.descriptionThis is a virtual presentation given at the Library Assessment Conference on January 21, 2021. The presentation is in the Measurement & Methods track.en_US
dc.description.abstractPURPOSE AND GOALS In the summer of 2018, the University of Michigan Library launched a new discovery interface, Library Search (https://search.lib.umich.edu), for discovering the library’s resources, collections, spaces, and expertise. Following our assessment plan for Library Search, we have iteratively measured the success of Search since its launch across a variety of measures using a mixed-methods approach. We have tested for accessibility, monitored system performance, and conducted considerable usability testing as part of the design and development process. In general, our assessments have indicated that Search works well according to many of our metrics, including accessibility, usability, design, and the general pattern of how searching for items, narrowing results sets, and accessing materials. There have been, however, a number of concerns about the catalog search in particular, and a general anecdotal sense that this important part of the interface is not quite meeting users’ needs. Therefore, we created a tool that we could use to first get a baseline measure of overall user satisfaction, and second, use again after we have made changes to Library Search to understand the degree to which our changes improved satisfaction. Our paper will report on our launch of our rather unique data collection tool and its use with multiple groups of stakeholders on campus. We will also detail how we plan to use the tool to gauge user satisfaction over time, and to regularly gather actionable data on the strengths and shortcomings of our catalog interface. DESIGN, METHODOLOGY, OR APPROACH An initial round of data collection focused on the search experiences of library employees whose work involves using Library Search to assist members of our campus community. Focusing on this group allowed us to survey people who had clear expectations of how catalog searching should function for library staff and users, and also to ensure that our data collection tool worked well before we used it to collect data from large numbers of faculty members and students on campus. The first full round of data collection, with faculty and students, is taking place in the winter of 2020; the results will be included in our paper and presentation. The invitation to participate in the initial round of data collection was sent in December 2019 to 96 Library employees. As an incentive, participants were given a chance to enter a drawing for one $50 gift card. Participants took part in the study online, via a survey on the Qualtrics platform. Forty people provided enough data to be included in some analyses (a 41% response rate); of those, 36 completed the whole survey. The final sample had decent variability with regard to respondents’ library division and the number of years they had worked in the library. A unique aspect of our data collection approach was to ask participants to conduct searches, to report on those searches, and to share the URLs associated with their search results. This allowed data from the survey to be interpreted while seeing exactly what participants were seeing as they did their searching. Thus, the first three sections of the survey asked participants to keep the survey tab open in their browsers while conducting specific types of catalog searches in a separate tab. The types of searches — known item, known set, and exploratory — were derived from another recent investigation of our library search interface. Participants used the Qualtrics survey to answer questions about those three search experiences. Specifically, participants were asked to report on their satisfaction with the relevance of results, the speed with which results were returned, and the adequacy of various pieces of information contained in item records. When participants encountered unexpected results in their searches, they were given an opportunity to share more about what they expected to see, in relation to what they saw. A final section of the survey asked people to provide more global ratings and comments related to recent uses of Library Search (not limited to catalog searching; this could also include focused searches for articles, databases, etc.). For those that remembered using Search a year prior, a small set of questions also asked people to compare their current satisfaction with Search to what they remember feeling a year ago. These final questions, about recent experiences and comparisons to a year ago, were answered by most participants. FINDINGS We asked questions about three broad areas of search interactions: known item searches, known set searches, and exploratory searches. Known item searches are for specific, individual items. Known set searches are for collections of items (plays by Shakespeare, sonatas by Mozart, jazz CDs, etc.), from which the searcher would be more or less satisfied with any specific item. Exploratory searches are subject or topic-related. For known item searching, respondents were asked two questions about what they saw in the results: did the item appear in the results as expected, and was the position of the item in the result set satisfactory. Seventy-five percent saw the item in the results as expected, and the majority (92%) were either very or moderately satisfied with the position. We asked several additional questions for known item searches (we did not ask these questions for the other search categories, as we felt the responses would not be substantially different). When asked about the speed of search results, most (95%) expressed some level of satisfaction. In terms of ease of determining availability of print or online access to the items found, most were moderately or very satisfied (85%) but a notable minority were dissatisfied. And most people were moderately or very satisfied (86%) with identifying where physical items were located, with a notable minority expressing dissatisfaction. For known item searches -- and for the other two search types -- participants who saw unexpected search results were asked to share what they expected to see, and what they did see. Comments touched on concerns such as the relevance of results and the way that holdings were displayed. These findings serve as guides for the continued fine-tuning of Library Search. In the paper we will present examples of how such comments were paired with recreated searches in order to guide the work of our developers. For known set searches, just over half (58%) of the 36 participants who did this search saw what they expected; respondents were satisfied with the ranking of the results, about 50% each very or mostly satisfied. For exploratory searches, just over half (56%) saw what they expected in the results. Most (80%) were moderately or very satisfied, and a sizable minority (20%) reported some level of dissatisfaction. As noted, where participants saw unexpected results, they provided comments that shed additional light on their closed-ended survey responses. In the final section of the survey, participants were asked about their recent experiences with Search (not limited to Catalog Search), and their views on whether Search has improved or not compared to a year ago (for those with memories of Search at that time). Thirty-three participants had used Search within the previous two weeks. Of these, roughly three-quarters were moderately or very satisfied, with the rest expressing dissatisfaction. When asked about their satisfaction with the recent relevance of Search results, few were very satisfied (12%); most were moderately satisfied, and 30% expressed dissatisfaction. The same results were obtained when people were asked to rate their overall level of satisfaction with their recent experiences with Search. When asked to compare their current satisfaction with the speed of Search with what they remember from a year prior, most (81%) were somewhat or much more satisfied currently. When asked to compare their current satisfaction with the relevance of Search results with what they remember from a year prior, 72% were somewhat or much more satisfied currently. Finally, when asked to compare their current overall satisfaction with Search compared to a year ago, 82% were somewhat or much more satisfied. PRACTICAL IMPLICATIONS OR VALUE This study provides an example of how libraries can use an online data collection tool to reach key stakeholders when evaluating satisfaction with a web-based library interaction. The general categorization of “known item”, “known set”, and “exploratory” searches, itself based on user research into kinds of searching, could easily be extended to other kinds of library interactions in which a user is seeking something. The general method of the survey allows disintermediated user research to take place, with the efficiency of gaining detailed user feedback about specific interactions without the investment of a commensurate amount of staff time. Another key advantage of our methodology is that it facilitates the repeated evaluation of the search interface over time.The inclusion of both library staff and campus users enables us to identify high-priority issues via staff insights and also to understand how users with a wide range of search expertise experience a core library discovery interface.en_US
dc.language.isoen_USen_US
dc.rightsAttribution-NonCommercial-ShareAlike 4.0 International*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/*
dc.subjectassessmenten_US
dc.subjectdiscoveryen_US
dc.subjectsurveyen_US
dc.subjectlibrariesen_US
dc.subjectlibrary catalogen_US
dc.titleSearch, Report, Wherever You Are: A Novel Approach to Assessing User Satisfaction with a Library Discovery Interfaceen_US
dc.typePresentationen_US
dc.subject.hlbsecondlevelInformation and Library Science
dc.subject.hlbtoplevelSocial Sciences
dc.contributor.affiliationumLibrary, University of Michiganen_US
dc.contributor.affiliationumcampusAnn Arboren_US
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/163532/1/Search, Report, Wherever You Are.pdfen
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/163532/2/Catalog Search Satisfaction Survey.pdfen
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/163532/3/Results of 2019 Survey on Catalog Search Satisfaction.pdfen
dc.identifier.orcidhttps://orcid.org/0000-0003-2574-2230en_US
dc.identifier.orcidhttps://orcid.org/0000-0002-0091-1037en_US
dc.identifier.orcidhttps://orcid.org/0000-0001-6215-5161en_US
dc.description.depositorSELFen_US
dc.identifier.name-orcidVacek, Rachel; 0000-0003-2574-2230en_US
dc.identifier.name-orcidVarnum, Ken; 0000-0002-0091-1037en_US
dc.identifier.name-orcidSmith, Craig; 0000-0001-6215-5161en_US
dc.owningcollnameLibrary (University of Michigan Library)


Files in this item

Show simple item record

Attribution-NonCommercial-ShareAlike 4.0 International
Except where otherwise noted, this item's license is described as Attribution-NonCommercial-ShareAlike 4.0 International

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.