Skip to main content
research-article

Great Library UX Ideas Under $100

  • Great Library UX Ideas Under $100

    research-article

    Great Library UX Ideas Under $100

Abstract

In June, the LITA’s President’s Program Planning Team partnered with Weave to hold a contest for great, affordable UX ideas for libraries. The winner won some fabulous prizes, but the committee had trouble choosing just one of the entries they received to get recognition. And so we present the winner and first two runners-up for the 2015 Great Library UX Ideas Under $100.

How to Cite:

(2015) “Great Library UX Ideas Under $100”, University of Michigan Play Test Journal 1(3).

Downloads:
Download XML

23 Views

23 Downloads

Published on
2015-09-03

Peer Reviewed

Winner: Guerilla Sketch-A-Thon

From the Robert E. Kennedy Library at California Polytechnic State University

As part of our efforts to redesign our website, the Robert E. Kennedy Library has solicited useful feedback from our users through quantitative analytics and surveys, as well as qualitative usability studies and interviews. But I also wanted to explore new methods for users to share their ideas.

I partnered with our Student Library Advisory Council, a volunteer group of students that are considered a “think tank” that advocates and consults the library on behalf of student needs, and promotes the library’s integration of new perspectives. The group puts out an annual survey with more than 800 participants, and it included questions about the website. The results provided information on what students consider the most important library web pages: hours, article search, book locations and maps, computer availability, course reserves, and printing guidelines.

To further define user needs and to understand the ideal information flow for online visitors better, I also designed a fun guerilla method for getting user feedback: I printed 200 sketch papers, and distributed them throughout the library, prompting student visitors and library employees to, “Sketch your ideal Kennedy Library website!” Sketches were returned to a colored cardboard box at the checkout desk, which allowed participants to return sketches anonymously if they desired. As an incentive and motivation I purchased three gift cards for the library café that I raffled off to three different submitters, totaling the costs of this project to $15 (prints of the sketch papers were done in-house on regular paper).

One week later, 20 papers with individual sketches were returned to the box (10 percent return rate), all of them providing detailed insight and guidance for the website redesign. Several participants stated that they had fun during the exercise.

I reviewed the sketches with a library-wide committee. Based on the content we decided to relabel and enlarge our search box, add icons to navigation items, use more colors of the university palette and de-clutter the marketing slider. To make finding of library materials and resources easier, the previous page’s two search boxes were combined into one. We made chat easily accessible on the main page, and reorganized and restructured the (massively condensed) website content, taking the proposed labels and information architecture into consideration.

Figures 1
Figures 1 & 2. Sample sketches from the Sketch-a-Thon.

Recent comparative analytics and usability studies show an overall increase in user interaction and mobile use when we compare academic years 2013 with 2015. We registered a 26 percent increase of pages per session, 55 percent longer average sessions and 20 percent drop in the user bounce rate, verifying the positive impact of the improved website on all users.

Many students requested custom features and ways to tailor the library website to their needs, a project the library web team will be working on during the summer. We are planning more Sketch-A-Thons for our future web developments.

First Runner-up: Wayfinding in Main Library

From University of Arizona Libraries:

Editor’s note: Shoshana Mayden is the copyeditor for Weave Journal of Library UX.

For our visitors, wayfinding in our five-story Main Library has always been a challenge. The issue comes up regularly in interviews with staff and end users, and we often observe users wandering around the library with puzzled looks on their faces.

In spring 2015, Rebecca led an initiative to fix this problem—quickly and with minimal expense. Our team did a complete audit of directional signage, and discovered that directional posters and signs were mostly out of date, or missing entirely. Directions to study rooms were non-existent, and floor maps were only available online. Aungelique and Beau conducted some wayfinding usability testing of our space, which confirmed our fears: finding books, rooms, and services in the building was next to impossible.

Using these findings as guidance, we implemented simple and inexpensive solutions to make things better for visitors. We took down all outdated signage. Aungelique created low-tech paper signs directing visitors to different call number ranges and collections, and posted them in clearly visible locations on each floor. This cost us only ten cents per sign. Nattawan, with feedback from our UX staff, created a simple, understandable digital floor map. We put it up on the third floor by the elevators using a large monitor that wasn’t in use. We gathered feedback using a comment box and by conducting in-person usability testing with students, asking them to locate books by call number and rooms by room number. We made final changes based on our findings, and then printed and posted a large floor map to replace the digital sign outside of the elevators. This cost just $33, which came out of a signage budget managed by our marketing department.

We also knew that other common questions visitors typically had were, “Where are the computers?” and, “Where do I check-out equipment?” We therefore created elevator directories on regular paper for ten cents a sheet. We then did lightning-style usability testing on these with students in the elevators, made adjustments, and finally laminated them for $3 each.

When testing and feedback proved the third floor map successful, we developed a process to do the same thing on the remaining four floors. We now have tested, laminated, mounted maps on all floors. This summer, we are planning to do the same at our other five-story building, the Science-Engineering Library.

Follow-up wayfinding usability testing shows that visitors are finding it much easier to locate materials, rooms, and services throughout the buildings. With millions of visitors through our doors each year, this has a huge impact on our user experience.

Second Runner-up: Applying a Hierarchical Task Analysis Method to Discovery Tool Evaluation

From Purdue University Libraries:

There is a large body of literature on usability tests of discovery tools in the libraries. These tests focus primarily on the search interface, and the testing tasks are often mismatches of users’ real search scenarios. Understanding how well a discovery tool supports users’ search goals and workflows remains a challenge.

In this project, we conducted hierarchical task analysis (HTA) to evaluate how Ex Libris Primo supports eleven search cases. The search cases involved different formats (article, print book, and e-book) and availability (not available, available in print, available online, and available both in print and online), which present users with possible frustrations and obstacles.

The HTA is a workflow centered analysis method without testing participants. We (two usability researchers) used a desktop and a free mind mapping software (XMind) for our analysis, hence the budget is zero if not considering the time cost.

We broke the search cases into subtasks and actions, allowing us to visualize the workflow and cognitive decision points. All cases involved four sub-goal processes:

  1. start search

  2. find relevant results

  3. view the desired item

  4. retrieve, locate, or request the item.

The first two sub-goals offered nearly identical experiences across all search cases. The third and fourth sub-goals, however, presented different workflow issues depending on the item searched and its availability.

Article search offered the least guidance from Primo’s interface and searching for an article not available in Primo involved a higher number of cognitive steps (13) than other availabilities in print (9), online (4), or both in print and online (8).

For book searches, it was challenging to verify the right book when there were many similar results or results in different locations. The steps to place a request for a book in the interlibrary loan system (Illiad) from Primo were also a challenge. The full task analysis cases and results are available on Purdue University Research Repository. We have also published a detailed discussion of the HTA methodology (Promann & Zhang, 2015).

Following our analysis, the web team at Purdue Libraries redesigned the Primo interface by eliminating confusing information in search results and unnecessary actions. For example, users no longer need to click on the obscure “multiple versions available” link in the brief item description area to see different versions of the same book. Instead, they can consistently click on the book title in the search results to view the single or multiple versions. It is much easier now to specify the exact publication date facet on a timeline than previously inconsistent time ranges. For items that are not available, the redesigned interface now displays possible search and request options like interlibrary loan, UBorrow, WorldCat, and Google Scholar. Our latest user tests showed better search workflow and improved user satisfaction with the redesigned interface.

The HTA helped us identify potential workflow issues not typically found in usability tests and new user requirements that discovery tools need to support. Our HTA analysis could also offer a comparable baseline and low-cost assessment for different discovery tools at other institutions.

Reference:

Promann, M., & Zhang, T. (2015). Applying hierarchical task analysis method to discovery layer evaluation. Information Technology and Libraries, 34(1), 77–105.