Skip to Main Content

Library User Experience Research blog: Welcome!

When less is more in a discovery service UI

by Heather Westerlund on 2019-10-07T15:53:00-05:00 | 0 Comments

When things are allowed to evolve organically, sometimes you fail to take a step back and look at the whole picture. When I started taking stock of all the help content we had embedded in our EBSCO Discovery Service (aka EDS, aka Thoreau) it only hit me then just how much content was there. And I had put it all there (over the course of several years). I suspected that it was too much content and not all of it was being used. As this is our most-used research tool and where a lot of students begin their research, it seemed like a good time to measure what was actually being used by students and identity helpful use patterns.

So how do you measure this anyway? Google Analytics Campaign tracking is an excellent and simple way to track and measure use. Adding a few parameters to the end of help URLs allows you to see precise use of these pages, including users, sessions, pageviews, average time on page, and more. You can even tie these to specific goals configured in Google Analytics, such as the number of emails submitted or chat sessions initiated through Ask a Librarian. For example, by viewing a goal next to a specific campaign, we can see how many students submitted an email to Ask a Librarian after clicking the "Ask a Librarian" menu link within EDS. This helps us measure the impact and value of these help links.

What did we find? First, we've only been measuring for 3 full months, so these are just initial findings of two Summer months and the busy Fall-term-start month of September. I'll re-run the numbers at the end of November after we have a full term of data, but I don't expect the results to change much given our dataset is 1.2 million sessions. One thing to note is that we're defining help broadly as anything that assists students in their research process. This includes links to FAQs, robust library skills guides, our Ask a Librarian service, and even "next step" research tools like our Databases A-Z list. This help is embedded in three locations: the top menu, the search results, and the right sidebar (screenshot for visual context).

A few call-outs:

  • A help link is clicked in 1.6% of EDS sessions. That may sound insignificant, but this translates to 19.6K sessions where a student sought help. It also aligns with the percentage of students who reported on a recent post-course evaluation survey that they used library help during their course. In other words, students seek out help in EDS at the same rate.
  • Help is viewed an average of 3.4 times per user for an average of 4 minutes. Students are revisiting the same help content (fascinating!) and may actually be reading it.
  • A help link located in the top menu is clicked by 7 times as many users than a help link in the right sidebar. Seven times! Here we were focusing energy on developing the sidebar help (which offers a lot more customizability), when we should be focusing on perfecting the top menu options. Mind you, a majority of the menu clicks are going to the Databases A-Z link, but generally, most of these links perform better than any of the sidebar links. I'll concede that this is not terribly surprising; the top of the page is more visible and it's an expected location to find assistance.

    bar graph showing that menu navigation help has 22.3K pageviews, compared to sidebar help, which only has about 1K pageviews.
     
  • Students who click "Ask a Librarian" in the EDS menu rarely contact us. Only 7.5% submit an email and 5% initiate a chat. Why? Are they hoping to chat, but chat is offline? Are they expecting to find something completely different on this page? They certainly aren't all leaving voicemail messages or making appointments. Maybe they're using the Quick Answers FAQ search box. We just don't know.
  • Within the sidebar, the FAQs are 2.5 times more likely to be clicked. This is likely due to visibility as they're positioned at the top. The first FAQ about using EDS/Thoreau--which we decided is most relevant--is not the most viewed. Students are more interested in learning about how to find scholarly, peer reviewed articles and accessing the full text. Contrary to what you'd expect, students also view the brief FAQs twice as long as the robust help guides, suggesting they're reading these and keeping them open in the browser to reference.

    Bar graphic showing how the peer review and full text FAQs receive more views than the Thoreau FAQ.

 

Overall, only help located at the top of the page is being used. And there seems to be a preference for FAQs.

Based on these findings, the broad recommendation is to reduce help within EDS to locations that are being seen by students (top menu) and to what is being used. This might mean removing the sidebar completely and reworking the top menu items to better align with the findings. I think a good approach is funneling instruction. Specifically, rely on FAQs as the first point of instruction, and make sure they can easily access relevant guides within the FAQ to provide more detailed instruction for those interested in really building their library skills. This approach would align better with what we already know about students: most don't have a lot of time and most don't want to read a lot of extra stuff. After re-analyzing the data in a couple of months, we'll see if this changes these recommendations.

While these findings are really helpful for determining what's being clicked and for how long, it doesn't give us a complete picture about whether students found the content helpful after they opened it. We've just started trying to measure this within EDS. When a student clicks a help guide in EDS (and only from EDS), they will see a message in the guide below the page navigation: How useful did you find the information on this page? Every time a student sends us feedback (clicks one of these options: very useful, somewhat useful, not useful), a click event is fired by Google Tag Manager. We can use the event label generated by the triggered event to measure how useful students find this content in the context of EDS. I'm not confident that the message is visible enough in the guide for this to gather enough data, but we'll see how it goes and adjust if needed. This is our next step as well as giving some thought to entirely different approaches to embedding help within our discovery system.

Screenshot of guide showing how usefulness of guide question is displayed
 

I recently presented about this at the ERMN conference in St Paul, MN, so that others could learn about embedding and measuring point-of-need help into a discovery service. I encourage you to look at the slides for a visual snapshot of where we've embedded help, how to measure it, and what we've learned, including some additional highlights not noted above.

 


 Add a Comment

0 Comments.

  Subscribe



Enter your e-mail address to receive notifications of new posts by e-mail.


  Archive



  Return to Blog
This post is closed for further discussion.