When things are allowed to evolve organically, sometimes you fail to take a step back and look at the whole picture. When I started taking stock of all the help content we had embedded in our EBSCO Discovery Service (aka EDS, aka Thoreau) it only hit me then just how much content was there. And I had put it all there (over the course of several years). I suspected that it was too much content and not all of it was being used. As this is our most-used research tool and where a lot of students begin their research, it seemed like a good time to measure what was actually being used by students and identity helpful use patterns.
So how do you measure this anyway? Google Analytics Campaign tracking is an excellent and simple way to track and measure use. Adding a few parameters to the end of help URLs allows you to see precise use of these pages, including users, sessions, pageviews, average time on page, and more. You can even tie these to specific goals configured in Google Analytics, such as the number of emails submitted or chat sessions initiated through Ask a Librarian. For example, by viewing a goal next to a specific campaign, we can see how many students submitted an email to Ask a Librarian after clicking the "Ask a Librarian" menu link within EDS. This helps us measure the impact and value of these help links.
What did we find? First, we've only been measuring for 3 full months, so these are just initial findings of two Summer months and the busy Fall-term-start month of September. I'll re-run the numbers at the end of November after we have a full term of data, but I don't expect the results to change much given our dataset is 1.2 million sessions. One thing to note is that we're defining help broadly as anything that assists students in their research process. This includes links to FAQs, robust library skills guides, our Ask a Librarian service, and even "next step" research tools like our Databases A-Z list. This help is embedded in three locations: the top menu, the search results, and the right sidebar (screenshot for visual context).
A few call-outs:
Overall, only help located at the top of the page is being used. And there seems to be a preference for FAQs.
Based on these findings, the broad recommendation is to reduce help within EDS to locations that are being seen by students (top menu) and to what is being used. This might mean removing the sidebar completely and reworking the top menu items to better align with the findings. I think a good approach is funneling instruction. Specifically, rely on FAQs as the first point of instruction, and make sure they can easily access relevant guides within the FAQ to provide more detailed instruction for those interested in really building their library skills. This approach would align better with what we already know about students: most don't have a lot of time and most don't want to read a lot of extra stuff. After re-analyzing the data in a couple of months, we'll see if this changes these recommendations.
While these findings are really helpful for determining what's being clicked and for how long, it doesn't give us a complete picture about whether students found the content helpful after they opened it. We've just started trying to measure this within EDS. When a student clicks a help guide in EDS (and only from EDS), they will see a message in the guide below the page navigation: How useful did you find the information on this page? Every time a student sends us feedback (clicks one of these options: very useful, somewhat useful, not useful), a click event is fired by Google Tag Manager. We can use the event label generated by the triggered event to measure how useful students find this content in the context of EDS. I'm not confident that the message is visible enough in the guide for this to gather enough data, but we'll see how it goes and adjust if needed. This is our next step as well as giving some thought to entirely different approaches to embedding help within our discovery system.
I recently presented about this at the ERMN conference in St Paul, MN, so that others could learn about embedding and measuring point-of-need help into a discovery service. I encourage you to look at the slides for a visual snapshot of where we've embedded help, how to measure it, and what we've learned, including some additional highlights not noted above.
0 Comments.