Back in October I wrote about an analysis we conducted on measuring the use of point-of-need library help within our discovery service interface. That analysis was based on 3 months of data, and you can see a visual summary of the initial findings from my ERMN conference session slides from last Fall.
It's February, and I've rerun the numbers with 6 full months of data to see if anything had changed. With the ebb and flow of courses, it's reasonable to think that use patterns could change significantly as students get deeper into their courses and library research.
Overall, the results have stayed mostly the same. Some of the charts I shared previously were based on pageviews. I've updated these because it's a more accurate reflection to show the data by sessions with Google Analytics campaign data. I've briefly summarized the updated findings, and I recommend reading the original post for context and commentary.
What this boils down to is:
Based on these findings, here are the recommended adjustments to the to our EDS interface to provide more effective point-of-help without distracting from the research experience:
Of course, these recommendations will be shared with library staff with the opportunity to provide feedback. Usage data analysis, if done properly, is one excellent way to assess web content and design and to identify glaring issues, but it rarely provides the full picture, the "why", or the nuance of usability testing. For this project, we won't be able to do usability testing to further inform these recommendations (there just isn't time or resources to test every change we make to our systems), so we'll plan to continue measuring and making iterative adjustments with the goal of providing a stellar research experience.
0 Comments.