Authors respond to “One Box to Search Them All” review (part III) | Federated Search BlogFederated Search
27
Feb

[ Editor’s note: This article is a continuation of part II of the authors’ response to my review of the pre-print: One Box to Search Them All. This article is the second half of the response of the three authors of the “One Box” paper to the points I raised in my review.

The “One Box” pre-print is available to Emerald Insight subscribers in pre-print form and free to the public from Ian Gibson’s web-site.

Please note that the article is a pre-print version. This means that this is not a final version of the paper and that the authors may revise it. ]

What happened after the article

Starting in September 2008 we placed a big Single Search box on our homepage to search the three databases above. We placed Single Search boxes on all our Article Indexes by Subject pages and inside our ‘Explore a Topic’ subject guides. Typically, each box searched ?5 indexes. It didn’t take long for problems to emerge. First, there was a wave of connectors which were out of date, weren’t functioning properly or simply didn’t exist. These were dealt with as soon as they could be. For the most part we were accepting of these issues because connector maintenance is a well known issue with most federated search products. As students started to use the product more real issues started emerging.

In my final report I categorized the student’s problems into 5 major categories:

  1. Searching in the wrong place – e.x. using the search box on the front page when it is not an appropriate choice.
  2. Inability of users to get to full text.
  3. Inability of users to easily narrow their searches.
  4. Technical problems with connectors.
  5. Typical federated search problems: long response time, no deduping or relevancy ranking.

The first three on that list were the biggest (and most consistent) problems, the final category of problems were technical issues with Single Search that we couldn’t rectify. I won’t go into the gory details but some of the student’s frustration could have been solved by information literacy. Most problems can be attributed to the poor layout of the results display. For instance when most indexes were searched one would see the basic bibliographic information and then on the right a link labeled ‘More Information’ and then our link resolver button that shows up in almost every database we own. More Information takes you to the full record in the index’s native interface which may be helpful but proved to be quite confusing. Even though every LI session stresses that to find the full text, regardless of index, you must click the ‘Get It @ Memorial’ button, students seem to be drawn, like a moth to a flame, to any other button they think might get them full text (we see this behavior in other databases as well). Many of them clicked on the More Information link and then were stunned to find that they weren’t looking at the full text. These misunderstandings ended up requiring a lot of staff time to untangle; sometimes as much as 30 minutes of one-on-one staff time trying to figure out what the student had done and why it didn’t work. In a few cases it turned out that Single Search wasn’t feeding the link resolver information properly and the student ended up having to search the native interface of the database to get to the full text.

The other big problem was that once you had completed a search and got your results there was no way to revise your search from the results screen. You could go back to the main search screen and type in your search again with the new words, but then you would have to wait for the search to complete. This was a major irritant to both students and the librarians who had to help them. Our version of Single Search was older and had none of the bells and whistles (clustering, facets, etc.) one would expect to find in a modern product.

The experience for front line staff was extremely trying. Not only were they spending more time helping patrons understand how to use the system, they were also in the frustrating position of seeing the system do something funny and then not being able to replicate the problem for systems staff. Sometimes when a problem was replicatable, systems staff would start working on it only for the issue to vanish as mysteriously as it had first appeared. By the middle of November, most frontline staff were steering people away from Single Search as often as they could.

As bad as all that was, the back end was worse. Obviously, we wanted some quantitative data on Single Search’s performance. Once again Single Search let the side down. Although there was a statistical reports module, reports could only be initiated by a human being sitting in front of the server. This was an issue as the server was a few thousand kilometers away in New Brunswick. Moreover, the reports were of questionable value as they had to be broken up (again by hand) between the different schools in the consortium. For instance looking at average response times one would see the data for Academic Search Premier at Acadia then at Mount Allison then at Memorial then at St. Mary’s. There were other problems with the reports caused by the lack of storage capacity on the server.

In December, I filed a report which outlined the difficulties we were facing and issues that were inherent with Single Search. In the wake of a special meeting to discuss the report it was decided that Single Search was not a service we wished to continue to offer our patrons and that for the time being we would not replace it. Many of the other schools in the consortium have recently committed to OCLC’s WorldCat Local. It remains to be seen where MUN may choose to go.

—————————–

Here are the links to the review and to previous parts of the response:

If you enjoyed this post, make sure you subscribe to my RSS feed!

Tags:

This entry was posted on Friday, February 27th, 2009 at 11:21 am and is filed under viewpoints. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or TrackBack URI from your own site.

One Response to "Authors respond to “One Box to Search Them All” review (part III)"

  1. 1 federated search user interface « Bibliographic Wilderness
    February 28th, 2009 at 10:14 pm  

    […] March 1, 2009 Posted by jrochkind in General. trackback Here’s a field report about some negative aspects of a particular federated search user experience, not certain by whom, […]

Leave a reply

Name (*)
Mail (*)
URI
Comment