25
Feb

[ Editor's note: This article is a continuation of part I of the authors' response to my review of the pre-print: One Box to Search Them All. This article is the first half of the response of the three authors of the "One Box" paper to the points I raised in my review.

The "One Box" pre-print is available to Emerald Insight subscribers in pre-print form and free to the public from Ian Gibson's web-site.

Please note that the article is a pre-print version. This means that this is not a final version of the paper and that the authors may revise it. ]

Response to Sol’s comments

The Memorial experience with Single Search is over (and frankly not a moment too soon). I will discuss your comments and then give some more general information about what happened after we wrote the article.


Background on Memorial’s Setup

SirsiDynix Single Search was purchased by the Atlantic Scholarly Information Network (ASIN) and hosted on a server at the University of New Brunswick (Single Search, for those of you not familiar with it, is the same as the MuseGlobal federated search product just with the MuseGlobal branding replaced with SirsiDynix logos). Most of the server administration was done remotely by staff here at MUN, including the updating of connectors and the general look and feel. For the first few years, this setup worked fine as there was very little use of Single Search anywhere in the consortium. In June 2008, as a follow up to our report we decided that starting in September we would place single search boxes all over our web presence to give it a thorough testing and, more importantly, to decide what we wanted in our next generation federated search tool. Unfortunately, our setup proved to be quite problematic.

Comments on Sol’s Comments

On buy in

Skipping the buy in phase was absolutely disastrous. Frontline staff did not appreciate having to support a product that they had not been consulted about. The lack of consultation had a considerable impact on the promotion of Single Search at the reference desk and in the classroom. A little consultation would have gone a long way to getting people to give Single Search a little more slack.

On the obstacles encountered

I can’t begin to summarize here the obstacles that others have faced. I recommend reading Susan Elliott’s report: Metasearch and usability: Toward a seamless interface to library resources. Even though it is older (2004) it contains a lot of really good information.

On speed

Anecdotally, from my time on the reference desk I can tell you that there is a common belief amongst the students that I talk to that library systems just take longer to do anything. Up until very recently, bandwidth was a major problem across campus. There are bandwidth issues as well in the province generally – one of the drawbacks of living on a big sparsely populated island with limited links to the mainland.

On number of sources

I am not a systems librarian so I can’t comment on why it takes more time but I know for a fact that the more sources Single Search was asked to search the longer it took. Also, while it sounds all well and good to time out slower sources in some areas this simply was not possible as the slowest resource was also the key resource. Below is a graph of the distribution of some of our connector response times. This shows the distribution of response times for Academic Search Premier, Web of Science and Wilson Omnifile. Academic Search Premier and Wilson were both around the median ‘average connector response time’.


Stay tuned for the third and final part.

If you enjoyed this post, make sure you subscribe to the RSS feed!

Tags:

This entry was posted on Wednesday, February 25th, 2009 at 1:17 pm and is filed under viewpoints. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or TrackBack URI from your own site.

Leave a reply

Name (*)
Mail (*)
URI
Comment