Tulane manages information overload at the library | Federated Search BlogFederated Search
5
Jul

Tulane Reference Librarian Paul St-Pierre presents a compelling case for federated search technology in a 31-minute video.
YouTube Preview Image.

While the video is largely about Tulane’s experience with Metalib the first ten minutes or so articulate problems that Tulane was seeing that motivate the search for a technology solution and that piece of the video is vendor-neutral.

St-Pierre explains that the problem at Tulane is “too much information.” Nothing new here. But, at 500 indexes and databases and 30,000 e-journals managing that information is a bigger challenge for them than for many other organizations.

Before federated search Tulane had many search tools, many user interfaces, and it was complicated to navigate the different tools, especially with documents being in many formats. St-Pierre described the situation as there being many paths to get to text.

Tulane’s competition is Google. It’s easy and it brings back lots of information. But, as St-Pierre reveals in three graphs, things are not as simple as they appear on the surface.

The first graph – growth of information over time – illustrates the following:

  • High quality (scholarly information) grows at an exponential rate.
  • Low quality information is increasing at a higher exponential rate.
  • Signal to noise ratio is getting worse.
  • It’s becoming more difficult to sift out the junk. Google isn’t so great after all.

The second graph – library’s curve – shows the following:

  • Content grows exponentially.
  • But, the signal to noise ratio is manageable and not getting worse as fast as Google’s.

The third and final graph – Complexity of search – reveals the following:

  • Complexity to use Google appears to be less than the complexity of using federated search
  • Complexity of using federated search is less than the complexity of using native resources

The interesting thing, though, is that if you analyze what is happening in all of the graphs you realize that the complexity of using Google is really increasing because you have to go deeper to get information you need.

The video is worth watching just for the interesting perspective on the complexity issue.

If you enjoyed this post, make sure you subscribe to my RSS feed!

Tags:

This entry was posted on Monday, July 5th, 2010 at 9:22 pm and is filed under viewpoints. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or TrackBack URI from your own site.

Leave a reply

Name (*)
Mail (*)
URI
Comment