Earlier this month ReadWriteWeb reported on a mechanism Google is creating for real-time indexing:

Google is developing a system that will enable web publishers of any size to automatically submit new content to Google for indexing within seconds of that content being published. Search industry analyst Danny Sullivan told us today that this could be “the next chapter” for Google.

And, here’s an interesting comment.

Last Fall we were told by Google’s Brett Slatkin, lead developer on the PubSubHubbub (PuSH) real time syndication protocol, that he hoped Google would some day use PuSH for indexing the web instead of the crawling of links that has been the way search engines have indexed the web for years.

If PuSH is as widely used as Google hopes it will be then this is a major paradigm shift for the search giant. No, Google won’t stop crawling the Web but if a critical mass of Web publishers get Google (and presumably other search engines) to index their content very quickly then the real-time Web will take a giant leap forward.

It will be interesting to see how PuSH impacts the federated search community. Clearly the real-time Web can move scientific information very quickly. Perhaps this new technology and paradigm will augment nicely the flow of scientific papers found by federated search applications in the deep Web.

If you enjoyed this post, make sure you subscribe to the RSS feed!


This entry was posted on Friday, March 26th, 2010 at 12:14 pm and is filed under technology, viewpoints. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or TrackBack URI from your own site.

Leave a reply

Name (*)
Mail (*)