The table above gives the Showdown Estimate and recent claims as to how many millions of Web pages have been indexed and included in the various search engines' databases. These estimates are based on exact counts obtained from Fast and Northern Light on the date of the comparison, and those numbers are multiplied by the percentage of a search engine's total hits from the searches used on the Relative Size Showdown as compared to the number found by Fast and Northern Light. The Showdown Estimate is then an average of those to numbers aims to give the searcher a very approximate estimate of the effective size of the database -- the part of the database from which the searcher may actually see results. While the terms used for the Relative Size Showdown searches are not chosen completely at random, they were chosen from a variety of subject areas and countries so as to meet the criteria outlined in the methodology.
Northern Light has a technique that can be used to obtain an up-to-date count of Web pages in their database. Limit to the World Wide Web only and enter search or not search. The resulting number should be the current size of their Web database. It works with most common terms. The OR NOT operation finds every record which has the term as well as every record which does not have the term. Fast provided me with a similar technique (which unfortunately I am not permitted to disclose) which gives an exact count of the records in their database.
So why these discrepancies between claimed size and the Showdown Estimates? Bear in mind that these are very rough estimates and that they are based on actual search results. There are several factors to consider which may explain these results beyond the limit of basing the estimates on a small number of searches and on only Fast and Northern Light's reported numbers.
The Inktomi-based search engines such as iWon are run on clusters of computers. According to Inktomi, at any point in time, some of the computers may be down for backup or other maintenance. Consequently, their entire database may not be searched at any point in time. My estimates thus reflect what was available to be searched at the time the searches were run.
AltaVista can time out on some searches and only deliver partial results. Since my numbers are based on actual number of hits found, that may cause AltaVista's size to be under-represented. On the other hand, if Inktomi and AltaVista do not have their full databases available to searchers, what is the use of that extra size if it is inaccessible? These estimates may well give a better sense of the size of the accessible portion of the search engine databases.
Google claims over 1.3 billion pages in its index, but only about 700 million are fully indexed, with the rest being non-indexed URLs. These only rarely show up in search results and while the numbers here include them, it is less than 1% of their results. Therefore, the claim listed above is only for their fully indexed portion of their database.
This April 2001 total size comparison only covers the top six search engines (as measured by database size) that were also included in the Relative Size Showdown.
While decisions about which Web search engine to use should not be based on size alone, this information is especially important when looking for very specific keywords, phrases, and areas of specialized interest. See also the following statistical analyses:
|A Notess.com Web Site
©1999-2007 by Greg R. Notess, all rights reserved
|Search Engine Showdown|