A search engine is simply a system of databases that stores massive data points about web sites. Data points are arranged in a variety of ways, and different search engines employ their own unique methods for retrieving or -ranking- web sites based on the search terms or keywords you give them.
Crawlers, Spiders or Robots are programs that search engines use to help them keep their databases up-to-date with the most current web site listings.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment