Quote:
Originally Posted by xris
This is how online search engines (such as Google) get the data indexed. The spiders are programs that trawl the internet (spiders, as they crawl around the web, www) and build up the index for the search engines. They download the web page, index every word they find and note the URL of the page. They also look for other URLs on the page so to build a network of links (to find more pages to search). If you look at a log file of a site to see who visits, spiders are common visitors.
|
Thanks for the information