Each time an indexing source like a search engine or online directory service is asked to review and include your web site’s contents within it's listings, it sends out user agents to analyze every aspect of your web site. ( search engine robots and spiders )
These agents are often referred to as "spiders", "bots", and "crawlers". While many may argue that these agents have different names for specific reasons, it truly matters on where they come from. But, their basic function is always the same.
Spiders, Bots, and Crawlers are defined as: Computer applications that travel Internet's various resources in search of web sites for it's host database to list and offer to information seekers. This type of program reads and copies text and other sources of informative content from your site, then assigns that particular URL, or page, a ranking based upon relevancy of certain keywords, phrases, and subject matters. These automated applications are preprogrammed to review page construction and coding to then add (or remove) that page to a database based on it's rank achieved.
Upon their initial visit to your web site, these agents will start at your web sites front door or index page and begin looking for where it can go from there. Following text links, these applications are able to compose a map of your entire web site and it's linked contents. Once this map of your site has been created, it can begin analyzing your various pages contents and look for what information is offered.
Using the map as a guide, these bots and spiders begin crawling every resource that it knows is available. Focusing on the text, alt tag attributes, and links of each page, a running record of all of the information is retained for further analysis. When the bot has analyzed and recorded a page in its entirety, it will then search through it's findings looking for words, phrases, and themes that are repeated in the page's contents.
3638 North Rancho Drive
Suite 6-S USA
Las Vegas NV 89130
Phone: Toll Free US
Copyright © 2006
All Rights Reserved
A rank is then assigned to each page based upon the criteria that these automated applications are programmed to look for. This is what is referred to as the search engines algorithm. For a more complete understanding of how page rankings are composed and used, please review our ranking section of this web site.
Click any of the links below to see more about search engine secrets:
• Search Engines Traffic
• Google Page Rank
• Search Engine Ranking
• Search Engine Optimization
Chat with one of our representatives for more info]
back to Optimization FAQ