How is a web site get indexed by the search engines in the first place? Search engines usually make use of programs which are commonly called spiders, crawlers, or robots which follow the links from page to page and download a copy of each one it finds.
The pages are analyzed, indexed, and, (with luck and good SEO) added to the database of the search engine. The search engines generally do this on a periodic basis visiting some sites more often than others.
How does the indexing of pages work? How do page ranking routines operate? Only the search engine owners know the true answers, and this is usually very closely held information. These processes are improved upon constantly, making them difficult to predict.
Information that the search engine operators see fit to pass on to us, added to the analysis of the results of specific keyword searches, helps us in our quest to discover how the indexing takes place.
By making use of this variety of information, we are able to make reasonable theories about which methods we can use in order to increase the visibility of a web site and increase its ranking in specific search results.
Through a thorough analysis of an existing web site, options which can greatly improve a site's ranking for specific searches can be recommended and implemented.
Some of the changes which are recommended could include changes to the text content which is contained within individual pages. Other types of changes will not effect what is shown in the browser, but will help the search engine spiders to correctly view and index a site.