Dynamic sites are defined as sites that retrieve data from a "database" and display it in the users browser. Generally, a single CGI ("Common Gateway Interface") page or script can be used to retrieve one (or many) pieces of data from the database in a Dynamic fashion, meaning that the page/script is instructed through certain "parameters" which pieces of information it should retrieve and display.

Let's take a PHP page/script as an example. If we were to have a database of multiple products, we might only want to display 1 product at a time. We could create a page called "products.php" and pass parameters to it through the URL "Query String". With each request for a product, the product "ID" as defined in our database, can be specified in the query string. This is the common setup for the vast majority of dynamic sites.

An example of such a request might be as follow:

http://www.example.com/product.php?ID=25

As you can see in this example, the product which is to be read from the database is contained the URL ("25" in this case) and the page will know that it is to look in the database for a product with an ID of 25. Once the product is found in the database, all of the associated information can be displayed to the user in whichever format is defined by the page. Multiple parameters can be passed in the query string (when separated with the "&" operator), and there is almost no limit as to how these parameters can be used.

With search engines, there lies somewhat of a problem in reading these "Dynamic URLs", namely that there is practically no limit to how many instances of any given page there might be since each parameter can be of any value and that there is practically no limit to the number of these parameters there might be. What's even more, sometimes these parameters are a random string of data which represent the users "session", and since this session value will change upon each visit, there exists the potential for an "infinite loop" if the search crawler is not highly intelligent. Since most of the current search engine crawlers are not very intelligent with such matters, certain precautions must be observed:

Dynamic Site Precautions:

  1. Limit the number of parameters for any given query string to 3 or less. Less is more!
  2. Never use session ID's which change upon each visit to your website.
  3. Use structured query string formats at all times.
  4. Never use non-urlencoded characters in query strings or page names.
  5. Never vary character cases. Stick to lower case unless you have a justified for doing otherwise.

As stated in No. 3, query string formats should be of a single structure in all instances. If you reference a page with the query string "?A=1&B=2", then you should never reference it with "?B=2&A=1", which is technically fine but from a search engine perspective will produce "Duplicate Content".

Following these precautions can create a bit more work for you as a Webmaster, but we are confident that following these will help improve you rankings in virtually all search engines.