Be Found:
Getting dynamic pages indexed

How to design findable web sites and get ranked with the best

The basics: understanding search engines and spiders

Index or directory?

Setting the right keywords

Using robot protocols to prevent unwanted indexing

Use core HTML tags more efficiently

Optimizing tricks to avoid

Getting dynamic web pages indexed

The problem of unreadable content

The commercial options for SEO

Using htaccess for efficient redirection

The final word on SEO

Dynamic pages

If your site is entirely (or even just largely) dynamic - that is, served via a database rather than being a collection of ready-made HTML documents in folders on the server - then you can run into problems getting it indexed by a spider. This is a particular problem when multiple parameters are used, pulling data from more than one source to create the final result. It's best to avoid referencing multiple data sources for dynamic pages if possible. Another solution is to redesign the way queries are sent to the server and handled there so they don't contain elements such as ?, $, or similar codes, working with ordinary URLs (or what appear to be ordinary URLs but are still recognised as query parameters) instead. Finally, consider providing static content where possible to make life easier for the spider-driven indexing process.

Looking for tips on searching more effectively instead? Read the Search Secrets pages.

Have you found the information on this site useful? If you like, you can make a small donation directly to my hosting bills! It would be deeply appreciated.