1) Screwing up Robots.txt file.
Make sure your text is like the one written as User-agent:* Disallow:
It should not be like the one written as: User-agent:* Disallow: /
This mistake can drop your organic rankings drastically.
2) Too many variables or parameters in your URLs
Although Search Engines are getting better and better in means of
crawling long and ugly looking links but still they don’t like them.
Short URL’s also do get clicked more often in the result pages and
they are good for crawlabilty and clickabilty.
3) Session ID’s in your URL’s
Search Engines straight away do not like session ID’s in URL’s. If you
are using Session ID’s then just store them in cookies rather than
including them in your URL’s. Session ID’s can cause a single page
content to be visible at multiple URL’s and thus obstructing the
4) Your site suffers from code bloat
Even though Spiders are normally good at sorting out code from
content, but that doesn’t mean you should make it more complex by
having so much code that the content is tough to find.
5) Your navigation and internal linking
Designers and developers can be pretty creative while designing a web
site but this creativity may come in the way Search Engine Crawlers.
Your site’s navigation build in complicated DHTML, java script code,
flash or Ajax based can stop a crawler in its track.