Use a robots.txt


Big Influence Hard to implement

In a robots.txt you can instruct crawlers to exclude certain areas of your website from the index or not to follow the links on your website.

Additionally you can add a link to your sitemap in the robots.txt to make sure that the crawlers will find it.

You can use the Robots Generator to create a robots.txt and upload it to the root directory of your website.

Tip: In the Google Webmaster Tools you can check which URLs you excluded from the index by clicking on Crawling > Blocked URLs.

Helpful Links

Bing Webmaster Help: How to Create a Robots.txt file

Google Webmaster Help: Block or remove pages using a robots.txt file

GoogleWebmasters: If I don’t need to block crawlers, chould I create a robots.txt?