Warning: Declaration of syntax_plugin_vshare::handle($match, $state, $pos, &$handler) should be compatible with DokuWiki_Syntax_Plugin::handle($match, $state, $pos, Doku_Handler $handler) in /kunden/309712_64295/webseiten/seowiki_en/dokuwiki/lib/plugins/vshare/syntax.php on line 12

Warning: Declaration of syntax_plugin_vshare::render($mode, &$R, $data) should be compatible with DokuWiki_Syntax_Plugin::render($format, Doku_Renderer $renderer, $data) in /kunden/309712_64295/webseiten/seowiki_en/dokuwiki/lib/plugins/vshare/syntax.php on line 12

Use a robots.txt


Big Influence Hard to implement

In a robots.txt you can instruct crawlers to exclude certain areas of your website from the index or not to follow the links on your website.

Additionally you can add a link to your sitemap in the robots.txt to make sure that the crawlers will find it.

You can use the Robots Generator to create a robots.txt and upload it to the root directory of your website.

Tip: In the Google Webmaster Tools you can check which URLs you excluded from the index by clicking on Crawling > Blocked URLs.

Helpful Links

Bing Webmaster Help: How to Create a Robots.txt file

Google Webmaster Help: Block or remove pages using a robots.txt file

GoogleWebmasters: If I don’t need to block crawlers, chould I create a robots.txt?