There are two ways of achieving it:
- You can add a meta tag to the generated HTML content indicating that the page must not be indexed by the search engine's spider.
<head>
<meta name="robots" content="noindex">
...
</head>
- put a file robots.txt at the root of your site, that is /robots.txt. In that file you can configure certain sections of your site to be skipped by crawling bots:
# All robots will spider the domain
User-agent: *
Disallow: /path/of/files/or/documents/to/deny
No comments:
Post a Comment