There are two ways of achieving it:
- You can add a meta tag to the generated HTML content indicating that the page must not be indexed by the search engine's spider.
<meta name="robots" content="noindex">
- put a file robots.txt at the root of your site, that is /robots.txt. In that file you can configure certain sections of your site to be skipped by crawling bots:
# All robots will spider the domain