The Robots.txt
User-agent: * Disallow: /
The above directive prevents the search engines from indexing any pages or files on the website. Say, however, that you simply want to keep search engines out of the folder that contains your administrative control panel. You’d code:
User-agent: * Disallow: /administration/
Or if you wanted to allow in all spiders except Google’s GoogleBot, you’d code:
User-Agent: googlebot Disallow: /