How to block search engines from...
Is there a way to make a page invisible to the search engines? I am sure there must be a way, I am just not certain how. I know just enough to be dangerous!!!!
Check out these links.
http://www.robotstxt.org/robotstxt.html
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=156449
http://www.robotstxt.org/robotstxt.html
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=156449
CoffeeCup... Yeah, they are the best!
You could add the meta tag below onto each page you do not want listed, it must be added above the </head> tag
However, not all robots will comply with the request.
You could also create a robots.txt file, an example would look like this:-
This would tell any robot to not index fox.html and cat.html
-----------
You would save the above code as an ascii text file called robots.txt and upload it to your root directory.
Both examples would work but are not foolproof.
The safest way
If you have files on your web site that you don't want unauthorized people to access, then configure your server to do authentication, and configure appropriate authorization. Basic Authentication has been around since the early days of the web (and in e.g. Apache on UNIX is trivial to configure). Modern content management systems support access controls on individual pages and collections of resources.
In other words, place the files you do not want just anyone to see, then you can place them in a separate folder and use .htaccess to password protect them, that way anly people you allow will be able to view them, not really the best idea.
<meta name="robots" content="noindex,nocache">
However, not all robots will comply with the request.
You could also create a robots.txt file, an example would look like this:-
User-Agent: *
Disallow: /fox.html
Disallow: /cat.html
Disallow: /fox.html
Disallow: /cat.html
This would tell any robot to not index fox.html and cat.html
-----------
You would save the above code as an ascii text file called robots.txt and upload it to your root directory.
Both examples would work but are not foolproof.
The safest way
If you have files on your web site that you don't want unauthorized people to access, then configure your server to do authentication, and configure appropriate authorization. Basic Authentication has been around since the early days of the web (and in e.g. Apache on UNIX is trivial to configure). Modern content management systems support access controls on individual pages and collections of resources.
In other words, place the files you do not want just anyone to see, then you can place them in a separate folder and use .htaccess to password protect them, that way anly people you allow will be able to view them, not really the best idea.
Jim
---------------------------
---------------------------
Have something to add? We’d love to hear it!
You must have an account to participate. Please Sign In Here, then join the conversation.