Ok, so we’ve seen how to password protect directories to keep the web crawlers out, but I don’t want to go through that. I want to keep the page open, but I don’t want it spidered and indexed by the bots.
There are ways for doing this too. In fact there are several. The most commonly accepted and respected way of telling a bot not to crawl certain areas of a website is with what’s called a robots.txt file. Usually this is put in the same folder as your main site index and looks like this.
The above will keep all robots out of your site. This might be too heavyhanded though. Let’s say the msnbot has been a bit too voracious with your downloads area
That should be enough to keep it out of that folder. Here’s another example.
You can get more complicated than this if you need to.
Here’s a link to googles robots.txt file
To exclude a specific file from being indexed, you might try the following meta tag in your document.
you can also use index and follow to fine tune what you want to restrict or allow.
I’m not certain that the respecting of a meta tag is widely held. robots.txt is more likely to be followed.
To just exclude the googlebot, you might try this…
according to Googles page on removing pages from the index
Google apparently will respect that tag and that would allow other bots through.
Related PostsRelated Posts
- Google Analytics under the microscope I've spent some time this evening looking at Google Analytics. (Now the data is being collected.) And I've got to say I'm impressed with the scope of what I'm seeing. First, since last night, more stats have been collected, there seem to be some missing from today yet (maybe ~12......
- The Google Problem, or why I'm starting to use MSN and Yahoo more. This weekend has been a bit of an introspective for me on why google is still the primary search engine I use. I know, I've been a big "fan(?)" of google for quite some time, I've obviously incorporated many of their products into my pages and used Google for 99%......
- SSH, Proxies (Proxy's?), Tor and Web Browsing For quite some time I've been making use of a dd-wrt modified linksys box on my home network as an openvpn endpoint so that when I'm out and about in the world, I connect the vpn, switch firefox to route through a squid proxy server on the home network and......
- Redirects - What to Use and When This is a guest post! If you want to write for us, check out the Guest Post section. To define redirect broadly would be seen as a way of sending a browser (or search engine) from one web address to another. Some commonly used redirects would be: Manual redirects –......
- Astonishing Tricks Of A Small Recognised Targeted Visitors Generation Grasp Traffic Bandits ReviewSo, you have create your foremost site while in the hope which you could appeal to far more visitors in your home business. You have worked some months setting up, creating, developing and having to pay to get a wonderful world-wide-web host. All that stays to become accomplished......
- Google Loves You: 10 Top Tips for a Google-Friendly Website Many small businesses believe good Google listings are beyond them. This simply isn’t true. Find out how to be able to say “Google Loves You” and your website. 1. Domain Name Choose a domain name that contains two or three keywords that are the most important keywords for your......
- Roll your own search engine… sort of…
- Interesting spyware push download tactic…
- Saving you from yourself or specifying which index file to use with apache
- Wget user agent avoidance
- Creating a redirect page