Google search engine crawling experiment



Recently I’ve had an experiment with the way Google crawls a site. I had a client site that had not been spidered in spite of being submitted to Google a good while back. I looked at the site and saw nothing amiss. There was plenty of text on the page everything looked good. I found their page in google but their was no cached text.


After a bit of searching I found others with similar problems and the speculation was that it was banned for “invisible text”. Some developers have used that technique to load keywoards into pages to take advantage of the search engines. Now, I hadn’t done anything of the sort, but I did write the pages in PHP and I had a few php comments to remind myself what I had done and why I had done it.

I removed the comment text and within a few days the main index page was spidered with searchable text and the summary was showing up in Google. So, I went about building a site map and submitting to get the rest of the pages crawled. Now the pages all were in the main directory with .php file extensions. I submitted the sitemap and waited and watched. A couple weeks went by and the sub pages still hadn’t been crawled. In that time, I had been doing frequent posts on this site and found Google spidering all over the place. WordPress uses easy to remember permalinks that look like directory paths, for instance…. http://www.averyjparker.com/2005/08/15/pcbsd-configuration/ instead of http://www.averyjparker.com/20050815pcbsd-configuration.php or something… Google seemed to like this as it was spidering and caching the various posts with about a 2 or 3 day delay.

About this time I found an interesting writeup on search engine optimization and specifically about Google. Among the things he noted is that he had noticed this behavior. No one at google could explain it to him, but it seemed that directories got spidered more quickly than specific document files. So, contact.php aboutus.php skills.php might not get indexed before domain.com/contact/ domain.com/aboutus/ and domain.com/skills

I tested this theory out on the site in question. I moved each subpage of the site to it’s own directory and renamed it to index.php (so that on viewing the directory it would automatically display), I updated my sitemap to reflect that (and the site’s menu). Within 2 days the Googlebot spidered each of the sub-directories. (After a wait of several weeks prior.)

So my best advice is if you’re eager to get a google spidering, go ahead and plan on giving seperate pages their own directories with a relevant name.

Related Posts

Blog Traffic Exchange Related Posts
  • Google Site: search issues This is interesting.... there's been a lot of frustration among some (myself included) with the current state of Googles site indexing. For a good while I've been able to consistently find ANY post on my site using google if I quote a certain amount of text that I know is......
  • Microsoft launches start page test to compete with Google's start page When I first saw the MSN search page I had to smirk. It reminded me much of the Google search page. It was a nice "clean" layout, take away the blue background and the similarities would be greater. Of course my first thought was. "OK they're copying google." or at......
  • Google as a tool for crackers Google is a search tool which I use literally every day. Sometimes it's multiple times per day. Sometimes I can't imagine how I would function without being able to do a quick google search. There are some features that I don't often use and in some ways have promised myself......
Blog Traffic Exchange Related Websites
  • How To Know The Finest Net Hosting Solutions I am positive that if you are visiting this page, you will be interested in best shared hosting. More generally than not, the items which have been new the day in advance of is worn out news nowadays and to the normal person, maintaining up with the technological breakthroughs concerning......
  • Develop A Membership Site Utilizing Internet Membership Software Programs Web membership software can be applied to create a membership web web page which is liable to bring you in a very good typical monthly income in case you have chosen the correct niche. In reality, when you choose the correct membership software package, you can run a variety of......
  • Link Tracking Choosing the right way to utilize your link tracking software is really a very simple task for even the green SEO link building learner. Everybody knows that you want to get as much links as you can pointing to your website to improve your rankings in the SERP's thus......
www.pdf24.org    Send article as PDF   

Similar Posts


See what happened this day in history from either BBC Wikipedia
Search:
Keywords:
Amazon Logo

Comments are closed.


Switch to our mobile site