- Buy Zithromax Without Prescription No RX Required - http://www.thesearchagents.com -

Google’s Site Review Hangout Provides Trends, Not Details

Posted By Michelle Silverstein On November 22, 2013 @ 11:25 am In Featured,News,SEO | 5 Comments

site-review (1) [1]

Google recently hosted a hangout [2] to provide feedback on sites that webmasters submitted for review.  Although the hosts John Mueller and Pierre Far didn’t actually review sites individually, they did highlight example issues they found within the submitted sites and provided generalized advice on how to resolve some of the more common problems they discovered. As the hosts described their process for reviewing sites, they first listed the areas they would examine that often cause problems for webmasters.  They also made a point to mention the areas they would NOT examine, areas which they no longer consider problematic as a result of Google’s improved ability to assess and ignore certain issues.  The list of elements Google said to ignore was rather surprising. Although we’ve listed a few of the highlights below, it’s important to keep in mind the caveats that accompany each of these, typified by the word “usually.” For instance, lack of HTML validation “usually” isn’t a problem for the Google bots, and Google can “usually” compensate for mixed up 301 and 302 redirects.  However, “usually” by definition isn’t “always,” so webmasters should continue their optimization efforts in these areas whenever possible, adhering to best practices and common sense. With this necessity in mind, what does Google say you don’t need to worry about?
  1. HTML validation: Google can usually pick up any page that has HTML on it – it doesn’t have to be valid.
  2. Missing robots.txt files: Forgetting to insert robots.txt files is not a problem. Google will simply continue crawling normally. Note: robots.txt files should be used for important crawler instructions, such as “blocking” directory crawls.
  3. Duplication from www/non-www or http/https sites: Google can usually reconcile duplicate content resulting from www/non-www or http/https URLs, unless the site is exceptionally large.
  4. 301 and 302 redirects: If you mix up 301 and 302 redirects, Google can usually figure out the difference – more specific to device detection / redirection.
  5. IDs in URLs: If you already have IDs in your URLs, you don’t have to go out of your way to rewrite them, as long as they are crawlable. Google still recommends “clean” URLs.
If anything, by indicating certain elements that webmasters usually don’t need to worry about, Google provides insight into some of the incremental improvements that the crawling system has made over the years. These changes make it easier for Google to identify common technical issues, potentially removing some of the burden from webmasters who are optimizing their sites. Check out the archived recording of the hangout [2] to get the whole story and to learn from some of the common mistakes made by other sites.  The hosts also indicated that they planned to hold more site review hangouts in the future where they might delve into individual sites to provide a deeper level of insight.

Article printed from Buy Zithromax Without Prescription No RX Required: http://www.thesearchagents.com

URL to article: http://www.thesearchagents.com/2013/11/googles-site-review-hangout-provides-trends-not-details/

URLs in this post:

[1] Image: http://www.thesearchagents.com/wp-content/uploads/2013/11/site-review-1.jpg

[2] hangout: https://plus.google.com/events/cgdq2mh2e66esokgg2jspr10bj0

Copyright © 2009 The Search Agents. All rights reserved.