First off, Vanessa clarifies the difference between a dup content filter and dup content penalty.Â Basically the good word is that Google can happily deal with these issues in house, but people can use Google sitemaps to define pages for crawling / listing.Â There’s no penalty for sites that may display duplicate content (through for example, a dynamic CMS that is using different URLs to display the content).
From a pure webmaster point of view, it doesn’t matter if you are facing these issues – you can just let Google grab all the URLs and decide which one to display in the SERPs, and the rest will be ignored.Â However, from a SEO point of view it does make sense to rectify these issues – you might have inbound links to different URLs for the same content – so effectively you lose link weight if Google only takes into account 1 URL.
Basically, while it won’t hurt you to have duplicate content, you won’t be reaching your full SEO potential.
However, if your duplicate content issue is down to having similar pages (as opposed to the same page being accidentally duplicated), then you should work on making these pages as unique as possible (because you want them indexed and ranking).
Code to text ratio!
This point I’ve seen crop up so many times, and each and every time I say – it does not matter!Â One of my first sites was created in Frontpage with absolutely shocking code and it ranks fine, even for searches with 100 million+ results.
The good word = Google ignores code to text ratio.Â
Not really big news, but I remember having an argument with a guy on a business forum who was adamant that content / code positioning was a deciding factor.Â Ha!Â I told you so!
Nice to get clarification on these small things, even if we knew them to be true before.
Google Sitemap format preference?
No big deal – whatever makes the most sense for you.Â In terms of setting a “priority”, it doesn’t effect the end importance of the page – Google still decide at their side how to approach the crawl priority.
It can be useful to spend time on in order to accurately define priority, but not necessary.Â No impact on rankings, other than getting pages crawled.
PageRank within Webmaster Tools
For verified sites only, more up to date – basically a small bonus for verified sites.
On PR for linking building, Vanessa says that you should look for links from relevant pages and not focus on Toolbar PR.Â Most of us know that, but I do know lot’s of “SEOs” that do adhere strictly to the “links must come from PR4+ sites only”.Â Hehe.
On DaveN & Sitemaps being a poor choice to get indexed
DaveN recently posted about how he felt sitemaps were a poor choice to get a site indexed (because newly indexed pages may not be found naturally by Google and therefore have a low or no inbound linkage – making a SEOs life more difficult).
The response – why not do both?
Vanessa offers the example that sitemaps can be useful when a site is new – it take time for Google to find every page of a new site (from external links) – so why not let Google index all URLs early on.
Certainly though, if there are issues where a page isn’t accessible by Google, then fix that issue as well as submitting a sitemap.
Nothing that will blow your socks off, but as always it is good to get some of the small stuff clarified and certainly a lot of these points will be useful for small site owners who aren’t familiar with SEO.
Share this article...
...or leave a comment below.