In the past, on this forum and other sites, been reading a lot about duplicate content and duped articles and how it can hurt you in search results. And even in a video of Google's Matt Cutts - he talked about this and how their main goal is to deliver unique content to their visitors and cut down on dupe content and how the Gbot does this and will get better with this.
Just doing some research this evening, I typed in a long tail sentence in Google... and the results: the 2nd and 3rd sites on the first page, had an "identical article" - word for word!
The page layouts were very slightly different. 1 used smaller font size than the other, and one used plain paragraphs, where the other had a couple tables here and there. The meta tag "Titles" were the same (except 1 had one less word)... otherwise they were completely identical.
So... is this Duplicate content rule a myth? To note there were 250K results for my sentence.
Interested to know about this.
Cheers
Just doing some research this evening, I typed in a long tail sentence in Google... and the results: the 2nd and 3rd sites on the first page, had an "identical article" - word for word!
The page layouts were very slightly different. 1 used smaller font size than the other, and one used plain paragraphs, where the other had a couple tables here and there. The meta tag "Titles" were the same (except 1 had one less word)... otherwise they were completely identical.
So... is this Duplicate content rule a myth? To note there were 250K results for my sentence.
Interested to know about this.
Cheers