I am currently working on an experiment that I thought you might find interesting.

There is a lot of debate about duplicate content.  The question usually goes like this:  can Google determine when content on your site has already been published on another site? If they can determine that you are publishing copied content, does that mean that your page containing copied content will suffer and SEO penalty? For that matter, will your entire site be penalized?

I don't think anyone outside of Google actually knows the answer to this question.  The problem is that Google really needs to be able to identify sites that are built based on copied content in a way that add no additional value for the user. If the top results in Google for particular keywords are simply spam sites, people will stop using the Google search engine.

A simple experiment

My gut feeling is that Google cannot actually determine the difference between legitimately syndicated content and scraped duplicate content.  After all, there are lots of really good reasons to legitimately syndicated content.

To prove this, one way or the other, I have created a site targeting a particular niche with a nice AdSense payout per click.  I am using WordPress for this experiment.   I did some keyword research and selected a domain name and six related keywords to target for my website. I then went to eZineArticles.com and found one good article for each keyword that I was targeting.

I posted each article on a WordPress blog that I set up on the test domain.  This resulted in a total of six posts on the new WordPress blog. The only modification I made to the syndicated article content was the addition of an introductory paragraph in each case and a keyword optimized title tag.

Once the blog was created, I noticed that my standard tagging plug-in had automatically created lots of tags and therefore lots of tag pages. Since I only had six posts, this created lots of duplicate content within the blog. I felt like this would contaminate the experiment, so on the tag pages I added random auction items relevant to the keywords that I was targeting.

Then I used three methods to promote the new blog. I syndicated 60 blog posts containing non-unique content and do-follow back links.  The back links pointed to the six posts targeting our six keyword phrases. I also registered the blog with Jonathan Leger's 3-way links service.  Finally, I submitted the blog to 30 or more social networking sites using automated submission software.

Now I plan to let the blog sit and see what happens.

What do you think?

While we are waiting the results I am very interested in your predictions. My prediction is that the blog will rank for keywords targeted on six pages due to the back links that I have created as part of my standard promotion package. I also believe that my content will rank for keywords that I am not targeting, and the result of all this will be organic search engine traffic. I believe that traffic will translate into ad clicks and revenue.

If I am correct, this will mean that there is no such thing as duplicate content penalty.

What do you think about this experiment? What outcome do you predict? Don't be shy! Leave a comment below and get on record with your prediction. When I published the results, if you got it right out credit you with the correct answer. Be specific if you want to win.

TEST