วันพุธที่ 24 กันยายน พ.ศ. 2551

How to Avoid the Google Duplicate Content Filter?

More and more webmasters are building websites with publicly available content (data feeds, news, articles). This leads to sites with similar content on the Internet. In the case of the construction of news or data feeds, you can also find sites that correspond with each other 100% (excluding design). More copies of the same content in a search engine that does not really do something good and so Google apparently decided to eliminate some of that content to deliver cleaner and better search results . Plain copies of websites were most affected. If a webmaster was publishing the same content on several areas, all areas in question has been removed from Google's index. Many websites based on affiliate programs suddenly took a great success in the loss of traffic from Google.com. Began shortly after the webmaster of some forums saw the same complaints and stories over and over 1 + 1 produced a clear picture of the situation are available: a dual filter content was used. Duplicate content is not always bad and always exist in one form or another. News sites are the best example of similar content. No one expected that it will be removed from Google's index. So how webmasters can avoid duplication of content filtering? There are several things webmasters can do when using duplicate content of any sort and still create unique pages and content on it. Let some of these options are explained here. 1) the unique content sites with similar content. On pages where duplicate content is used, the exclusive content will be added. I do not like a few words or a link / navigation menu. If you (the webmaster) can add 15% - 30% exclusive content pages where the content of the proportion of similar content over the entire contents of the page down. This will reduce the risk of having a page flagged as duplicate content. 2) Randomization content never seen those Quote of the day given thin on some sites? It adds a random quote of the day to a page at a time. Whenever you return to the page will be different. The scripts can be used for much more than seeing a quote of the day with a few changes to the code. With a little creativity a webmaster can use a script to create the impression pages are always updated and always different. This can be an excellent tool to prevent Google from using the same content filter. 3) Yes exclusive content, unique content is still king. But sometimes you can not avoid duplicate content at all. It is good. But what about adding unique content on your site, too. If the general relationship unique content and duplicate content is well-balanced chances that the dual filter content to your site are much lower. Personally, I recommend a site that has at least 30% of exclusive content to offer (I admit it - I sometimes find it hard to reach my level, but I try). Does this guarantee that your site remains in the Google index? I do not know. To be successful a website should be unique. Unique content is what draws visitors to a site. Everything else can be found elsewhere, and visitors have no reason to visit a single website if you can get the same elsewhere.
About the Author
Christoph Puetz is a successful entrepreneur and author of the book internationally. Currently, Web sites are powered by Christoph http://www.realcreditrepair.info credit problems and help http://www.highlandsranch.us Highlands Ranch Colorado. SEO and PPC services provided by the sponsor can be found on http://www.netservicesusa.com U. S. Net Services LLC. Note: This article may be published by a person as the resource box (About the author) is on the Web, including links, and that these links to click. The last paragraph in italics source of information on the author resource box May not be ublished.

ไม่มีความคิดเห็น: