22 Mar 2016

SEO and the Implications of Duplicate Content

22 Mar 2016

A few weeks ago, we wrote about reasons sites fall in search rankings with the intention of addressing the most common culprits. Upon discovering recently that we overlooked one, we decided to take the time to expand upon another common reason a website could see its rankings taking a downward turn.

The issue at hand: duplicate content.

Duplicate content is largely an issue because of individuals or sites that use the copy-and-paste method for building content, in hopes of doing nothing more than generating more site traffic. While this is typically the exception and not the norm, it happens, and the brains behind the search engine algorithms are acutely aware.

Malicious or not, if your site reuses information or makes the same information available in multiple places, search engines get a little concerned. The algorithms employed by these search engines can’t decipher which page or information source should be displayed, so they either avoid displaying them or knock them down a bit due to competition.

It all seems pretty easy to comprehend and avoid, right? Well, it’s a bit more complicated than it appears if taken only at face value, as the presence and source of this duplicate content often goes overlooked.  Luckily, there are some measures you can take to ensure your site is not unintentionally reusing content. The most common are:

  • Consolidating multiple pages that are similar or contain a lot of the same information into a single page.
  • Avoiding recycled boilerplate copy by making small tweaks in the wording or structure so that it isn’t always identical.
  • Including links back to the original article or content if used on other pages to keep Google from having to figure out which is the original source of the information.
  • Eliminating “printer-friendly” versions of pages that contain the same content.

There are times when there is no way to avoid site content being displayed across multiple URLs.  When this occurs, edits can be made on the backend to make sure that traffic is being directed accordingly and your site is not penalized. The easiest methods for addressing these issues are:

  • Rel=”canonical” – this tag allows search engines to crawl the URLs, but indicates that the pages or contents are duplicate.
  • 301 Redirect – this allows you to reroute the page with the duplicate copy to the original page.

Those concerned about the ramifications of duplicate content rarely make the mistakes knowingly or with malicious intent, but mistakes are made. Armed with the knowledge to successfully combat these issues, you can now deftly maneuver around these obstacles without feeling the sting.

If you have questions about your website, or simply would like to consider a more proactive SEO approach, we can help! Holland Advertising: Interactive consistently brings their creative expertise to the realm of marketing, with strategic planning and experienced execution. For more information, contact Bryan Holland at 513.744.3001.

Leave a comment
More Posts
Comments
Comment