Rise and Shine

Content Rewrites Are Dead

275 Flares Twitter 248 Facebook 12 Google+ 15 Pin It Share 0 275 Flares ×

Photo of Todd RamosThis is not your Father’s Google. The Google that you once knew and loved is gone, and today we have a new and unpredictable Google that has moods as unpredictable as the sea and which is constantly searching for a new corner of SEO to penalize.

 

We’ve had Google Panda which hit content farms, we’ve had the de-indexing of blog networks, we’ve had the new ‘fold algorithm’ for AdSense and we’ve had Penguin which targeted over optimization and link building. Now Google has its eyes set on another SEO practice that it deems no longer kosher – content rewrites.

 

What Are Content Rewrites?

A content rewrite is pretty much what it sounds like – articles that are simply other articles that are rewritten. Not spun articles necessarily (they bit the dust ages ago), not duplicate content (likewise) – these are the articles that are literally copies of other articles but completely rewritten (you know – the way kids do their homework using Wikipedia).

 

In a recent interview with Stone Temple’s Eric Enge, Matt Cutts announced that along with infographics, content rewrites would be coming under fire. Not because the practice is necessarily ‘wrong’ or immoral, but rather just because it doesn’t offer any new value over what already exists. Said Cutts:

 

Those other [rewritten] sites are not bringing additional value. While they’re not duplicates they bring nothing new to the table. It’s not that there’s anything wrong with what these people have done, but they should not expect this type of content to rank. -Matt Cutts

 

Matt Cutts SEO Penalty

Matt Cutts has been the primary source of information regarding recent and upcoming Google algorithm changes as of late.

 

 

What Does This Mean?

Basically then, if you have a tendency to write articles by just looking at another page and then rewording exactly what it says, then you’re probably going to notice your site losing traffic as a result. This is a common practice for people who want to create large numbers of sites quickly, or for people who perhaps don’t have the strongest English/best knowledge of their subject – so if this is you then you need to have a rethink.

 

On a more general note what this means, and what most of Google’s recent changes have meant, is that you need to be focussing on writing original and high quality content that brings something new to the table and that’s written first and foremost for your visitors rather than the search engines. Think long tail business plans, and think good quality articles that you write because you have something to say on the subject.

 

How to Avoid Being Penalized

If you are worried that you might be inadvertently creating article rewrites then the solution is to make sure that you don’t just use one source of reference when you’re researching – and that you don’t have any open while you write. What you need to do instead then is to have a read around the web on the topic you want to write about and see what the different sites are saying about it, and then using the knowledge you’ve absorbed sit down and write something from scratch with nothing to guide you through the process (write your own structure plan if it helps).

 

Better yet, try to write about new topics that haven’t been covered yet – which means either staying on look out for breaking news, or just coming up with new topics that you want to write that are completely original. Don’t be afraid of using the word ‘I’ or ‘we’ and of injecting a little opinion in there either. Using subjective opinion is a workaround that large content producers have been using for years to avoid wasting manpower covering stories that have an insurmountable amount of competition in search.

 

Either way, Google is working everyday to make the black hat obsolete. Never before has producing content like a normal person been such a great content strategy.

 

 Todd Ramos is currently working full time as a Search consultant for Atlantic.net 

Todd Ramos

Todd Ramos is a founding partner of PenTech Consulting. A web design and SEO services company based in Connecticut.

Latest posts by Todd Ramos (see all)

My Blog Guest

6 comments

  1. This is fine and dandy as a goal for Google, but until IBM’s Watson is rolled into Google (if ever) I don’t see the search engine being smart enough to detect thoroughly rewritten content, if it is well done. We’re talking about needing a search engine that has true Artificial Intelligence, and is able to understand the true underlying meaning of articles rather than just the mere words that are used. Maybe someday, but not anytime soon.

    • Tim, actually what you’re thinking about is the Semantic web (also referred to as Web 3.0), and it probably isn’t as far off as you think. There are certain information databases (like professional medical databases and psychology references) that use markup languages specifically made for Data like RDF (Resource Description Framework), and OWL (Web Ontology Language), and do a good job of returning the exact result on the first query. 

      Although the semantic indexing of the web as a whole is a long way off, smaller datasets have already been semantically indexed. It’s hard to explain how the RDF ontologies I’ve looked at work, but basically the data is linked so that for every query, probabilities try to eliminate all but one possible answer. In the data terms are linked to their independent data pages so a search will follow  all the links until they get to a dead end – which is hopefully the data you’re looking for, in full. The ontology itself usually works out vagueness, and when it can’t the algorithm will use a type of fuzzy logic to identify the best fit.

      If you weren’t aware of Semantic search and existing semantic capabilities, you really made a great deduction Tim! Watson is a good parallel. Look at Web 3.0, study up, and come back and predict Web 4.0 for us!!

      The next step I think is an engine that deconstructs articles and reconstructs them to make hybrid web pages where the best (not most accurate or relevant necessarily, but the best) data from several pages.

  2. Tray /

    Once upon a time when I was learning programming languages.  I would buy 2, 3 maybe 4 books teaching the same language.  I had my favorite authors but often when I could not fully understand a concept I would read from one of the other books.  Even thought the logic may have been exactly the same, another author would say it in a slightly different way.  Without this multiple styles of teaching it would have been much harder.

    Recent drops in search quality due to algo / filter updates makes me less than optimistic that Google will improve the situation with their impotent attempts.

  3. Bye Bye Content Farms /

    Looks like Google wants to crush folks like Examiner.Com , mop the floor with them and throw them away forever, never to be see again on the internet. It’s not like writers at Examiner.Com get royalties.

    • They do get royalties don’t they?

      Re-writes is lazy writing though. There is a ton of content posted everyday that doesn’t add anything new – just rehashes something else. Not just on Examiner. Google isn’t saying that you can’t re-write content, they’re just going to rank original content higher, and I think they’ve been doing this for a while anyway – the age of a post will often stand up to superior offsite SEO.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

More in SEO (6 of 8 articles)
Optimize your search listing


Almost immediately after the introduction of the +1 button, people began selling +1s with the attached claim that it would ...

275 Flares Twitter 248 Facebook 12 Google+ 15 Pin It Share 0 275 Flares ×

Facebook

YouTube