SEOdamian's Blog

Helping others understand & share about technology, search

I Got a Free Blog – Thats Good Link Juice Right? (part 1)

The are many cliche’s that warn us there is no free lunch. But that does not mean we understand where the cost of those lunches really is. If we are used to paying the waiter, but the real cost is elsewhere it can be confusing.

The challenge is similar in technology, marketing and other areas that are changing rapidly. In the world of Free Blogs, the value and costs sometimes are hard to find. DRKent.Com had someone sell him a free blog that looked like a good value. It was bundled with other services, so it did not look like a scam. The creators of this blog generator may not have even known they were selling year old bananas, but they were.  Here are my comments to Dr. Kent about what is happening with his free blog.

I would take down your blog ASAP or ALL links to DRKENT or any other site you want to rank well. This is a clone site as you mentioned that I did not know until your email –

Google penalizes links from this site, since it is the equivalent of spamming,  since there are thousands of copies of the site. When I copy your 1st sentences into Google search it returns:

Results 1100 of about 5,320 for “I can’t believe how much my low back hurts! I don’t know if I can go to work with it like this!” Does this sound familiar? (0.32 seconds)

Over time, Google learned that real content seldom has the same sentences as other sites. Even if they are quoting the president, there are enough other analysis or comments that make the page unique. But if it is the exact same words, then Google interprets all the sources as junk. Additionally, if those sites are junk, then all sites they recommend are junk. Basically if you hang around criminals, statistically you are a criminals. Of course that leaves an interesting situation for Police and the FBI, but there are a few dozen TV shows and hundreds of movies and books exploring that correlation. So if you have ‘cheap, duplicate content’ pointing at your site, then you probably don’t have quality content on your site. While Google cannot really read (understand what you are really intending to communicate), it can statistically analyze what you have shared. It knows that if your friends are bad, then there is a greater chance you are bad. Just like a background check by the FBI.

From Google’s own blog at

There are some penalties that are related to the idea of having the same content as another site—for example, if you’re scraping content (automatically copying the ‘good’ parts) from other sites and republishing it, or if you republish content without adding any additional value. These tactics are clearly outlined (and discouraged) in our Webmaster Guidelines:

  • Don’t create multiple pages, subdomains, or domains with substantially duplicate content.
  • Avoid… “cookie cutter” approaches such as affiliate programs with little or no original content.
  • If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.

In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.

I guess as a way to ‘prove’ that the process works, using the scientific method – test and see the results. It is not truly following the scientific method, but as an experiment it has greater value then just being a thought experiment so often used as truth on the Internet:

  • When I search for ‘back pain’, none of the 5,320 cloned blogs show up in the top 200 listings.
  • When I search for ‘lost work time’, none of the 5,320 blogs show up in the top 200 listings except for who they changed his title tag and page name explicitly to include ‘lost work time’.  It appears that he was the 1st one they did with the new format on August 5 out of about 70, so he got a higher listing. The other 5,320 including you – lose.  Google will sometimes rank the 1st site it finds, and then count the rest as duplicates. So yes you can game the system, but it may be as much effort as just following the rules appropriately.

When I search for ‘Neck and Arm Pain – The Herniated Disk?’ – the title and heading of another page – one site gets listed in the top 200 at 12 or on the second page for most searchers (click through rate on page 2 is about 1% of being in the top 3, so unless it is a high volume search term, useless) – Ben’s page may have been a test or a mistake, since it has the extra characters in the page name, and his page does not have an address or phone number on it – probably making it unique.

So the moral of the story is check to see if your free website is worth taking. It may even be costing you (extra calories or wasted time) in ways you do not even know. Typically to give something away free it needs to be lower quality. In the world of the Internet, quality is easier then ever to determine with just a few clicks by human or less effort by computers.


February 15, 2012 Posted by | Blogging, copywriting, SEO tools, tools, Uncategorized | , , , , , | Leave a comment

Want Traffic? – Don’t Try to Boil The Ocean With Your Website

The concept of the HTML/and the web is lots of small chunks.

If you look at what is the ‘correct length’ of a blog post, it is often listed at 400-800 words.  This is typically 5-8 paragraphs to cover a single idea in a bite sized chunk. It is a singe idea, and the blogs are set up to have each post be their own page.

So when you are designing your site, map it out by focusing on what you are trying to accomplish.  Then outline your site with separate pages for each idea.  Each page should have a clear purpose. This makes it easy for your visitors to understand what you are trying to communicate to them. It makes it easier to accomplish your purpose and for your visitors to be in sync with what you are trying to accomplish.

Of course, your ‘number 1’ visitor is the Google spider – so these ‘rules’ for your human visitors also apply to Google’s spider. If you design for good human readability, then more often than not, you will have good Google readability and Google will reward you with high rankings accordingly. If your page is focused on a single idea, then Google will more likely see your keyword phrases and understand your page is concerned about that and rank it higher than a page that is focused on 5-6 ideas and is crammed with various keyword phrases.  Google will ‘read’ your page and rank it lower for multiple keyword phrases. There are exceptions, but trying to ‘trick’ Google these days is a hard way to build traffic, and you run the risk of Google shifting its formulas and bouncing way down.

So don’t try to boil the ocean with one fire, create separate pages that have single purposes.  It is easier for your readers, it is easier for Google. It will get you more traffic.

Break up your Page

James Michener wrote novels that were great for those wanting a single summer read. They were long and full of detail.  They carried a lot of ideas interwoven together.  They had great plots that kept you following along for hours and hours, page after page. You got great value from all the details painting the complex pictures of his topics and themes.

But the web is not designed for reading long sections of text sequentially.  It is designed for chunking – lots of breaks. Those breaks are headlines, and pages.  In fact, that is how Google determines what is important – if you label something as an H1 heading-that is your headline, it assumes that those words are more important than the little footnote at the bottom of the page.  The H1 heading is specific, do not just use a relative larger font for a few reasons:

  • It is sloppy coding that will often come back to haunt you.
  • Google prefers the H1 heading to clearly identify what the Heading is on the page.
  • It displays more consistently across the various browsers including mobile browsers.

By putting a single idea on a page, it makes it easier for the reader to plan their reading – they can see how big the page is, they can see the topic and determine how in depth you will be going for that topic. Google also has an easier time ranking your site’s page for that specific topic.

Thanks to for inspiring this idea.

June 18, 2009 Posted by | How To, HTML Issue, Internal Organization, Uncategorized | , , , , | Leave a comment

Don’t Be Stingy With Your Pages – We Are Not in the Print World Anymore

One of the challenges for many people trying to create a website is the sense of scarcity of space on the Internet.  This goes back to the days of printing when each impression on each sheet of paper was a significant effort.

The world of the web is different.  Space is just about free.  Sure there are some hosting plans that charge a slight increase to have more pages, but if that is preventing you from effectively communicating your messages, then you need to reevaluate your hosting (another post). The amount of server space you are taking up with most well designed pages is minimal (if the pages are big enough to measure, they will be too slow to load. Make them smaller in graphics, or optimize your code).

So unlike a book or article printed on paper where ‘white space’ feels like a waste, on a website ‘white space’ is a sign of focus and professionalism.  This also makes it easier for the reader to get a single idea at a time.  Use the page as a way to segment ideas. You will notice that complex manuals use this concept because it improves effectiveness and comprehension.  Complex manuals break the ideas into separate pages because it allows for cohesiveness within a concept being communicated.

And guess what – Google and other search engines do well with this.  They understand and appreciate a page that is trying to communicate one idea better than a page that is about 15 ideas and all over the place. When you review search engine analysis programs like WebPosition and other tools, they talk about the precise number of words, how often to repeat the keyword phrases, the percentage of keyword phrases to total text and further formulas.  But when you get down to it, if you write at a reasonable grade level of understanding, and cover a single idea rather than a bunch, these numbers magically fall into place.  Search engine analysis tools are great tools for tweaking, but following 1 idea per page solves a ton of tweaking work later.

So, don’t be stingy with your use of individual pages. Don’t create one long page that covers it all – it is confusing for your visitors and for Google. Create lots of pages, there’s plenty of room in web-space!

June 17, 2009 Posted by | copywriting, SEO tools | , , , , | Leave a comment