SEOdamian's Blog

Helping others understand & share about technology, search

New World order – customer perspective 1st, Company perspective 2nd

Next Door Chicago

Who do you think the business is behind this site?

Check out this website and their whole approach to marketing. Let me know how long it is before you figure out who’s actually the big company behind the totally different approach to marketing.

Here is what stands out in looking at this as an effective marketing tool:

  • it is fun and colorful
  • it has movement – both in the rotating graphics as well as in the variable typefaces being used.
  • It is inviting,  both from its graphics and it’s ability to share with others, and the ability to easily find information
  • it states what it can do for me in a non-sales format way before I ever can get to the point of finding out what I can buy from them.
  • It focuses on community and how we can interact locally, rather than with a big mega Corporation.
  • It’s quick, concise, clear, and the messaging text is easy to figure out what it’s about, then get on,  get off, and move on to my next task at hand.
  • the navigation is easy to follow. While I usually don’t like the drop downs and chase the cursor type websites, this one is easy, because the targeted areas are large,  and easy to click on, with a single layer drop-down.

I think the key here is that they are starting from a customer perspective, rather than from a corporate perspective, which is very key for any business these days, especially in working with the younger generation. owner of Next Door Chicago website.

This site may or may not be the best for search engine optimization. Although it really is not clear exactly what terms they would be trying to optimize for anyways. They do rank at the top for “next door Chicago”. Which if that is their brand focus here, it is a good approach. But I imagine in the list of site objectives, SEO was lower on the list, and they are more successful in other site objectives.

I would love to have your perspectives on this.

April 1, 2012 Posted by | Chicago, Community, How To, local marketing, Uncategorized | , , | Leave a comment

Types of Traffic – It is all the same, right?

A customer is a customer is a customer, right?  No, there is the customer that asks 200 questions and buys your loss leader, returning it used and unresellable. There is the customer who comes in, asks can you ship 200 of your most profitable item? Oh, and can you autoship if they pay in advance for the next 5 years?  Who would you prefer more of?

Visitor traffic is similar, not all traffic is the same value to your organization today or tomorrow. There are different types of traffic to look at when trying to create a successful site.  There are different dimmensions of traffic – what type of traffic, what quality of traffic, what the cost it was to get the traffic and others. But lets just look at one dimension, where is the traffic coming from:

Referral Traffic

This is the traffic created by other sites referring / linking to your site, or your emails driving traffic to your site.  This is great ‘free’ traffic.  Referrals from other websites also supports your SEO traffic and rankings, because the search engines (SE) respect other sites sending traffic to your site. The more you are listed on other sites, the better your SE (search engine) ranking.  The more related the sites, the better your SE ranking. For example a site about guns referring to your site on trees is not as helpful as a referral from a site about arboretums referring to your tree site.

A way to quantify your ranking is to look at your Page Rank (named after Google Founder Larry Page). There are a number of tools including add-ons to your browser that will automatically let you know your PR (Page Rank) on a 0-8 (higher is better) score. Here is one –

SEO – Seach Engine Optimization or Natural Traffic

This is the traffic that comes from your site ranking well in the search engines, and the traffic coming from others looking for answers to their searches.  The 1st key to creating this traffic is to create content that answers your visitors questions. The key here is to look at your site from the perspective of your visitors. What are they looking for? Not what do you want to tell them. You may shift the answers to be what you want to share, but 1st look at what would be the perfect answer to a visitor coming the site.

Of course, if you collect the most common questions you can create a FAQ or Frequently Asked Questions page or better yet section of pages (depending on your answers, it may be appropriate to have a single page for each question).  The challenge is to come up with the best answer to meet everyone’s needs. But like all good communications, allow the reader to choose how much they want to read by putting the most important/simplest information first. DO NOT follow the mystery novel format and keep the best for the last.

SEM – Search Engine Marketing

Search engine marketing is when you pay to get traffic to your site.  This can include a variety of different strategies such as:

  • Banner Ads on other sites – here you can pay per image shown (maybe or maybe not registered in the viewers mind), or per click thru to your site.
  • Email marketing like Constant Contact or iContact (a whole other post) to someone else’s email list (make sure you are not spamming) or to your own list.
  • Google AdWords (buying ads on the search engine)
  • Google AdSense (buying text ads on different websites) – this is an overlap of banner ads, but through the 600 pound gorilla Google that acts as the ad-man for you.
  • Ads on other search engines
This traffic can be very profitable or a total waste of money. The great part of this type of traffic is it is easier to test. Testing is a big part of what you need to do to be successful. Consider it the equivalent of practice in sports, music or dating.  You don’t expect to get it perfect the 1st time, and you need to keep improving as your competition keeps getting better.

LMT – Local market traffic

Local market traffic is when visitors are looking for the types of businesses and solutions that used to be  typically found in a yellow pages directory.  They are usually the local businesses like restaurants, and cleaners.  These are businesses that are not trying to market across the country, but only the local community. This traffic is in some ways easier to get, because there is less competition (only a few real estate agents in Downers Grove, compared to million in the whole country). You don’t have to rank higher than the real estate broker in Texas, if you are Wisconsin. But you need to have people looking for Wisconsin real estate.  That is where focus is important.  This also requires looking outside the ‘normal’ spots.  Don’t try to rank high in ‘real estate’ on the search engine.  Instead try to rank in ‘Bloomington real estate’ or even better ‘Bloomington vacation real estate’. But also look at other sites than the big 3 search engines.  Look at local sites like the different yellow pages or

What kind of traffic you want will determine what kind of strategies you should follow.

SEO is more on content.

SEM is more on converting the visitorOf course you have to define what conversion is. It may be a sale. It may be a signup for a newsletter. It may be a request for a consultation and setting an appointment. Or it may be requesting a free report. You need to define what your conversion funnel is (most businesses have multiple conversions in the process of a lifetime of a customer.

LMT is about converting and getting them to call or use traditional business channels. That may be getting them to drive to your store. That may be getting them to call and set an appointment. It may still be just calling or emailing for a report or other trust building step.

Understand that just like traditional sales takes on average 7 touches to convert to a sale (after having built up enough trust), online channels can take multiple steps to build the trust required for your final conversion.  Make sure you design your website presence to convey that.

What about you? Take our long poll – 1 question –

March 15, 2012 Posted by | Definitions, How To, local marketing, SEO tools | , , , , , , , , , , , , , | Leave a comment

I Got a Free Blog – Thats Good Link Juice Right? (part 1)

The are many cliche’s that warn us there is no free lunch. But that does not mean we understand where the cost of those lunches really is. If we are used to paying the waiter, but the real cost is elsewhere it can be confusing.

The challenge is similar in technology, marketing and other areas that are changing rapidly. In the world of Free Blogs, the value and costs sometimes are hard to find. DRKent.Com had someone sell him a free blog that looked like a good value. It was bundled with other services, so it did not look like a scam. The creators of this blog generator may not have even known they were selling year old bananas, but they were.  Here are my comments to Dr. Kent about what is happening with his free blog.

I would take down your blog ASAP or ALL links to DRKENT or any other site you want to rank well. This is a clone site as you mentioned that I did not know until your email –

Google penalizes links from this site, since it is the equivalent of spamming,  since there are thousands of copies of the site. When I copy your 1st sentences into Google search it returns:

Results 1100 of about 5,320 for “I can’t believe how much my low back hurts! I don’t know if I can go to work with it like this!” Does this sound familiar? (0.32 seconds)

Over time, Google learned that real content seldom has the same sentences as other sites. Even if they are quoting the president, there are enough other analysis or comments that make the page unique. But if it is the exact same words, then Google interprets all the sources as junk. Additionally, if those sites are junk, then all sites they recommend are junk. Basically if you hang around criminals, statistically you are a criminals. Of course that leaves an interesting situation for Police and the FBI, but there are a few dozen TV shows and hundreds of movies and books exploring that correlation. So if you have ‘cheap, duplicate content’ pointing at your site, then you probably don’t have quality content on your site. While Google cannot really read (understand what you are really intending to communicate), it can statistically analyze what you have shared. It knows that if your friends are bad, then there is a greater chance you are bad. Just like a background check by the FBI.

From Google’s own blog at

There are some penalties that are related to the idea of having the same content as another site—for example, if you’re scraping content (automatically copying the ‘good’ parts) from other sites and republishing it, or if you republish content without adding any additional value. These tactics are clearly outlined (and discouraged) in our Webmaster Guidelines:

  • Don’t create multiple pages, subdomains, or domains with substantially duplicate content.
  • Avoid… “cookie cutter” approaches such as affiliate programs with little or no original content.
  • If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.

In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.

I guess as a way to ‘prove’ that the process works, using the scientific method – test and see the results. It is not truly following the scientific method, but as an experiment it has greater value then just being a thought experiment so often used as truth on the Internet:

  • When I search for ‘back pain’, none of the 5,320 cloned blogs show up in the top 200 listings.
  • When I search for ‘lost work time’, none of the 5,320 blogs show up in the top 200 listings except for who they changed his title tag and page name explicitly to include ‘lost work time’.  It appears that he was the 1st one they did with the new format on August 5 out of about 70, so he got a higher listing. The other 5,320 including you – lose.  Google will sometimes rank the 1st site it finds, and then count the rest as duplicates. So yes you can game the system, but it may be as much effort as just following the rules appropriately.

When I search for ‘Neck and Arm Pain – The Herniated Disk?’ – the title and heading of another page – one site gets listed in the top 200 at 12 or on the second page for most searchers (click through rate on page 2 is about 1% of being in the top 3, so unless it is a high volume search term, useless) – Ben’s page may have been a test or a mistake, since it has the extra characters in the page name, and his page does not have an address or phone number on it – probably making it unique.

So the moral of the story is check to see if your free website is worth taking. It may even be costing you (extra calories or wasted time) in ways you do not even know. Typically to give something away free it needs to be lower quality. In the world of the Internet, quality is easier then ever to determine with just a few clicks by human or less effort by computers.


February 15, 2012 Posted by | Blogging, copywriting, SEO tools, tools, Uncategorized | , , , , , | Leave a comment

KISS – Keep It Simple and Short

One of the best practices for website conversion also lines up (often) with better rankings on SRP (Search Results Page) – simple language, and good readability scores.

The issue is that once you get someone to your site it needs to be easy to read. There are exceptions, but how often are the buyers (of your product or ideas) really looking to work hard to understand what you have written? Of course Google is no genius either (although many that work there are). It is designed to look at your site as others without sophisticated degrees and high end language skills would look at your site. Of course, I am like many who like to slip into the shorthand of our vernacular and the jargon of each industry. But I do so at my own peril. And increasingly, at my own laziness. MS Word has had analysis of the reading level of your documents for at least a decade. But there are also online tools available as well. They are often for free, that are quick and easy to use.

Understand that the Wall Street Journal writes its content for 11th grade. Most novels are written at the 8th to 10th grade level (remember we had to read them in high school). However, take a look at most websites, they are written for graduate level. How many websites have you seen with humongous long words, utilizing complex sentence structures reminiscent of academic papers that put everyone to sleep (bad example intended)?

Here are a few based on a quick search (until I get back to finding the best in class):

In fact, the readability of a website is important enough that Google will allow you to control what sites you see in the results based on reading level in their advanced search options:

Improved readability will also improve your site traffic in other ways – happier readers will recommend your site more often. Easy to read writing allows your content to be shared with a wider audience. Those that can understand more complex writing can still understand your simpler text, and those that don’t have 10 years of reading Shakespeare will also be included. It also makes it easier for all to read and digest quickly. Just because a lawyer can read complex writing in statutes does not mean he or she prefers to read tangled prose to learn about your ideas.

So simplify your writing and widen your audience.

If you don’t believe that readability can help your site, do a little test and then let me know the results in the comments section.

December 26, 2010 Posted by | copywriting, How To, tools | , , , , , | Leave a comment

What is Google Analytics?

Once you have these (the various other suggestions and strategies for website optimization)  in place, make sure you have Google Analytics or some way to measure your traffic to determine if the problem is – not enough people getting to your site, or not enough people taking action on your site.

Google Analytics is a free resource by the wonderpeople at Google.  They realize that the more you improve your websites, the more people will search because they get better answers. Plus if you make money or have success with your website (readers, subscribers, callers, or however YOU clearly define a successful website), you will invest more into the web, including marketing. Hey! Google sells some marketing with GoogleAdWords (and makes a TON!). So Google has a bunch of free tools and information to improve your traffic so they can make more money.

Every time you (or anyone) goes to a webpage, you send a request to the server of the webpage or webhost server for something – the text, an image, some flash, a sound file – whatever.  In order to keep it all straight, there is also ‘who requested the info’ (so the webhost knows who to send the answer to), where  you were last (to help maintain continuity and to understand where you were), time and what you asked for. Most webhosts can keep a log of all those requests and may have some programs to take those computer geek files and make pretty charts and graphs and reports. But more and more, the simpler solution that most small, medium, and large (not many gigantic) sites are using Google Analytics.  A big reason medium and large sites use Google Analytics is that those log files can get real big (larger then a DVD worth of data). So trying to handle whole file can get real cumbersome even for a fast computer. Imagine a website with the average page that has 9 photos and someone usually visits 5 pages and there are 1,000 visitors a month (or 33 visitor per day)- that is over 50,000 entries for a relatively low volume site. Imagine a speadsheet with that many rows. Google has lots of computers and hard drive space to handle that, but many office computers start to get bogged down. Oh, did I mention another big advantage – Google Analytics is FREE.

You can use Google Analytics by following the instructions at of course you have to put the code on each page in order to get full value (just like you have to put cameras at all doors to your store to get full value from each any camera – you want to know who is visiting through all entrances).

Once you start getting your reports you can start to analyze what your visitors are doing when they get to your site:

  • Are they looking at one page and leaving?
  • Are they starting a shopping cart, getting frustrated and leaving?
  • Are they looking at your service descriptions and then looking at the prices (and leaving)?
  • Are they going straight to terms and conditions page or to the price page?
  • Are they looking at the comparison page comparing the free version and the paid version without downloading either?

These are all tools to understanding what you need to do to improve your site to meet your customers needs and improve sales. Having played detective as to why your visitors are coming and going without leaving a name or credit card number, comes the creative part – Improving the website to meet the needs of your customers. If this is your business then this part should be your passion. How to serve the customer to make them Raving Fans (Ken Blanchard) and improve your Net Promoter Score (NPS). If this is not your passion – then ask your best customers what their honest reactions are to your site while watching them navigate through it the way they want to.

November 14, 2009 Posted by | Definitions, hosting, Uncategorized | , , , , , , | Leave a comment

Google continuing to grow

I just heard that Google has reclaimed its marketshare that Bing lost while Microsoft was spending tons to launch its new revamping of its search engine.

It seems that many people checked out Bing and decided to stick with Google. Keep this in mind as you dedicate your resources on optimizing your sites for the search engines.

October 7, 2009 Posted by | Uncategorized | , , | Leave a comment

Is Google The Only Universe or Just the Center?

I was talking with an SEO expert recently who was commenting on how the whole Internet marketing industry is so Google centered, and the extreme power they have on the industry.

While I agree that the economic engine that Google engenders for anyone marketing on the web is huge, I had to kindly disagree that it was all Google and their relative power was growing.  Pointing out that numbers in DM News about the amount of traffic internal (self generated rather than search engine generated) on Facebook and LinkedIn has become so substantial that it is shifting the PPC (Pay Per Click) price models. This creation of ‘internal’ traffic is a continuing movement toward user generated content (UGC). UGC is a major component of web2.0 or web3.0 (depending on whose definitions you use). But looking at the statistics of how many HOURS people spend on Facebook per day and week clearly shows that the power of people writing what interests them is very impactful on the overall web.  It is no longer just what the professionals write and what Google feels we want.

The retailer Amazon recognized this shift of power from the corporation as well.  Look at their site these days and you will quickly see 3 main sources of content:

  • Publisher provided – title, price, ISBN number and editors review.  The facts are seldom disputed, but the ratings on the editors review show that everyone understands the publishers editor always loves it’s own book.
  • Mined content – pulled from the content of the book – Top phrases in the book, key words in the book, number of pages.  This is content that reflects Amazon’s ability to use computers to infer real information just by counting and running programs against the data of the book.  The actual information and wisdom comes from a visitor to take these snippets of information and see answers that are useful.
  • User Generated Content – even the rating of the editor’s review is user generated. But the other reviews and the ratings of the reviews is where gold mine of content and traffic to Amazon trumps most other retailers.  More and more Amazon becomes the Wikipedia of a card catalog – UGC. It provides more information than professional abstracts and paid professional summaries found in the old dusty paper based card catalogs or their digital equivalents these days. The reviews can be biased, but the ratings and openness of them allow their value to be taken in context.

That last source – UGC is such a gold mine that Amazon went and bought a collection of it for future use – it was all the UGC about movies and TV shows – (Internet Movie Data Base) that was predominantly UGC (what was not was almost, as the staff generating the content were mostly paid low wages for a labor of love according to rumors).

This is all a long way of saying that UGC is one of the forces that has the potential to knock the powerhouse of Google off its throne and leave space for all of us to consider different sources of ‘truth’ and wisdom of crowds. This will affect how we optimize our sites.  More and more it will be the UGC that is key.  UGC are Forces beyond our control, but well within our influence.  One of the many areas I have seen the models change – managing volunteers compared to professional staff.  There are commonalities, but there are differences.

How have you experimented with User Generated Content and what were the results?

July 22, 2009 Posted by | Community, SEM Industry, SEO tools | , , , , , , , | Leave a comment

Google Spiders-I’m arachnophobic, and I don’t think I want spiders in my website

A reader asks:

Ewww! I’m arachnophobic, and I don’t think I want spiders in my website. What are they anyway, what do they do, and how do they work?

But you do want spiders all over your website. You want Search engine spiders crawling all over your website. While real life spiders eat bugs, Internet or WWW Search Enginee spiders bring you visitors to your website. You do want visitors don’t you?  Otherwise, why post on the web (well actually there are some good reasons, but that is a later post). Back to Search engine spiders.

Search engine spiders are computer programs that look at web pages, lots and lots of web pages. And they create the building blocks for the results we all see when we enter a search phrase at Google, Yahoo, Bing, GoodSearch or other search engines.

So how do Search engine spiders work?  Well, a lot of the process is somewhat blackbox – something goes in, magic happens, something different comes out.  The process is often referred to as ‘crawling a site’ as it seemingly wanders the web trying to understand what each web page is about.  But I will try to shed some light on it.

  • The seach engines have a ‘sign up page’ where you can register one page of your website.  Google’s is at Google. And Yahoo’s free submission is at Yahoo (there is paid submission, but that is another post).
  • The search engine then makes a list for of all the registrations.
  • It then gives the list to a spider.  Again remember the spider is just a computer program, so this list is in essence a batch file of  ‘your work for today is to look at these websites’.
  • Starting at the top of the list the spider ‘goes’ to the 1st page in the list.  Just like you can surf all day and never leave your chair, but still travel the world, a spider never leaves Googleplex or YahooVille or the Bada-Bing.
  • It loads up the page from the list, just like your browser does.  Only rather than looking at how it displays, it looks at the code that creates the page. You can see the code for this page (or most pages) by going to View Source in most browsers.
  • The spider then collects all the words and meta tags, and ALT tags and TITLE tags.
  • It then runs another quick program (remember the search engine is trying to look at the ENTIRE web as often as possible).  This program boils this page down to what keyword phrases it is about. It also assigns a strength or rating about each phrase. So a website that is for a business in Wisconsin gets some rating for ‘Wisconsin’ because the office address is there (2349 E. Ohio Ave, Milwaukee, WI 53207). But a site that is about tourism and the history of Wisconsin ( get a much higher rating for ‘Wisconsin’.
  • It then files all these keyword phrases and ratings about them for later.

After it makes a list of all the keyword phrases and their ratings, the spider then:

  • Looks for all the links to other pages.
  • It adds these links to its list of To Do’s (‘your work for today is to look at these websites’), with  additional pieces of information.
  1. What was the page that had the link on it about in keyword phrases.
  2. What does the information about the link say about the new page.
  • Is there text that is linked or is it just the URL?
  • Is there a linked image?
  • What is the ALT text about that image?
  • What is the image name?
  • What is the text around the link?

These are the clues that we as humans and the Search engine spiders use to determine what this linked page is about.  It collects all that information and uses it to ‘prejudge’ what this new page is about.

The search engine spider has now ‘crawled’ one page.

After building a list of all the linked pages on this page, it starts to go look at all of these new pages, one by one. If you have 5 pages it may look at all 5 pages, if you have 5,000 pages it may look at them all. (of course it may get tired or ‘bored’, again another post).  If you think of a line being drawn to each new page, including some being drawn ‘backwards’ to previous pages, you can start to envision a web of lines to all the different pages with all sorts of connections.  This web is where the Search engine spider name is drawn from.

You can see that if there are other websites pointing to your site, that the spider should eventually find you. But if you are an island, and no one is linking to your site, the search engine may never find you unless you register with it. The spider is not like an airplane that is going around the ocean looking for islands.  It needs to be pointed to an island at least once by someone registering the site, or another site (that the spider is visiting regularly) pointing to you.

Of course at some point the spider runs out of time for the day, and needs to return the results back to the nest to be merged with the many spiders looking at other websites.  There the ratings of the different spiders web pages are all merged together and rankings are updated.  This merging will also take into consideration when other websites link to your website – if a 3rd party felt your site was important enough to link to, then it is more important usually than a page that no one has linked to.

Way back in the early days of the WWW (1996-2000), spiders would actually go out at primarily at night (by California USA standards). When I analyzed the logs of different clients, I could see the spiders coming in the ‘wee hours’ of the morning.

Log files are the records that the hosting computer where your website is kept that lists every single visitor to your site. It lists when they came, what page they looked at, and where they came from. There are programs that take these logs and make them easier to understand. Some of these include Google Analytics, and WebTrends.

Now of course, the spiders are out searching around the clock in order to try and keep up with the vast changing content of most of the web. Especially the ‘good’ stuff on the web.  So there is a prejudice that new content is better than old content in our ever changing world by the search engines.  That is why your site’s rankings can change minute by minute, as different spiders come back home to the nest and report how a site has changed its content, or links out or links in. Other sites may have gotten better or worse for a rating of a keyword phrase. If yours has not improved, it will affect your ranking.

At some point after the spiders conquer the new website list, they will go back to websites on their existing list, and revisit and look to see if any pages have changed. The changes could be to add or delete links to other pages, or to add or delete information on that page or how it describes other pages.  It updates the information is its master list and lets the ‘nest’ re-rank all the websites for the different rankings.

Hopefully that helps clarify how spiders work and why you need to be descriptive in your words to get good rankings of your website.

July 3, 2009 Posted by | How To, Simple | , , , , , , | 2 Comments

Email is a Traffic Generator?

Your personal email is a great way to improve your site traffic, for at least two major reasons:

  • You are sending a personal message to someone. You should have a great deal of trust in what you say and recommend. Your signature link to a site leverages that trust.
  • It helps create a mindset to tailor your site to optimizing your links and ALT tags to effective communication to Google about what your pages are about.

If you can’t put a good reason in your email for someone to visit your online presence to the people you are emailing, then you are doing something wrong. Sorry to be so direct, but is it that hard to have something worthwhile to share with those you are emailing?

If you are trying to promote or market yourself and you cannot put a softsell reason to have someone visit your online presence (website, Facebook page, Linkedin page, any of your blogs, where you have posted on someone else’s blogs) then you are either too selfish in not helping others out by sharing your information, or you are way too shy about your knowledge and wisdom.

To paraphrase podcaster (et al) Douglas E Welch puts it ‘if you have one more piece of knowledge than someone else in some area’ then you are an expert.

Your expertise is what you want to be sharing through your online presence.  That presence should be linking back to your website. As you share your expertise, you should ‘naturally’ be creating more content that the search engines can use to understand why you should rank high in their listings.

So the key is to be altruistic and give your expertise and wisdom away.  Think about how you would ‘gently’ let others know about your knowledge and put links in your email signature.  Then look at how to describe that wisdom in short snippets –

  • 5 ways to work more efficiently
  • 3 ways to lose weight
  • My favorite wines in the last year
  • How I improved my  business 15%
  • How I improved the health of over 1000 patients
  • How I helped 200 people save money with xx product
  • My favorite flowers for clay soils in the midwest

These quick descriptions belong in your email signature. They also belong in your blog posts, comments where appropriate on other sites, and in your links to specific pages on your site.  These descriptions also tell Google what your pages of content are about.

These links should not just be to your home page, but specific links to specific information on your online presence. I would recommend creating a catalog of signatures and rotate them on a regular basis so that people know to pay attention to the extra ‘nugget of knowledge’ you give in each of your emails. That little bit extra you give in each interaction with your visitors/clients/friends/community.

Start thinking about helping others with your email signature, and you can help yourself.

Thanks to Dr. Kent Christianson for the inspiration to this blog

June 22, 2009 Posted by | How To, Simple, Uncategorized | , , , , , , , , | Leave a comment

Want Traffic? – Don’t Try to Boil The Ocean With Your Website

The concept of the HTML/and the web is lots of small chunks.

If you look at what is the ‘correct length’ of a blog post, it is often listed at 400-800 words.  This is typically 5-8 paragraphs to cover a single idea in a bite sized chunk. It is a singe idea, and the blogs are set up to have each post be their own page.

So when you are designing your site, map it out by focusing on what you are trying to accomplish.  Then outline your site with separate pages for each idea.  Each page should have a clear purpose. This makes it easy for your visitors to understand what you are trying to communicate to them. It makes it easier to accomplish your purpose and for your visitors to be in sync with what you are trying to accomplish.

Of course, your ‘number 1’ visitor is the Google spider – so these ‘rules’ for your human visitors also apply to Google’s spider. If you design for good human readability, then more often than not, you will have good Google readability and Google will reward you with high rankings accordingly. If your page is focused on a single idea, then Google will more likely see your keyword phrases and understand your page is concerned about that and rank it higher than a page that is focused on 5-6 ideas and is crammed with various keyword phrases.  Google will ‘read’ your page and rank it lower for multiple keyword phrases. There are exceptions, but trying to ‘trick’ Google these days is a hard way to build traffic, and you run the risk of Google shifting its formulas and bouncing way down.

So don’t try to boil the ocean with one fire, create separate pages that have single purposes.  It is easier for your readers, it is easier for Google. It will get you more traffic.

Break up your Page

James Michener wrote novels that were great for those wanting a single summer read. They were long and full of detail.  They carried a lot of ideas interwoven together.  They had great plots that kept you following along for hours and hours, page after page. You got great value from all the details painting the complex pictures of his topics and themes.

But the web is not designed for reading long sections of text sequentially.  It is designed for chunking – lots of breaks. Those breaks are headlines, and pages.  In fact, that is how Google determines what is important – if you label something as an H1 heading-that is your headline, it assumes that those words are more important than the little footnote at the bottom of the page.  The H1 heading is specific, do not just use a relative larger font for a few reasons:

  • It is sloppy coding that will often come back to haunt you.
  • Google prefers the H1 heading to clearly identify what the Heading is on the page.
  • It displays more consistently across the various browsers including mobile browsers.

By putting a single idea on a page, it makes it easier for the reader to plan their reading – they can see how big the page is, they can see the topic and determine how in depth you will be going for that topic. Google also has an easier time ranking your site’s page for that specific topic.

Thanks to for inspiring this idea.

June 18, 2009 Posted by | How To, HTML Issue, Internal Organization, Uncategorized | , , , , | Leave a comment