SEOdamian's Blog

Helping others understand & share about technology, search

What 1st Steps as New IT Manager Would You Take?

From a post I commented on in LinkedIn

You are assigned the position of IT Manager in a new company. You don’t know the business or the developers you’ll be managing. What are the first steps you would take?

A few additions on my part:
1) Look at the ‘written’ /published documentation (preferably 1st) of stated goals, visions etc. They may not be accurate, but someone wanted them at some point. Ideally you have time (ha) to incorporate this perspective as you ask others how they view the issues.

2) Being ‘new to the business’ may imply new to the industry. Find the top 3-6 periodicals for the industry. Read as much of the last 6 months to learn the jargon and issues coming down the road in the next 2-3 years. If you are not at the leading company in the industry, these will tend to indicate what needs to be done to be competitive in the near future.

3) Take a look at previous reviews for your developers. These indicate what previous perspectives were. Even if they are totally off base, the reviewed may still be battling the criticisms in those reviews. This also allows a point of reference for areas to keep an eye on what other staff may view as issues.

4) I would look to find what may be written about your customers – whether that is blogs, press releases or trade journal for external customers, or wiki’s, newsletters, blogs, annual reports for internal customers. What is written can be a starting point for conversations of how others perceive and understand the situation.

5) I suggest the SPIN approach to conversations –
– What is the Situation?
– What is the Problem(S) created by that situation?
– What is the Implication of the situation, the ‘So What’ of the situation?
– What is the Need of the situation, This lets you help set your priorities described in previous posts.

6) Keep a sense of humor – seldom is it that life or death, and your attitude can build far more bridges than the ‘perfect’ knowledge or a perfect plan written yesterday.

Advertisements

July 28, 2009 Posted by | Uncategorized | , , | Leave a comment

Large SKU Sites-Understanding the True Need with a SPIN Through the Warehouse and Website (part 2)

The next few blogs posting are some of my thoughts on the process I would take to begin the project.

Understanding the True Need with a SPIN Through the Warehouse and the Website

Like any project I take on,  I often use the SPIN model to clarify that I am solving the correct issues.  Often, it is far easier to jump into solutions for problems that do not exist. Or they exist, but do not want to be resolved. Or they want to be resolved, but there are far bigger issues that will kill a business long before the smaller issues become core to the business. Those ‘other issues’ exist due to specific agreements internally or externally that may have been around far longer then the current players.  That does not mean the issues should not be reviewed, only that there may be valid reasons why not to focus on those issues at this time.  When the flood waters are rising, it is not the time to talk to the architect about the new addition for a sunroom. Although, it might not be a bad time to make some mental notes on how to plan and prevent or mitigate flood issues for the next time.

When taking on a large SKU website project, it is core to prioritize based on the true need of the business.

SPIN – Situation, Problem, Implication, and Need.

This model is based on the series of books by Neil Rackham. This model is what I have found the easiest to teach the concepts of truly understanding what the true needs of a project/customer/client/company/patient/department.

Situation –

is about understanding where you are, and how you got to this point. Of course the idea way to learn the situation is a nice written history of all the significant details, and all the insignificant fluff is written out.  No bias enters into the conversation, and all the skeletons in the closest are clearly laid out and labeled as such.  The reasons for past compromises are identified and resolved for a clear path to future success with the current team that has no affinity for past missteps. This history is agreed on by all stakeholders, with no animosity or grudges. Management and the line teams all agree on all the issues and relative importance of each.  Oops, I was drifting off to the land of – ‘yeah right’, and there are no budget issues either. But this is the information we are searching for.

Situations are messy.  For every project I have ever participated on, everyone did the best they could at the time, based on the resources they had.  Those resources include:

  • time,
  • best practices of the time,
  • money (always in limited supply),
  • information,
  • know how,
  • Sense of vision and purpose.

And of course hindsight is 20/20, so we should be able to see where we could have improved from this vantage point.  This step is not meant to be a witch hunt, but a truthful assessment of where we are today and major issues that contributed to our being here. This step will look at the SWOT (Strengths, Weaknesses, Opportunities, Threats) of where we are relative to where we are planning on heading.  Sorry about all the acronyms, but they are great tools in being able to roll up a lot of concepts in easier to manage format.

S W O T :

  • S – Strengths – What is currently working well with the current situation? Where do we excel over our competition?
  • W – Weaknesses – Why is our competition currently getting any customers that we would want?  What are we lacking from a CUSTOMER perspective? What are we lacking from an internal perspective that is making us work harder and not smarter?
  • O – Opportunities – The world is a changing place.  And changing fast.  What can we do to improve our abilities, meet client needs, make our job easier and reduce cost?
  • T – Threats – Everyone is looking to capitalize on our success.  Ideas are a dime a dozen. So what do we need to recognize that others will be doing better then us in rapid succession, or even longer term? What do ‘they’ have that seems to give an inherent advantage over us?  What can they do to ‘buy’ and advantage over us, and what can we do to counter that? Can we mitigate the threat by working in partnership with them?

If a manufacture is looking to sell direct to our customer base, can we partner with them to become their exclusive fulfillment house? That comes from a SWOT analysis of understanding our:

  • Strength is in fulfillment and customer service
  • Weakness is in high volume single orders – our system is not optimized for sending 10,000 single SKU of the green bag out.
  • Opportunity is to partner with the knowledge of the manufacture to get product quicker, more directly and cheaper.
  • Threat is if they can set up their own system they have the increased cost savings of manufacturing the product themselves to ‘fund’ a direct distribution process. Additionally, many of their orders are already small individual orders requiring a lot of fulfillment strength.

Next I will explore the Problem part of the SPIN approach to need identification.

July 24, 2009 Posted by | How To, Large SKU site, SPIN, Uncategorized | , , , , , , | Leave a comment

Optimizing Large SKU Count Retailer vs. Boutique Retailer On The Internet (part 1)

Large SKU Count Retailer vs. Boutique Retailer On The Internet

Recently a broker asked me how I would handle setting up a new Search Engine Optimization and Search Engine Marketing position for a large SKU (Smallest Known Unit, or individual parts) count retailer/direct marketer with an existing successful book of business.

This is a different kind of project than someone starting up a website on Yahoo.com or GoDaddy.com. A typical Yahoo shopping cart has from 20-200 individual product numbers.  There are many other vendors of shopping carts, but the cart that Yahoo bought years ago was the strongest for many years. A ‘Yahoo’ type cart is someone who is selling 5 or 20 product lines with a half dozen colors of sizes of each. That kind of site allows for a mentality of treating each SKU or product number as a gem. With a small SKU count you are trying to create a relationship with each individual audience. There is a reason why Suzy likes the large blue one and Ryan likes the small burlap one.  The optimization of effort and concentration on solution is different than a large SKU count retailer.

The large SKU count retailer sees the world from the solutions, not individual but multiple solutions he provides his clients by providing a large selection of products.  His value is not his ability to choose the ‘one’ product a client needs, but in anticipating the multiple needs his clients with the proper products at the moment his client needs them. His job is to find, source, stock and share the information on the products his client needs before his client knows what they needed themselves.  While this retailer is:

  • being a mind reader,
  • attending tradeshows,
  • meeting with vendors and
  • separating the flash in the pan products that will haunt him for years with reputation ruining returns or worse from the next wonder product that later will become obvious they are winners,

The retailer has to stock the standards those products his clients count on him to have – the nuts and bolts of the industry. All of this requires a mindset of looking at the bigger picture, but also a process designed to store the data so it can be managed as a whole, as well as individually by item. It needs to be collected, so the addition of new items can be distinguished from current items. A bolt is not a bolt to someone concerned about corrosion on a salt truck holding a snow plow blade or boom on a sailboat crossing the ocean, or a motorcycle on a cross country tour.

The large SKU count retailer needs to 1st understand how he will manage his inventory and back office system before even considering how he will be selling it out the front end whether through a retail store, phone and mail order catalogue or Internet online sales. The back-end management wags the dog, compared to a boutique retailer where the front end store is the dog.

While much can be written about the tying of back end to front end, and working in partnership, I will focus more on a role directed to improving the front end results through marketing.  By marketing I am referring to all activities with new and existing clients to create:

  • greater lifetime value
  • higher referral rates,
  • greater profit per sale
  • reduced cost per acquired customer
  • reduced cost per sale

The next few blogs are some of my thoughts on the process I would take to begin the project.

July 23, 2009 Posted by | How To, SEM Industry, SEO tools | , , , , , | 2 Comments

Is Google The Only Universe or Just the Center?

I was talking with an SEO expert recently who was commenting on how the whole Internet marketing industry is so Google centered, and the extreme power they have on the industry.

While I agree that the economic engine that Google engenders for anyone marketing on the web is huge, I had to kindly disagree that it was all Google and their relative power was growing.  Pointing out that numbers in DM News about the amount of traffic internal (self generated rather than search engine generated) on Facebook and LinkedIn has become so substantial that it is shifting the PPC (Pay Per Click) price models. This creation of ‘internal’ traffic is a continuing movement toward user generated content (UGC). UGC is a major component of web2.0 or web3.0 (depending on whose definitions you use). But looking at the statistics of how many HOURS people spend on Facebook per day and week clearly shows that the power of people writing what interests them is very impactful on the overall web.  It is no longer just what the professionals write and what Google feels we want.

The retailer Amazon recognized this shift of power from the corporation as well.  Look at their Amazon.com site these days and you will quickly see 3 main sources of content:

  • Publisher provided – title, price, ISBN number and editors review.  The facts are seldom disputed, but the ratings on the editors review show that everyone understands the publishers editor always loves it’s own book.
  • Mined content – pulled from the content of the book – Top phrases in the book, key words in the book, number of pages.  This is content that reflects Amazon’s ability to use computers to infer real information just by counting and running programs against the data of the book.  The actual information and wisdom comes from a visitor to take these snippets of information and see answers that are useful.
  • User Generated Content – even the rating of the editor’s review is user generated. But the other reviews and the ratings of the reviews is where gold mine of content and traffic to Amazon trumps most other retailers.  More and more Amazon becomes the Wikipedia of a card catalog – UGC. It provides more information than professional abstracts and paid professional summaries found in the old dusty paper based card catalogs or their digital equivalents these days. The reviews can be biased, but the ratings and openness of them allow their value to be taken in context.

That last source – UGC is such a gold mine that Amazon went and bought a collection of it for future use – it was all the UGC about movies and TV shows – IMDB.com (Internet Movie Data Base) that was predominantly UGC (what was not was almost, as the staff generating the content were mostly paid low wages for a labor of love according to rumors).

This is all a long way of saying that UGC is one of the forces that has the potential to knock the powerhouse of Google off its throne and leave space for all of us to consider different sources of ‘truth’ and wisdom of crowds. This will affect how we optimize our sites.  More and more it will be the UGC that is key.  UGC are Forces beyond our control, but well within our influence.  One of the many areas I have seen the models change – managing volunteers compared to professional staff.  There are commonalities, but there are differences.

How have you experimented with User Generated Content and what were the results?

July 22, 2009 Posted by | Community, SEM Industry, SEO tools | , , , , , , , | Leave a comment

Ideas for Using Linkedin For Business

Are you looking for a few ideas on how to use Linkedin?  Try looking at the article  33 Ways to Use LinkedIn for Business

The key idea here is to start looking at tools beyound the initial headline, and start looking at them with the entrepruneal spirit of your business or website.  Linkedin is very much respected by Google.  Many websites link to Linkedin, so Linkedin linking to your site is a good ranking assist.  Especially if your page in Linkedin is related to your website.  Do you start to see the synergy developing? The more you contribute, the more opportunities there are to link to your website.

Don’t forget to check out the comments.  This is where you can expand your insight from 2 points of view of the authors to many more based on the comments posted.  Also post your own comments and reactions – again they allow you to create links back to your website, as well as demonstrating your expertise, and by extension your websites’ epxertise.

So, remember to keep looking at the many varied tools on the Internet in how they can assist your goals today.

If you do not feel you are creative, then get into conversations with creative people on the web or off the web to see how different tools can be applied to your situation to promote your website.

July 17, 2009 Posted by | Community, How To, Uncategorized | | Leave a comment

How do I Manage Social Networking For My Organization

Here is a good article titled Drafting a social-networking policy: 7 key questions about setting policies for social networking at your organization.  Ignoring the problem will not make it go away.

July 16, 2009 Posted by | Community, How To, Internal Organization | , , | Leave a comment

What am I, a Mind Reader or Something?

I was having a conversation the other day with someone and he assumed something in the conversation, to which I said, ‘What, am I a mind reader or something?’.

Google Can not mind read

Which brings us to Google. Google is not a mind reader either. You can’t assume that Google knows what you mean. You have to be specific, and spell it out in on the page’s text. Why do you care what Google thinks about your page? Because Google and other search engines are typically 87% of a websites traffic. And that traffic is generated based on how it understands your site, or at least what it thinks you are trying to communicate about. You want Google to play well with others and rank your page for the keyword phrases you want. So don’t make it assume something. Give it something to work with, and be clear with your communication.

The key here is to write out full paragraphs or bullets.  You can use bullets to communicate your points, however, make sure you are not dropping the keywords from your bullets, assuming that because you put the keyword in the subheading it will be covered. And you can use pictures and graphs – but understand that the saying a picture is worth a thousand words does not apply to Google or the visually impaired.

Use SubHeadings and Bullets

So considerations to remember for your content pages:

  • Use the terms and phrases that your visitors and customers use – not the industry jargon, unless your visitors speak that normally.
  • Be thorough in your writing. This is not the classified ads in the back of the newspaper where you are paying for each word.
  • Use headlines to highlight your more complete descriptions – They should be labeled as H2 or H3 (see other post) to allow Google understand that the headline or sub-heads are more important then ‘normal’ text.
  • Where you are using words that are unfamiliar, link to outside definitation if you are not defining them there. Wikipedia is great, and so is Answers.com
  • Have someone who is your ideal visitor proof read your copy or content. If they do not understand your terms, then Google will probably not understand the terms you need to be optimizing for.
  • Remember to look at your website from the perspective of a visitor not from the perspective of the writer, and you will be much more likely to get Google ranking your site well.
  • If you still have not fully explained your ideas, link to smaller sub-pages to more fully describe your topics or concepts or item descriptions. Just make sure you have a link back or bread crumb back to the previous page.

What have your experiences been in writing copy and having Google rank your site?

July 11, 2009 Posted by | copywriting, How To | , , , , | Leave a comment

How Do I Learn All of This SEO stuff

Frequently, I try to share how the search engines work, and how to work with them to get your site ranked well.  Quickly, I get the response – I am busy, How do I learn all of this.  Well you don’t need to understand it all, but you can start understanding a bit a time. One of my tools for helping others is CommonCraft videos

I continually am amazed at the work that Leo and the team at ComonCraft put out for explaining complex ideas.  Their short video tutorials are awesome for getting the ‘story’ on concepts in just a couple of minutes.  Plus their videos are very sharable – just email or post the link such as http://www.commoncraft.com/search . Here is one on Web Searching on Search engines. These would even be great on a 3rd screens (cell phones, iPhones), if you can get YouTube and other videos on your smart phone.  That makes them great for sharing with others you may be trying to explain something to.

One warning though – while the videos are short (2.5 minutes typically), it can be addictive to go wandering into a bunch of  the other tutorials. Next you know it is lunch time and you are far brighter, but behind on work. Plus, you may start using your hands a lot more in describing concepts.  With that warning – Have Fun.

What is your favorite tool for learning new concepts?  Add a comment so that we all can learn how best to learn quickly in this fast paced world – DidYou Know 3.o

July 9, 2009 Posted by | How To | , , , , | Leave a comment

Google Spiders-I’m arachnophobic, and I don’t think I want spiders in my website

A reader asks:

Ewww! I’m arachnophobic, and I don’t think I want spiders in my website. What are they anyway, what do they do, and how do they work?

But you do want spiders all over your website. You want Search engine spiders crawling all over your website. While real life spiders eat bugs, Internet or WWW Search Enginee spiders bring you visitors to your website. You do want visitors don’t you?  Otherwise, why post on the web (well actually there are some good reasons, but that is a later post). Back to Search engine spiders.

Search engine spiders are computer programs that look at web pages, lots and lots of web pages. And they create the building blocks for the results we all see when we enter a search phrase at Google, Yahoo, Bing, GoodSearch or other search engines.

So how do Search engine spiders work?  Well, a lot of the process is somewhat blackbox – something goes in, magic happens, something different comes out.  The process is often referred to as ‘crawling a site’ as it seemingly wanders the web trying to understand what each web page is about.  But I will try to shed some light on it.

  • The seach engines have a ‘sign up page’ where you can register one page of your website.  Google’s is at Google. And Yahoo’s free submission is at Yahoo (there is paid submission, but that is another post).
  • The search engine then makes a list for of all the registrations.
  • It then gives the list to a spider.  Again remember the spider is just a computer program, so this list is in essence a batch file of  ‘your work for today is to look at these websites’.
  • Starting at the top of the list the spider ‘goes’ to the 1st page in the list.  Just like you can surf all day and never leave your chair, but still travel the world, a spider never leaves Googleplex or YahooVille or the Bada-Bing.
  • It loads up the page from the list, just like your browser does.  Only rather than looking at how it displays, it looks at the code that creates the page. You can see the code for this page (or most pages) by going to View Source in most browsers.
  • The spider then collects all the words and meta tags, and ALT tags and TITLE tags.
  • It then runs another quick program (remember the search engine is trying to look at the ENTIRE web as often as possible).  This program boils this page down to what keyword phrases it is about. It also assigns a strength or rating about each phrase. So a website that is for a business in Wisconsin gets some rating for ‘Wisconsin’ because the office address is there (2349 E. Ohio Ave, Milwaukee, WI 53207). But a site that is about tourism and the history of Wisconsin (http://www.wistravel.com/history_of_wisconsin/)will get a much higher rating for ‘Wisconsin’.
  • It then files all these keyword phrases and ratings about them for later.

After it makes a list of all the keyword phrases and their ratings, the spider then:

  • Looks for all the links to other pages.
  • It adds these links to its list of To Do’s (‘your work for today is to look at these websites’), with  additional pieces of information.
  1. What was the page that had the link on it about in keyword phrases.
  2. What does the information about the link say about the new page.
  • Is there text that is linked or is it just the URL?
  • Is there a linked image?
  • What is the ALT text about that image?
  • What is the image name?
  • What is the text around the link?

These are the clues that we as humans and the Search engine spiders use to determine what this linked page is about.  It collects all that information and uses it to ‘prejudge’ what this new page is about.

The search engine spider has now ‘crawled’ one page.

After building a list of all the linked pages on this page, it starts to go look at all of these new pages, one by one. If you have 5 pages it may look at all 5 pages, if you have 5,000 pages it may look at them all. (of course it may get tired or ‘bored’, again another post).  If you think of a line being drawn to each new page, including some being drawn ‘backwards’ to previous pages, you can start to envision a web of lines to all the different pages with all sorts of connections.  This web is where the Search engine spider name is drawn from.

You can see that if there are other websites pointing to your site, that the spider should eventually find you. But if you are an island, and no one is linking to your site, the search engine may never find you unless you register with it. The spider is not like an airplane that is going around the ocean looking for islands.  It needs to be pointed to an island at least once by someone registering the site, or another site (that the spider is visiting regularly) pointing to you.

Of course at some point the spider runs out of time for the day, and needs to return the results back to the nest to be merged with the many spiders looking at other websites.  There the ratings of the different spiders web pages are all merged together and rankings are updated.  This merging will also take into consideration when other websites link to your website – if a 3rd party felt your site was important enough to link to, then it is more important usually than a page that no one has linked to.

Way back in the early days of the WWW (1996-2000), spiders would actually go out at primarily at night (by California USA standards). When I analyzed the logs of different clients, I could see the spiders coming in the ‘wee hours’ of the morning.

Log files are the records that the hosting computer where your website is kept that lists every single visitor to your site. It lists when they came, what page they looked at, and where they came from. There are programs that take these logs and make them easier to understand. Some of these include Google Analytics, and WebTrends.

Now of course, the spiders are out searching around the clock in order to try and keep up with the vast changing content of most of the web. Especially the ‘good’ stuff on the web.  So there is a prejudice that new content is better than old content in our ever changing world by the search engines.  That is why your site’s rankings can change minute by minute, as different spiders come back home to the nest and report how a site has changed its content, or links out or links in. Other sites may have gotten better or worse for a rating of a keyword phrase. If yours has not improved, it will affect your ranking.

At some point after the spiders conquer the new website list, they will go back to websites on their existing list, and revisit and look to see if any pages have changed. The changes could be to add or delete links to other pages, or to add or delete information on that page or how it describes other pages.  It updates the information is its master list and lets the ‘nest’ re-rank all the websites for the different rankings.

Hopefully that helps clarify how spiders work and why you need to be descriptive in your words to get good rankings of your website.

July 3, 2009 Posted by | How To, Simple | , , , , , , | 2 Comments