Monday, March 30, 2009

Social Media

Team WorkOkay, so I was eventually going to get round to this one at some time. Here's the real twist... I know nothing about social media. Okay, that's a bit of a lie/exaggeration, I know enough to know that I don't really have a clue. So I did what everyone else would do in my shoes... called a consultant.

I guess this post really starts some two years ago when I first joined Prop Data. Online marketing has come a long way from simply calling yourself an SEO by stuffing keywords, these days it's a full time commitment to the betterment of the internet (yeah, I'm fighting the crusade for the good guy). Okay, so I admit to spamming on occasion, but who hasn't? When I joined the company I knew that many things were changing online, web 2.0 wasn't just making websites easier to maintain and update but also making them a lot more interactive. While visitors could interact with the site, it's still just code, visitors needed interaction to work back to them. Enter the age of blogs and social media in general.

This brings us back to last Friday. Having joined Twitter a good long while ago I've been closely following other folk in the SEO, SEM and Social Media circles. While there are precious few in South Africa that claim to follow these trends, one chap Mike Stopforth has put himself out there. Replying to a Tweet he sent out a few months back offering a free consultation he agreed to join us and speak to us. Being Prop Data, the team were all over worked, understaffed - the usual. We changed the format up a little and it became a open discussion between Mike, the sales guys and myself.

The discussion was great. Broken down into some very simple points Mike did a great job of highlighting the points to consider and questions to ask before venturing forward on any social project. I guess many of these points we already knew, it was just a case of putting them into perspective. After all, there rarely is a point in doing something simply for the sake of doing it. Focus and result is the main point, as it always should be. Sometimes having a blog isn't a good idea when a fan page would make a lot more sense. Not everybody likes a particular product, but it may have many fans. Simple point, but I'd never thought of it that way. What can I say, I don't know social media.

I guess like so many other things IRL (that's "In Real Life" ;) it's not what you do, but how you do it. Social Media is an animal. You have to feed it and nurture it, if you don't it will turn and bite you. Those wishing to engage in Social Media, "Are you ready for that kind of commitment?"

Friday, March 27, 2009

Arrrgh... It's a Treasure, uh, Sitemap

Treasure MapThere are two types of sitemaps that you might employ on a website. First the HTML version which is intended to offer the average visitor to the website an overview of your website. The second is a machine readable XML sitemap (commonly referred to as a Google Sitemap) intended to inform a search bot of the pages found on your website. I'd always recommend the use of both - in various forms.

HTML sitemaps are a wonderful resource. If a visitor to the website can't find what they are looking for this is the easiest way to point it out. In addition many fail to realise that his is a prime spot to put text links to pages with your keywords as the anchor text. In addition why not add a little additional information next to the link. Suddenly the whole page becomes a fantastic resource for both human and bot.

XML sitemaps a great for dynamic websites. Many of the websites we have worked on include many products or other kind of items listed. Scripts can be run to dynamically generate this sitemap as listings are changed or updated. These sitemaps can be submitted directly to the search engines informing them that this is a true reflection of your site.

I've tended to break my XML sitemaps down into smaller maps. Often breaking down static pages and separating them from the dynamic pages. On our really large sites I have even broken the dynamic sitemaps into categories. Mostly you don't want to offer a sitemap of thousands of pages to the search engines at a time. A thousand pages or so may be the limit.

Submit your XML sitemaps to the search engines and be sure to link to your HTML sitemap from your homepage, this will ensure that the sitemap is easily crawled by the search engines and will subsequently lead to the other pages being easily indexed.

Tuesday, March 24, 2009

Google Code

Comment ImageTelling Google what to index might not be a figment of webmasters imagination for that much longer. I recently came across a few lines of code explaining that you can tell Google not to index parts of your page. This could prove to be quite useful.

Don't index word:
fish <!--googleoff: index-->shark
<!--googleon: index-->mackerel

Don't use link text to describe target page:
<!--googleoff: anchor--><A href=sharks_rugby.html>
shark </A> <!--googleon: anchor-->

Don't use for snippet:
<!--googleoff: snippet-->Come to the fair!
<!--googleon: snippet-->

Don't index any:
<!--googleoff: all-->Come to the fair!
<!--googleon: all-->

Now I'm not sure if any of this works just yet. I'm still testing, but I imagine that Google will for the most part ignore these comments. We know that the Google bot pretty much tries to read all the code on a page including scripts. But if the snippet comment works at least we might be able to use a description that might be useful.

Watch this space...

Too Cuil for You

Well it's been some time since the Google killer Cuil was launched and I've not heard much since. Launched to much fanfare and expectation I think this has to be the largest flop seen in years. I wonder just how much was put into this development in time and money? I wonder if any of the investors would be getting anything back?

As I've not used this search engine (mostly as I found it to be useless at launch) I can't comment too much on the accuracy of the search results but do know that the images displayed still don't quite match up. Nice try though. I think Google, Ask and even Live have better image results blended into their universal search.

Perhaps in time they will be able to make sense of "the largest directory of indexed pages". But for now the results seem to be outdated, irrelevant and at times just wrong. Was this what we expected of the ex-Googlers? Perhaps this is a prime example of why they are Ex Google folk?

Sunday, March 22, 2009

What's in a Name

Name TagThe title tag has to be, in my humble opinion, the most important on-page factor when it comes to high rankings in the search engine results pages. Found in the head tag of a standard HTML page, the title is the first place that you can start placing your keywords. Surprisingly some pages don’t define this tag. Worse yet, some overlook it and omit it altogether. Here is a basic example of where the title tag fits into an average HTML page.

<html>
<head>
<title> The title goes here</title>
</head>
<body>
Web page content goes here.
</body>
</html>

Here are five points I always consider when constructing a title.

Limit the length of the title.
Google currently displays approximately 63 characters of a title. The total number of characters displayed varies from engine to engine. While it is not the end of the world to exceed this by a slight margin, (I don’t believe there are any penalties for having a long title) remember that the search engines will cut off anything that goes beyond that which they display. This would leave you with a “…” instead of a complete title.

The title tag can be useful for branding your traffic.
By adding your website or company name to the title tag you can build brand awareness and increase returning direct traffic. While many suggest doing this I would only recommend adding your company or website name to the end of your title tag. While I don’t think it makes much difference to the order, your keywords are placed in the title tag, I suggest that you ensure your keywords are towards the beginning of the tag as it reads easier. Once again don’t forget that the title tag is the first thing that is displayed from each site by the search engines.

Divide your title tag.
When branding your site, break the title tag so that it becomes obvious which is the page title and which is the site name/title. I find that by using the pipe break “|” (that’s the funny symbol above the “\” key) I am able to do this quite neatly. This is also a great way keep your titles consistent. For example:
<title>Keyword Phrase Goes Here | Some Company</title>
Instead of:
<title>Keyword Phrase Goes Here and Blends into the Name of the Company</title>
As you can see it makes it a lot clearer when considering which part of the title labels the page and which part labels the website.

NEVER, I repeat never, repeat your title.
Each page should have a unique title. By giving each page a unique title you are telling the search engines that each page is indeed unique. For exactly the same reason that you don’t name every file the same, (Well, apart from the most obvious reason which is that you just can’t!) as it is easy to distinguish the contents of a file by simply scanning the title. The same principal applies to web pages. This also goes a long way to indexing the priority of each page. If every page had the same title, which page would be ranked more relevant than the next?

Keywords in the title.
I have spent a lot of time optimising websites for real estate agents. While their stock standard pages have targeted keywords in the title, headings and content, it becomes a little more challenging to do the same for each listing. This is usually where the developers come into play. With a little effort the Title can be dynamically created. In my case, it drastically changed the titles I could offer from something such as:
<title>Property Listed for sale or to let by Estate Agent</title>
To a far more specific title that really does describe the listing perfectly:
<title>House for Sale in Suburb, Area | Estate Agent</title>
Okay, so I usually go a little further than that, but as you can see the title not only makes perfect sense and describes the page but also is keyword rich for the search phrase “house for sale in suburb” or even area in this case. While this works well for this kind of website, the principles can be applied to any other dynamically created web page.

I have noticed that time and time again the search engines return results with the search phrase in the title. I think we can all agree that if a web page has been titled correctly then the page will be accurately described. However search engines will discount a title that is no more than a list of spammed keywords. I think we’ve all heard the mantra, create pages for real people not robots, too many times. I would prefer to change that statement:
Make well structured, informative web pages that are relevant to what you are doing.

When you apply the above, search engines have little option but to regard each page highly and rank it accordingly. While there are so many other factors to consider when optimising a page I believe the title to be a crucial element.

Thursday, March 19, 2009

The Theory of Relative URL's

How large is a PiE=mc2 or e=mc2 ?

Have you ever tried to type E=mc2? Notice how difficult it is to find the funny little 2. In fact how many people even know how to go about finding that 2? Similarly while Pi can be rounded to 3.14 the number is infinitely long and trying to remember it rounded to two decimal points is hard enough. However, this is not a science or maths lesson. The point I’m trying to make here is that Pi is easier to remember than 3.14… or how to find the 2.

Similarly we can compare the following URL’s:

www.widgets.com/purple-widgets.htm
www.widgets.com/itempage.htm?id=123

At first glance we would assume that one is a static URL and the other a dynamic URL. Both of these URL’s could be the exact same item, but which one are you more likely to remember? Already, by looking at the above mentioned URL’s you would be able to guess which one of those pages may relate to purple widgets.

We know that the anchor text in a link carries much weight when it comes to gaining a top rank for a specific keyword. Indeed anchor text alone can get a site ranked for a search term that is never mentioned on that page. This has been used and abused in the past. Link bombs, such as the “miserable failure” Google Bomb, serve to prove just how valuable anchor text in a link can be. While many links created on websites are displayed as “widgets.com” you can already see the benefit of having keywords in your URL.

The search engines continue to preach how you should be optimising your site for real people and not the bots that visit the website. With this in mind I wouldn’t be surprised to find that www.widgets.com/purple-widgets.htm would be ranked higher simply because it is simpler URL and surely a lot easer to remember than a messy dynamic URL. This could just be wishful thinking on my part, or is it? When running a few searches I found that 7 out of the top 10 results all had keywords in the URL.

Search engines prefer the use of hyphens in domain names because they can produce more accurate search results by being able to divide and recognize specified keywords in your URL. After all if it’s easier for us to read purple-widgets than it is to read purplewidgets why shouldn’t it be the same for a bot?

Many would then assume that the underscore “_” would be the same as a hyphen. This is not true. I would appear that as the underscore character is often used in programming languages it is treated as a whole other character of its own. As we all know a hyphen simply adds words together it is read as a simple join between two words, nothing more.

It is also be worth mentioning that the URL is listed in the actual search results themselves. While just a small single text entry the URL may give the searcher a little more faith that the page listed is actually what they are looking for. So with a neatly put together Title, gripping description and a URL that matches both you might just find that the URL could even aid in generating traffic.

Useful Tips:

1. When picking a domain name that people will link to, use your targeted search phrase.
2. When creating directories and files use your targeted keywords.
3. Individual words in the URL should be separated as the search engines might not recognize then when joined (although stemming seems to have seriously improved in the major search engines - Smaller Engines still look for exact matches), i.e. purplewidget.htm should be purple-widget.htm
4. When separating words in a file or directory, use a hyphen rather than an underscore (this is easier to see as an underscore can’t be seen if the link is underlined).

As you can see, the search engines and visitors alike have very similar needs when it comes to making sense of your website. Google have been on a crusade for as long as I can remember, trying to get webmasters to design websites that are aimed at a human audience. Perhaps this is prime example of good structures that work for both human and bot. Perhaps this is just a coincidence. But while we hope that the search engines return more accurate search results, this could indeed be a step in the right direction.

Which brings me back to the original question: E=mc2 or e=mc2 ?
Remember to always pick one that will easier for the end user to understand be it human or robot. As it would appear that they are a lot closer than many may think.

Tuesday, March 17, 2009

Is Your Website a Unicycle?

UnicycleIs your website a unicycle, a vehicle that requires much training and skill before it can be used? While there are so many “beautiful” websites online, some simply don’t make sense. Have you ever found yourself on a website that seems quite impossible to use? Even worse, landed on a website after doing a search only to wonder why you are there at all?

Site usability is possibly one of the more important factors of a top performing website. While so many will argue that the site is nothing without a genuine web presence, I will argue that some websites rely purely on offline marketing. At the end of the day, if your website is impossible to use, nobody will be able to (or even want to) use it. Points to ponder when designing your website:

1. Navigation
2. Login/Signup
3. Onsite search
4. Flash and other multimedia
5. Bookmarks/Favourites
6. Contact

1. Navigation
This may seem like an obvious point but as most visitors are more likely to find your homepage, are they able to navigate to the section of the website that best relates to their needs? Simple text navigation will also make it easier for the search engines to index the individual pages of your website (where have you heard the design the website for a human visitor before?).

2. Login and Signup’s
Does your website require that visitor’s to login; do you want new visitors to signup for your newsletter (or other services)? If so, is it possible to do so from the homepage? While you may not want to place a login on the homepage, a link to a login page will suffice. Again the key is to keep it simple and clear as to what you expect of the visitor.

3. Onsite Search
This is crucial for any website that offers a large quantity of information or products. Can you imagine trying to find an item among 2,000 by going through a product list 10 items at a time? I didn’t think so. Offer you visitors what they are looking for by adding a simple search to your website. This should help speed things along. Many websites have a quick search option towards the top right-hand corner of the homepage (sometimes this spans the entire website in all the headers). Keep it simple, visible and obvious. Make sure that the average visitor knows that this is a search function.

4. Flash and other Multimedia
Okay so Flash is a pet hate of mine. But the same could be said of all multimedia that simply clutters a website. Remember that while multimedia and other interactive agents can at times seem really cool or even a good idea, some visitors don’t have advanced updated browsers. That said, sometimes the best way of doing something is through the use of these tools. Make sure that these are placed on well marked pages with an explanation of what they are about. This way, if the visitor is unable to view the contents the at least know what it is about and why they can’t view it. Otherwise they will simply think that the website doesn’t work and leave. After all, what use is a website that is broken?

5. Bookmarks and Favourites
If you want returning visitors (who doesn’t?) then it is usually a very good idea to offer a “bookmark this page” or “add to favourites” button. I’m pretty sure we are all in agreement that traffic is valuable so there is no excuse for letting it get away. The “favicon” is a useful way of separating your website from the others. Once made a favourite this icon will be found next to your websites name. This is an ideal spot to promote your logo and brand.

6. Contact
Even after making the site as foolproof as possible there will still be occasions where even will all that planning something will come along that you hadn’t factored. When this occurs make it as easy as possible for the visitor to contact you. Be it by making your contact details (phone, email and fax) available on each page, or by placing a quick contact us form that is accessible from each page. Again, you’ve worked hard to drive the traffic to your website; don’t let it simply get away.

Remember simple is best, leave no room for mistaken functions. Signups, Logins and searches should be clearly marked so as not to confuse the visitor. Make it as easy as possible for your visitors to find what they are looking for. With a well structured website you will notice that the conversion from visitor to customer will increase. At worst the few questions on where to find something or how to use the website will decrease. Your website is after all supposed to make your life easier as well as save you time.

SEO Design

Ask Jeeves the ButlerThere are many aspects to consider when putting a design together, most of which are either second nature to the seasoned developer or overlooked completely by the novice. Although as we all know, what looks good doesn’t necessarily work well and vice versa.

Neatness of Code:
Code should be neat. Simple. No, it doesn't have to conform to W3C standards (Google doesn't even conform and it is estimated that only 3% of all sites actually do - side note that half the sites that do claim to conform don't either). With so few sites conforming to these standards how can Google (or any other engine for that matter) offer decent results if they negate 97% of the internet? Keeping code clean includes keeping all generic information such as style sheets and scripts in separate files.

Robot Tags and Text File:
Often you may not wish for the search engine bots to index certain pages. You can easily add the NoIndex attribute to this page. For whole directories you can simply add them into your robots.txt file located in the root of your website (http://www.website.com/robots.txt). Why is this important for design? Well the crawl rate and indexing is a concern for all departments. Remember I mentioned that all generic information should be kept separate? Well this way you can simply block those directories with the robots.txt file. This way the search engines will be forced to index your actual content pages before attempting to read your styling code.

Content:
Do not replicate content on multiple pages, it's a waste of time, effort and dilutes keyword value. While the duplicate “penalty” is a myth it does confuse the search engine as to which page is more important or even which page was first. Imagine someone giving you the key to a Ferrari and then telling you it was the key to the red Ferrari parked outside? Now imagine there are 10 red Ferrari’s parked outside! Which one does the key fit? If there is only one Ferrari the choice is easy. Usually the page which is indexed first is the one that is credited with being unique. The other pages are simply diluting their keywords and purpose. Personally I've always tried to aim one keyword per page this does lend itself to long tail combinations working on a single page as well.

While it is commonly accepted that the major search engines ignore boilerplate content (such as standard navigation, headers and footers), it has since been suggested that you can point out which sections Google should ignore. This doesn't seem to be in mainstream use just yet and I am sure that this won't make much of a difference as it remains open to abuse - as with so many other on-page factors.

URLS:
URL’s, or URI’s, can make a difference when it comes to ranking. As mentioned before people may link to the page (home or internal) with the actual URL. As mentioned before anchor text is vital for ranking a page so it makes sense then to include keywords in your URL. Long gone are the days when URL’s were dynamic and half the URL’s had strange characters and session ID’s (a massive source of duplicated pages).

www.website.com/page?ID=123
www.website.com/location/
www.website.com/Town-Name/

In addition to duplicating pages session ID’s and multiple variables can also cause a search engine spider to become trapped in deep pages. Once trapped a spider will leave the website this may result in your more important pages not being indexed. We can now specify the URL of a page through the use of a specific tag in the page header. In this instance the search engines (Google & Ask.com) will ignore session variables (or others you may have generated) and only index the page as you specify.

Links:
The easiest way for human and bot to get from page to page is through links. Not all links were created equal. Links hidden in flash, images or scripts may look good to the human but be impossible for the search engine bot to read. Content remains king and while community (social media) has recently been crowned queen but it is the text link that remains everyone’s servant. On your own website you can use desired anchor text to describe the page you are linking to.

From another website, if a link to a website is a vote, then the anchor text tells you what they are voting for. Because so many webmasters, bloggers and publishers link to pages using the URL as the link text it becomes quite clear as to just how valuable it can be to include your desired keywords in your URL. However, no matter how hard you try you will always have broken links to your site. This could be due to a typo or because you've moved the page (or restructured the website) in which case a custom 404 page is crucial. When rebuilding a website and changing your URL structure, it is advisable to 301 (permanent redirect) the old URL to the corresponding new one.

Forms and Restricted Pages:
Don’t hide your great content behind forms and other forms of logins. Robots can’t fill these in and won’t be able to reach these pages. Simply put they won’t know that it exists. There are ways around this, but why make it difficult of the Robots or even Humans who are now becoming more and more reluctant to part with personal info on the web.

Sitemaps:
XML sitemap for robots (often simply referred to as a Google Sitemap). If you have many pages, consider breaking these down into themes. At present I prefer to set up a static XML sitemap for the pages that won't change and a dynamic XML sitemap for listings, products, etc that will change on a regular basis.
HTML or plain text sitemap for humans can be a perfect place to get all those keywords in either the link itself or next to it. This is also an easy way for a visitor to find something listed on the website. Make sure that this page is easily accessible from the homepage.

Summary
It is reported that Google has over 200 criteria points when it comes to ranking a website. Many of those aren’t part of the design. But a few that are include:
  • Keep code to the minimal required

  • Minimise the use of code that search engines can’t read (hide it when possible)

  • Unique content - keep navigation consistent

  • Use descriptive URL’s

  • Keep unique URL's

  • Descriptive internal linking

  • Use text links to reach all of your pages

  • Custom 404 page

  • Don’t hide great content behind forms and login pages

  • Use XML Sitemaps for the search engines

  • Use a descriptive HTML sitemap

Sunday, March 15, 2009

A Design for AI

Artificial Intelligence MovieWhile creating a website can either be a simple or complex procedure it is always advised that you start simple and add on from there. Once you have a basic design it is a lot easier to add in advanced functionality.

Create a standard design that runs through the website, this is usually done by using base templates or include files. The search engines will read each file once per visit. What this does mean is that once the bot has cached the file it won’t need to reload it each time it views a page. More importantly this will also prevent these lines of code being replicated and taking up a good percentage of unique content on each page. Although it is now suggested that the major search engines can now recognise boilerplate content and filter it out for the most part.

CSS while being valuable to human visitors as it quickly styles a page with quicker load speeds this advantage can also be carried over to the bots. It has long been speculated that the quicker a page loads the more likely the bot is to continue indexing your website. It would almost seem that a time limit is posed on each visit, the more pages the bot can index in that time frame, the better for the site.

CSS has also widely taken over from the old frameset style of design used in what seems a totally different age. Frames are a bad way to design as the frameset page only holds details for where the how the frames are applied – there is no useful content. CSS enables you to place what you want exactly where you want it, no frames required.

Navigation is crucial for deep indexing. The search engines love text, they can follow a hyperlink to any page, but the anchor text in these links gives a very quick title of the page it links to. Use the anchor text wisely, if you are linking to a page about koalas, let the anchor text say “koala”. By following this format you will help identify the page as being about koalas.

Many people like to use flash for navigation. While this often looks pretty the search engines are unable to follow objects embedded in flash files. More often than not it would seem you could create a very similar effect by using CSS.

Links, links, links… This seems to be one of those things that everyone has on their mind constantly. From a design point it is important to remember a few simple things:

  • Don’t put too many links on a single page as this weakens the links strength.

  • Make sure that the links are text based and are clearly labelled; this will let the search engine know about the page it is linking to.

  • Try to link to all important pages from your homepage as this will help the bot create a hierarchy. If at all possible try to make each page accessible in just 2 clicks from the homepage.

  • Link to a static text sitemap from the homepage – This will help link each page just 2 clicks from the homepage. Also remember that when the bot finds this page it also finds every page on your website.

There is a lot more to linking but this is keeping it simple.

Breadcrumbs (No, I’m still not talking about the type that Hansel and Gretel used – but close enough) are useful once again to create a link to a page with related text as an anchor. In the case of our koala the pages that most likely lead them to where they are would be something like: animals – marsupials – koala. While we find ourselves on the koala page there is no need to link back to this page but you can link “animals” to the animal’s page and “marsupials” to the marsupial’s page. Again you get to make best use of anchor text.

Keep your pages uncluttered. This not only refers to the content but code of a page. By keeping the number of lines of code to a minimum you will be able to increase the download speed of the page. By doing this you will also prevent your content being diluted with what the bot may find to be little more than garble.

The content on your pages should be unique and specific. Going back to our friend the koala, many will know that koala’s eat eucalyptus. But it would be more beneficial to make short reference to this and then create a separate page for each of these related topics. While it won’t do any harm to mention eucalyptus on the page try to keep the info on that page predominantly about the koala. If the visitor is looking for koala’s then give them koala’s but by all means link to the eucalyptus page. Again you can make use of anchor text to link to eucalyptus.

As you can see by keeping it simple you can present the website in a manner that bot would really eat up and one that should also be focused on an exceptional visitor experience. The search engine engineers have tried their best to create a bot that will spot a website or page that is the most relevant for a particular search based on what would be best for a human visitor. As you can see in many cases the benefits of doing this is beneficial for both parties.

Tuesday, March 10, 2009

A Design for Life

Koala ImagesCreating a website is a fairly simple or very complex exercise depending on your needs. The best way to start is always go simple. Create a standard design that you will use to run throughout the website. The reason for creating a stock standard look and feel is so that the visitor very quickly gets used to navigation on your website and is able to find exactly what it is they are looking for. With a standard design it is also a lot easier to help drive home the branding of your service or product.

CSS, that’s Cascading Style Sheets, is a great way to keep a constant look and feel of a website consistent. Even better than including style to each page you can create individual files and simply include them in your pages. While the benefits are too many to remember at the moment, I see the greatest benefit of using CSS being the fact that you can make a single change to a single file and the changes will be global. You update one page and all your pages are updated in a single move.

Include files are wonderful. Depending on the language you are using to create your website much of the boilerplate design can be set in a single file and simply included. Much like CSS the biggest benefit from doing this is being able to make a single change and it updating every page on your website. These files could be the search function or a sign up script. They needn’t be full pages of information but rather standard snippets of code. Imagine you changed the subscription email address and had to update a script on 1000 pages? When you could simply update one include file and every page is instantly updated.

Navigation is vital. While it may seem sensible to categorise things the way that you have, remember that not everybody knows the finer details of your business. Imagine you had a website that was about mammals, separated into distinct sections (such as feline, canine, bears and so on). Not everyone would know that a koala is in fact not a bear despite often being referred to as a koala bear. For the record it’s a marsupial.

If you have many products or articles that people may need to look through then a search function is crucial. Even if I couldn’t find koala under bear if I used the search it would eventually bring up a link to the koala page.

Returning to the concept of keeping things simple it is notable that all of this could be incorporated into a single page.

Links, links, links… This seems to be one of those things that everyone has on their mind constantly. From a design point it is important to remember a few simple things:
  • Don’t put too many links on a single page.

  • Make sure that the links are clear and indicate where they lead to.

  • Try to link to all important pages from your homepage.

  • Link to a sitemap from the homepage – one that visitors can easily navigate to where they need to be.

There is a lot more to linking but this is keeping it simple.

Breadcrumbs (I’m not talking about the type that Hansel and Gretel used – but close enough) are very useful in letting the visitor know where they are and how they got there. In the case of our koala the pages that most likely lead them to where they are would be something like: animals – marsupials – koala

It is useful to link each of those back to the page that they represent (koala need not as they are already there) so link marsupials to the marsupials page and animals to the animals page.

Keep your pages uncluttered. This not only refers to the content but code of a page. By keeping the number of lines of code to a minimum you will be able to increase the download speed of the page. This includes optimising the images on your web pages for optimal display but also for the smallest possible files size. Again this will help speed up download times.

The content on your pages should make sense. Going back to our friend the koala, many will know that koala’s eat eucalyptus. But wouldn’t it make more sense to create a separate page for each of these topics. While it won’t do any harm to mention eucalyptus on the page try to keep the info on that page predominantly about the koala. If the visitor is looking for koala’s then give them koala’s but by all means link to the eucalyptus page.

As you can tell by keeping it simple you can give the average visitor a pleasing browsing experience. By building a good solid page the visitor will know where they are as well as why they are there.

Monday, March 9, 2009

Sick Websites

Sick ManIt seems that no matter the time of year, someone is just getting over what ever bug it was that was doing it’s rounds (nothing worse than a summer cold). But I have been thinking how it is a good idea to have an annual check-up for your website. How often does your website get sick? When was the last time you checked for broken links? While the list could be endless here are a few things to check up on:

The Home Page
While this may seem like a no brainer it should always be your first stop. Does your homepage correctly describe your business, services or products? Often as time passes so to does the nature of your business and the home page needs to reflect your business accurately. Just recently we had a client inform us that the products on his website were a little outdated. As it turns out most of those products are no longer on offer, but more than this it would seem that they have changed the entire direction of their business – the website (and importantly the home page) needs to reflect this.

The main title of your website should always describe your services and products or at least grab the attention of your targeted audience. Many visitors sum up their interest in your website in the first few seconds so be sure to reassure them that they are at the right spot.

Navigation
As with the home page, many things change over time. Is navigation effective, quick, simple and easy to follow? Many websites are continually being updated; can these updates be reached from the homepage? Are internal pages correctly linked and grouped together? Have any of your pages been moved or removed? If so, has this resulted in any broken links?

While quick navigation is important to your visitors it is also vital for those search engine bots. The easier it is for the visitor to find your pages the easier it will be for the search engines to reach your pages. I always suggest a plain text sitemap for websites as this is ideal fodder for the search engines and requires no more than a simple text link from the home page.

Outdated and Up to Date
Is any part of your website outdated? While it is easy enough to explain face to face with other people what you many offer in terms of products and services (okay, so not always that easy – I used to sell rocks!) it’s often not that easy to convey this online. When was the last time you checked your website for industry updates or product updates. Is your latest news still current? While we all know that bad dress sense makes a comeback every 20 years or so, do you really want your website to have to wait that long before it is seen as current?

When adding pages did you make use of a template page? If so, have you modified the title and description tags?

User Friendliness
While navigation is possibly the first place to start on this one, a few other points to consider would be:

Email subscriptions, useful links and other end user products you may offer. Do all those documents you uploaded still exist? Are they compatible with the majority of browsers/readers that your end user may be using? Do you offer links to useful resources such as a document reader? If so, it may be helpful to offer installation tips or a how-to guide. Does your website encourage visitors to sign up to your mailing list? Equally important do you also have an unsubscribe option? And often overlooked does this work? So many websites have an automated mailing service and often the unsubscribe option is also automated – does this still work? I have had dealings with web hosts in the past that changed operating systems or scripting resources that left some of my code useless. I only found out once the complaints started rolling in. Keep checking that that scripts work. While an annoyance to you it could be the reason you lose a potential client.

Do you link to other resources? If so, you may wish to check that they are still there or that they are still the kind of resource you would like to link to. Sometimes websites like people can be struck down by disease. Be sure to remove links to any infected websites as this will reflect upon you and your website.

While design and overall content are vital factors in the functioning of a website these are often too large to be overlooked once they become a problem for one reason or another. Put together your own check-up list and check these factors every week/month/year depending on your needs. The real point here is that you should never assume that once it’s healthy up and running it will remain so without a little boost now and then – we all get sick from time to time.

If you have a few checks that you run regularly let me know.

Sunday, March 8, 2009

Search the Final Fontier

Star Ship Enterprise

Second star to the right and straight on till morning.
– James T. Kirk

I am still amazed on a daily basis just how many people don’t know, don’t realize or simply don’t care just how powerful the average internet browser can be. It almost seems as though the average user either has their browser homepage set to their own website homepage (usually by the technician that set the network up) or to a search engine or directory of preference.

I have seen some people enter email addresses into the web address bar or even a full URL’s into the search bar. Some would even argue that entering the URL into the search bar and hitting the infamous I’m Feeling Lucky button is quicker than having to move the mouse to the address bar and typing it in. While this may be true, I’m still not buying it. But after looking into our larger client’s websites it would seem that the company name is more often one of the top 10 keywords. I would imagine that many fail to bookmark or add the website to favourites.

While it is very easy to gain a top ranking for your company name, especially if your domain name matches, how many know how much potential traffic (good qualified traffic) is lost by not ranking first for your own company’s name. Some marketers even bid on their own name in paid ads simply so that nobody else can rank higher for searches. Sometimes your domain name won’t match your company name; perhaps you’ve opted for a slogan or keyword phrase as a domain name. Will your website still be returned at number 1 for your company name?

Consider the following: You have spent thousands (insert any currency here) of your hard earned pennies into offline marketing. Slowly over the years you have created a good well established brand. A neighbourhood brand if you wish. Then, as time passed, you decide that you would move with the times and seek the additional benefits of online marketing. After having a website created, you just sit back and wait for the traffic. After all having a website guarantees you success online, doesn’t it? Surely you will be successful online, especially if you already have an established market?

Sometimes with a bit of luck you websites can indeed rank well for good converting keywords. This usually requires a lot more work. But surely your website will rank top for your company’s name? This may come as a shock, but, “NO!” Quite often you will find an article or a page of information that may actually rank higher for your company’s name than your very own website. This won’t come as a surprise to any seasoned SEO consultant.

“Okay, so what does this Matter?" Some may ask. Well while the search may lead to another website that endorses yours, many times it does not. Can you imagine the damage that could occur if someone searched your company name, only to find that some blogger had taken your entire customer support department to task. Even worse than that would be if you didn’t even know. Imagine losing all those years of building a good solid, respected brand name only to have it ruined in moment because of a careless search.

Going back to my original point you can see how valuable it is that you have a top ranking for your company name. This is just one incident where the variations of how people search, and more importantly what they can find, that you should consider when marketing your website.

Some web browsers have a built in default search function that will return a search results page should a URL not be found. Even if the searcher was to put in a correct URL it may occur at times that either the site or the network are simply too busy to return the correct results. If these search results are returned, will your website have the number one ranking?

As any marketer would suggest, study your market. Learn the search techniques of those that you expect to be searching for you online. By doing this you will pick up those points that are often overlooked. While the goal remains a number 1 ranking for you key search phrases sometimes you can harvest a lot more traffic by just getting the smaller details right.

Thursday, March 5, 2009

Paid/Sold Links

chain linkLinks… aren’t we all a little tired about hearing about links? Don’t link to bad neighbourhoods, don’t link to link farms… don’t get links from bad neighbourhoods; don’t get links from link farms. Don’t buy links. Don’t sell links. Don’t have too many links on one page. Don’t let all the links pointing to your website have the same anchor text.

Actually that’s just the tip of the iceberg. But I think we should go back to the beginning of all of the fuss. PageRank (my pet hate) seems to have started this. A long time ago Google (I’ll assume the other search engines figured this out too) learned that many webmasters were sneakily modifying their websites to gain high rankings. So they reverted to the voice of the people. By counting links and more importantly the text within those links they felt they could better rank a web page. The problem with this is that for some reason they decided to release the Google Toolbar with the PageRank value in it. This surely changed SEO forever. Well done Google, take a bow. *note the sarcasm* While they won’t let us in on any of their algorithm they will happily allocate out a very random value for the uneducated to obsess over.

So armed with a few green pixels, usually a few more than another website, webmasters suddenly had a tangible commodity to offer other webmasters. After all, everybody wants to rank highly on Google, don’t they? So if I have a website that is ranked 7/10 by Google surely that is worth selling a piece of? So that’s exactly what webmasters did. It’s a standard marketing tactic. If I have a publication that’s seen to have some worth, be it number of subscribers or area of distribution, then the space on my publication is seen to be valuable. Eventually that’s exactly what webmasters were doing. Selling a piece of Google worth became a thriving business. It was more than just SEO which is something you sell someone based on an idea and projected results. This was something you could actually see.

This was suddenly a major headache for Google who realised that yet again their system was being gamed by sharp webmasters/brokers and other advertising agents. So then we had the announcement that paid links were bad. But how would Google know that links were paid? Well they seemed to think that the algorithm could recognise that external links near the word sponsor or paid or even ads were paid links and discounted those links. Again, great announcement to webmasters everywhere as they removed these statements off of their websites.

I have to point out that at this point I don’t see what the problem is. If someone has a high valued website then surely selling advertising on these pages should be standard practice. More often than not these links were marked as being sponsors. This added prestige to these links as well as clearly informing visitors that these were sponsored links. The fact that the search engines didn’t like this practice… oh well, so be it.

So right about now Google came out with their own standard, breaking every W3C law out there by adding the rel=”nofollow” attribute to the link tag. As the search engine could no longer tell the difference between paid and legitimate links Google insisted that the webmasters flag these links. Great going Google! Was this a public admission that the googlebot just wasn’t able to tell the difference? So now webmasters had to either mark their links so that the algorithm would recognise it as a link to discount (because it wasn’t valid – despite being a genuine sponsor) or they would have to hide the link in a script of some kind (usually Java Script).

But as not all webmasters are honest this wasn’t going to be a foolproof plan. So Google did the next thing they could think of, as webmasters to report paid links. This turned any webmaster into a possible snitch. Great for the very ethical Google, right? After all, why would you let someone get away with cheating you? Then again if it’s not hurting traffic/rankings then why not just copy what the offending webmaster is doing?

Google started visibly lowering the Toolbar PageRank that was visible to the masses on the websites that were identified as link sellers, threatening many others with a similar penalty. However it is worth while noting that rankings haven’t been affected by these toolbar changes. I guess the ranking algorithm doesn’t work off of the same algorithm as the PageRank one does. Who really knows? Matt Cutts? Right now, I wouldn’t bet on that.

So it comes back down to a simple decision, do you buy/sell links or not. If you buy links for the sole purpose of increasing PageRank then expect a penalty. If you sell links watch out for a penalty.

However it surely is very difficult to truly monitor this one. Some would argue that relevant links would carry more weight and that non related links won’t be counted as they would be seen as possible paid links. But then it would be unfair for someone who is usually an SEO blogger to not be able to pass on any link strength to a website that sells a gadget that the blogger finds useful – even if it’s not SEO related?

What this all boils down to is one simple thing. Google created a PageRank monster which fed off of all of these links. Yet again we see a perfectly normal tactic used the past become branded black hat simply because the big boy(s) doesn’t like it. While we watch the success of paid links and other shady SEO techniques you can’t help but wonder what Google (and let’s not forget the other search engines) will try next to keep the internet a free and fair space.

I would say, Keep watching this space. I think things are due for a big shake-up in the near future. When this does happen the paid links hysteria may seem like a storm in a teacup.

Wednesday, March 4, 2009

PageRank the next Y2K?

Google PageRankPageRank… or is it really just a few green pixels? Every so often we hear of a new PR update and the craze that follows it. Has this addiction finally reached a pinnacle or has the worst yet to come? I shudder to think.

Funny enough, the name PageRank has little to do with actual web pages. The name is derived from Google founder and developer Larry Page, hence the name Page-Rank.

PageRank is defined by Google as: PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page's value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves "important" weigh more heavily and help to make other pages "important".

This translates into a democratic internet where the sites with more links are deemed more important, but that links from these important sites are deemed to be more important too.

I am sure that anyone who has an email listed somewhere in cyberspace has had a link request that has included the: …please put a link to our site on a PR2+ page. Possibly the most concerning is that nobody really knows what the PageRank of any page truly is, so how can anyone really verify this? With the current PageRank updates coming out so very slowly, about 4 times a year, how would a young website ever be able to compete with an established website?

Most web users, with the Google Toolbar installed, assume that PageRank goes from one to ten. This is little more than a very (and I do mean VERY) rough guide as to the popularity of the page. In this case the 1 really represents something silly like 0.000001 and the 10 virtually immeasurable. This all remains very subjective to the number of pages and the number of links on indexed pages and which of them Google gives weight to or ranks highly. As you can tell quite quickly, that value of 1 to 10 really could represent anything.

Now I say it: I’m so very much over this PageRank craze!

Okay, now that that has been said. Links are important. Links are possibly the most important attribute when trying to rank a webpage for a particular search term. You need only have a look back upon all the Link Bombs from the past to see that by simply adding links to a site you can rank it for virtually any search phrase. The terms find chuck norris and miserable failure come to mind. It must be said however that while many of these tactics do work, they often don’t last for very long.

PageRank is indeed a very valuable concern when trying to rank for competitive phrases. And as the PageRank description points out that the more links the better, the more popular the website sending the link even better still. But I think we can all appreciate that the true nature of this recipe is as closely guarded as the Coca-Cola or KFC recipe – perhaps even more so.

But perhaps it is the next level of the PageRank that really makes the most difference – Trust Rank. As mentioned before PageRank is calculated by the number of incoming links and from which site these links are from. A newly created website has little weight when it comes to casting a “vote” to another website. A site that has been online and active for the past 5-10 years would pass a lot more weight along. Usually you will see that these older sites do indeed have a good few green pixels in that toolbar. But possibly the most important factor here would be the age and traffic Google themselves have passed on over time. This would go some way to establishing trust. If Google has never blacklisted a URL and it has remained active and current throughout all those years then perhaps it is very much trusted. This TrustRank could indeed prove to be the real PageRank over time.

While it may be difficult to evaluate the true trust value of a website, especially a new website, there are a few tell tale signs to look out for:
  • Look for a security certificate; this is usually a good sign that someone else has already done the fine-tooth comb job for you.

  • Check the websites back links. Remember trust is often created by links themselves; these links will indirectly be linking you to the rest of the internet.

  • Check for indexed pages; this is where the green pixels do come in handy, if it has a visible PageRank then chances are that Google has ranked the page. It never hurts to use the site:www.domain-here.com command.

These are normally the easiest ways to establish the authenticity of website.

Links are very important in speeding up the indexing process to any new website as the more links that are incoming the more likely Google will notice your website. The weight of those links are more important than bulk. After all a site that has a high Trust Rank will pass on more weight to another page than a page that has 1000 links from spam websites.

As for right now, I believe that little green bar to be nothing but a big hoax as it is out dated, rarely updated and so far pretty meaningless, I wouldn’t give it any more credit than the Y2K bug. I wouldn’t be surprised if Google were to announce that its PageRank toolbar is little more than smoke and mirrors. If so, Larry Page is surely a better illusionist than David Copperfield.

Tuesday, March 3, 2009

Men (and Women) with Hats

Anakin from Clone Wars

SEO is so easily divided into two categories, the good and the bad, the yin and the yang… the light side the dark side. Okay so it’s not all a battle between good and evil, but the Star Wars analogy is closer to the mark. SEO is often broken into two camps those that practice safe optimisation (White Hat) and those that prefer to break the rules for immediate results (Black Hat).

Much like Darth Vader, a black hat SEO will use all possible weapons at their disposal, often sacrificing a ship (site) or two on the way to gaining victory (top ranking). This dark side of SEO breaks the terms of service set out by the search engines by any means they deem necessary. The only thought is immediate results. Sometimes these sites may show lasting results, but this is rarely the case.

On the other hand the white had SEO is more like Obi-Wan. Controlled, directed, methodical and with a much better staying power. Okay so I know that Obi-Wan Kenobi was struck down by Darth Vader in Episode IV, but he made an awesome apparition guide. There are guidelines put down by the search engines and the white hat SEO does their best to stick to the rules knowing that deviating from this could lead to penalties which would quickly undo all the good work done up to this point.

Most SEO consultants know how difficult it can be to explain to a prospective or new client that results may take time. Very much like a Jedi this can sometimes sway an SEO to offer quicker less than ideal solution. Once you start down this path it is often difficult to turn back. Sometimes it is impossible. The rewards can be great but if caught out the penalty could lead to the end with a very dead website.

Okay, so that’s very much a black and white or light side, dark side take on it. Lets face it, few things are that clear cut and simple. White hat is only white hat for as long as the powers that be say it is – or until they catch on and ban a technique. I guess I would say, White Hat SEO is an oxymoron to some degree. You aren’t supposed to game the search engines but surely optimisation to some degree is gaming the system?

Going back to the Star Wars Universe I would say that most SEO folk fall into a third category: Han Solo. This is more a greyish kind of hat. This is where you go with your “gut feeling” on what is possibly right. If optimising a site for a client you certainly would keep well clear of anything that would get their website banned. But you would also be looking to rank as highly as possible so would be willing to “bend” a few of the rules.

By going through most of those points I think most people would realise that the simple act of optimising a website IS gaming the system. Okay, so many do try their best to keep within the webmaster guidelines. But an attempt to gain a favourable ranking is in some way an attempt to skew the results, in your favour. While sticking to the webmaster guidelines one could call this technique white hat. Although I would say a true white hat SEO is one that does nothing. They simply build a perfect website never considering a search engine for a single moment. Black hat SEO would be the exact opposite, building a website for a search engine, never considering the user for a moment.

Personally I think we are all pretty much like Han Solo, we know where the boundaries lie and we stick to our side of it… well as much as possible. However as the boundary keeps changing, what is right today is outlawed tomorrow, it does become difficult to maintain a perfect score. Fortunately the powers that be are quite forgiving as long as it’s not blatant over the top Black Hat you’ll usually not be in too much hot water.

Black hat? White hat? Grey hat? Which one do you wear? Personally I don’t like wearing hats. Mostly as I’ve got quite a bit of hair and usually end up with terrible hat-hair. But I do think that it’s time we got over the idea of SEO being a shade of black/white it’s so much more than that.

Monday, March 2, 2009

Local Search and Optimisation

You are Here - upside downRecently I was reading yet another article about the trouble in Zimbabwe and how it affected South Africa. I was surprised to find that many people didn’t realize that South Africa was a country in its own right, but most thought that Zimbabwe was a country in South Africa or were all part of one and the same. While I find this amusing (as both South Africa and Zimbabwe are very much individual independent countries) I did realise that not everybody is 100% clued up on their geography.

Okay so what does this have to do with search? As many small businesses are unable to supply the entire world with their product or service many try to localise their client base. This is carried over into their website. It’s not easy for a plumber in London to be making call-outs to China. While this is an extreme example I think you get the point.

When it comes to optimising websites for local searches it is important to always include the area name in your key phrases. So instead of optimising your website for say experienced plumber after hours you might optimise it for experienced plumber after hours in London. But as you may have guessed that while the tail grows longer you are expected to have fewer visitors. Don’t forget that long tail keywords do convert better though. However at this point you may also have noticed that you could start to shorten your key phrase. Now it would be easier to obtain top rakings for the phrase experienced plumber in London or even at a push simply plumber in London.

However remember that while local search is intended for localised visitors not all qualified visitors will be locals. Okay, so that sounds a little confusing but consider the fact that not everyone searching for your services will be in the same town while doing the searching. Sometimes people need to do a search of a location and services that they may need while away on holiday, business or even for someone else. Imagine a chronic diabetes sufferer needing to know of the nearest doctor while on vacation or someone needing to know where the nearest laundromat is while on vacation in a small seaside town. Both of these searches may be done with only a little information on the actual location of where they may be.

I have often gone on vacation to small towns where many of the nearest services were in the next town. Often people won’t even know which suburb of a town your business may be located in. I think of Johannesburg and while I know that there are literally thousands of tiny suburbs, many people (and yes, even I am guilty of this) simply lump them all together as Johannesburg.

So what does this all mean for localised search?

Never assume that the visitors know exactly where you are. If your town only has a population of 75 and has never been a hot spot of any kind, then chances are nobody will know your town off hand. In these instances go bigger, optimise for the larger geographical area or town as well as the suburb. I know that this certainly has worked for smaller estate agent websites we have worked on based in a small suburb of a major South African city. While they rank highly for variations of their keywords for their specific town and suburb, the bulk of all traffic comes from the very broad searches as this makes a great starting point. While this traffic doesn’t convert as well, it certainly means that you won’t be missing a potential lead.

Always include area info. Once again as many visitors may not know the area very well it may prove very useful to supply a map of the greater area with a breakdown town by town. As mentioned earlier with many of the small towns that dot the map, many of the service providers may supply more than a single town due to supply and demand. This should help with two things. Firstly, it will help put distance and location of your business into some perspective in relation to where they will be. Secondly, this certainly won’t hurt in the search engines. Always remember relevant information is always welcome; just don’t cram it full of useless information.

Going back to the opening paragraph we must remember that not everyone knows their geography that well, and fewer still are masters of localised geography. Keep this in mind when optimising your website. Always try to think like an out-of-towner. Keep it specific, going broad when you must but always make sure that you have your bases covered for your local searches (although if you’ve done your offline marketing well this shouldn’t be a problem).

SEO the Game

PackmanA while back the question came by as to what it was that I wanted to when I grew up. I just laughed and replied, I’m most likely never going to grow up, so there is no need to think about it! Going back to my first childhood though, I can recall wanting to play computer games for a living – didn’t we all? This got me thinking which, as most know, is quite a rare occurrence.

Remember those classic old arcade games? I’m talking long before anyone even knew what a CD was let alone a DVD, so PlayStation is out the equation. I’m talking about the likes of Space Invaders and Packman. These were games that never ended; you kept playing until you finally lost. The ultimate goal was to aim for a score you knew that nobody could ever beat or at least one that wouldn’t be beaten anytime soon.

I always wanted to be the guy that set those scores. I always wanted to see my name at the top so that when you started the game for the first time it would be my name that was automatically on top.

After some thought, I came to the realisation that I am now doing exactly that. I spend my days playing on the biggest game system known to man, the Internet. My job is just like those games from my youth where I tried to get the highest score that I possibly could. We all want that number one ranking. While top 10 positions are notable, places 2 and 3 out of 30,098,293 are still a good score, but that number 1 is the most coveted of prizes.

While the older titles of Packman and Space Invaders have faded away, we have now replaced them with titles such as Ask, Yahoo!, MSN and the most popular one of this generation Google. Each title has its own set of rules but the game play is very similar. Build a well structured website, build good solid content, create something unique and useful and then tell the world about it. If done right you could be climbing up that leader board. If not, nobody will even notice that you bothered to play the game.

But by now you are asking yourself, How does this help me with SEO? SEO is the game. Never forget that while you may be top today there could be someone else who’s just managed to work the system, learned the sequence just that bit quicker or better than you have. Never become complacent with a top spot, keep playing. While we will never know just how close the next player’s score is to ours, the prize of Number 1 Player is never to be taken lightly.

SEO is a game; it requires know-how, skill and a bit of luck now and then to master this art. The best tip I can give anyone however is not about linking, optimising content or working on title tags. I won’t tell you about analytics or how to set up paid ads. I won’t try to add how valuable social media can be to your business. The most valuable piece of advice I can offer is simply this, Enjoy the game! It’s that simple.

I’ve always been best at the games I’ve enjoyed the most, hasn’t everyone?

Longtail Keywords Convert Better

Miles -Tails- ProwerYou’ve built a state-of-the-art website expecting it to be your little nest egg. But the thought that runs through your mind now is how you are going to get people to see it. The first thought that comes to mind is to gain top rakings with major search engines and anyone else for that matter. This leads you to the next question, which is What keywords are you going to target?

This can be a tricky obstacle at the best of times. You expend countless hours into researching the keywords and phrases with the highest level of popularity. Now, with the hard work done, sleepless nights and agonising wait over, you finally see your website rank for these keywords and phrases. While this does start to generate quite a bit of traffic you notice that it hasn’t made that much impact on the bottom line. Why aren’t they buying, signing up or downloading your product or service?

Even after doing all your homework, identifying what were money maker keywords you simply aren’t making any. Could it be that you aren’t bringing in the best-qualified visitors?

As a web design, hosting and marketing company, we at PropData based in South Africa, undertake the promotion and maintenance of all of our clients’ websites. The better the website performs, the happier our clients are and in tern, the happier our clients the more likely they are to refer us to other interested or needing parties.

Recently we made changes to one of our largest clients’ websites and it was decided that rather than try and hit generic search terms that we would prefer to target longer tail keywords. The reasoning here was that while searching for real estate online, the average search would include both location and dwelling type (apartment, house, etc). The results were astounding! When comparing the statistics against a previous record month (a month in the past 6 months that had recorded the highest number of visits) we noticed a few very interesting points:
  • Traffic had increased considerably, while number of pages per visitor decreased.

  • Enquiries on listed property had increased.

  • Mailing list subscriptions had doubled!

It would seem that while going broad and targeting generic search terms we had missed the most important factor; that the average searcher is looking for something specific.

While in this particular case there was a noticeable increase in visitors, the number of page views didn’t grow in accordance. While many might be concerned that we had lost traffic, and targeted traffic at that, the increase in mailing subscriptions as well as enquiries would suggest something better. We no longer simply had the random visitor having a look about but that the average visitor was now someone who was actually looking for a specific listing. They were going straight to the listings that met their needs. But most important - this visitor was now converting.

The client was very happy to see such positive results. With a constantly growing mailing list of prospective clients, their latest listings are being viewed by an ever growing audience. This in turn can only result in further business. While in real estate it is difficult to make a sale through a website, we consider a direct enquiry on a listing a success.

The same principle and model applies to all websites. If you sell mobile phones from a single manufacturer then don’t try and target mobile phones as a key phrase. Rather if you are selling Nokia phones target Nokia mobile phones or better yet even target each listing by model too. At the end of the day, someone searching for mobile phones is simply going to look through your website, possibly enquire on a few listings but will have very little commitment. However someone who is searching for a Nokia communicator 9300i mobile phone is in a much better position to make an impulse purchase when they find your website. At this point they are specifically seeking that which you have to offer.

I would like to add that while your meta description tag won’t aid you in rankings this can be crucial when it comes to making that conversion. A well thought up and informative description tag can often in itself convert a good search engine ranking into a click through. Once again it comes down to being precise rather than just having a standard generic site description.

While many webmasters and website owners congratulate themselves on gaining a high ranking or high volumes of traffic, it is always important to remember the purpose of the website. For most of us a website is a means to make sales or gain further interest in our services. Either way this translates into us wanting more business. If your traffic isn’t converting then it is really the same as owning a shop in the busiest mall with nobody ever stepping through your door.