Friday, February 27, 2009

Linking Practices Part 4 of 3: The Unconfirmed

Mysteries of the UnexplainedHere in part four of “Linking Practices” we will cover how to maximise the benefits of those links we discussed previously (check out part 1: “Links Defined”, part 2: “Links Applied” and part 3: “Link Tricks”).

Okay, so you’ve already noticed that this is part 3 of 4 (you did notice, didn’t you?). After all it is commonly known that the best trilogies come in fours. These are unconfirmed ideas. I don’t think anything can really be classed as a myth or truth in SEO simply because what holds true today may not hold true tomorrow or vice versa. That’s why I refer to these as ideas and not myths. I guess that’s what makes it all so very exciting.

Link Juice:
Ah, link juice. What exactly is it? How much does a page have? How is it lost? It is suggested that the total link juice that each page has is a result of the number of links going into a page. Of course if you wanted to be able to measure that one accurately you would need a base page to work from. So for now we will simply say that a page with many links has a lot more link juice than a page with few links (assuming all links are equal). I don’t believe that link juice can be lost. It simply becomes more diluted at more links are added. Sometimes some of this juice can be given up by having too many links. Imagine a page with 1000 links; do you think that Google will actually follow each of those? I doubt it. I would venture to guess that when encountering a page with an excessive number of links our friend the Google Bot would simply ignore the whole page.

PageRank Shaping:
“Nofollowing your internals can affect your ranking in Google” – Matt Cutts. Okay so purposefully nofollowing your own pages is a confirmed way of promoting some pages of your website a little more than others. After all, who would want to rank for the phrase “contact me” when you are trying to sell kites? This brings us back to the first topic, Link Juice. Each page has X amount of juice or link love to spend. This is relative to many different factors, but we’ll just say that it’s due to just inbound links at this point. Each link from that page shares some of that X. If you only have 2 links from a page, each link is worth ½ of X. If you have 4 links then each link is worth ¼ of X and so on. Internal PageRank can be shaped by nofollowing some of the pages you don’t desire or need to rank – the privacy policy or terms and conditions come to mind here. By nofollowing these pages you don’t have to share the juice. In this case the pages that do get the juice will be getting more.

Think of it as follows, you and 5 friends (that’s 6 of you) share a six pack and you each get a beer. But if three of your friends are teetotallers, then those of you that do indulge in alcohol each get 2 beers, getting you just that little bit drunker in the process. By nofollowing you can do exactly the same thing to each of your pages. Although I would suggest that you don’t drink while writing your copy.

Buying/Selling Links:
While Google have come right out and said “do not buy text links” (for PageRank purposes that is). However it can be very difficult at times to spot a paid link. While many publishers mark their “sponsored” links accordingly there are no doubt a few that won’t. While the PageRank craze does seem to be dying down a little many still believe that any page’s worth is proportional to the toolbar PageRank – something that we know is inaccurate at best. I would bet that if you had a site or page with a high PageRank you wouldn’t have to look too long or hard to find a willing buyer for a link to two. Will Google ever find out? Who knows just how good they are at detecting bought links? And regardless of how good they are there will always be a few that manage to get away with it. So far it would seem that the only real penalty for buying or selling links is a decrease in toolbar PageRank – is that really so bad?

While there are many ways to try and cheat the system by weighting some links, negating others and even blatant cheating (by buying links) sooner or later you will get caught or worse you may end up nofollowing your more important pages by accident. It is vital to remember that the darker the hat you are wearing the shorter your success periods tend to be.

After all the changes in algorithm, the war on paid links… the list really does go on. There will never be a substitute for a well designed site with unique content as this will gain links naturally.

Wednesday, February 25, 2009

Linking Practices Part 3 of 3: Link Tricks

LinksHere in part three of “Linking Practices” we will cover how to maximise the benefits of those links we discussed previously (check out part 1: “Links Defined” and part 2: “Links Applied”).

This section could also be classified as: Tips, Hints and Cheats. While I recommend and try at all times to adhere to ethical and honest practices there are many webmasters out there that do not. While PageRank still remains a prize in eyes of many webmasters there will always be a handful that will do anything to try and game the system. These are some of the usual ways in which they do this:

Robots.txt
The robots file is a little plain text file that resides in the root directory of a website. The purpose of this file is to notify the search engine spider of which pages should be indexed and which should not. After all you wouldn’t want the backend to your website indexed in the search results. This file stands alone and you cannot know its contents by looking at a standard webpage on that site. I know many webmasters have now started using the robots.txt file to exclude their link pages from the search engine spiders. By doing this they are effectively preventing the search engine from seeing the link to your page. This in turn makes your link back to them seem like a valuable one way link.

An easy way to search for this would be to type the link to the robots file directly into the address bar: “http://www.dodgywebmaster.com/robots.txt”

This will return the contents of the robots file. Check that the directory or page that your link is listed on isn’t listed in this file. If it is, then you may indeed be dealing with a dodgy webmaster.

Meta Tags
By simply adding the robots Meta tag to the page that has your link on it, webmasters can still have the page indexed but none of the links followed.

<meta name=”robots” content=”index,nofollow” />

While it may seem like the page has been indexed the search engines will ignore your link. This will once again make it appear that their link on your website is one of those much coveted one way links.

An easy way to determine if the webmaster in question is using these tactics would be to visit the page that has your link and view the source code. This is easily done by going to “view” on the menu bar and selecting “view source” (Firefox) or “source” (Internet Explorer) - this may vary slightly from browser to browser. You will find the robots tag somewhere between the <head> and </head> tags.

Again, if you find that the robots tag has the nofollow attribute, you know that the webmaster is not letting the search engines find their way to your website.

NoFollow
The no follow attribute is one that Google has pushed for and it would seem that the other major search engines are now starting to accept. This attribute is added into the anchor tag to prevent the search engine from following that link.

<a href=”www.yourwebsite.com” rel=”nofollow”>

This was an answer to paid text links. Google insisted that all paid links should not pass on PageRank and that webmasters selling links should mark them appropriately. This has been applied to most blogs in an attempt to combat spam. After all, spammers leave pointless comments on your website with the hope of gaining a link back to their site. The Google bot no longer follows these links rendering them useless when it comes to influencing the search engine results.

However once again this can be used for the wrong reasons. Some webmasters “no follow” all links from their site so as to hold as much PageRank as possible. There is even a school of thought that believe that if you used this attribute you actually give the other site a “bad vote” however I have yet to find any evidence to support this.

In order to find out if a webmaster is using this technique, search the source code of the page with your link for this tag. The easiest way to do this is to run a search on the source code of this page for the link text or website URL. The anchor tag will be near these.

Java Script
Search engines still have trouble reading java script if at all. By embedding your link in a short script the search engines will not be able follow your link or even recognise it as a link for that matter.

Normally the easiest way to spot this is by running your mouse over the link. If the destination URL in the status bar doesn’t match your URL but when you click on the link it takes you to your website it might be a script running. The only real way to know is by checking the source code. In this case you need to know what you are looking for.

By checking for these simple tricks you can ensure that your link is indeed pointing to your website and that the search engines will be able to find your site. Often paid links will have a script or a no follow attribute, this is good practice from the webmasters side as the search engines could penalise them for not marking paid links. Always remember that each circumstance has its own set of rules.

Also I would urge you not to try and use these methods to game other webmasters and cheat them out of potential link-love. If you are linking to and from good reliable, related websites then you would want the search engines to pick up and follow each and every link.

Tuesday, February 24, 2009

Linking Practices Part 2 of 3: Links Applied

Chain LinksHere in part two of “Linking Practices” we will cover the application of those links we discussed previously (check out part 1: “Links Defined”).

As with all marketing practices the best place to start is always with a plan. In this case we will need to establish what the purpose is of building these links. With links being a major factor when ranking a website many webmasters use links to try and gain higher rankings in search engine results. Sometimes links are valuable simply as they drive traffic to your website. As we all know traffic is very blood that powers your online business. Consider links the veins.

Reciprocal Links:
These links often offer very little aid when it comes to ranking highly in the search engines. While these links usually reside on a page that is no more than a long list of URL’s occasionally these can be of great benefit. A reciprocal link simply means a link exchanged. Usually you can request the anchor text and description you wish the link to display as well as the URL this points to. This makes sense as this is supposed to be a beneficial exchange for both parties.

While finding your link on a page with a thousand other links won’t be worth much at all, exchanging exclusive links can make the world of difference. For instance if I was a small independent car dealership, it would make sense to team up with a local insurance provider. In this case I could offer a direct link exchange straight from even my homepage to theirs and vice versa. If either I or they were running a special, why not point that link directly to that special.

With this kind of link you could drive direct, targeted traffic to your website from a related site.

One Way Links:
These are usually generated by previous visitors to your website with little coercion from your side. The best way to gain these links is by offering good solid content on your website. This is always a lot easier said than done. But by offering content people would want to share with others or be able to easily revisit themselves, visitors would add links elsewhere on the net and usually with good keyword rich phrases. It makes sense to title a link to another webpage with a relevant phrase. This is why your content is so important.

The second benefit of this type of link is that it is usually linked to an individual page. Couple this with a keyword rich anchor text and suddenly your website has a lot more credibility.

These are very difficult to come by. So many webmasters seem to be afraid of spreading a little link-love in the fear that they may lose some precious PageRank.

Three Way Links:
This practice is still practiced by many, in exactly the same fashion as reciprocal links. More often than not with poor or no results at all. Again the reason is that so many webmasters while trying to dupe the search engines simply add the link onto a link heavy page. As we all know by now, the more links on a page, the less valuable each one becomes.

This can however be a good way to have a new website indexed. By offering another website a link from an already establish website if they link out to a new website you can help speed the indexing and ranking of the new website.

siteA –> siteB –> SiteC

In this case the goal was to have site C indexed without using site A to do so (what ever that reason may have been) so there may be no need for Site C to link back to Site A.

This will only work if you can get a valuable link from that website (Site B in this case), being added to that dreaded “links” page just won’t cut it.

Deep Links:
These are the best (in my opinion) as they are usually far more specific and aid with the deep indexing of your website. These can be a result of any of the link building techniques. As mentioned above, these link to an individual page.

By gaining a link directly to an internal page of the website you have the potential of driving traffic to a specific page and usually with a specific reason in mind. If you are selling goods many people might link to your specials page, if you are a blogger this may be a link to a popular post, or if you offer a service of some kind this could be a link to a bio page.

Again, if I owned a small car dealership I would want to offer my visitors an insurance option. But if the site I was linking to offered various forms of insurance I would want to link directly to their motor insurance page.

Apart from driving traffic to a specific point on your website, these links will also aid rankings. In this case it makes sense that a link from a car dealership to a page on an insurance website would in some way be related to motor insurance. Especially if the anchor text said motor insurance. I am sure you can see the real benefit here.

Paid Links:
Paid links have a very bad name at the moment. This has been used mostly to game the search engines of late. But it looks as though the major search engines may be getting on top of this one. While it is seriously frowned upon to buy text links for search engine purposes, there are still valid forms of paid links.

Visitors to your website are valuable, but targeted visitors are even more valuable as they are your business’s lifeblood. It is often feasible to buy links (or traffic) to your website. Possibly the most popular way of doing so is the Google AdWords program, where a link to your website is displayed for specified search terms. Here you only pay once someone clicks on your link.

But sometimes you have to pay a fixed fee for a permanent link from a website to yours. This could be a simple text link to a flashy flash banner. The aim here is to drive targeted traffic to your website. If my business offered motor insurance then it would be viable to buy a link on a car dealer’s website – provided they had the traffic to justify it. A paid link becomes more valuable as the traffic to a website increases. After all there is no point in running a billboard ad in Antarctica where only a handful may see it; you would want to run it on the main highway (where all the traffic is).

This can be a very valuable but always make sure that you are able to keep records as to how many times your ad has been seen, the number of visitors that have clicked on the ad and so on. After all, you want your ad to be profitable, if the ad costs you $20 a month you want to be able to at least cover your costs if nothing else.

Internal Links:
This is the simplest form of linking. Simply link from one page of your website to another. That said there are a few simple rules that you should adhere to, these apply to the search engines and human visitors alike.

Make sure that all your important pages can be found within two clicks of your home page. This makes it easier for visitors to find what they are looking for. We are all in agreement that nothing is more frustrating than finding a website and going round in circles trying to find something. The same holds true for the search engines, unless your website is an authority website there is little need for the search engines to go deep into your website.

Text links! The search engines love text. They follow text links easily with the added benefit of having had a title to that link (the anchor text). While most human visitors have the ability to read graphics, JavaScript and flash some may have had this functions disabled. For these visitors simple text navigation is a must.

This is just a brief overview of how these links can and should be applied. Always remember that a link for a page is a vote, always try to make the link as relevant as possible, this includes the source, destination and content of the link.

I will offer a few tips on how to maximise the benefits of these links in part 3.

Monday, February 23, 2009

Linking Practices Part 1 of 3: Links Defined

This is a trilogy of links... told in four parts.

Here in part one of “Linking Practices” I will run through and explain in a little detail. While each method of linking has its benefits it also often has disadvantages. More often the advantages outweigh the disadvantages – if done correctly.

Reciprocal Links:
While many believe reciprocal links to be dead, these links have the potential to be either very useful or very useless; it all depends on the source for this one. A reciprocal link is a link swapped with another website. While the search engines don’t really give these all that much weight (unless from an authority site) these can be useful in driving traffic to your website from related websites. With enough of these links each driving small amounts of traffic to your website the increase in traffic can be quite noticeable.

A reciprocal link from an industry leader or a related industry leader can often lead to good volumes of traffic to your website. The greatest benefit here is that these are visitors that are already interested in your product or service. The disadvantage of running reciprocal links is that you get the world and his brother wanting to swap links with you, usually with absolutely no relevancy at all.

One Way Links:
These are the most valuable links when considering the search engines. A direct or one way link is a link to your website that is not reciprocated. The reason that this is so much more valuable is because in this case it a website that has linked back to yours without any solicitation (or at least that is how the search engines see it). The theory behind this and the PageRank algorithm is that every link to a website counts as a vote. Obviously the more votes you have the move important you site is seen to be. However it is also believed that every vote that you have to send out weakens your importance. Kind of like if you had a group of 10 people that all gave you $10, you would be up $100, but if you also had to give out $10 to each of them, then you simply break even. In the same way, the more votes that you are able to "bank" (not handed back) in this case the better off you are.

The problem with this form of linking is that it can often be very difficult to get this kind of link. With everyone so afraid of losing out on link juice many have become paranoid about linking out without any form of reciprocation.

Three Way Links:
Three way linking was originally a practice created to try and game the search engines. Basically it was a way of trying to create perceived one way links to websites but by giving out links in return. Sound confusing? Many people were confused and many to this day fear it will harm your search engine rankings. I don’t think this has ever been the case.

Described simply this would be a set of 3 sites that each link out to the next website until the last website links back to the first, or Site A links to Site B which links to Site C which in turn links back to Site A.

siteA –> siteB –> siteC –> siteA

I think most of the search engines have caught on to this practice and now treat these links in small clusters as no more than reciprocal links.

Deep Links:
Deep linking is the practice of building links to individual pages of your website. Again this can be crucial for gaining favour with the search engines. While your website may offer a multitude of products, it will be very difficult to rank the homepage for each and every product you offer simply because it isn’t the most relevant page for that product. Sense would say that one of your website’s deeper pages is more relevant, and therefore a competitor’s optimised page will be a lot more relevant than your homepage.

This kind of link usually comes from a previous visitor that found that particular page useful for what ever reason. This is also often a one way link to your website with related anchor text. By finding these individual pages through other websites most search engine bots place more value on these pages. The downside of this kind of linking is that it can be very difficult to obtain (without simply paying for it that is).

Paid Links:
Just don’t do it! Okay, so you can do it, just don’t do it to try and increase PageRank or influence the search engine results. Or even more specifically don’t buy text links where you can specify the kind of anchor text that links back to your website.

Even if you stick to the rules and don’t buy text links to manipulate PageRank or the search engine results you can still gain great benefit for paid links. Very much like paying for an advertisement in high profile magazine, top radio station or prime time TV, you pay for the link because the site is popular (in this case it has large traffic volumes). This in turn should help drive large volumes of traffic to your website.

Internal Links:
While all of the other methods of linking mentioned here come from external sources this one is equally important and will be mentioned in greater detail in part 2.

But unless the internal linking of the website is easily crawled by the search engines many of your pages can’t be indexed and will seriously harm chances of performing well in the search engine search results.

This is a brief summary of the type of links to a website. I will explain how to put these in practice in part 2.

Sunday, February 22, 2009

Doorway and Landing Pages

doorwayWhen somebody mentions something about a landing page or even a doorway page you immediately dismiss that as an archaic practice. But today with the search engines indexing almost every page of a website, Every page of a site indexed by search engines is a potential landing page.

In years past it was sometimes a tactic to create a single page for each search engine in the hope of winning top spot for a particular search phrase. This now heavily penalised practice of creating gateway pages was one of the earlier ways webmasters gamed the system.

But by landing page I’m not referring to a page that has been optimised for one specific keyword which would then lead you onto another webpage or even website, but rather a page that could possibly generate traffic to your website or even convert traffic. With so many pages including dynamically generated pages now ranking for search terms, each page is now a possible entry point for a search.

An important point to always remember is that each page is indexed by the search engines (or at least you hope it is if you’ve done your homework correctly). This would mean that each product in your catalogue has been indexed. While some websites simply put lists of products together many have pages dedicated to individual products. Each of these individual product pages could be a landing page for a search for that particular item.

While someone searching for “t-shirts” may find the home page of a multitude of suppliers, how many of those same sites would rank for the search “britney spears t-shirt”? This is where the optimisation of each page would come in handy. After all if your pages were optimised for their specific product they would be easier to find (and longer tail keywords do convert better). By the time the visitor reaches your website they are not only keenly looking for your product but are able to instantly find it at the page they find themselves on.

While this may seem logical to many webmasters you may be amazed at how few put this into practice. I have actually searched for a “britney spears t-shirt” and was unable to find anything that lead me to the actual product. In most cases I found a website that promised t-shirts but Britney Spears was related to their other merchandise. It would appear that many webmasters have overlooked the value of turning a simple search into an instant sale.

In this case a simple page optimised for exactly what it sells (a particular t-shirt in this case) would draw a visitor, confirm that it is exactly what the visitor is looking for and hopefully make the sale. In doing so this may have bypassed the homepage as well as other pages usually credited with generating quality traffic.

Unlike landing pages of the past where the page was rarely valuable to the visitor, these pages need to provide the visitor with a reason as to why they are there while at the same time reassuring them that not only is the page is legit but that the whole website is reputable.

Make sure that on these pages you have:
  • State clearly what the page is about or offering.

  • Links to privacy and security policies

  • If the purpose of the site is to make sales, assure the visitor that payment is secure.

  • Links to the about us/contact us pages. The visitor will be able to either contact you directly with any questions or will get a better feel for your company.

Possibly the most powerful use of a landing page is for competitions or other special offers. This is a single page, often not even linked into the rest of the website that serves a single purpose – to capture info or make an instant sale. This page is usually linked to from a specific source such as an email or a preferred partner. These are usually run in the form of competitions or special offers. The reasoning behind this is that you already have an interested party and you will be making them a unique offer where they will either say yes (and sign up or buy), or no and simply not visit the page.

These are a sales person’s dream. While having a specific audience already targeted it is often easy to style and word a page for optimum conversions. Needless to say that the conversion rate from visitor to sale should be relatively high or the campaign simply wasn’t put together correctly.

While gateway pages are a thing of the past the value and benefits of a well structured landing page are immense. While making special deals available to only a select few or trying to generate added sales through better optimisation of individual pages remember that every page should benefit the visitor. If you landing page is serving only to rank highly in the search engine results the visitors may not see the benefits and your number one ranking would be for nothing.

Oh, and for those who are interested, I did eventually find a Britney Spears t-shirt after a lengthy search.

Thursday, February 12, 2009

Flashing your Wears

Mostly because Corrine wanted to know…

When I think of flash I immediately think of Flash Gordon or simply “The Flash” himself. While growing up these genres of demigods were heroes of mine. Even now as an adult and I use that term very loosely as all it really means is that I can buy beer and am allowed to vote, I still think back on how cool they were, particularly Flash Gordon* who went on to save the world.

These days when people speak of Flash they are referring mostly to Macromedia and the impressive interactive designs that adorn many of the top websites today. Initially, Flash was a pet hate of mine, I just couldn’t see it working for an e-commerce website. The splash pages that so many websites had were for the most part poorly designed, the files were well oversized and they were devoid of any accompanying text. It succeeded in slowing down entry into the website and leaving the page impossible for the spiders to index. It would seem that these days many of the splash pages have been replaced (thank goodness), although recently there has been an increase in the number of sites that do have a Flash Intro.

As with so many good things Flash has been brutally abused. It seems that for such a long time Flash was seen as a massive cool factor. The biggest disadvantage for me has always been the size of the file. While navigation files may be quite small the splash pages and, often headers were just way too big. While many people have dedicated digital lines at work they often only have access to dial up connectivity at home. I am sure that everybody knows how frustrating it can be trying to browse a website that has large images and other multimedia files with a mere dial up connection. Beyond this frustration there are a number of reasons why Flash is just a bad idea, these include:
  • Flash breaks the back button. If you navigate within a flash object and you hit the “Back” button it takes you back to a previously viewed page and not back within the flash object itself.

  • The standard link colours do not apply. This can lead to confusion as to which pages have been viewed and which ones have not.

  • Flash integrates badly with search functions. More of this will be explained later.

  • The design is set. Text can not be enlarged for people with limited vision. The view can’t be changed to suit the end users needs.

  • Flash in general is difficult to access by visitors with disabilities.

Recently Google has announced that it can indeed read .swf files looking for text. According to Dave Taylor of askdavetaylor.com, “Google can parse through the text contained within a .swf file and present that information in a Google search. But due to the fact that an entire website can be contained in a single .swf file, whereas a traditional HTML site may consist of hundreds of individual pages, the weightings and rankings given to certain pages may not be accurately portrayed in Google's results.”

This would mean that Flash still comes up short when it comes to ranking favourably in the search engine results. While I don’t see this hurting major household name brands (such as Coca-Cola or Pepsi), it certainly means that the smaller emerging business with a total Flash website is going to struggle to rank well.

As mentioned earlier, Flash integrates badly with search functions. This is mostly due to the fact that the flash file itself can’t be indexed in the same manner as plain text. While Google has managed to index the plain text within an .swf file it has become plain that from a search point of view, the Flash file is treated in much the same way as an image. The ability to read the text in these files would seem to have little more than benefit than using an “Alt” or “Title” tag. However this may change in the near future.

Perhaps in time the search engines will be able to index Flash files correctly, perhaps not. Personally I would prefer it if they don’t. Some things really should be discouraged from corporate websites. While Flash is a great way to build a site with all the bells and whistles, it also removes some of the functionality. I imagine that a website for Ozzy Osbourne could easily include Flash as fans are more than happy to sit and wait for the objects to download. But what works for Ozzy might not work for your business. As with all website additions, the main thought should always be, “will it improve the average visitor’s experience?” If not then it shouldn’t be used.

For now, I’ll just keep thinking of Flash (Gordon) as a hero. I know my brother does and, believe it or not, even plans to name his child “Flash”. I think that’s how things should stay.

*Flash Gordon - Flash Gordon the Film

Facts are Meaningless

Homer Simpson“Facts are meaningless. You could use facts to prove anything that’s even remotely true!” I think Homer Simpson really hit the nail on the head with this one. We are told that analytics are the way to go, time and time again. But what exactly does that mean? What exactly do you measure? What is a good measurement? What would are you really measuring to begin with?

Right now it would seem that the most popular form of analytics would be Urchin, now owned by Google. “Urchin 5 analyzes traffic for one or more websites and provides accurate and easy-to-understand reports on your visitors - where they come from, how they use your site, what converts them into customers, and much more.” But while giving you a multitude of numbers what does this really tell us?

So we are still at the very beginning. Before even thinking about running any kind of analytics you first need to define your objective. While this can be a lot easier if you are selling a tangible product, it is not always a clear cut decision. While a quick sale is almost always a good thing, many would agree that the effort made to make that sale far outweighs the profit you may have made. More often than not when dealing with sales you are looking for returning customers. So in this case it’s not a sale you are after but a returning customer.

If this is the scenario that you are finding yourself in, then you would definitely prefer to have a lower number of unique visitors to overall visitors. If however you were providing a newsletter subscription service, then you would want as many unique visitors as possible. With your product being a one-time sale option, the more readers who see your site the more possible subscriptions you are likely to make.

The point here is that the same metric could mean two different things to two different businesses. Here are a few points to ponder:
  • A high percentage of your traffic is generated by search engines

  • A high percentage of your traffic is generated through direct traffic

  • A high percentage of your traffic is generated by referring sites

Each of those could either be a good or a bad sign. If the search engines drive a majority of your traffic to your site this could indicate that many visitors don’t return. On the one hand, this could indicate that visitors do not bookmark your site, or even that the targeted keywords or phrases are driving the wrong kind of traffic to your website. On the other hand this could be an indication that traffic is on the rise or that newly targeted keywords are now generating more traffic than before.

If most of your traffic is direct traffic, this could indicate that many visitors have bookmarked your website and return often, newsletters are driving visitors to your website or that off-line marketing is working for you. This could also be a sign that you are slipping in search engine rankings or that visitors simply aren’t searching for your services/products.

While gaining traffic of any kind is rarely a bad thing, too much traffic generated from other websites could be an indication, that while many need your service or product, they simply aren’t finding your site the first time. This means you are losing out on a lot of first time traffic and that you are probably just getting the scraps. However, this could also be a positive sign; if you are selling a product or service in a highly competitive market this indicates that you have managed to form some very valuable alliances.

There are always two sides to this coin (if not more). Rather than read too much into a single point you really need to sit down and go through everything with a fine toothcomb. You might be surprised at what you find.

A little while back we ran a Google AdWord campaign for a client. It was found that during this period that the bulk of the traffic was coming from direct traffic. While this seemed to prove that the AdWord campaign was failing horribly it would seem that rather than a failure it was actually driving a much better targeted visitor to the website. During this period the number of visits per visitor suddenly climbed. Then another interesting point came up, while initially the page views per visit had increased they were now on the decline. As our client’s website contained many, and frequently updated, listings, once a visitor had seen a listing they often didn’t view it again.

In conclusion it would seem that the AdWord campaign had done its job. Which was that targeted traffic had been directed to the website, these visitors continued to return to the site (often signing up for the newsletter) but often only browsing the newer listings. This would then appear as though they weren’t interested, while in actual fact were dedicated visitors.

While looking as though the AdWords were having little to no affect it actually turned out to be a very profitable exercise. In this case returning visitors are highly sought after as if they don’t find what they are looking for the first time, they might find it on a return visit.

Then again, if Homer is correct, then facts really could prove anything.

Thursday, February 5, 2009

Faceless Communication

Face is hidden“If you can't stand in front of your target, reading what you've written aloud, you have no right to it.” – Frank Watson

After reading countless articles on social media, and sadly more recently of the tale of Megan Meier, it would seem that the bad element of society thrives on the safety of being anonymous. So often I’ve looked at blog postings, or more specifically the comments left by unnamed visitors, and wondered if those people would actually be able to say what they do to someone else face to face.

While some people are quite blunt while conversing face to face most are not. So how is it that they can go about posting comments like these? Simple, it is because nobody knows who they are and they are not likely to be held accountable. While we all know how difficult it can be to tell a mother-in-law that the dinner she just spend 2 hours preparing is terrible, it is so much easier to simply grin and get on chewing. On the other hand it takes 5 seconds to blog about it later, especially if you know for a fact that the said mother-in-law is never going to read your blog.

These are things that must always be considered when dealing with people in any form online. In a world that has quickly moved from posted letters that could take weeks to get from one place to another to airmail to email and now text messages so much can be lost in translation or even the lack of it. Twitter another step in that direction. While it can quite easily be possible to convey a solid point or idea across in just 140 characters many people can’t. Let’s face it, there really is an art to getting a wide audience to understand your idea or drive home an idea with little more than a few written words. I guess that’s why writers, good writers mind you, have always been in great demand.

Points to remember when dealing with people online, netiquette if you like:
  1. Sometimes visitors to your website or blog are really just looking for a spot to vent. Moderation can be a friend here, but why not contact them and politely ask why they want to leave such harsh comments.

  2. With email and other simple text message systems out there, if someone seems upset with you it could be that they misinterpreted the tone of your message.

  3. Sometimes email or other messages genuinely do get lost somewhere along the line in cyberspace. This can and does happen quite regularly. Don’t assume that someone is fully clued up on all correspondence unless they actually agree that they are – or at least think they are.

  4. Don’t take everything at face value, just because it is online doesn’t make it true. Think to all those hoax virus emails we get regularly. Take it with a pinch of salt and move on.

  5. Don’t take it personally. Okay, so if someone has had a real go at your name or business reputation it’s kind of personal but don’t resort to similar retorts. Rather turn it about and give a civil helpful reply. You might just be able to convert them into a glowing reference.

While it’s not always easy to deal with people face to face it’s sometimes a lot more difficult to deal with anonymous entities. Try to get to know who it is that regularly posts comments on your blog or replies to your emails with scathing remarks. You might just find that it’s little more than miscommunication.

Sometimes however you do have those that wish to incite or mislead others for whatever their reasons may be. The best answer at these times is no answer at all. Some things simply do not require an answer, but if you feel you simply must answer remember - “If you can't stand in front of your target, reading what you've written aloud, you have no right to it.” Always be accountable for what you post, this is the easiest and quickest way to gain respect online. Once you’ve got that, others will simply ignore unsubstantiated comments left by the anonymous posters.

Wednesday, February 4, 2009

Meta Tags Reach Undead Status

Resident Evil ZombieI’ve heard time and time again that “Meta Tags are Dead!” While this may ring true when it comes to gaining a good rank in the SERPs there are always two sides to every story. While it is accepted that this tag certainly isn’t going to aid your rankings, it could make a massive difference when it comes to searchers clicking through to your website.

<meta name=”description” content=”your descriptive description here” />

The description meta tag is possibly the most overlooked marketing tool available to all webmasters and online marketers. Behind the title tag, which is displayed as your listing title in the search engine results page, the description is the first real taste of your website. Does this snippet best describe your flavour?

While many search engines can and do pull snippets from a variety of sources to describe your website/webpage they often fail to present a compelling reason to visit that page. These snippets often come from other directories that your website may be listed on, such as the Open Directory Project (DMOZ.org) or Yahoo!

In order to get the search engines to ignore the snippets from those directories, you will need to add the following line of code to your webpage, between the and tags:

<meta name="”robots”" content="”noodp,noydir”">

The noodp command will stop the engine from returning the snippet from the Open Directory Project and the noydir command will stop the engine from returning the Yahoo! snippet.

Now the search engines should ignore these snippets and return the title and description as described on your webpage. Many overlook the true value of this. As many of our websites are real estate based, I’ll show you how this tag has been used to our benefit:

Previously:
<meta name="description" content="Exceptional Experienced Realtors with the Best Listings in the County">

Updated:
<meta name="description" content="Townhouse For Sale in Suburb, City | USD 11 000 000 | as listed by Our Realtors">

As you can see this will make a considerable difference to someone who is searching for “townhouse for sale in suburb” or even “house for sale in city.” This will assure the searcher that this page is indeed what they are looking for. In this case we’ve included the price but other information can be added such as the number of rooms or bathrooms. But I think you get the idea here.

Things to remember when writing a description:
1. Keep the tag short, the engines will cut it off if it’s too long;
2. Make sure the description actually matches the page content;
3. Make sure that each page has a unique description;
4. Try to include your keywords in the snippet. (This will increase the likeliness of Google returning your description as the snippet.)

A useful tip to remember is that the description doesn’t have to be a perfectly formatted sentence. As you can see from the example given I included the pipe break to break the description into different segments of information. This is where the savvy online marketers can really go to town. Include information relating to a special deal or any other compelling call to action. Often you can instill trust and even conclude a sale purely based on the value of that snippet. This is especially the case when the snippet is on a product page. By simply scanning the snippet, the visitor knows exactly what is listed, the price as well as any shipping conditions which may arise.

There are times when using a description is not advised. Such a case would be on large blogs, or any other page that contains long lists of unrelated topics. In this case it would be better to ignore the description tag. The reasoning behind this is that when returning the search results, the search engine will return a snippet from somewhere on the page that best relates to the search query. The drawback for this kind of search result however is that while the description may meet the query it may be difficult for the visitor to locate this snippet on the actual page. This may lead the visitor to conclude that the page is not relevant to their search.

Looking back on the subject topic, you may be surprised or even confused at my statement “Meta Tags Reach Zombie Mode.” While the description tag (or any other meta tag for that matter) might be dead when it comes to search engine rankings it can still be of great value when converting those rankings into a visit or sale. This may very well prove to be the next stage in the life cycle of meta tags – Zombie.

Tuesday, February 3, 2009

Duplicate Content

No Duplicate ContentIs duplicate content penalty a myth or a reality? Well I would say that it’s a bit of both. Many folk have been terrorised into paranoia over duplicate content, but here are the facts.

“Duplicate content is bad.”
Okay, so I said it. But please note that I didn’t say that it’s evil or that your web pages are certainly heading to supplemental hell. But duplicate content can be bad for the following reasons.

1. Multiple copies of the same content are not useful to the internet as a whole
2. Multiple copies of the same content on your website could dilute your “link juice”
3. Multiple copies of the same content can hamper effective indexing of your website.
4. Multiple copies of the same content will confuse the search engines as to which copy may be the original.

Multiple copies are not useful to the internet. Okay so we often turn on the TV only to find that while there are over 100 channels on offer there’s still nothing on. But imagine if out of the 100 channels there were only really 8 to choose from, the others were simply showing the same thing as on the others. Duplicate content works in the same way. It serves us the same content on all the channels. The search engines quickly realised that good content ranks but that the same content shouldn’t fill all choices. This is the real penalty of duplicate content. The same content shouldn’t rank over and over because if it doesn’t meet the searcher’s needs then they wouldn’t have another option.

From the average webmaster’s point of view I would say that point 2 is the most important. Can you imagine having 3 or 4 pages all with great content, but the same content, being indexed and linked to from related websites? Sounds great, but if it is the same content then you have effectively shared the full linking power of that content over multiple pages. So in this case instead of having one page showing up on the first page of the search results, you now have 2 on the second. I know which scenario I would prefer. The other side of the coin is with dynamic URLs you may find that several different dynamic formulas may take you to the same product page. Again this could be seen as duplicate content by the search engines and with this may come the uncertainty as to which one to rank. Quite often this will result in the relegation of both of those URLs to a lower rank. Although I have a sneaky suspicion that the search engines are catching onto the dynamic URL problem and now will rank one page from a website and simply ignore the others.

Should you have multiple versions of the same content on your website as can often be the case of dynamic URLs then this may prevent the search engines from indexing your whole website. Imagine once again you are searching through the TV channels and the first 10 are all the same, you may continue searching and reach channel 20. If by this time you’re still searching then you must be really bored. The Bots don’t have time to get bored, if it reaches page 10 and all the content seems the same it will assume that the rest aren’t worth indexing. This could be a problem. Added to this if it deems none of your pages worth indexing you could face a massive uphill battle to have your pages indexed, never mind ranking highly.

Originality is a difficult one because with no way of being able to certify that you were the first to post the content the search engines rely on the speed with which it was indexed. Indexing of websites takes place at different rates. Blogs generally are crawled and indexed a lot more frequently than a website that shows updates once a month on average. In this case if you had added great quality content to your website only to have a blogger scrape it, you might lose the full benefit of this content as the blog may be indexed and be recognised as having first published it. While this is an extreme example it is a good reason to regularly update your website to keep the search bots coming back regularly.

As you can see from the points above there is no definite penalty for duplicate content. But for reasons that make good logical sense many times these pages simply don’t cut it at the highest level. The question you should always be asking is, “will this benefit my visitors?” or even, “Will this impact negatively on my visitors?” The answer to that question should quickly send you in the right direction and keep you from possible pitfalls.

Ironically this article is a duplicate of one posted on SiteProNews that I submitted some time ago.

Monday, February 2, 2009

Bad SEO Practices

The Boogie ManOnce upon a time, many webmasters abused the system. The system now abuses the webmasters (okay perhaps not, but these tricks no longer work).

While many of the articles written on good SEO techniques I’ve decided to turn things around and look at a few of the sure-win techniques of the past. The point is to make sure that you are no longer making use of any of techniques.

Hidden Text:
This is one of my favourites. A long time ago I even tried this one, and it worked! You picked a background colour and matched the text colour to it. Very quickly the search engines caught onto this one and simply marked the sites for spam. But that wasn’t the end however. By creating a background image of a particular colour and then making the text the same colour, you could once again benefit from hidden text. Again, it wasn’t long before the search engines caught onto this one and penalized the offenders. While webmasters continue to find ways to add hidden content to web pages the search engines will continue to find ways of exposing this and you will be penalised.

Keywords:
I think this was the first bit of code to be totally discarded by the search engines. Gone were the days of simply putting any old keyword into the meta keyword tag and hey presto, there you are. Keyword stuffing was the norm and has thankfully slowly become a thing of the past. However, many designers still seem to think that you need to fill this tag with as much as possible. Google have time and time again stated that they simply do not use this tag for anything.

Description Tag:
The description tag was abused in much the same way as the keyword tag was. Webmasters everywhere stuffed as many keywords into the description tag as possible. Because of this, the search engines no longer use this as a ranking tool. Although a well written description may not gain you rankings anymore, this can still be useful in promoting a click through by working as a sales pitch.

Gateway Pages (Landing Pages):
When mentioning gateway pages I’m not referring to those pages that you have optimised for a particular search, but rather those rubbish pages that have no value, no real content and redirect a visitor before they know what is happening. While many SEO’s still believe this is the way to go, the search engines frown on these pages as they offer the visitor little to nothing of any value.

Links:
Links are good. Links from just anybody, are bad. Gone is the day where a link from a Free For All (FFA) is worth anything (then again, was it really ever worth anything?). All those links from reciprocal link pages are now worth very little if anything at all anymore.

Cloaking:
As much as I like breaking the rules, I’ve never actually done this personally. Cloaking is as the name suggests hiding one version of a webpage from either the visitor or the robot. This was usually done by returning a specific super optimised page to the robot and then a pretty design to the visitor. While there may be the occasion where many feel this is a valid technique (returning text to the robot, flash to a visitor) it is still a massive “No-No”.

While each of these techniques has been used, abused and banned over time it is useful to remember the following:

While hidden text is very much frowned upon it is good practice to keep active code off of the webpage. By adding all of this code into a separate file and included later the search engines have less code per page to spider and this should translate into a higher text/code ratio. This should aid with text/page density, adding value to the text content.

Keywords are no longer used by the majority of search engines. Many SEO experts now even suggest leaving out the keywords tag so that you don’t alert your competitors to which keywords you are targeting. That said, as a few search engines still use keywords (although with minimum weight) it can’t hurt to add them, sparingly.

The description tag is one of my favourites. As mentioned above, it won’t aid with your search engine rankings, but it can add value to a page. If you have optimised your page to rank highly for “search term” then make sure that your description compels the searcher to click on your link. This is a short description of what the page is all about. If this tag is used as it was originally intended it can be very valuable.

Gateway pages spammed to death in the hope of tricking a visitor into visiting an unrelated website are a thing of the past. You can bet on being permanently banned for this one. However a properly optimised landing page for a particular special offer or product is still very useful. Remember that this page should offer something of value to a visitor or it still is just spam.

Links are one of the most important factors when trying to rank highly for a search phrase. While the old adage “content is king” has been worn out, the content of an anchor link is now king. So much so that a website can rank for a phrase that does not exist on its site purely from the sheer weight of links (this can be done for malicious reasons too however, often referred to as a Google Bomb).

Cloaking is still a bad idea in any form. Instant redirects are pretty much instantly banned by Google if not heavily penalized. But should you have moved a page be sure to put a 301 redirect in place. This will pass on any link strength to the new pages.

While the use of these techniques has been discouraged and outlawed over time their use can usually still be applied in some form or other. As a friend of mine always says, “It’s not what you do, but how you do it!” How very true.

-Previously Published on SiteProNews.