Showing posts with label bad seo. Show all posts
Showing posts with label bad seo. Show all posts

Tuesday, March 3, 2009

Men (and Women) with Hats

Anakin from Clone Wars

SEO is so easily divided into two categories, the good and the bad, the yin and the yang… the light side the dark side. Okay so it’s not all a battle between good and evil, but the Star Wars analogy is closer to the mark. SEO is often broken into two camps those that practice safe optimisation (White Hat) and those that prefer to break the rules for immediate results (Black Hat).

Much like Darth Vader, a black hat SEO will use all possible weapons at their disposal, often sacrificing a ship (site) or two on the way to gaining victory (top ranking). This dark side of SEO breaks the terms of service set out by the search engines by any means they deem necessary. The only thought is immediate results. Sometimes these sites may show lasting results, but this is rarely the case.

On the other hand the white had SEO is more like Obi-Wan. Controlled, directed, methodical and with a much better staying power. Okay so I know that Obi-Wan Kenobi was struck down by Darth Vader in Episode IV, but he made an awesome apparition guide. There are guidelines put down by the search engines and the white hat SEO does their best to stick to the rules knowing that deviating from this could lead to penalties which would quickly undo all the good work done up to this point.

Most SEO consultants know how difficult it can be to explain to a prospective or new client that results may take time. Very much like a Jedi this can sometimes sway an SEO to offer quicker less than ideal solution. Once you start down this path it is often difficult to turn back. Sometimes it is impossible. The rewards can be great but if caught out the penalty could lead to the end with a very dead website.

Okay, so that’s very much a black and white or light side, dark side take on it. Lets face it, few things are that clear cut and simple. White hat is only white hat for as long as the powers that be say it is – or until they catch on and ban a technique. I guess I would say, White Hat SEO is an oxymoron to some degree. You aren’t supposed to game the search engines but surely optimisation to some degree is gaming the system?

Going back to the Star Wars Universe I would say that most SEO folk fall into a third category: Han Solo. This is more a greyish kind of hat. This is where you go with your “gut feeling” on what is possibly right. If optimising a site for a client you certainly would keep well clear of anything that would get their website banned. But you would also be looking to rank as highly as possible so would be willing to “bend” a few of the rules.

By going through most of those points I think most people would realise that the simple act of optimising a website IS gaming the system. Okay, so many do try their best to keep within the webmaster guidelines. But an attempt to gain a favourable ranking is in some way an attempt to skew the results, in your favour. While sticking to the webmaster guidelines one could call this technique white hat. Although I would say a true white hat SEO is one that does nothing. They simply build a perfect website never considering a search engine for a single moment. Black hat SEO would be the exact opposite, building a website for a search engine, never considering the user for a moment.

Personally I think we are all pretty much like Han Solo, we know where the boundaries lie and we stick to our side of it… well as much as possible. However as the boundary keeps changing, what is right today is outlawed tomorrow, it does become difficult to maintain a perfect score. Fortunately the powers that be are quite forgiving as long as it’s not blatant over the top Black Hat you’ll usually not be in too much hot water.

Black hat? White hat? Grey hat? Which one do you wear? Personally I don’t like wearing hats. Mostly as I’ve got quite a bit of hair and usually end up with terrible hat-hair. But I do think that it’s time we got over the idea of SEO being a shade of black/white it’s so much more than that.

Thursday, February 12, 2009

Flashing your Wears

Mostly because Corrine wanted to know…

When I think of flash I immediately think of Flash Gordon or simply “The Flash” himself. While growing up these genres of demigods were heroes of mine. Even now as an adult and I use that term very loosely as all it really means is that I can buy beer and am allowed to vote, I still think back on how cool they were, particularly Flash Gordon* who went on to save the world.

These days when people speak of Flash they are referring mostly to Macromedia and the impressive interactive designs that adorn many of the top websites today. Initially, Flash was a pet hate of mine, I just couldn’t see it working for an e-commerce website. The splash pages that so many websites had were for the most part poorly designed, the files were well oversized and they were devoid of any accompanying text. It succeeded in slowing down entry into the website and leaving the page impossible for the spiders to index. It would seem that these days many of the splash pages have been replaced (thank goodness), although recently there has been an increase in the number of sites that do have a Flash Intro.

As with so many good things Flash has been brutally abused. It seems that for such a long time Flash was seen as a massive cool factor. The biggest disadvantage for me has always been the size of the file. While navigation files may be quite small the splash pages and, often headers were just way too big. While many people have dedicated digital lines at work they often only have access to dial up connectivity at home. I am sure that everybody knows how frustrating it can be trying to browse a website that has large images and other multimedia files with a mere dial up connection. Beyond this frustration there are a number of reasons why Flash is just a bad idea, these include:
  • Flash breaks the back button. If you navigate within a flash object and you hit the “Back” button it takes you back to a previously viewed page and not back within the flash object itself.

  • The standard link colours do not apply. This can lead to confusion as to which pages have been viewed and which ones have not.

  • Flash integrates badly with search functions. More of this will be explained later.

  • The design is set. Text can not be enlarged for people with limited vision. The view can’t be changed to suit the end users needs.

  • Flash in general is difficult to access by visitors with disabilities.

Recently Google has announced that it can indeed read .swf files looking for text. According to Dave Taylor of askdavetaylor.com, “Google can parse through the text contained within a .swf file and present that information in a Google search. But due to the fact that an entire website can be contained in a single .swf file, whereas a traditional HTML site may consist of hundreds of individual pages, the weightings and rankings given to certain pages may not be accurately portrayed in Google's results.”

This would mean that Flash still comes up short when it comes to ranking favourably in the search engine results. While I don’t see this hurting major household name brands (such as Coca-Cola or Pepsi), it certainly means that the smaller emerging business with a total Flash website is going to struggle to rank well.

As mentioned earlier, Flash integrates badly with search functions. This is mostly due to the fact that the flash file itself can’t be indexed in the same manner as plain text. While Google has managed to index the plain text within an .swf file it has become plain that from a search point of view, the Flash file is treated in much the same way as an image. The ability to read the text in these files would seem to have little more than benefit than using an “Alt” or “Title” tag. However this may change in the near future.

Perhaps in time the search engines will be able to index Flash files correctly, perhaps not. Personally I would prefer it if they don’t. Some things really should be discouraged from corporate websites. While Flash is a great way to build a site with all the bells and whistles, it also removes some of the functionality. I imagine that a website for Ozzy Osbourne could easily include Flash as fans are more than happy to sit and wait for the objects to download. But what works for Ozzy might not work for your business. As with all website additions, the main thought should always be, “will it improve the average visitor’s experience?” If not then it shouldn’t be used.

For now, I’ll just keep thinking of Flash (Gordon) as a hero. I know my brother does and, believe it or not, even plans to name his child “Flash”. I think that’s how things should stay.

*Flash Gordon - Flash Gordon the Film

Tuesday, February 3, 2009

Duplicate Content

No Duplicate ContentIs duplicate content penalty a myth or a reality? Well I would say that it’s a bit of both. Many folk have been terrorised into paranoia over duplicate content, but here are the facts.

“Duplicate content is bad.”
Okay, so I said it. But please note that I didn’t say that it’s evil or that your web pages are certainly heading to supplemental hell. But duplicate content can be bad for the following reasons.

1. Multiple copies of the same content are not useful to the internet as a whole
2. Multiple copies of the same content on your website could dilute your “link juice”
3. Multiple copies of the same content can hamper effective indexing of your website.
4. Multiple copies of the same content will confuse the search engines as to which copy may be the original.

Multiple copies are not useful to the internet. Okay so we often turn on the TV only to find that while there are over 100 channels on offer there’s still nothing on. But imagine if out of the 100 channels there were only really 8 to choose from, the others were simply showing the same thing as on the others. Duplicate content works in the same way. It serves us the same content on all the channels. The search engines quickly realised that good content ranks but that the same content shouldn’t fill all choices. This is the real penalty of duplicate content. The same content shouldn’t rank over and over because if it doesn’t meet the searcher’s needs then they wouldn’t have another option.

From the average webmaster’s point of view I would say that point 2 is the most important. Can you imagine having 3 or 4 pages all with great content, but the same content, being indexed and linked to from related websites? Sounds great, but if it is the same content then you have effectively shared the full linking power of that content over multiple pages. So in this case instead of having one page showing up on the first page of the search results, you now have 2 on the second. I know which scenario I would prefer. The other side of the coin is with dynamic URLs you may find that several different dynamic formulas may take you to the same product page. Again this could be seen as duplicate content by the search engines and with this may come the uncertainty as to which one to rank. Quite often this will result in the relegation of both of those URLs to a lower rank. Although I have a sneaky suspicion that the search engines are catching onto the dynamic URL problem and now will rank one page from a website and simply ignore the others.

Should you have multiple versions of the same content on your website as can often be the case of dynamic URLs then this may prevent the search engines from indexing your whole website. Imagine once again you are searching through the TV channels and the first 10 are all the same, you may continue searching and reach channel 20. If by this time you’re still searching then you must be really bored. The Bots don’t have time to get bored, if it reaches page 10 and all the content seems the same it will assume that the rest aren’t worth indexing. This could be a problem. Added to this if it deems none of your pages worth indexing you could face a massive uphill battle to have your pages indexed, never mind ranking highly.

Originality is a difficult one because with no way of being able to certify that you were the first to post the content the search engines rely on the speed with which it was indexed. Indexing of websites takes place at different rates. Blogs generally are crawled and indexed a lot more frequently than a website that shows updates once a month on average. In this case if you had added great quality content to your website only to have a blogger scrape it, you might lose the full benefit of this content as the blog may be indexed and be recognised as having first published it. While this is an extreme example it is a good reason to regularly update your website to keep the search bots coming back regularly.

As you can see from the points above there is no definite penalty for duplicate content. But for reasons that make good logical sense many times these pages simply don’t cut it at the highest level. The question you should always be asking is, “will this benefit my visitors?” or even, “Will this impact negatively on my visitors?” The answer to that question should quickly send you in the right direction and keep you from possible pitfalls.

Ironically this article is a duplicate of one posted on SiteProNews that I submitted some time ago.

Monday, February 2, 2009

Bad SEO Practices

The Boogie ManOnce upon a time, many webmasters abused the system. The system now abuses the webmasters (okay perhaps not, but these tricks no longer work).

While many of the articles written on good SEO techniques I’ve decided to turn things around and look at a few of the sure-win techniques of the past. The point is to make sure that you are no longer making use of any of techniques.

Hidden Text:
This is one of my favourites. A long time ago I even tried this one, and it worked! You picked a background colour and matched the text colour to it. Very quickly the search engines caught onto this one and simply marked the sites for spam. But that wasn’t the end however. By creating a background image of a particular colour and then making the text the same colour, you could once again benefit from hidden text. Again, it wasn’t long before the search engines caught onto this one and penalized the offenders. While webmasters continue to find ways to add hidden content to web pages the search engines will continue to find ways of exposing this and you will be penalised.

Keywords:
I think this was the first bit of code to be totally discarded by the search engines. Gone were the days of simply putting any old keyword into the meta keyword tag and hey presto, there you are. Keyword stuffing was the norm and has thankfully slowly become a thing of the past. However, many designers still seem to think that you need to fill this tag with as much as possible. Google have time and time again stated that they simply do not use this tag for anything.

Description Tag:
The description tag was abused in much the same way as the keyword tag was. Webmasters everywhere stuffed as many keywords into the description tag as possible. Because of this, the search engines no longer use this as a ranking tool. Although a well written description may not gain you rankings anymore, this can still be useful in promoting a click through by working as a sales pitch.

Gateway Pages (Landing Pages):
When mentioning gateway pages I’m not referring to those pages that you have optimised for a particular search, but rather those rubbish pages that have no value, no real content and redirect a visitor before they know what is happening. While many SEO’s still believe this is the way to go, the search engines frown on these pages as they offer the visitor little to nothing of any value.

Links:
Links are good. Links from just anybody, are bad. Gone is the day where a link from a Free For All (FFA) is worth anything (then again, was it really ever worth anything?). All those links from reciprocal link pages are now worth very little if anything at all anymore.

Cloaking:
As much as I like breaking the rules, I’ve never actually done this personally. Cloaking is as the name suggests hiding one version of a webpage from either the visitor or the robot. This was usually done by returning a specific super optimised page to the robot and then a pretty design to the visitor. While there may be the occasion where many feel this is a valid technique (returning text to the robot, flash to a visitor) it is still a massive “No-No”.

While each of these techniques has been used, abused and banned over time it is useful to remember the following:

While hidden text is very much frowned upon it is good practice to keep active code off of the webpage. By adding all of this code into a separate file and included later the search engines have less code per page to spider and this should translate into a higher text/code ratio. This should aid with text/page density, adding value to the text content.

Keywords are no longer used by the majority of search engines. Many SEO experts now even suggest leaving out the keywords tag so that you don’t alert your competitors to which keywords you are targeting. That said, as a few search engines still use keywords (although with minimum weight) it can’t hurt to add them, sparingly.

The description tag is one of my favourites. As mentioned above, it won’t aid with your search engine rankings, but it can add value to a page. If you have optimised your page to rank highly for “search term” then make sure that your description compels the searcher to click on your link. This is a short description of what the page is all about. If this tag is used as it was originally intended it can be very valuable.

Gateway pages spammed to death in the hope of tricking a visitor into visiting an unrelated website are a thing of the past. You can bet on being permanently banned for this one. However a properly optimised landing page for a particular special offer or product is still very useful. Remember that this page should offer something of value to a visitor or it still is just spam.

Links are one of the most important factors when trying to rank highly for a search phrase. While the old adage “content is king” has been worn out, the content of an anchor link is now king. So much so that a website can rank for a phrase that does not exist on its site purely from the sheer weight of links (this can be done for malicious reasons too however, often referred to as a Google Bomb).

Cloaking is still a bad idea in any form. Instant redirects are pretty much instantly banned by Google if not heavily penalized. But should you have moved a page be sure to put a 301 redirect in place. This will pass on any link strength to the new pages.

While the use of these techniques has been discouraged and outlawed over time their use can usually still be applied in some form or other. As a friend of mine always says, “It’s not what you do, but how you do it!” How very true.

-Previously Published on SiteProNews.