18 Common SEO Mistakes
21 March 2010, Jonathan Saipe
Given the complexities of search engine algorithms, coupled with an ever changing technology landscape, running SEO campaigns is by no means easy. Our mission is to demystify the complexities of search engine optimisation in our SEO training courses.
Below is a guide to 18 common SEO mistakes or omissions made by website owners or web designers.
1. Incorrect Keyphrase Deployment
You don’t have to be a rocket scientist to figure out that deploying commonly-searched-for keyphrases within your on-page content and metadata, will yield better search results. Avoid deploying content with keyphrases that have poor search demand or are less likely to convert humans.
2. Poor Website Architecture
Good website architecture will promote better website crawling, distribute PageRank more effectively throughout a website and will also land humans at the right point in the searching or buying cycle.
Don’t create your website information architecture out of isolation of search engine optimisation.
3. Ineffective Anchor Text
Our SEO audits continually cite poor use of anchor text such as “click here” or “more info” etc. These anchor text links add little SEO value and are not helpful for accessibility. You will improve your overall link equity by using searchable terms in your hyperlink (anchor) text.
Hyperlinks with good anchor text inserted into body copy are effective for SEO, as they are often contextually relevant and often have good keyphrase prominence.
4. Missing or Duplicate Metadata
We commonly find that websites have either poor or missing website metadata. Or the other extreme is when websites duplicate their metadata across all pages.
Follow best practices when writing your website metadata and keep your title tags and meta descriptions unique per page. If your website is huge, consider how you can dynamically generate meaningful metadata.
5. Hosting Location and TLD
UK websites that use a non UK top level domain name (TLD) such as .com or .org – and that are hosted outside the UK – often find themselves excluded from the UK indices of Google, Bing and Yahoo! The same applies to any other localised websites using generic TLD’s.
It’s an easy mistake to make. But it it easy to fix? There are a variety of options available, but the quickest solution is to use Google’s Webmaster Tools to localise your site. However this doesn’t work for Bing and Yahoo! And furthermore, this doesn’t always resolve the problem if the website is targeting an international audience. In an ideal world, site owners should have local country TLD’s with content unique to that country. Easier said than done!
6. No Linkbaiting Strategy
Too many websites still adopt a “brochureware” strategy when it comes to content, where there’s little valuable, up-to-the-minute useful content that would naturally attract links. Brainstorm what kind of linkbait should be available on your website and make sure it’s kept fresh.
7. Zero or Low Content Velocity
Following on from the previous point, ensure that your content is kept up-to-date and fresh. New or recently updated content is likely to be crawled and indexed more readily. Brainstorm your content velocity strategy and ensure that you have the resource to maintain your audience’s expected level of updates.
8. Missing XML Sitemaps
Sitemap submission is often ignored, as the work often falls between the role of your web design agency or your SEO agency. In most cases, submitting XML sitemaps is simple and cost effective. Ensure sitemaps are submitted to the major search engines every time your website content is updated.
9. Poor Keyphrase Relevance
Keyphrase relevance refers to matching popular on-page keyphrases (that have good co-occurrence) with the content found in your metadata, link anchor text, URL structure, header tags and ALT attributes for images. Tackling SEO holistically – either across a single page or across a website – will yield improvements in search rank, so maintain a good level of keyphrase relevance.
10. Complex Data-driven URLs
Nowadays, search engines are much better at crawling websites with query strings and session variables attached to URLs. But, having infinite or very long URLs can slow down crawl rates which in turn can cause index exclusion.
Furthermore, humans will react better to “human-friendly” URLs and are more likely to click on those search results in SERPS. Encourage search-engine friendly URLs that follow your keyphrase research strategy.
11. Duplicate Content
Google is well known for excluding content that looks appreciably similar to other content within your domain or across domains. Avoid duplicate content such as repeated boiler plates, repeated product descriptions, syndicated content or stripped down printer-only or mobile-only content.
Check for duplicate content using Copyscape and if you are syndicating content such as online press, create a backlink to your original content to indicate the original source of the content.
Also, ensure your website architecture is improved, so that duplicate content is avoided i.e. link to one version of a page rather than repeating the same content across multiple pages.
12. Domain Canonicalisation
Domain canonicalisation is harder to pronounce than to understand! In short, ensure that only one version of your page URLs exist. For example, if your home page is http://www.domain.com – avoid publishing alternative versions such as http://domain.com or http://www.domain.com/index.html etc.
Set up appropriate 301 (permanent) redirects to canonical versions of your URLs. If necessary use Google’s link rel=canonical command within your domain or across multiple domains. Lo0k at configuring your server to add www if the URL is referenced without it (or vice versa).
By choosing the right canonical URL, you will avoid possible duplicate content exclusion, and you will also direct link equity more efficiently throughout your website.
13. Accidentally Blocking Bots
If you happen to be deploying a robots.txt file or on-page metadata controlling bot activity, ensure you aren’t accidentally blocking bots. If you’re unsure how to use these techniques, it’s probably better not to use any at all.
14. Excessive Link Velocity
Link building is a huge part of search engine optimisation. But over- zealous link building can be picked up by search engines as black hat SEO, normally seen among link exchange schemes. Encourage link building – yes. But don’t use schemes that build thousands of links in a very short space of time – especially if they all have the same anchor text. Chances are you will suffer for it!
Use Majestic-SEO’s backlink history tool to measure backlink discovery over time.
15. Non-Indexable Dynamic Content
16. No Header Tags
When adding page headlines and sub-headers, follow best practice CSS. Whilst H1, H2, H3 tags etc won’t instantly propel you to position one in SERPS, they are nonetheless considered best practice for both SEO and accessibility.
17. Missing or Ineffective Image ALT attributes
If you are adding images to your website, give each image an alternative description to help with SEO and accessibility. Here’s your chance to squeeze in a couple of useful keyphrases whilst adhering to accessibility guidelines. Best practice says avoid ALT attributes on incidental images that have little or no meaning.
18. Poor Link Building Techniques
I’ve left one of the most common SEO mistakes ‘til last. Don’t bother trying to attract links from websites that have little contextual relevance with your website. Also avoid bad neighbourhoods such as link directories, link farms, banned sites and websites with hundreds of outbound links.
Encourage links from websites that have a good PageRank and search rank for keyphrases in your sector. Try to encourage the backlinks to contain searchable anchor text. Look to see what other links are contained on the donor page; ensure they have good co-citation i.e. the other outbound links backlink to websites within your sector.
Whilst the above is by no means an exhaustive list of common SEO mistakes, these should keep you busy for now!