SEO gone overboard

The hunt for getting a good search engine ranking affects more and more companies. A higher ranking equals more visitors equals (most likely) more customers. More customers, in turn, equals more money, which is what business is about.

But sometimes, the hunt hurts the quality of a web site.

Recently, I was told by a SEO company (through my client’s marketing department) that any anchor links in the code was bad for SEO. While I naturally understand that empty links (e.g. <a href="#">Link text</a>) isn’t a good thing, I seriously can’t fathom why a valid anchor link <a href="help/#contact">Contact us</a> would be bad.

I strongly feel that this is just BS, but I have no real proof why anchor links aren’t bad. To me, it’s just common sense that no serious search engine would punish web sites who strive for better accessibility and for giving the end user the best experience possible.

But let’s, for the sake of argument, say that this is true. Imagine that you would indeed get two notches up in the search engine ranking ladder if you got rid of all your anchor links. Your competitor, who doesn’t blindly listen to (bad) SEO companies, would be just below you in the list with their anchor links intact. The difference is that your end users would maybe visit your web site first, get annoyed by poor usability that doesn’t live up to its search engine ranking, instead go to your competitor’s web site which cares about its users, and do business with them.

With this, I appeal to you to not only stare blindly at a search engine ranking. It is important, no doubt about that, but in my opinion, you should never forsake the usability of your web site for it.

Related reading

PS. If anyone has any hard facts showing that anchor links aren’t bad in any way, please let me know! DS.

27 Comments

  • icaaq says:

    Hi, i have been thinking about this for quite a while, actually since i read a report from funka.nu .

    A example of what they said in that report.

    If you have different function areas on the page, then every function area shall be indicated by a new level one headline. And by function areas they meant for example on stockholmshem.se every box like "nyheter", "Ombildning till bostadsrätt" and so on. That would led to 8 level one headlines including "Välkommen. Hos oss bor 60 000 stockholmare." They are referring to 3.5 Use header elements to convey document structure and use them according to specification. I can´t find anything about having to use multiple level one headlines in that section but apparently funknu have, that´s another debate if they are right or not.

    The problem is that if your client wants to follow funkanu´s guidelines in creating a html-document then the SEO will have big arguments for using multiple level one headlines and by that losing the importance of them all.

    The conclusion, hmmm you can get a good SEO-friendly site if you go by standards but if wanna go the extra mile you have to cut the corners on accessibility.

    I can also say that i would have used level two headlines for the section-areas on the example above.

    Cheers.

  • Teddy Zetterlund says:

    Accessibility improves SEO. If/When SEO hurts accessibility it's probably done in the wrong way (often in ways that'll get you banned from search engines as well). I'd like to see concrete examples of when SEO techniques actually help a web site to get a higher rank and at the same time lowers the accessibility on the site.

    And about using h1's all over the document, I'd like to refer them to this W3C tip: Use <h1> for top-level heading

  • Ash Searle says:

    Taking the W3C's techniques for section headings into account, I'd suggest <code>h1</code> is used to head the main content of the page, with <code>h2</code> used to head any other functional areas.

    (I think) this balances three objectives:

    Use <code>h1</code> for top-level heading

    Content developers should not 'skip' levels'"

    and "some users skim through a document by navigating its headings"

    Note: this doesn't say they navigate exclusively by <code>h1</code> headings

  • In certain industries, I often see high ranking sites that over-load their homepage with keywords to the point that the content is nearly non-sensical. The potential client won't even bother reading the content or going through the site because either the product/service is not too complicated or the client mistakenly assumes that he cannot be educated by a web site for this product/service; sometimes if he sees a price that is cheap enough or a wonderful promise, he will just pick up the phone and call. My point is that there is good business through SEO hackery to be had by catering to the immediate needs of unsophisticated consumers. Like fast food, web sites that are highly visible to tasteless consumers (or consumers uneducated about a certain market) may not need to be designed to such high standards.

  • Ash Searle says:

    Robert, you said: Your competitor, who doesn’t blindly listen to (bad) SEO companies, would be just below you in the list with their anchor links intact.

    Which search-engines keep anchor links intact? I tend to stick to Google, and can't remember ever jumping directly to the relevant part of a page (i.e. AFAICT Google doesn't include your #anchor in search-result URLs. A quick check on Yahoo and live.com also indicates they ignore anchors.)

    It seems that the best you can hope for is that the search-engine ignores the anchor. Accessibility (when navigating from a search engine) seems to be a moot point.

  • Mike Davidson did an SEO test regarding web standards and came to the following conclusion.

    Although good semantics are somewhat valuable in optimization, simple things like proper titles, descriptive filenames, and incoming links are dramatically more important.

    And as for Web standards.

    The findings do support my initial suspicions about web standards as they relate to SEO though: that they matter about as much as a cheap umbrella in a hailstorm. That is to say: “kind of”.

    Developers should write clean, semantic code as a matter of professionalism rather than search engine optimization. For good SEO, making your site sticky enough to attract quality incoming links is by far and away the thing to concentrate on.

    It looks like well written content is most important for SEO ranking.

  • Sean Fraser says:

    Robert,

    There is no hard evidence that #anchor is bad. Or, good. Ash is correct: search engines ignore them. From your example, "Help" would show in search results; "#Contact" section would not. Presently, search engines are concerned with pages and not individiual on-page sections. (There has been past rumours that they have experiemented with citation links but nothing has materialised from those experiments.)

    Most SEO companies refer to the text in links as "Anchor text" but even that doesn't explain your client's marketing company's SEO statement. I don't know what they mean. Very few SEO companies understand web semantics.

    And, even if your client used #anchors, So what. Every page has other elements that can be optimized.

    As regard multiple use of <h1> elements in a page, Google prefers a single inclusion; spammers used mutliple <h1> and were penalized. Google's search results algorithms seem – All SEO studies are emphirical – to have included most of W3C "best practices" standards.

    I – Still – cannot comprehend that SEO companies "anchors are bad" statement.

  • Deborah says:

    I found the comment from the SEO company that "anchor links in the code is bad" to be rather odd, and in contrast to what I've been reading on search engine optimization websites. One site I visit frequently is SEOmoz.org which provides lots of great info about SEO.

    Their search engine ranking article highlights what several SEO experts consider the most important factors in ranking. The article isn't dated, but the information in the article is in line with the advice I've read on other SEO websites.

    Within the five categories of factors identified in the article, the experts came up with the following 10 top ranked factors:

    1 ) Title Tag – 4.57

    2 ) Anchor Text of Links – 4.46

    3) Keyword Use in Document Text – 4.38

    4 ) Accessibility of Document – 4.3

    5 ) Links to Document from Site-Internal Pages – 4.15

    6 ) Primary Subject Matter of Site – 4.00

    7 ) External Links to Linking Pages – 3.92

    8 ) Link Popularity of Site in Topical Community – 3.77

    9 ) Global Link Popularity of Site – 3.69

    10 ) Keyword Spamming – 3.69

    Anchor text is ranked #2 out of 10.

  • Robert Nyman says:

    Thanks for your comments!

    Jason,

    Yes, to some degree, that's correct; but hopefully, most people will appreciate a quality web site that they like, and don't use it solely for low prices.

    Ash,

    To clarify: I don't think search engines indexes actual anchor links, but rather just ignore them. What I mean with that statement is once the visitor is at your web site, accessibility and usability will be better, and that's why they'll prefer the web site with the anchor links.

    As long as they ignore them, that's just fine. But I need proof for that. 🙂

    Deborah,

    Absolutely, but I think it's a distinction between anchor text and the actual internal anchor linking.

  • icaaq says:

    Hi again,

    About the internal links issue maybe the SEO company got it by the wrong foot and thought that a internal link is indexed on the same way as a dynamic link, which they are not. Searchengines does not care for the hashmark what so ever.

    But in that way the searchengines would index the same content under different pages which gives a bad ranking, at least on google.

    Cheers.

  • Johan says:

    Find-ability should come first! Information architecture that is.

    Accessibility has nothing to do with SEO or SEM, it is a usability feature.

  • Stefan Van Reeth says:

    @Robert

    Nope, search engines don't index anchors. I can't prove it, but by some deduction we can get there. Let's design a generic spider ourselves:

    First of all, indexing the pages comes down to checking for the things search engines value, not to validate every element on a page. So a spider would home in on page name, titles, keywords, the full text, in short: things that define the content. Then it compares all those with each other to calculate relevancy of each and to look for keyword spamming and other bad stuff.

    Off course link relevancy is also checked: by following them and doing the same thing there. Following anchors equals performing the same checks again on the same content, so that seems not the way to go.

    Spiders are written for speed: they have to crawl the net, and last time I checked there were a few billion pages out there.

    Ever used W3C validator (that's a rethorical question off course :))? It takes a few seconds for even a small page to be validated. This is because it checks each and every element, and this costs time. Imagine Google spiders doing the same thing: their index would take ages to be build and it would be hopelessly out of date…

    @icaaq

    As I said: anchor validating would consist of validating a page twice or more. Sure this would mean we get a duplicated content penalty. This alone is enough reason to believe they're just ignored. Doing the things spiders need to do is hard enough to make in a small and fast application. Adding logic to compare an anchor and the content it points to isn't all that easy when you think about it. It get's rather bloated very fast and that is contrary to the design of spiders.

    What IS that content belonging to an anchor in fact? The paragraph, div, element where the anchor is nested in? Also the following tags with text in them? Where does that content end then? What about anchor's pointing to pictures, just evaluate any title property attached (if any)? Imagine writing RegExp's for that ;)!!!

    @ Tanny O'Haley

    Right on man. Good semantics provide good content a better relevancy rating, but bad content will never rank high these days, except when it's visited by thousands of people each hour. Site popularity is by far a better way to rank high than writing semantically perfect pages.

    @all

    Seems to me that the BS-factor is quite high with the statement of the SEO company guy. I think we can all safely assume that our trusted and well-known techniques for SEO are still valid as they are. So let's just file this one under a large stack of other obsolete stuff and forget about it.

    @me

    Stop writing novels on other people's pages!!!

  • Robert Nyman says:

    Tanny,

    Thanks, interesting link (I always like Mike's writings). I think it's fine that good content and linkage gets the highest ranking. I think good semantics pays off to a certain degree, at least, but I know what I don't want: shady, inaccessible methods paying off.

    icaaq,

    Yeah, maybe. As long as they're ignored, that's fine with me.

    Johan,

    Accessibility has nothing to do with SEO or SEM, it is a usability feature.

    Well, yes and no. Good semantics is the key to good accessibility, and good semantics also helps search engines to properly and correctly weight the content in your web site.

    Stefan,

    Don't worry, I'm just happy if I can write a post that inspires someone to express their feelings about it! 🙂

    Just as I said to icaaq: as long as anchor links are ignored, that's fine with me. But I can never see why usage of them would in any way be frowned upon by search engines.

  • Johan says:

    Well, yes and no. Good semantics is the key to good accessibility, and good semantics also helps search engines to properly and correctly weight the content in your web site.

    Let me rephrase my remark, accessibility contribute to make websites user-friendly and more searchengine-friendly. But for the end-user! A user-centric approach which has nothing to do with the website author = companies. Companies need to understand that accessibility features are user-centric and should be a layer constructed that does not interfere with the level of usability. The website owner should understand that readable and concise premium content embedded in the document structure should be first.

    Why not add rel="nofollow" to the anchors?

  • Johan says:

    The website owner should understand that readable and concise premium content embedded in the document structure should be first.

    With readable content I mean both machine-readibility for search-engines which means a semantic document structure (weighted content for SEO purposes), using easy to understand language (concise content for SEM and a usability feature or human-readibility). But in the end both are intertwined …

  • Robert Nyman says:

    Johan,

    Content is king.

    Why not add rel=”nofollow” to the anchors?

    But is it necessary? If, as people says, search engines already dismiss anchor links, there's no need for it, right? Besides, nofollow is just something made up for search engines and doesn't, in true meaning, convey an actual relation.

    So, it has a value in certain contexts (like trying to avoid spam in your blog comments…), but I don't think it should be overused.

  • Maybe we should not just think about SEO, but also how to get the customer to stay too. Does anyone know of a study that has been done on SEO vs Stickiness? It's all nice and good to have a fantastic SEO rating, but if the customer doesn't stick around to view the site then what good is a fantastic SEO rating?

    Once a potential customer gets to your site, how long do they stay, how many pages did they view?

  • Robert Nyman says:

    Tanny,

    Yes, that's exactly what I'm going for here.

  • icaaq says:

    <blockquote cite="johan">Why not add rel=”nofollow” to the anchors?Why should we use a nonvalid value of the rel attribute? 😉

    List of valid link types

    Cheers.

  • Tinus says:

    If you offer beautiful, well structured, high quality content, and you follow Google's own guidelines (very important), then there's no need for SEO. You shouldn't be the one that spends time optimizing optimizing your site for Google. Google should be optimizing ITSELF to let people find your site. The best SEO tip is: PATIENCE!

    If your site is rubbish, Google won't bother. And that's a good thing! Don't spend your time optimizing your site with SEO tricks, but spend your time writing quality content that attracts visitors in a natural, 'organic' way, Google will follow and you WILL get good rankings.

  • Robert Nyman says:

    Tinus,

    I couldn't agree more. Well put!

  • Haha, I think I know which SEO-company. I have heard the exact same thing. I gave them a link to w3c and told them that anchor-links are defined in the standard and all real search engines can handle them and understand them… I also gave them a long list corrections they shall made in their analysis.

  • Robert Nyman says:

    Martin,

    Probably not the same SEO company, since this is a UK-specific one, but there's plenty of them like that out there. 🙂

  • Harvey says:

    My entire site is made up of hash links (the site my comment link points to).

    The purpose of this is to make the entire site out of AJAX requests, while still retaining bookmarkable URLs, back button functionality, and search engine spiderability.

    While using the hash for this is a bit of a bastardization of it's initial purpose, this is really just a proof of concept – AJAX can be implemented so it maintains user experience and degrades well if you put the effort in. It's a whole pile better than Microsoft's attempt at AJAX + usability on their recent revamp.

    Anyway, does it effect my SEO?

    A little bit. When someone links to http://www.ragepank.com/#contact/ Google sees this as a homepage link, not a link to the contact page. If a javascript enabled browser goes to that address, they get to see the contact page.

    From my point of view, I'm getting more homepage links and less deep links than I normally would. This isn't great, but I can live with it.

    I have never seen any search engine index the hashed version of my URLs in the 6 months I have been testing this system. I am also yet to see a hashed URL be given a pagerank, so I'm reasonably convinced that Google ignores the hash.

  • Robert Nyman says:

    Harvey,

    Interesting, thanks for sharing!

  • Alan Carr says:

    I'm still a little confused.

    If I put anchor links, the hash kind external links from elsewhere, at the top of my page linking to content further down, does that damage the rank or not?

    For example my site does bodybuilding software – if I put the phrase "bodybuilding software" and link to the software section of that page, would engines ignore that text completely or still realise the site is about the topic of such software?

    Alan

  • Robert Nyman says:

    Alan,

    As far as I know, anchor links should never hurt your SEO. Please consult with Google, though, and see what they say.

Leave a Reply to Tinus Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.