SPECIAL OFFER Get 30% OFF on All Qode Plugins Discount code: WOOCOMMERCE30 SPECIAL OFFER Get 30% OFF on All Qode Plugins Discount code: WOOCOMMERCE30 SPECIAL OFFER Get 30% OFF on All Qode Plugins Discount code: WOOCOMMERCE30
SPECIAL OFFER Get 30% OFF on All Qode Plugins Discount code: WOOCOMMERCE30 SPECIAL OFFER Get 30% OFF on All Qode Plugins Discount code: WOOCOMMERCE30 SPECIAL OFFER Get 30% OFF on All Qode Plugins Discount code: WOOCOMMERCE30
BACK TO TOP

20+ SEO Myths You Need to Let Go

SEO Myths You Need to Let Go

Content is what gets traffic to your website, and, no matter the business model you subscribe to, traffic means revenue, whether by ad views, sales, or other modes of monetization. And, while there are many ways in which people can find their way to your website, ranking high on SERPs (search engine results pages) is certainly one of the commonest.

There is no way we can overstate the importance of SEO. Without implementing at least the basics of SEO, you might as well not be online at all. And SEO isn’t always easy. At this point, the massive array of techniques and practices that is SEO is cloaked in its own lingo. Unfortunately, webmasters and even SEO experts often operate on false premises and a surprising number of SEO myths persists. To do better (or even well), you need to avoid these pitfalls. You need to, in the immortal words of Elsa of Arendelle, let them go.

Here’s our pick of SEO myths:

Quantity Is Better Than Quality

Quantity Is Better Than Quality

We will open with one that is partly true: you should keep your text posts relevant and informative. Google (and possibly other search engines) penalize very short blog posts as they may see it as a black hat (disallowed) optimization practice; a ploy to make your website appear to have a large amount of content when in fact there are only short posts stuffed with keywords. Longer posts are preferred, but quality – content optimized for humans, rather than machines – is key.

This myth has to do with the recent shift in optimizing for user intent, rather than a search algorithm. People tend to see long posts as more in-depth, and therefore more useful, but long posts will rank better only if they are more useful than shorter ones for search engine users. A large number of short posts, especially if they contain links, may be seen as a link scheme by a search engine.

Duplicate Content Gets Penalized

Duplicate Content Gets Penalized

Again, this is not exactly untrue, but content duplication is not penalized as such. Think of online stores, for instance: how many ways of describing a t-shirt can you think of? If duplicate content were penalized, you wouldn’t be able to find an online store on the internet, and, of course, we know that that doesn’t happen.

What is true is that duplicate content can be penalized, but only if it is perceived as SEO spam: if you are loading your website with copies or near-copies of keyword-rich articles, for instance Duplicate content can also have an adverse effect on your SERP ratings other than a penalty – it can dilute your impact.

Canonicalization Always Works

Related to duplicate content, you may think you have fixed the issue by canonicalization – assigning the rel=”canonical“ attribute to a page to avoid diluting your results. It may happen that Google disagrees with your assessment of what should be the canonical page (which duplicate should rank better). This may happen because the non-canonical page is linked to from your website navigation or because it is included in your sitemap, or because of external links linking to it. In the case of Google, you can fix this by explicitly marking the duplicate page as a duplicate.

Keywords Are Irrelevant

Keywords Are Irrelevant

The truth is, unsurprisingly, quite the opposite. Recently, Google has been focusing on adapting its algorithm to reward content which caters to search intent, rather than content loaded with particular keywords. You still have to do your keyword research: what changed is that keyword-loading is far less likely to improve your SERP ranking, at least where Google is concerned. Except now, you have to consider the searcher’s real intent and needs.

With the focus on user intent and content optimized for humans, rather than machines, Google, being the market leader, continues to penalize keyword stuffing, so you’d do well to avoid it. Just what constitutes keyword stuffing was and remains subject to change and search engine operator policy, so you should keep abreast of that.

There is an additional upside to keyword focus: Google and other search engines now employ algorithms which allow for more leeway in terms of word order and common misspellings of keywords.

Best Themes for Marketing
Borgholm Marketing WP Theme
Borgholm

Marketing Agency Theme

Foton Banner
Foton

Software and App Landing Page Theme

Valiance
Valiance

Business Consulting

There Is a Fixed Maximum of Keywords

If you know search engines will penalize overoptimization, you will likely be tempted into thinking that there is a non-penalizable maximum number of keywords per article (or some other metric). This is not the case. So long as your keywords occur naturally as part of your textual content, the number of keywords is irrelevant – you can have as many instances of a keyword as you like. To avoid your content appearing keyword-loaded, try employing related words and synonyms.

Anchor Text Is Irrelevant

Anchor Text Is Irrelevant

Similarly to the above, this is simply not true. For one, keyword-loading your anchor text may be penalized. However, you can and should still use keywords as part of your anchor text, just so long as the anchor text appears naturally in the main body of text. No specific kind of anchor content is penalized by default. As a search engine typically uses anchor content to inform its indexing process, using keywords in your anchor text should be your practice in any case – so long as you don’t overoptimize.

As a rule of thumb, you should be safe with brands or naked URLs, especially used in conjunction with phrases such as “according to X”, or “read more on X.com”. When it comes to keyword-rich anchor text, you should make sure that it doesn’t stick out unnaturally from the main body of your content. You can even resort to what’s known as “noise anchors” (generic text such as “Click here”), but don’t overdo it. Diversifying between the three is generally considered best practice.

Meta Tag Keywords Are Irrelevant

Once again, this is plainly a myth. While it is true that meta tag keywords do not rank by themselves, they still inform search engines on what your content is about. This means that keywords in meta tags (title and meta description) will make it more likely for your content to appear as a related result in a search. What’s more, a good meta description may make your content more attractive to search engine users, so long as you do your homework and make them engaging, relevant, and to the point.

Headings Affect Rankings

Headings have several important functions, both from an aesthetic and from an accessibility point of view. In terms of aesthetics, they break up long pieces of text into smaller, more manageable and less daunting chunks. They are also a great tool for creating scannable content. In terms of accessibility, heading structures assist screen readers. Headings do not, however, directly affect your search rankings. They do affect user experience, and can make your content and your website more popular through increased traffic and low bounce rates.

Image Optimization Is Irrelevant

Image Optimization Is Irrelevant

Each image hosted on your website has an URL which is commonly related to the file name and used by a search engine to understand what the image represents. Optimizing your images for search engines is, therefore, a must, and that includes clean image file names. You should also optimize your image alt text, not just for the sake of overall website accessibility, but also for the benefit of your SEO. For best results, stick to image formats supported by Google. Fortunately for you, these include the commonest: .jpeg, .gif, .svg, .png, .bmp, and .webp.

Nofollow Links Are Irrelevant

Again, the exact opposite is true. We won’t get into nofollow links in detail here, but the gist of it is that each link means credibility – the more links to your website, especially from high-quality websites, the better your SEO ranking. However, you may be linking to websites you don’t fully trust – perhaps you are commenting on something you are linking to, or a user dropped a link into your comments section. You want to avoid being perceived by Google as linking to unreliable content. To do that, use nofollow links on your website.

Bounce Rates Affect SERP Rankings

Bounce Rates Affect SERP Rankings

Bounce rate is the rate of visits to a website which do not involve further interaction with the website to overall website visits. A high bounce rate is usually seen as an indication of low quality, but that may not be the case. It could just as well simply be the case of the landing page containing all the information a user needed. So, a high bounce rate might not be a ranking factor for Google, but that still doesn’t mean you shouldn’t improve your content to better fit your user’s needs in case you want them to interact with your website further.

Guest Blogging Should Be Avoided

Guest Blogging Should Be Avoided

There is literally no difference between guest content and own content. The myth comes from several Google updates which were meant to stem black hat SEO practices devised to create artificial brand authority by overlinking. You can still benefit from guest posting, but make sure that any guest posts provide quality, relevant content and avoid link schemes. There is a grain of truth to this myth, too: you should avoid calling guest posts “guest posts” – after a long period of abuse, Google really doesn’t like them.

Social Network Engagement Is Irrelevant

This one is actually true in a sense: liking or commenting on your content on social media does not directly influence SEO. However, search engines use data from social media websites when determining the authority of your brand. Besides, sharing on social media makes it more likely to attract more traffic, and more traffic (especially if it comes from people who actually want to access your content, keeping your bounce rate low) means more credibility, which means better ranking.

Domain Names Are Instrumental

Domain Names Are Instrumental

Domain names might or might not matter, depending on the circumstances, but they are hardly ever instrumental. One notable exception are exact match domain names – domain names with exact keywords you are trying to rank for – which can get penalized. Another issue you need to consider are top-level domain names. Most users recognize .com or .net as the “default” website extensions. They are meant to be used for websites with global audiences, and usually dominate SERPs. However, in geographically targeted searches, you might do better with a country code top-level domain.

Domain Age Is Important

There is a strong correlation between older websites and good SERP rankings. The reason for it is simple: older websites have had the time to build authority, accumulate backlinks, and earn a reputation as a trusted source. The domain age itself is not a factor. Quality and consistency are.

Well-Known Brands Rank Better

This is actually true: Google’s focus on quality and authority (as understood by Google) means that brand authority carries a great deal of weight. Some of it may also be down to bigger brands having more resources to invest in SEO. Smaller players can still outrank big brands, though, especially if they operate in a well-chosen niche, or if they have a strong local presence.

You Need To Make Google Crawl Your Website

You Need To Make Google Crawl Your Website

A robots.txt file is a text file which governs the behaviour of bots – the pieces of software used to create an index. Stands to reason, then, that you should use it to tell Google which pages of your website to crawl, right? Wrong. There is a lot to be said on how to use a robots.txt file, but bots will crawl your website anyway. A better use of it is to make parts of your website private if you want to.

Sitemaps Affect Rankings

Sitemaps Affect Rankings

While adding an XML sitemap to your website has its merits, sitemaps do not actually affect rankings directly. What they do is improve the visibility of your pages to search engines, making them more easily accessible to bots. Quicker indexing makes an XML sitemap especially useful to websites who are just starting up, as they tend not to have large numbers of backlinks.

There is something else to be said about XML sitemaps: while you don’t need to make Google (or other search engines) crawl your website, you can use an XML sitemap plugin to generate a sitemap and submit it to Google and other search engines automatically. This means your website will be indexed without the need for crawlers, as crawling can take some time (especially for new websites).

Paid Search Results Improve Rankings

With paid ads, what you get is a way to reach more search engine users, but a paid result doesn’t matter at all in terms of improving your organic rankings. The two are completely separate. A good use for paid search results is in keyword research, though: you can easily pay for search results on multiple sets of keywords and see which one performs best, and inform your SEO strategy that way.

Google Suppresses New Websites

This is a tricky one: many claim that Google takes some time before ranking new websites at all. This phenomenon is known as the Google Sandbox: nominally, you can rank, but in reality, it doesn’t happen. The part that makes some sense is that Google takes time to figure out a new website’s usefulness, and that may be the source of the myth.

You Need “Near Me” To Rank Locally

You Need “Near Me” To Rank Locally

This is a persistent myth which simply fails to take into account the fact that Google knows where you are. Google assumes a local context for searches with relevance to a physical location. How likely is it, after all, that a person searching for, say, tacos, is willing to travel very far for a snack? If a user does use the words “near me” in a search, that doesn’t mean that the website needs to be optimized for the keyword.

You Need Frequent Updates

You Need Frequent Updates

Search engines do tend to rank new content better, but that is only true for some queries. Which ones? That is a little harder to define. Usually, this concerns trending topics, but the search engine’s algorithm has the final say. For our purposes, frequent updates alone do not make for better ranking, but, if you are running any kind of a news website (or similar), you will be posting on currently trending topics by default. Content updates may have an indirect effect on the number of visits, as people generally want to know the latest, but that’s not always the case. After all, if this myth were true, would there even be such a thing as evergreen content? We don’t think so. Still, even evergreen content ought to be kept fresh with updates in order to maintain relevance.

In Conclusion

As you can see, myths abound, and some of them are persistent and widespread even among IT professionals and SEO specialists. Hardly surprising, as some stem from information which may have been true at some point, or otherwise have a grain of truth to them. We hope we have gone some way towards debunking them. The main takeaway? Everything is relevant, and just about anything is optimizable. Do your best to provide as good a user experience as possible – it‘ll likely pay off one way or another.

Post your comment

Comments0