5 SEO Mistakes Even The Pros Make

5 SEO Mistakes Even The Pros Make

SEO Mistakes Brisbane

You could be forgiven for thinking that SEO has taken a back seat to comprehensive digital marketing. Why put all that time and effort into keyword research, link building, writing tags, etc., when you could just create a social media presence, buy ads and write content? Maybe SEO doesn’t get as much attention as it used to, but it’s just as vital as ever to search engine rankings – and in some ways, even more so.

Using on and off page optimization can get your site to where you want it to be, but as always, proper SEO is a lot easier to talk about than it is to actually do. This is a fast moving industry and Google algorithm updates keep coming fast and furious. It’s hard to keep on top of the updates as they come down the pike and keep your site optimized to stay with them.

It’s no surprise that SEO professionals are always reading up on the latest developments to try to stay current. However, in their endless search to be aware of what’s going on, they can make some very obvious mistakes. Keep reading for 5 obvious but very dangerous SEO mistakes that can torpedo an entire campaign.

#1 Close Your Site from Indexing in .htaccess

Any SEO professional knows about .htaccess. This important configuration file contains instructions which allow or block access to site’s document directories. It should go without saying, but this file is important. If you know how to manage it, you can create cleaner URLs, make better site maps and improve your site’s load time through more efficient caching.

.htaccess is critical to streamlining your site’s indexing, which can lead to better search engine rankings. It takes some expertise to properly set up an .htaccess file. Just one mistake can cause serious problems for your site. For example, just having something like this in your .htaccess file:

RewriteCond %{HTTP_USER_AGENT} ^Google.* [OR]

RewriteCond %{HTTP_USER_AGENT} ^.*Bing.*

RewriteRule ^/dir/.*$ – [F]

This can prevent search engine bots from indexing your site. If you see this code in your file, delete it or ask your web developer to do it for you. Every time you start a new SEO project, check the .htaccess file. There are SEO experts who work on a site for months, but neglect this and end up wasting countless hours of work.

#2 Discourage Search Engines from Indexing in CMS

The SEO plugins on CMS platforms like Drupal or WordPress can hamper your SEO efforts, since their built in features let users ask search engine bots not to crawl your site. In the settings menu, this will be under Settings>Reading>Discourage. Just one click can make your site invisible to the search bots!

Check this box at least weekly, since this could accidentally be clicked by anyone with access to the CMS. Keep in mind that some search engines may continue to index your site even if this setting is clicked, so if you have a real reason to prevent your site from being indexed, use the .htaccess or robots.txt files to do this instead.

#3 Leave Your robots.txt File Entirely Open for Crawling

This can be a real problem in terms of privacy. Never leave your robots.txt file open for crawling, since this can expose you to a total loss of your site if there’s a data breach. Take some time to learn about how to properly set up and manage your robots.txt file and check it regularly. If you see something like this in your file:

User-Agent: *

Allow: /

This means that you need to take action immediately, since this text means that search bots can index every page on the site, including login, admin, cart and dynamic pages (like filters and search). Any customer pages should be closed and protected. You should also exclude your dynamic pages from indexing, since these look spammy to search engines. It’s simple, but it takes time to learn how to set this up properly in the real world.

#4 Forget Adding “nofollow” Tag Attribute to Outbound Links

Links are important to your search engine ranking; the mistake people make is to focus on backlinks at the expense of the outbound links from their own sites. Keep working on building quality backlinks, but do the following:

Scan your site using a tool like Xenu

Sort your links to identify outbound links and create an Excel or HTML report

Check every link and implement “nofollow” where needed; don’t worry about this too much though, as it could backfire by provoking other SEO professionals to do the same to your backlinks. This is powerful, so don’t abuse it.

#5 Fail to Check the Code in Validator

The better your site’s code, the better the search engines will rank you. Clean code is easier for search engine bots to crawl and this works in your favor. When you work on a new project, check the code before, during and after optimization. You don’t have to be a developer to do this, just copy and paste your URL into the W3C Markup Validation Service and ask a developer to fix any problems you find. The image below shows you what a typical validator report might look like:

Google doesn’t necessarily penalize you for having a little invalid CSS or HTML, but your site is better off for having done this. It improves user experience and makes your site better for the search engine crawlers as well.



Category Search Engine Marketing