In terms of making your website more search engine friendly, it is important to get back to the fundamentals of search engine optimization. Because of events like Hummingbird, pushing content to the fore, a lot of people are not giving enough importance to other aspects of sound SEO.
The fact of the matter is that you can have the best content in the world. But that won’t mean diddly-squat if your website hasn’t been optimized. Without optimization it won’t get the rankings. So, let’s just take some time out to recap on the basics.
Spider friendly
When search engine spiders are not welcomed on your website, they will struggle to gain access to the various pages, making it difficult to index them. The foundation to good SEO is to make sure that the spiders can crawl wherever they need to get to with ease. To give the spiders the access they need, you need to make best use of the "robots.txt" file.
Of course not all of your website's pages need to be spider friendly. There are some pages you won’t want indexing. Take as an example any log-in pages or directories containing private data. The way of blocking access to these pages is to make them off-limits by disallowing them as follows:
- User-agent: *
- Disallow: /cgi-bin/
- Disallow: /folder
- Disallow: /private.html
Another way of blocking access to certain pages is as follows:
- User-agent: BadBot you simply replace
- Disallow: /
In using the example shown above you simply replace “BadBot” with the actual bot name you wish to block.
You need to be a little careful when stopping spiders from accessing your entire website. My recommendation is don’t do it unless you are aware of a specific bot that causes trouble. This will prevent you from denying spiders access to the pages that you want them to index.
With regard to WordPress, they have a variety of plug-ins that can help you to disallow access. If you don’t use WordPress it’s quite simple to set up a "robots.txt" file on your server.
The importance of a good site-map
Once you designed your "robots.txt," the next thing you should to is to ensure that the Google spiders can crawl your website. In order to do this, the first thing to do is to draw up a site map. You can either do this yourself manually, or by using a specific tool. If your site is designed in WordPress you will find there are a variety of plug-ins available that can manufacture site maps for you.
Having completed your wesite the next thing to do is to log in to Google Search Console. Once there look in the navigation panel on the left-hand side of the page and click “Crawl” followed by “Sitemaps.” Next, click “Add/Test Sitemap” 10 times in the top right hand corner. Having done that you can then check out your site before submitting it to Google for indexing. You just need to be aware that it will take Google a while to crawl and index your website.
If a site map has already been offered up and you now simply want to test/submit a specific web page from your website, utilize the “Fetch as Google” function. You’ll also find this under “Crawl” in the navigation console on the left-hand side. In summary:
- Having logged in click “Crawl”
- Go for the “Fetch as Google” option
- Enter the URL of the webpage and click “Fetch” (if you’re testing your homepage leave the URL blank)
- You should now see a green check saying “Complete”
- Select “Request indexing” if this option is there
Completing this brief exercise is important in terms of getting your website indexed. Remember, if it isn’t indexed, it simply won’t get found.
The structure of your website
Because a lot of people are now using their mobiles, people often overlook what is both simple and practical. The fact of the matter is however that whilst the mobile-first fraternity is trending, it is still important to think about search engines. By having a robust structure to your website this will enhance the user experience and enable the website to rank more effectively.
Although having a robust structure to your website may seem simple, it does in actual fact need time and careful planning. It’s not only fact that it affects navigation and website links, it also enables the spiders to evaluate your content and its make up more easily.
Having a robust website structure means assembling your content in a logical manner. You don’t want your visitors or search engine spiders to have a hard time locating what they came to find on your website.
Utilizing titles and meta descriptions to best effect
In terms of SEO fundamentals, titles and meta descriptions are important foundations. Although it is true that titles reflect in ranking whereas descriptions do not, they are both nonetheless, vital. Okay, Google doesn’t use descriptions for ranking, but that is no excuse to ignore them.
When it comes to surfers and search engine report pages, titles and descriptions are usually the first things they recognize. That being the case here are few top tips for creating better ones.
Top tips for designing better titles
- Concentrate the tag of your title on the core focus of the web-page
- Don’t go in for keyword stuffing
- Optimize the amount of characters between 50 and 60
- Make sure the title is relevant to readers
- Avoid duplications
Top tips for creating descriptions
- Descriptions are better when they focus on action
- Ensure that your primary keyword is included
- Make sure that the content is easy to read and understand
- Limit the number of characters to between 135 and 160
- Avoid duplications
The better your titles and descriptions, the better the click through rate and the visibility of your website will be. Just bear in mind that should Google determine your meta data doesn’t meet user requirements; they will change it.
Remember, before following the latest SEO fashion, get the basics right first. You’ll be surprised what a few simple adjustments can mean for your website and its overall Internet presence. Follow the tips outlined above and you will be building a firmer foundation for ongoing success.
Category Search Engine Marketing