Technical SEO Tips to Improve Search Ranking
The key to attracting more traffic to your site is to maintain a good ranking in search engines through Search Engine Optimization (SEO). SEO is a broad topic and relates to many aspects of your websites however, this article will focus on technical SEO — the optimization not represented in your actual content, but instead in the development of your site. So that your business has the best possible ranking in search results, our digital marketing team has put together a list of technical SEO areas that are fundamentally important to the performance of your site.
Robots.txt file — The robots.txt file is a text file that controls which parts of your site are indexed by search engines. By excluding particular pages, you not only keep private content from being displayed in search results, but also save bandwidth and reduce the load on your servers. This improved performance has a direct impact on your search ranking — you are penalized by sluggish speeds and other performance issues. You can create a robots.txt file here.
A note on privacy and robots.txt: Although all major search engines will follow the instructions written in your robots.txt files, they are not an effective way to protect sensitive information from being accessed — to truly protect private or sensitive information, do not store it unsecured on a public web server.
Sitemaps — A sitemap is an XML document created and submitted to search engines that allows them to understand the organization of your website as you believe it exists. If the search engine indexes your site and finds differences between the sitemap and what it found, it can report the differences to you, allowing you to find and fix indexing errors that may prevent proper indexing in the search engines.
Interlinking — Improve you site’s crawlability by finding natural ways to link between the pages of your site to ensure all content is connected within a good site structure. For example, creating a link to your site’s FAQ page in the content of the Contact Us page, benefitting both your customers and your SEO.
Page Load Time — Monitor how fast your site is loading, not only for user experience, but for search engine ranking as well. If your site loads slowly, its search ranking may be penalized. You can test your site’s speed here. If you find that your site loading slowly, work with your web developer to take action.
Optimize your code — One factor in page load time is the amount of code on each page. Keeping the size of each page’s HTML, CSS and JavaScript as small as possible reduces the amount of information that needs to be transmitted, improving this speed. Additionally, each individual file referenced in your site’s code becomes another request that has to be made to the web server. You therefore want to keep the number of files involved as low as possible. There are tools you can use to optimize and compress your site’s code, one being gzip, that compresses files by removing whitespace and comments, among other things, in your code. Code optimization is a large subject: further topics include inlining vs. external CSS, images and fonts, asynchronously loading JavaScript, reducing the number of server calls, and minification of code.
SEO Friendly URLs — Short URL’s make it easier for spiders to crawl your site. Stick to short, keyword-focused, readable URLs to improve your site ranking.
Search Engine Optimization is not just about keyword research and page content. There are many technical factors unrelated to content that can help or harm your site ranking in search results. Luckily, if your site is suffering because of one of the above technical SEO topics, there are solutions. If you need help, give us a call — our expert marketing team would be happy to assist you.
Robots.txt file — The robots.txt file is a text file that controls which parts of your site are indexed by search engines. By excluding particular pages, you not only keep private content from being displayed in search results, but also save bandwidth and reduce the load on your servers. This improved performance has a direct impact on your search ranking — you are penalized by sluggish speeds and other performance issues. You can create a robots.txt file here.
A note on privacy and robots.txt: Although all major search engines will follow the instructions written in your robots.txt files, they are not an effective way to protect sensitive information from being accessed — to truly protect private or sensitive information, do not store it unsecured on a public web server.
Sitemaps — A sitemap is an XML document created and submitted to search engines that allows them to understand the organization of your website as you believe it exists. If the search engine indexes your site and finds differences between the sitemap and what it found, it can report the differences to you, allowing you to find and fix indexing errors that may prevent proper indexing in the search engines.
Interlinking — Improve you site’s crawlability by finding natural ways to link between the pages of your site to ensure all content is connected within a good site structure. For example, creating a link to your site’s FAQ page in the content of the Contact Us page, benefitting both your customers and your SEO.
Page Load Time — Monitor how fast your site is loading, not only for user experience, but for search engine ranking as well. If your site loads slowly, its search ranking may be penalized. You can test your site’s speed here. If you find that your site loading slowly, work with your web developer to take action.
Optimize your code — One factor in page load time is the amount of code on each page. Keeping the size of each page’s HTML, CSS and JavaScript as small as possible reduces the amount of information that needs to be transmitted, improving this speed. Additionally, each individual file referenced in your site’s code becomes another request that has to be made to the web server. You therefore want to keep the number of files involved as low as possible. There are tools you can use to optimize and compress your site’s code, one being gzip, that compresses files by removing whitespace and comments, among other things, in your code. Code optimization is a large subject: further topics include inlining vs. external CSS, images and fonts, asynchronously loading JavaScript, reducing the number of server calls, and minification of code.
SEO Friendly URLs — Short URL’s make it easier for spiders to crawl your site. Stick to short, keyword-focused, readable URLs to improve your site ranking.
Search Engine Optimization is not just about keyword research and page content. There are many technical factors unrelated to content that can help or harm your site ranking in search results. Luckily, if your site is suffering because of one of the above technical SEO topics, there are solutions. If you need help, give us a call — our expert marketing team would be happy to assist you.