5 Technical SEO Tactics You Need to Know in 2016

5 Technical SEO Tactics You Need to Know in 2016 Featured Image

seo-tactics-2106

Your website might be filled with high-quality search engine optimized content, but your rankings could still suffer if you’re not also ensuring everything is good on the back end.

Stop me if you’ve heard this one before:  “Content is king.” Chances are you’re familiar with this oft-overused phrase if you’ve been anywhere near the field of digital marketing.

Of course, there is an element of truth behind this principle. Google’s core algorithm continues to champion user experience — websites designed for people versus search bots — when determining rank. The quality of one’s content has since become a guiding principle within search engine optimization, a change for the better.

But what good is quality content if search engines can’t find it?

Popular opinion regarding what constitutes an effective website has changed considerably, but one thing remains the same: technical SEO is essential to having your website and its content discovered by human audiences. So, here are five technical SEO tactics you should incorporate into your 2016 SEO strategy or ensure your SEO company implements for you.

1. Make Your Website Mobile-Ready

Now more than ever, mobile is crucial for online visibility. As of 2014, there are now more mobile users than desktop users.

ComScore

Ensuring your website is mobile-ready requires more than just using responsive design. One way to learn more about your mobile readiness is with Google’s PageSpeed Insights. This free tool will help you break down exactly what area of your website you need to improve to provide a better user experience for your customers across all devices.

Technical SEO Tactics: PageSpeed Insights

Some issues, like enabling compression and leverage browser caching, are simple enough to implement yourself. It just requires some minor additions to your .htaccess file with Apache.

PageSpeed Issues - Technical SEO Tactics

Why Is Improving Your Score Important?

Users expect Web pages to load as fast, if not faster, on their mobile devices than on desktops. Apart from providing the best user experience possible, page speed is a ranking factor and should not be ignored. Your mobile sites could face penalization for loading too slowly. Visitors are also quick to bounce away if a site takes longer than a couple seconds to load.

2. Switch to HTTPS to Improve Security

Even if you don’t require a secure encryption, using HTTPS can provide your website with some ranking benefits. Look inside your Analytics, and filter by browser: How much traffic comes from Chrome? If it’s the majority of your traffic, you should consider switching to HTTPS. Google started flagging non-HTTPS sites with a little X next to the URL in 2015.

HTTPS

What are the advantages of switching to HTTPS?

  • Identity Verification
  • Data Integrity
  • Security and Privacy
  • Trust

If you use WordPress, follow these simple steps for a smooth migration from HTTP to HTTPS:

  1. Download, install, and activate the Easy HTTPS Redirection plug-in
  2. Go to Settings -> HTTPS Redirection
  3. Verify that all URLs are resolving to HTTPS. Check internal links to ensure they’re showing as HTTPS.
  4. Add the HTTPS version in Google Search Console and verify

HTTPS Redirection Settings

3. Avoid Duplicate Content With Canonical Tags and 301 Redirects

Many times search engine obstacles are created unintentionally, but these can still hurt your rankings. Make sure you have canonicals and 301 redirects in place to prevent the creation of duplicate content.

URL Parameters

Some content management systems (CMS) can create multiple versions of the same page, especially if you’re running an e-commerce website. When a user selects a parameter (price, size, etc.), it will create a new dynamic URL. Google sees this is as a separate URL and indexes it. To prevent this and stop duplicate content, place a canonical tag on the root or category page.

Canonicalization
Image by Moz

Www vs non-www

This is probably one of the most common issues of duplicate content that exist today. Having canonicals and redirects in place is critical because Google considers the www-version and the non-www version of a page as entirely different pages. To prevent this, place a canonical tag and a 301 redirect going from the www version to the non-www version, or vice-versa, whichever is your primary URL.

Non-www

Non-www Canonical

Www

www Canonical

4. Configure Your Robots.txt File for Search Engine Bots

A robots.txt file is a simple file used to specify what Web pages search engine bots should not crawl. Typically, a well-structured robots.txt file will have folder paths you don’t want Googlebot to waste resources on and shouldn’t be in the search engine results pages (SERPs), such as private folders and less-noteworthy content.

The simplest version of robots.txt:

There are two major elements within a robots.txt file: User-agent and Disallow. The first line “User-agent” indicates that the following lines apply to all agents. The space after “disallow:” means that nothing is limited and allows all types of robots to see and crawl everything on the website.

Common examples of robots.txt files:

Allow all bots to access the whole site?

robots.txt file

To block some parts of the site to avoid duplicate content:

robots.txt file3

Don’t forget you can specify the Sitemap location in your robots.txt file:

robots.txt file4

5. Create XML and HTML Sitemaps and Submit to Search Engines

There are two types of sitemaps: HTML and XML. HTML sitemaps are designed for humans. These are typically linked in the footer of your website.

XML sitemaps help search engines understand the full scope of content contained within your site. Without a complete and submitted XML sitemap search engines may miss content and fail to index all Web pages. This can result in a lack of organic rankings and organic traffic.

Always create a sitemap.xml file and submit your sitemap to Google using Google Search Console. You can also specify which pages take priority over others and help Googlebot understand the intended structure and importance of your website.

  • Login into your Google Search Console account
  • Click Add/Test Sitemap

ADD Sitemap

 

  • Submit the sitemap and allow Google to crawl it. Check back in roughly 24 hours to see the number of URLs that have been indexed.

Indexed URLS

In Summary

Content is only useful for your site if it can be found by a human audience. To ensure that both your website and its content can be found on search engines:

  1. Make your website is mobile-friendly.
  2. Switch to HTTPS even if you’re not an e-commerce business.
  3. Make sure 301 redirects and canonical tags are in place to avoid duplicate content.
  4. Confirm your robots.txt file is in place and only blocking search engine bots from crawling pages and files you don’t want indexed.
  5. Create an HTML sitemap to improve user experience and create an XML sitemap to submit to search engines.

Though there are other technical SEO factors that may cross your radar in the future, our list of five technical SEO tactics should lay the foundation of any SEO strategy you put in place in 2016.

Schedule Your 1-On-1 Search Engine Optimization Consultation Now

Tags: , , , , , , , , , , , , , , , , ,

  • This field is for validation purposes and should be left unchanged.