5 Common SEO Mistakes

SEO can be very straightforward, and it can also be extremely complex. A lot of times we are so focused on our rankings, metrics, and content that we overlook simple mistakes. Below are 5 of the most common SEO mistakes in 2016:

1) Broken Links

Broken links are links on your website that don’t work properly. There are different types of broken links, the most common being a 404 error. A 404 error is a link that leads to a page on your website that no longer exists. Another example would be a URL that is incorrectly entered to the website, meaning the link doesn’t go to the right location. Broken links can be frustrating for users, and as a result, Google penalized websites with a lot of them.

white_house-gov_404_error_1-20-09
No website is immune to broken links. Here is a 404 error on the White House Website.

As time passes and a site grows, it is inevitable that some links will break. But if your site has dozens or even hundreds of pages and links, how are you supposed to stay on top of all of them? The best way to do so is to regularly perform web crawls of your website. Software like Screaming Frog (LINK) use a robot, or “spider,” to go through your website page by page and uncover errors such as broken links. Performing monthly, or even weekly crawls of your site is an effective way to prevent broken links and improve/maintain your rankings.

2) Duplicate Content

Duplicate content is exactly what its name would imply: content that is repeated on your site on more than one page. Search Engines want to give the user the best, most relevant content. When that content is repeated on multiple pages, Google doesn’t know which web page to pick and send users to. As a result, websites with duplicate content can be penalized in their search rankings.

But what if your website needs duplicate content? For example, many e-commerce websites need to link the same products across different web pages. There is a solution. That is to establish a 301 redirect hierarchy. This is a way of telling search engines which version of your duplicate content to send visitors to. That way, search engines know your preferred page, they aren’t confused, and you receive no penalties.

3) Non-Unique Meta Description/Title Tags

Meta descriptions are the short blurbs that show up on search engines giving a short description of your web page. The easiest way to explain meta descriptions is with an example. When I search “news” in Google, CNN is the top result. The short snippet that says “View the latest news and breaking news today for U.S., world, weather, entertainment, politics and health at CNN.com.”

screen-shot-2016-11-06-at-5-25-56-pm
The highlighted region is the meta description.

Title tags are even shorter, and arguably, more important. They are the short description that gives a quick preview of what the page is about.

They are the blue text shown on search engine results pages. They are also shown at the top of tabs in web browsers. For CNN, their title tag is the blue text : “CNN – Breaking News, Latest News and Videos.” When I click into the CNN website, that same description is at the top of my browser in the tabs.

screen-shot-2016-11-06-at-5-33-42-pm
The CNN Title Tag.

A great way to check to ensure you don’t have duplicate meta descriptions or title tags, is to use Screaming Frog web crawler. Not only can Screaming Frog find broken links, as mentioned above, but it can also find duplicate title tags and meta descriptions. And best of all, the software is free!

4) Not Allowing Search Engines to Crawl Your Site

How do search engines find and list your website? They use robots called spiders to sift through your website’s content, “crawl” it. After they crawl your content, they get a better idea of what you are writing on, how to categorize your site, and ultimately how to rank your website.

However, you as a website owner have to give Google and Bing permission to crawl your website. If you don’t do this, your site won’t ever be listed, as the search engines won’t even be aware that it exists. Google and Bing both offer “Webmaster Tools” for site owners to manage how their websites are doing on the search engines, if they can be crawled, etc. This link shows is a great guide for beginners on how to use Google and Bing Webmaster tools.

5) Having a Slow Loading Site

People are impatient. We don’t like waiting, and will actively avoid places that make us wait. This is as true for your coffee as it is for your website. Waiting for a web page to load can be extremely frustrating, and extremely annoying. This is especially true for people accessing your site on mobile devices. When this happens on multiple links on a site, the user may just give and leave the site. Not only does this cost you traffic, but search engines take note and will penalize you accordingly.

Fortunately, there are a couple of ways of improving page speed on your site. The first is to update your hosting. If you are using shared hosting, the servers your site uses are under considerably more strain and have less bandwidth to dedicate to your site. This results in longer load times for web pages. Although using dedicated hosting may be more expensive, it can be a worthwhile investment. It loads faster and can improve your search rankings.

Another way to improve site speed is to use plugins like Smush to compress images. This reduces your bandwidth and puts less strain on your servers. Another way is to enable browser caching. When a user visits your website, content of the site is temporarily stored on their hard drive, like the text and images of your site. If they visit the site again in the near future, their browser can use this stored information to quickly load the site.