Modern and ethical SEOs, like me, love to shout about the importance of content from the rooftops: "Content is king!", we all blog from behind smudged computer screens. And creating quality and valuable content should be an important part of your SEO strategy. But how important are technical SEO factors? 🤔
Let's put it this way: if you create a web page with mediocre content, you probably won't rank great, right? But if your website has technical issues, that might mean you won't rank at all.
A technically sound website environment should be a top priority for a successful SEO strategy. If your site is not crawlable or is non-indexable, all your other SEO efforts do not matter. Typical websites are managed by a CMS hosted on a server. Both your server and CMS can have an effect on your website's technical SEO performance.
Let's look at some 3 critical technical factors you should familiarize yourself with.
1. URL Structures
The structure of your website URLs matter to search engines and to the humans searching the web. A URL is a human-readable text that is used in places of the numbers IP addresses. IP addresses are a series of numbers that computers use to communicate with servers.
It is one of the first signals to a search engine about your web page. One way you can help your webpage rank better is to create SEO-friendly, pretty URLs. What makes a "pretty" or "ugly" URL? Let's look at an example of a pretty URL:
<!-- Example of a pretty URL structure and a pretty good beer 😉--> https://www.newbelgium.com/beer/voodoo-ranger/
This is a pretty URL from New Belgium Brewing (and my beer choice while writing this blog). While you may not know where that page will lead, a person reading it can clearly get an idea. So can search engines.
Now, in contrast, let's look at an ugly URL:
<!-- Example of an ugly URL structure --> https://cdn.07edfg.domainname.com/934rf/s?uqTqcfxYashaD64S67Cm
Now what kind of page would you land on if you clicked on that (it's a fake URL by the way)? Hard to know because, besides the domain name, it's a mess of random letters and numbers. People and search engines don't like ugly URLs.
Creating pretty, SEO-friendly URLs
When creating URLs for your website, you want to do so taking your webpage keywords into account. We'll talk about finding keywords another day, but each URL on your site should target the primary of the webpage it represents.
<!-- Example of a pretty SEO-friendly syntax --> https://example.com/category-keyword/sub-category-keyword/primary-web-keyword.html <!-- E-commerce example for mens running shoes --> https://shop.example.com/mens/running-shoes/nike-air-zoom-pegasus-36-trail
In general, the shorter the URL, and the fewer folders, the better. The folders on your website should represent the website's architecture.
2. Broken Pages and Redirects
Have you ever been navigating the internet, click on a link, and ended up on a 404 page? Maybe it didn't say 404 Error. It might have said something like, "Oh this is embarrassing. This page doesn't seem to exist any longer!" It might have a graphic to appeal to your confusion.
Broken pages are created when content is removed or deleted but are still being linked to by live pages. This creates an issue for search engine crawlers and people, alike. Broken pages trigger a server 404 error. It's the server's way of telling Google, "Ummm, they say something is here, but nothing is here." That's a poor signal.
People all over the internet can find and link to your content. When you delete or move that content, it is important to redirect links pointing to non-existent resources. Technical SEOs can implement temporary or permanent redirections to
Temporary and Permanent Redirects
Let's imagine a scenario: you have a "2019 Guide to..." post on your website but you wrote a new "2020 Guide to..." post on the same traffic. You now want to delete the 2019 post but want that web traffic to be directed to the new page. How?
You must implement a redirect. There are 2 main redirects: 301 Permanent Redirects and 301 Temporary Redirects. Feel free to look up the others, but these are the most common you will run into. Most of the time when you are moving content, it is a permanent move.
There might be other webpages linking to your "2019 Guide to..." post, and you don't want to lose those backlinks or that web traffic. Instead, you implement a redirect to, well redirect, web users and search engines to the new page. Depending on your website and/or CMS you'll want to look up how to implement a redirect or talk to your development team.
3. XML Sitemaps and robots.txt files
XML sitemaps and robots.txt files that allow you to communicate with search engines about how to crawl your site. With your robots.tx file, you can dictate which web crawlers have access or are forbidden from your site. Both of these files are generally placed in the root folder of your website.
Please note, for the sake of brevity, my explanation is very brief are very basic examples of each type of file, but please research more about robots.txt files and XML sitemaps to learn how your options of how to best optimize your website.
Example of basic robots.txt file
Let's look at a basic robots.txt file that allows all search engines to crawl every area of a website. Note that your robots.txt file can and should declare your sitemap as well.
<!-- Allows all crawlers to crawl all areas of the site --> User-agent: * Allow: / Sitemap: http://www.example.com/sitemap.xml
Example of basic sitemap.xml file
Your sitemap should be submitted in Google Search Console so that it can be validated and your site can be properly indexed by Google. Let's look at a basic example of a sitemap that shows 2 URLs:
<!-- Basic example of a sitemap.xml file --> <?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>https://www.example.com/</loc> <lastmod>2020-01-01</lastmod> </url> <url> <loc>https://www.example.com/about</loc> <lastmod>2020-04-22</lastmod> </url> </urlset>