The Complete Idiot’s Guide To SEO: Everything You Need To Know
Welcome to The Complete Idiot’s Guide To SEO.
Let’s start by answering your first question: yes, we were originally going to call this guide SEO for Dummies.
Wouldn’t you know it though, that name’s already been taken by Peter Kent. If you don’t feel like reading 13 pages of documentation you can just take his Udemy course instead. No hard feelings if you do. We know British accents are hard to resist.
Still here? Good. Let’s begin.
Introduction
Search Engine Optimization, also known as SEO, is the process of optimizing websites to improve their performance on popular search engines such as Google. SEO is a subfield of the wider set of career skills known as digital marketing.
SEO deals with the “organic traffic” that comes about as a result of “organic search results”. Organic means any traffic that comes about naturally as a result of a user typing a query into a search engine and clicking one of the listings that happens as a result, rather than on a paid ad.
To clarify: when a user types in a given keyword, say “best 4k TVs“, the first several results they see are typically listed as paid advertisements marked as Ads. Organic results are anything they see below the ads. Organic traffic refers to any inbound traffic to a website resulting from users who click on those search results.
The early history of SEO and how search engines work
Website owners first began the practice of intentionally making websites rank higher on search engines in the mid-1990s, but the term SEO itself wasn’t coined until 1997.
A website owner, or webmaster, submits their website’s URL to a search engine. The search engine then sends an automated program called a web crawler or a bot to visit the website and learn about what’s on it, or crawl it.
The bot extracts information about the links leading into and out of the website, then sends it back to the search engine to be stored on a server, called an indexer.
The bot then moves from link to link and from website to website, cataloging the information it finds as it goes, and puts it together to make what’s essentially the giant map of everything on the internet.
The indexer then uses the words on each page to understand what a website is about. It looks at what words are on a webpage, where they appear on a webpage, and how frequently and prominently they appear one each page.
The words that determine this are known as keywords.
These keywords are matched to a user’s given search query. The corresponding websites that are triggered by those keywords are listed in order of relevance when a user types them into a search engine.
This, in its simplest form, is how a search engine works.
That’s how things started, but it didn’t take long before people with more brains than scruples started figuring out how to use this process to turn the Internet into a giant money-printing machine.
We’ll get more into that later.
Google, PageRank, and the importance of links
Google is by far and away the most popular search engine in the world at 82% of market share in the search engine industry, followed by its competitor Bing at 6% and then by Yahoo, Baidu, Yandex and Duck Duck Go.
It’s a well-known and popular fact that Google are all-seeing, all-knowing, omniscient, benevolent cybergods who make the best digital products in the world and are better than Jesus if he were taller and a snappier dresser.
If you disagree with the above statement, then congratulations. The NSA are already on their way to your house.
Larry Page and Sergey Brin, the co-founders of Google, created a search engine which made use of a system, or algorithm, to rank web pages in order of importance given a user’s search query.
This algorithm evaluated web pages based on the number and strength of hyperlinks, or links for short, pointing to a given webpage. This algorithm is known as PageRank.
A link is a word, phrase, or image that leads from one webpage to another webpage, or from one website to another. In HTML it takes the format <a href={{address of the webpage}}>YOUR ANCHOR TEXT HERE</a>.
In the PageRank algorithm, a link acts as an endorsement, or vote, for one website by another website. PageRank looks at the links pointing to a website from external domains, or external links, to assess a website’s relative importance.
It looks at whether external links are coming from a website it deems authoritative and trustworthy, and flags any links coming from websites it deems spammy or otherwise illegitimate as having less value.
Here’s an example. Let’s say there are two businesses in your hometown of Beach City: Fish Stew Pizza and Beach Citywalk Fries. They each have their own website.
Fish Stew Pizza might have a web page listing the local businesses it partners with, and on that webpage there’s a link leading to Beach Citywalk Fries’ website.
This is essentially like the owner of Fish Stew Pizza saying “oh yeah, Beach Citywalk Fries is great. Their fry bits are legit, we’re both members of the Beach City Chamber of Commerce, and I send the owner Mr. Fryman a Christmas Card every year. Just so long as he doesn’t start making pizza and cuts from my business, I endorse him, his website and his business. You should tots check him out.”
Get it? Tots, not totes? Nevermind.
Takeaway: Links are really important. Getting more of them and making sure the links come from authoritative websites in your niche is the single best way you can get your website to perform better on Google.
You stand no chance of ranking for anything without them.
On and off-page SEO
Google has over 200 ranking factors and they update their core algorithm over 500 to 600 times per year.
Google keeps most of the ranking factors its algorithm uses a well-guarded secret.
They do choose to tell us certain things in their Google Webmaster Guidelines, Google Search Liason on Twitter and various marketing, tech and web development conferences. Apart from that, most of what we know about how search results get sorted and prioritized is the result of guesswork informed by decades of rigorous testing and experimentation.
Broadly speaking, SEO can be placed in two different categories: on-page and off-page SEO.
On-page SEO refers to any ranking factors a website owner has direct control over. These factors include keyword optimization (what keywords it has and how frequently they appear), technical SEO (indexability, canonicalization, URL structure, metadata, image optimization, website speed etc.), and site structure (the way a website is organized).
Off-page SEO, by contrast, refers to any ranking factor for a website that the owner does not have direct control over. These include the number and authoritativeness of backlinks a website has, and user metrics such as click-through-rate and bounce rate.
Anchor text is a factor here as well. When building links, relevant anchor text will work best but it’s important not to over-optimize. Generic anchor text is a safer option but you can ensure relevance by including your keywords in the same sentence (this is known as co-citation) and the tactic is based on a Google patent.
As a side note, if you want to get an understanding of what Google is doing, patents are extremely useful.
Keyword optimization (keyword density)
In the early days of SEO, the game was all about keywords. Anticipating what keywords your ideal user (that is to say, your customer) is typing into Google, and how to effectively use those keywords in your website’s copy.
Perplexingly, some people in the SEO world still insist on the concept of keyword density as a relevant concept for optimizing websites. These people are wrong and when you find them, the correct response is to point your finger at them, contort your face in anger, and loudly denounce them as the filthy, horrible liars they are.
The truth is that although on-page keyword optimization is still important, it’s a far better use of your time to just interlace the keywords in your copy in a way that flows naturally, reads logically and makes sense to the reader, as if you were going to include them anyway.
In other words, write your website’s copy for humans, not search engines. A good rule of thumb is to use your focus keyword about three times for every 500 words or so. But like we said, don’t get too hung up on it.
The point is that many SEOs still make the mistake of designing and optimizing websites primarily to make the Google Gods happy. Your best bet is to focus instead on making a better experience for your ideal user.
Takeaway: This is always the first and foremost rule of SEO as well as all forms of digital marketing: Always. Remember. The User.
Indexability
When a website or web page is indexed, it means that it’s discoverable by search engines and accessible by users via that search engine. Making a website indexable, therefore, means improving the ease with which search engine web crawlers can catalog it.
There are also a number of reasons why you might want a search engine to avoid indexing certain pages and file types on your website to prevent them from popping up in search results.
This could be anything like locking away eCommerce-centered webpages like carts and checkout pages to provide a more logical user journey, or it could be your corporation’s attempt to disguise the dirty money they’re hoarding in sketchy, offshore Seychellen bank accounts to avoid taxation.
At any rate, making your website indexable means submitting two things to Google Webmaster: your website’s XML sitemap and a robots.txt file.
An XML sitemap is essentially a map of your website. It tells Google and other search engines where everything on your website is located, and what their URLs are. This is what makes your website more easily accessible via Google.
Creating and submitting an XML sitemap is a quick and simple process that takes 5 minutes with Google Search Console and a tool called Screaming Frog. You can learn how to do it yourself in this video.
A robots.txt file, meanwhile, is a text document found in the root directory of your website. It tells Google what web pages and file types on your website are off-limits for crawling and which ones are A-Ok. If your XML sitemap is like a roadmap, then a robots.txt file is more like a blueprint.
It’s important to note that Google treats the parameters outlined in your robots.txt file as a guideline, not a hard and fast rule. It could choose to just ignore it.
The most sure-fire way to prevent a webpage from being indexed is by including the meta-tag: <meta name=”robots” content=”noindex”> in the page’s source code.
Submitting your XML sitemap and robots.txt file to Google is essentially like giving it directions to your new house.
“Hey Google, welcome to our new house! Here’s how to get to our house. Here’s what you’ll find once you’re there. Oh, but please make sure you don’t go here, here, or here. That’s where we’re hiding the bitcoin mining servers, dead bodies and exotic animal-skin coats.”
SEO as a marketing strategy
SEO is the art and science of improving websites so they perform better on search engines, designating it firmly as a technical skill.
However, SEO is also a field that requires studying the way people use digital media and anticipating the way they use websites and search engines, then using that information to run a website as if it were an online business.
For this reason, SEO-related tasks are typically relegated to the marketing department, whatever those smartass web developers over in engineering might say to the contrary.
Unlike many internet marketing tactics such as PPC Search Ads or Facebook advertising where success is measured in terms of days or weeks, SEO success is measured in terms of months or even years. SEO is playing the long game.
Takeaway: Marketing is never free. It will always cost you either time or money, and you need to pick one. PPC search and Facebook advertising cost money. SEO takes time.
Google algorithm updates, and White Hat vs. Black Hat SEO
It’s also a well-known and popular fact that marketers are one of the most universally disliked groups of people on the planet Earth.
This may be explained by the fact that so many of them have scammy side hustles, big egos, stupid haircuts, douchey personalities, and the combined moral fiber of two starving wolverines trapped in a burlap sack.
It follows therefore that many of them have tried all kinds of sneaky and nefarious ways to outsmart Google and shoot to the top of their search results the easy way. These are known as Black Hat SEO tactics.
Doing SEO this way does not work, not in the long run. People have proven this. Yes, we’re sure, so don’t bother trying.
Why? Mostly because Google usually responds to this skullduggery by outsmarting them right back.
They accomplish this by punishing websites they suspect of using Black Hat SEO tactics and giving preferential treatment to website owners that use ethical website maintenance and business practices. The latter are known as White Hat SEO tactics.
When the field of SEO first rose to prominence, it used to be all about the keywords your website had. Website owners and marketers would often try to stuff as many keywords in their website’s copy and metadata as possible in order to get the most organic search traffic.
This is a practice known as keyword stuffing. Today, keyword stuffing is generally frowned upon in the SEO world. It’s regarded as providing a poor user experience at best and putting you at risk of Google penalizing your search results at worst.
Takeaway: Again, optimize and write for people, not search engines.
When keyword stuffing stopped working, the next thing they tried was to duplicate or outright plagiarize content from high-ranking websites to rank their search positions.
In response, Google released the Panda update to its core algorithm in 2011. Panda searches for something called duplicate content. Duplicate content is any copy (read: words) that are found on your own or another’s website that are also found on one or more places elsewhere on the Internet.
Takeaway: Original, unique content is the best content. Content is King.
Then, they tried to game the system by artificially manipulating how links get pointed to their websites so they could inflate their own search results. These are known as “link schemes”, and it usually refers to the practice of buying or exchanging links for money.
So, in response, Google came out with Penguin in 2012. Penguin looks for websites that it determines are participating in link schemes and penalizes them. It judges this by assessing the quality of the sites a group of backlinks is coming from.
Takeaway: Paying for links doesn’t work. You will most likely get garbage and you will risk losing your search results. It’s just not worth the potential risks. Don’t do it.
Then, Google tried something particularly interesting. In 2013 they released the Hummingbird update.
What Hummingbird does is evaluate search terms used by users in terms of contextual rather than simply semantical text. In other words, it tries to figure out not just what users are searching for, but why they’re searching for it.
Takeaway: Google’s first and foremost objective is to provide the end-user with the best and most relevant search results for a given query. You should form your SEO strategy based on that thinking.
We said it before. We’ll say it again: Remember. What. Is Best. For the user.
Other prominent Google algorithm updates include RankBrain (released in 2015) which looks at user metrics such as click-through-rate and bounce rate to determine the typical user experience of a website, and E-A-T (released in 2019) which examines whether a website demonstrates expertise, authoritativeness and trustworthiness based on a number of factors.
International SEO
This is where things get a little more complicated. International SEO is a very dense topic and it often leaves even experienced SEO professionals scratching their heads in frustrated confusion.
We’ll try to keep it to the salient points and just tell you what you need to know.
Let’s say that your business is expanding into the Canadian market and you want to make your website accessible to both English and French-speaking Canadian users.
You basically have one of four options.
Your first option is Country Code Top Level Domains (ccTLDs).
The URL structure for your new website will look like this:
URL structure:
www.prerender.ca/en
www.prerender.ca/fr
This involves registering a new domain with a new country code, in this case .ca, along with any corresponding languages codes, en and fr.
This is the most effective signal to Google that you’re making a website for Canadian users. It’s equivalent to a full embrace of the Canadian market.
However, it’s also by far the most difficult and demanding international SEO solution to maintain as well as the most resource-intensive.
Doing it this way essentially means you are setting a precedent for making a new website domain for each market you expand into, each with their own whole new SEO strategy (going through the process of ranking for keywords and establishing a backlink profile from scratch), not to mention hiring a separate team of web developers to maintain each one.
This is just not an option for most businesses. Most businesses simply don’t have the resources to do this sustainably unless they are globally-recognized brands like Nike or McDonalds.
Then there’s subdomains.
URL structure: www.ca.prerender.com.
This is the next strongest way to establish geo-targeting for a website and is easier to maintain. Going the subdomain route means you get to use Google Search Console to geo-target each subdomain on your website by region and language.
This is a perfectly viable option. You may do things this way if you so choose.
That being said, SEO best practice usually advises that you consolidate your subdomains and only work with as many as strictly necessary. It just gets messy otherwise.
Next, we have the subdirectory method. Adding a separate site path folder (subdirectory) for each locality and language.
URL structure:
www.prerender.io/ca-en
www.prerender.io/ca-fr
This is essentially keeping each language and region in its own folder. It also allows for individual geo-targeting with Google Search Console.
This is the method that is generally seen as having the best mix of scalability and ease of implementation. As such, it’s usually the recommended method of choice for international SEO initiatives. Do things this way, if you can.
Lastly, there are generic top level domains (gTLDs) used in tandem with hreflang markup.
URL structure:
www.prerender.com?lang=fr-ca
www.prerender.com?lang=en-ca
Doing your international SEO strategy like this is messy, gross and overly complicated. Google doesn’t like it. People don’t like it. Don’t do it.
Resources
And that about covers the basics! Here’s where you can study up if you want to learn more:
- Moz – Beginner’s Guide to SEO
- Ahrefs – SEO Basics: Beginner’s Guide to SEO Success
- Google – SEO Starter Guide
- CanIRank – SEO for Startups: 5 Ways to Use Brains Over Budget
- Blogging Wizard – What Is SEO? A Beginner’s Guide To Search Engine Optimization
And while you’re at it, here are some YouTube channels to subscribe to:
These are often run by Rand Fishkin. Fishkin is the CEO of Moz, an enterprise-grade SEO platform trusted and respected throughout the industry.
Please ignore for a moment that his corny hipsterstache makes you want to smack him upside the head every time you look at his dumb, stupid face, and focus on the actual content in his videos. Fishkin is in fact a leading expert on all things SEO-related and he knows what he’s talking about.
His Whiteboard Fridays series are some of the best resources on offer when it comes to SEO. He breaks down complicated SEO topics in ways that anyone can understand.
Ahrefs is simply the best content marketing, keyword research and backlink analysis tool there is. Their YouTube videos on SEO are led by Sam Oh, a digital marketing and SEO consultant at Ahrefs.
He covers incredibly useful strategies on everything from how to do a content audit to resource page link building, as well as corresponding email templates and overviews of basic SEO concepts. We usually start our work mornings by watching one of his videos over a slice of grapefruit.
Mr. Oh has the additional credentials of being a squishy, doe-eyed, huggable, teddy bear of a man whose rosy cheeks make a sound like a squeaky toy whenever you poke them.