Put simply, SEO – or search engine optimization – is the process of getting a website to rank higher on the list of websites that search engines, like Google, return to you when you type search terms into their search boxes. Ideally, you want your site to rank on the first page of results, as users are far more likely to visit these sites than they are sites on pages two and beyond.
Improving your website ranking position is no easy task. The rank of your site depends on a set of complex algorithms determined by search engine providers. The goal of SEO is to manipulate these algorithms to your benefit, helping your site rank more highly in search results but without falling foul of any of the terms of service that providers, like Google and Microsoft, insist you obey.
Given that page space on search results is finite, there’s a lot of competition to get onto the first page of results. Big companies with even bigger budgets spend substantial marketing dollars every year ensuring that they retain their position at the top of search results for specific keywords.
SEO, in general, is focused on achieving what people in the industry call “organic” traffic for websites. Unlike paid advertising, in which companies pay to rank more highly in search results, SEO focuses on generating traffic for sites without paying a search engine directly. Of course, doing good SEO often requires paying a third party marketing firm that specializes in the process, but the idea of SEO is to ultimately make website traffic self-sustaining without the need to continually pay out advertising money every month. Think about when you last typed a search query into Google’s search bar for a product: you may have clicked on a paid link, but more likely, you just clicked the first result that came up. By clicking on an unpaid link, you became organic traffic for that website.
Google and other search engines want to provide excellent service to their users because they want to attract advertising dollars. Without a large active user base, advertising on their platforms becomes less appealing. Search engines, therefore, have a vested interest in making sure that their content is relevant to users. If users go elsewhere, it will cost them advertising revenue.
But how do they try to stop users from switching to other platforms?
One of the most basic ways they do this is through the use of a program called a spider. Spiders go into websites, have a look around, and pluck out associated keywords that they think might be related to the search term a user typed in the search bar. Effective SEO involves including relevant keywords in website copy that spider algorithms believe are associated with each other.
For instance, suppose that your business website sells cutting machinery to other companies. A customer might type “laser cutters” or “plasma cutters” into a search engine. To rank well, your website should contain the same terms. If it does, spiders are more likely to conclude that the pages of your site are relevant for that particular user.
It’s usually a good idea to think carefully about the type of words that you use. Although people in your company familiar with industry jargon might use one set of expressions to describe your product, customers uninitiated in the technical aspects of what you sell might use another. You might, for instance, sell “low fatigue girding,” but your customers might call it “reinforcement poles” instead. Although the former may be the technically correct expression, spiders are more likely to reward websites containing the latter.
Of course, the terms that users type into search boxes can change over time. As products and industries evolve, it’s crucial to update website copy so that it remains on-trend and can capture traffic using the latest language. It’s also a good idea to keep an eye out for so-called “long-tail keywords.” These are phrases or sentences that users frequently type into the search box in the hope of getting an answer or a product recommendation. Businesses that can cater to these keywords will find themselves in an excellent position, receiving lots of free traffic while signaling to Google and others that their website is providing value to users.
Although you might not think it, it matters a lot where you locate your keywords. Yes, search engines tend to prioritize keywords in titles and page headers, but they also take into consideration keywords in body text and in the so-called “metadata” of your website – the data that users don’t see but search engines do. Ideally, you want your metadata to provide information to search engines about your page suggesting that it is highly relevant to your target audience.
When Larry Page and Sergey Brin, the founders of Google, were putting the first iteration of Google together, they needed a way of sorting pages in order of importance. At the time, it wasn’t clear what the best way of doing this was until somebody had the idea that using the links already on the internet might provide a possible mechanism. The more links a page had, the reasoning went, the more useful and relevant it was likely to be.
Links soon became the bedrock of the modern page ranking system, and even today, with all the additional complications of the algorithms, it remains at the core of SEO.
Links are simply hyperlinks from other websites. Links create a lattice of relationships between sites across the web, with the most popular sites having the most links, and the least popular, the fewest. Links are both a great way to get traffic from users of other sites, and excellent for signaling to search engines that your site is providing useful information to users elsewhere on the internet.
Getting links, however, can be a challenge. One method is to use paid links – paying a third party agency to create content with links to your website on other, related websites. Google and other search engines will then trawl those websites, follow the links, and discover that they converge on your site, adding to the evidence (at least in the mind of the search engine), that your site is providing users with something valuable.
Another method for getting links is to provide incredibly useful content. If you can generate new and original information, other domain owners will link to your site because your site is the only place to get such information. Being the first to create something new can boost your site up the rankings and get you noticed.
It’s worth pointing out that not all links are created equal. Sure, a link from a random, low traffic website will have some effect on your page ranking, but its impact will be dwarfed by that of a link from an “authority” site with millions of hits each day. Search engines know which sites are authority sites, and which aren’t, and will weight links accordingly. A link from a well-respected site like Forbes, Inc, Buzzfeed, TIME, or Hubspot can have a hundred times the weight of a link from a site that was just set up yesterday.
Getting links from these kinds of sites is one of the holy grails of SEO. Links from respected sites imbue pages with legitimacy and generate a fair amount of traffic in their own right. The quest for links is one of the reasons why so many entrepreneurs contribute their content for free on high-profile sites. They can get a link to their personal websites in return, potentially cementing their position high up in the search rankings for their chosen keywords.
Staying On Top Of SEO Updates
Microsoft, Google, Duckduckgo and others are continually looking for ways to optimize their services and stay one step ahead in the cat-and-mouse game of SEO. The big companies know that with each passing year, SEO marketing becomes more sophisticated and businesses learn how to advantage their content at the expense of better, potentially more relevant, material from elsewhere.
Search engines want to ensure that they provide the most relevant content possible, and so they constantly tweak their algorithms to make sure that artificial SEO tactics have less of an effect. In the past, for example, search engines used to rank a page depending on the number of times it contained the relevant keywords. The result was “keyword stuffing” – or the practice of placing as many keywords in web copy as possible to trick the search engine into thinking a particular page was the most relevant. Keyword stuffing worked for a while, but it was bad for users who didn’t want to read poorly written articles which repeated the same expressions over and over again. Search engines quickly stamped out the problem by ignoring repeated keywords.
The way paid links works has changed too. In the past, it was sufficient to artificially create a bunch of websites and then link these to a particular page to boost its ranking. Businesses paid third parties to build hundreds of links from obscure sites to signal that the page was popular and relevant. The result, of course, was a bunch of high-ranking pages providing low-quality content which frustrated and bored users. The high rank was an artificial outcome of SEO tactics, rather than the natural result of being something genuinely useful.
Today, link building has changed dramatically. Companies now know that to be sustainable, links need to come from relevant authority sites, and website content must be useful. If it isn’t, then firms will lose customers and waste their SEO marketing budgets.
Up To Date Pages
Have you ever noticed that when you type a search into Google search bar, you tend to get pages that were created more recently in time? Rarely do you end up on a webpage created ten years ago. The fact that you get more recent search results doesn’t happen by chance – it’s an artifact of the way the search algorithms work. Search engine providers know that their users are more likely to want the most recent, up to date information, and so they preferentially select more recent pages to rank higher in search results. It’s not a perfect system, but it does mean that businesses that want to rank more highly need to ensure that they update web pages on a quarterly or yearly schedule.
Although SEO might seem like a lot of work, it is easy to outsource. Companies don’t need to learn the ins-and-outs of SEO from scratch and then implement it themselves: they can just farm out activities to SEO agencies that will take care of the whole process for them.
There are also a host of SEO tools on the market which help businesses automate what could otherwise be time-consuming, SEO-related tasks. Some tools, for instance, allow you to quickly and easily see not only which sites link to your own, but also the sites that link to your competitors. Knowing the sites that link to your competitors could be quite useful, allowing you to quickly create a shortlist of target sites that you’d like to link to your own.
SEO tools might also include the following:
Keyword generators. These tools will generate a set of related keywords linked to a particular keyword you type.
Backlink checkers. Backlink checkers show you the links on a site from other domains.
Trend checkers. Trend checkers show the growth in popularity of particular keywords.
Page tools. Page tools allow you to collect data on how many people visited a particular page, how long they stayed etc.
Page analyzers. Page analyzers help you spot common SEO problems with specific pages, such as broken links, which can impact ranking.
In conclusion, SEO is simple in concept but complicated in the details. A good SEO strategy helps to boost your site’s ranking in search results while simultaneously netting you more customers. Sensible companies work with experts who understand the SEO process in all its detail, and can provide long-term support. Companies that succeed in SEO usually succeed financially.