Search Engine Optimisation Defined
“SEO stands for “search engine optimization.” In simple terms, it means the process of improving your site to increase its visibility when people search for products or services related to your business in Google, Bing, and other search engines. The better visibility your pages have in search results, the more likely you are to garner attention and attract prospective and existing customers to your business”. (Search Engine Land, 2021)
How Google and Most Search Engines Work
This post will provide a full break down on how Google works. The aim of this post is to arm you with the relevant knowledge in relation to what you, as a business; should be asking of your developers. It should also outline what you should expect from marketers in relation to your site’s content:
Google uses a patented algorithm known as PageRank. This algorithm is just one small part of how Google ranks your site. This algorithm works in conjunction with other elements which will then determine the overall ranking of your website.
Ian Rogers of The University of Princeton has put how PageRank works very simply; “In short PageRank is a “vote”, by all the other pages on the Web, about how important a page is. A link to a page counts as a vote of support. If there’s no link there’s no support (but it’s an abstention from voting rather than a vote against the page)” (Rogers, 2014).
The key thing to remember about the PageRank algorithm is that it ranks pages on importance. This importance is measured via the number of internal linkages and external linkages. In essence, it can be said that this algorithm is the quantitative measuring element of the search engine.
Google Bot works in conjunction with the PageRank algorithm. This is to give Google an accurate picture on where they should place your site in their results. The role of the crawler is to trawl through billions of webpages. This is done by working through a list of URLs. This list is generated through previous crawl sessions and submitted sitemaps. The key information that the Google bot is looking for is as follows:
- Detecting links on web pages and adding them to their existing list of URLs to crawl.
- New websites to add to their existing list of URLs to crawl.
- Changes to existing sites.
- Dead links.
- Google bot is also equipped to detect malpractice in relation to SEO.
- In short, the crawler’s job is to collect website information for Google to index.
Another part of the Google bot indexes and categorises information contained within web pages. This is done by assessing the position of words on each page and determining their relevance to the content. These words are known as keywords. In addition to this key content tags such as titles, ALT attributes and image titles/tags. Google indexes sites based on relevance and if the website in question covers a broad range of good quality content.
Key things to remember
Link your pages
Make sure your pages link in a way that is above board and legitimate. Only allow good quality sites to refer your URL. This will allow PageRank to determine that your site provides a good quality browsing experience that displays content that is relevant to the browser.
Make it easy for the bot to crawl through your site
This is done by ensuring your links work, your keywords are in the right places and are also in alt tags. Make sure your URLs are not too long and logically lead to one another. If your website is updated and/or new pages are added; submit the updated sitemap to Google console. If your website is updated regularly then perhaps it would be beneficial to submit your sitemap to Google console on a bi-weekly basis.
Stay relevant to aid indexing
Make sure your content is relevant succinct and concise, word quantity will not do you any favours. More information on how to be easily detected and indexed can be found in our blog section.