Search Engine Optimisation
Search Engine Optimisation is the process influencing the structure, content, and authority of a website to increase the number and improve the quality of visits from search engines via “organic” search results. Search engines are continuously evolving, so your strategies and tactics should also continuously evolve. SEO is an ongoing process, not a one-time activity.
Ideally you want your web site to be displayed somewhere on the first page of results for your important keyword phrases, as 80% of users don’t look past this page. Studies suggest the top five ranked listings receive 50% of the clicks for a given keyword phrase.
What are the benefits of Search Engine Optimisation?
The majority of search visits is driven by the three major search engines, Google, Bing and Yahoo!. If your web pages are not discoverable, and their associated content elements are not readable by search engines, you miss out on the incredible opportunities available to web sites provided via search.
Proactively engaging in SEO may require significant human resources and financial investments, but can have an exceptional rate of return. There are several benefits to making your website SEO friendly, these include:
- One of the most cost-efficient online strategies
- Results can last for months and even years beyond the initial investment
- Clicks on organic links account for approximately 70% of overall search traffic
- Sites with established brand equity typically have an advantage
- Highly accountable, can directly measure impact
- When applied well, can enhance user experience and improve website accessibility
- High rankings for generic terms contribute to brand awareness
- Highly targeted search visitors are more likely to convert
How does Organic Search Work?
Search engines consider many factors to determine the web pages to include in their indexes and how the pages should rank for keyword searches performed on their engines.
There is a three step process in which content appears in the Search Engine Results Page:
- Software agent called a crawler (also called robot, bot, or spider) discovers web pages by following links
- Crawlers only fetch a limited number of pages at a time (called threads). The URLs of pages to fetch are stored in a queue.
- Crawler adheres to exclusions listed in a robots.txt file and/or specific meta tags in code.
- The source code of each successfully fetched page is saved as a cached version.
- Content is collected and filtered from each saved page.
- Duplicates and low quality pages are filtered and removed.
- Data is organized for speed and relevance of returned results.
- Software processes search keywords and ranks web pages based on unique algorithms.
- Algorithms consider on-the-page and off-the-page factors including web page copy, meta data, and inbound & internal links.
- Each search engine has a unique formula for determining relevance.
The image below shows you the steps in which a web page is indexed and placed into a search engine database.
A search engine’s operations can be quite complex, managing at times millions of calculations each second and funneling these demands for information to each individual user around the world.
|What Steps does a Typical SEO Project Include?|
|Most SEO projects include several, if not all of these phases:
Getting Started with SEO
Is SEO a good fit for you? It maybe if…
- Your site’s lifespan will be at least 6 months
- You are able to make changes to site content and code
- Your site consists of at least 20 pages of unique content
If you answered ‘Yes’ to these questions then you are ready to engage in an SEO project. Getting started with a best practice SEO program is EASY!Get in Contact today - SEO