Our Work







Search Engine Optimisation Explained

Search Engine Optimisation Explained

As web designers we are getting more and more requests to take Search Engine Optimisation seriously and it is becoming a standard skill set that you must have if you want to create websites. It’s not only important to have a website that is usable, good looking and interactive, but it must be found on search engines such as Google. This process can be a bit of a mystery or an enigma for some clients, hence we will try and explain some of the factors we have learned in our research into Search Engine Optimisation.

This article is an extension to an article we wrote some time ago where we go into a little more detail to help explain.

See Search Engine Optimisation & Search Engine Tips for some background info.

To start off let’s look at a great and informative video by Google itself explaining how their Search Engine works and how it gathers your search results…


OK, so basically Google has written programs called spiders which crawl over the web collecting data from web pages. These spiders bring the data back to there nests (a huge database) and arrange the data in a particular priority based on each unique search term or phrases so that when a user searches a term or phrase the most relevant pages are presented.

This means that if you want to get found on google and better still ranked higher than your competition your website should be friendly and helpful to Google’s spiders.

So here is a list of things we have learned that should be implemented into your website so as to ensure your current website is spider friendly or Search Engine Optimised.

Be Credible

The Google spiders are very intelligent and are not easily fooled, so what ever your product or service make sure you write about it in a legitimate manner. A lot of people try and hack google by over adding keywords, hiding keywords on a page, using spammy link building services. Google knows this straight away as it detects trends across websites so you could be doing your site a disservice if it is not credible. Remember Google’s focus is on giving the customer the best experience possible so it will not present your information if it is not credible.

Credible Backlinks

The more credible you are, the more likely you are to have credible back links linking to your website.
If the information you are supplying is true and generally helpful other websites are going to want to link to you to support there conversations. The more legitimate back links you have the more Google’s spiders will see your site as credible and hence rank you higher. Hence having a blog or a latest news area that discusses your service or product in more detail than the expected can go along way to getting your page ranked higher.

Other ways to get credible back links are through;


Define and create a keyword strategy and then implement that strategy into your website content. This is probably the most expensive process as it involves expertise and time, but generally it involves;

  • Initial Market Keyword Research
  • Strategy and design
  • Implementation
  • Monitoring
  • Analysis
  • Continuous Improvement

There are many tools that can assist in your research, such as;

This can be a drawn out process as it involves implementation and testing and reimplementation. Also it is more beneficial to focus each page on only a few keywords at any one time so as not to overcomplicate and confuse Google’s spiders. If you have many keywords that you want to use, it is probably better to create additional pages and content that target these keywords.
Keywords can also be defined by short keywords or longtail keywords that are more like a phrase. Longtail keywords can be better qualified as these search results are more specific and targeted to a particular phrase input increasing the chances of a conversion rate.


Utilise landing pages for specific keyword campaigns

Landing pages are small targeted pages that usually sit on a sub domain of your website. These are designed to speak to users that have landed on your page from a particular keyword search. As landing pages can be the last stage of the users decision making process it is important that the page be specific to the users needs offering specific images, information, relevant testimonials and some kind of information capture or point of sale.


Use clean URL’s

This is where the actual URL (web page address) of each and every page on your website is correctly defined matching the page title and content of each page.

A lot of websites still have URL’s that are non explanatory, made up of random strings of letters and numbers. Modern URL’s can actually define the categories and keywords that are on your page.

For more information visit

Meta Data

Using correctly defined Meta Keywords and Descriptions unique & relative to every page. This is hidden code defined in the <HEAD> of each page that helps define your page which spiders understand.

For more information visit

Semantic Programming

It is best practice that your website is programmed semantically correct. This is where the actual structure of the code is clean and in it’s intended structure by the original code creators. There are hidden rules to the way you layout a page and the way you use particular programming TAGS to structure that page (known as semantics).

For more information visit

Social Media

Personal and credible back links to your website can be utilised via social media platforms including Twitter, Facebook and Google +. These are helpful as a back link to news articles and information pages and to build a social network around your brand. These are not only generated by your own social pages but are even more beneficial if your webiste is linked from your clients or fans social media pages.

Check out which is a tool that measures your social engagement.

Upload an XML sitemap of your website

An XML sitemap is a file that defines the page structure of your website. Although the job of the spider is to crawl your website and create it’s own snapshot of your websites page structure, if you provide Google with an XML Sitemap, you are making the spiders job a little bit easier and more efficient. I guess the spiders should like that.

There are tools that can help you generate an XML site map such as

Once a site map is created you need to upload them to Google Web Master Tools and associate it with your domain.

The speed of your website

The faster and more efficient your website is the more likely it is that the spiders will like your website. If your websites page load lags after a reference from google is clicked, the user experience is compromised and therefore Google will not rank you as high. The speed of your website depends on a few factors such as where your website is hosted in relation to your target audience, programming and image optimisation and how the website is designed.

You can test the speed of your website from Google’s own speed service here. This not only gives your website a speed ranking but offers you ways to improve the websites speed.

Set up Google Analytics and understand your results

Google Analytics is a fantastic tool to help you understand your current website traffic and to therefore test & make improvements to your website Search Engine Optimisation. Some important aspects of your Analytics report are;

  • The page bounce rate is where the rate the user leaves the page that they have visited and not continued to other pages on your site, therefore not capturing there interest. A lower bounce rate 25-40% is a good benchmark to aim for
  • The increasing number of new visitors – if this number is growing your website is increasing in new traffic
  • Traffic sources – where the users are coming from
  • Keywords – what words have they used to discover your webiste
  • Mobile version – helps you to learn how many people are accessing your site on mobile devices encouraging you to create a mobile friendly version of your website
  • Visitor flow – this measures what pages people have visited and more importantly where they have dropped off
  • View right now – the ability to see who is on your site live as you are viewing your analytics

Use a 301 redirect

Sometimes it maybe necessary for you to move your website from one domain to another. If this is the case you risk losing the credibility already gained from Google’s spiders as the pages have moved to a new location in cyberspace. Thank fully there is a tool available to resolve this known as a 301 redirect. More info can be found here about this.

To Sum Up

So that’s a comprehensive list of some of the things you can look at when trying to improve the Search Engine Optimisation of your website. It is not a complete list as there are many more things that can be done but this is a good introduction. Probably one of the biggest thing is that the web is continuously evolving and changing so it is just as important to keep up to date with web trends as this also effects Search Engine Optimisation.

We invite you to share your comments below on  your experience with SEO or any questions relating to this content.

Daniel Borg

Daniel Borg

Creative Director

psyborg® was founded by Daniel Borg, an Honours Graduate in Design from the University of Newcastle, NSW, Australia. Daniel also has an Associate Diploma in Industrial Engineering and has experience from within the Engineering & Advertising Industries.

Daniel has completed over 2800 design projects consisting of branding, content marketing, digital marketing, illustration, web design, and printed projects since psyborg® was first founded. psyborg® is located in Lake Macquarie, Newcastle but services business Nation wide.

I really do enjoy getting feedback so please let me know your thoughts on this or any of my articles in the comments field or on social media below.

Cheers Daniel