How We Ensure Our Web Designs Are Search Engine Friendly
Ensuring that your website is search engine friendly when we design it ensures that should you choose to engage any marketing or search engine optimisation all your efforts will have a seamless effect on your rankings. Here’s how we do it.
Step 1. Site Planning
During site planning consideration should be made for search engines friendliness. The direction that a search engine robot crawls your website is very much influenced by your site plan. If you want your website to be highly relevant for a range of products and services you have to make sure that not only are there dedicated pages for them but also that these pages are easily accessible for the robots. That means linked to from the front page or at the very least linked to from a higher level page that is eventually accessible from the front page. The further within the site that a page is, the less residual ranking power that it receives from your top level web pages.
Step 2. Copywriting
Gone are the days of SEO meaning filling your copy with keywords to the point of it being senseless to the reader. Good SEO copy = good copy. The focus should not be on keywords but on expressing clearly what your page is about. Is this a page about web design? Then your page should mention web design and variations on the phrase such as website design, web development etc. Not to the point of having to fit it in but if you write a concise page about what you want to rank for then chances are you’ll get it right anyway.
Step 3. Tools & Analytics
Installation of analytics software allows you to gather insights from user behaviour. You can see what your traffic does when it gets to your site. Information such as your best pages, conversion and bounce rate and demographic information can be gathered using Google Analytics and Adobe Omniture. Advanced features such as attribution modelling can be a useful way to ensure that you apply the correct value to certain inbound channels for instance if a person comes in on first touch from social media and last touch through organic search, without attribution modelling you will apply the value of the conversion to Google instead of the important first touch provided by social media.
Unfortunately Google has effectively rendered Google Analytics useless for doing keyword research and optimising your website utilising this information but you can still get some data from Google Webmaster Tools. Other helpful analytics programs include click map and scroll tracking software such as Crazy Egg, giving you a visual representation of the data that Google Analytics supplies to you in raw form.
Step 4. On-Site Optimisation
Perhaps the most important thing to do is ensure that you make the range of fixes that come under the banner of on-site optimisation. The technical SEO aspects that give you an advantage over other websites in your space include duplicate content elimination, structured data markup, metadata optimisation, internal linking and page speed optimisation.
Elimination of Duplicate Content
Duplicate content is an issue that comes up quite frequently on websites that haven’t had the magical SEO touch. There are going to be situations where you’ll want to reuse the same content across a range of pages but Google sees this as you trying to game the system and introduced it’s Google Panda algorithm update to counter such things. Fortunately there are ways to keep your content on multiple pages using the rel=canonical tag as seen below.
In the head section of the non canonical (page you don’t want to be organically ranked) link to the page you do want to be ranked as follows.
<link href=”http://www.canonical.com/canonical-page” rel=”canonical” />
It is better practice to 301 redirect if possible, as the canonical link is only a guide that the robots can choose to ignore but should you absolutely need to have the same content on multiple pages this is the only way around it.
Structured Data Markup
Structured data is the application of HTML markup to your content that allows search engines to apply relevance to the content that they wouldn’t be able to get from the content as intended for human eyes. The most used is schema.org, an industry standard list of structured data markup phrases that most search engines will read and apply in kind. This is pivotal for E-Commerce websites and for websites of stores with franchise locations across the country. Everything you need to know about schema is on the website.
In Google Webmaster Tools there is a reasonably new feature called “Data Highlighter” which acts as a light version of schema.org for webmasters who aren’t quite so savvy.
This is quite straightforward. The metadata that counts are your meta page title and to a lesser extent your meta description. Meta keywords count for absolutely nothing. Most CMS’s that you use will have available SEO extentions or modules that will allow you to optimise this information from the page editor. Failing that hardcoding it would look something like this.
When writing your page title ensure that it’s not too long. Around 70 characters would be the most I would include. If Google doesn’t correct you, ensure that the main service (main target keyword of the page) is at the beginning with a – or a | afterwards and then your brand name. Google does tend to swap your brand with your page title when displaying in search engine results, which seems to be a rather new trend. Meta descriptions while having precious little to do with ranking, do a LOT for your click through rate from the search engine so ensure that it is written with this in mind. Max 150 words depending on the search engine or it’ll snip the end off.
Appropriate linking from page to page inside your website allows for pages that otherwise may not be crawled often to be crawled. This opens up your website to crawlers. Ensure that you internally link only where relevant and try to avoid linking to landing pages from footers with keyword rich anchor texts as this so often causes duplicate content issues when Google sees that you’ve just tossed a keyword onto every single page on your website. If you simply must link from the footer do like we did on our site and make it so that it’s only the homepage doing so.
Page Speed Optimisation
A range of things could be causing your website to be slow but the most common issue has to be large images. Ensure that your images are crushed down to a size that doesn’t slow down the page load time of the website. This is one of the easiest fixes to what could be a huge problem. Otherwise page slowness can be caused by bandwidth limitation with your hosting, so ensure that your hosting matches the amount of traffic you expect to get.
There are a heap of other things you can do to help page speed and fortunately Google has built an awesome tool that tells you everything they think you should do. You can find it here.
If you build sites with all this in mind you will rarely ever go wrong.
Joe McCord on Google+
Joe McCord on Twitter