Search engine optimisation is not black magic. Nor is it rocket science. SEO is a puzzle: comprising thousands of little pieces that fit together to provide search engines a clear picture of what your website has to offer.

You need someone with intense attention to detail to structure your website, the content displayed on your website, and the links that point to your website in order to arrange the pieces of the puzzle in the most efficient way possible.

The Periodic Table of SEO Success Factors


On-page SEO Factors

On-site SEO ranking factors boil down to quality, accuracy and integrity.

Search engine optimisation is no longer about technical tricks designed to outwit Google. It is about building an audience, earning trust, and publishing information that customers want.



Marketing is telling a story that connects with customers. If your customers like what you have to say, so will the search engines. Focus on writing epic content for customers first, then tweak a few keywords afterwards. Don’t write for the sake of writing: back it up with valuable insight, thoughtful opinion or researched facts and make sure your content is well-written with perfect grammar and spelling.


Keyword research is the foundation of on-site SEO. The keywords and phrases recorded by search engines offer a window into the thought process of customers, which translates into a check-list for you to cover all bases. Ensure every page stands on its own as a topic. Don’t publish multiple pages about the same thing. Instead, stick to single, well-written, citation-worthy, topic pages and optimise them for a common theme.


It goes without saying that it is important to include the full range of phrases relating to a keyword on the page. Search engines look at the relationship between the primary keyword and secondary keywords that are often used within the same context. The more your information reflects these patterns of search behaviour the more relevant the page.


Engagement refers to the amount of time visitors spend on your website and how many pages they visit. Once again, meaningful content is key. It’s amazing how it all comes back to quality. Engagement isn’t just limited to text content, either; web design is equally important. Words don’t just have to read well to be engaging — they have to look good. Readability includes everything from page layout to font selection to letter and line spacing. Additionally, pay attention to navigation and the presentation of links to other content, as these elements can impact bounce rates and other visitor engagement metrics such as time on page / time on site.


Websites with blogs generate roughly 50% more traffic than websites without blogs. This is because the search engines detect that you are publishing new content on a regular basis. Not only is new content important to attract readership, it also improves crawl frequency and depth. Fresh content also builds awareness and trust. You must go beyond writing about just your company and its products or services. Go broader and become a resource: a real, viable, valuable resource for your customers. Taking this broad approach will give you more to write about, allowing you to focus on topics that interest your customers.


Avoid writing short “thin” articles: pages with less than 300 words rarely rank well as they do not provide enough information on the topic and it is guaranteed there are other pages out there with more to offer. On the opposite end is “deep” content: pages with 2,000 words or more. Many websites have seen success with this content, so businesses are produce longer, more meaningful pages. The trouble is it can be long-winded.

Va | ADS

Another part of layout is the placement of ads. Search engines will not penalise you for having advertisements, however they will penalise your website if you have too many ads or inappropriate ad placements. If you use interstitial or pop-up advertising, make sure it doesn’t interfere with the ability of search engines to crawl your pages.



The rules for writing optimised title tags and headers have not changed: each page must have a unique title with between 35 and 70 characters. Pay attention to the order in which the keywords are positioned within the title – the closer to the beginning of the title the better.


Meta description tags have also not changed: write unique descriptions for every page between 120 and 156 characters. Besides having a minor impact on your ranking, well-written descriptions can increase click through rate as they are often the first impression you give your customers. For pages that appear in site links, ensure that the truncated portion of the description that appears beneath each link reads coherently.


You must have only one, unique H1 heading on each page. The primary keyword should be within the heading of each page, ideally at the beginning of the H1. Following this logic, each page should contain multiple H2 and / or H3 subheadings that also include the primary keyword.


Structured data markup makes life easier for search engines as it itemises the information contained on the page. Structured data markup like, RDFa and other forms of structured markup, like the author and publisher tags, are not controversial and have entered the realm of best practices. Use them.


Negative ranking factors like keyword stuffing and hidden text can negatively affect e-commerce websites. It can be tricky not to use the same word or phrase over and over again when they are used as categories or descriptions for products. Different shopping carts have different levels of control – ensure your e-commerce platform is keeping up with best practices.


Hidden content is often an unintentional consequence of getting around a quirk of the content management system. The rule is simple: if it appears on the page, it should be in the HTML. If it does not appear on the page, it should not appear in the HTML.



You need to ensure search engines can crawl your website and all your pages. Keep in mind that if you do not want to botch the flow of PageRank through your site, use the meta noindex, follow tag to exclude pages, not robots.txt. Search engines budget crawl frequency and depth for good reasons. You need to manage your website crawl budget and use it well; don’t just leave everything up to chance.


Your website is unlikely to rank for content that the search engines have already crawled. Each page should be unique and offer something of value. The bottom line is: keep managing your duplicate content by preventing or eliminating as much as possible, and as for the rest, you can use canonical tags to refer Google to the page you want indexed over the others.


Most websites are not going to see an SEO benefit from increasing the speed of their website. Google has always said only a small fraction of sites are affected by this part of the ranking algorithm. Honestly, the best test of speed is to take your laptop to the local café and surf around your website. If you are not waiting for pages to load up, then you are probably okay. The exceptions (sites that should be concerned about speed) are large enterprise and e-commerce websites. If you optimise for one of these, shaving a few milliseconds from load time may lower bounce rates and increase conversions or sales.


Best practices for URLs remain the same: make your URLs simple and easily readable. With today’s multi-tabbed browsers, people are more likely to see your URLs than they are your title tags. I will also add that, when seen in the search engine results pages, readable URLs are more likely to get clicked on than nonsensical ones.


Google and Bing agree that the ideal configuration is for websites to have a single set of URLs for all devices and to use responsive web design to present them accordingly. In reality, not all content management systems can handle this, and web designers have presented case studies of situations where the search engine standard is neither practical nor desirable. If you can execute what Google and Bing recommend, do so. However, if you cannot or have a good reason not to, be sure to use canonical tags that point to the most complete version of each page, probably your desktop version, and employ redirects based on browser platform for screen size. You will not risk a penalty from the search engines as long as your website treats all visitors equally and doesn’t make exceptions for search engine spiders. Basically, this is similar to automatic redirection of visitors based on their geographic location for language preference.


Similar to hidden content, cloaking is often an unintentional consequence of good intentions. The rule is simple: if a search engine can crawl the page, it should be accessible by users. If it is coded to only appear to search engine bots or users can not navigate to the page, it should be fixed.

Keep Calm and Optimise

Off-page SEO Factors

Off-site SEO ranking factors boil down to quality, authority and trust.

Search engine algorithms have shifted away from easy-to-measure but less useful signals (like domain age) toward more difficult to measure signals that are more important (like visitor location).



Google is going to continue its trend of getting more discerning and aggressive with penalties for low quality links: their link analysis keeps getting better and the confidence of the search and spam teams — that they will not unjustly target innocent websites — keeps growing stronger. There is a lot you can do to encourage quality links without resorting to artificial means. Link building tactics are merging with influencer marketing programs and becoming more networking oriented. Diversity of links from a variety of sources is also important. If all of your links are coming from your network or the same websites over and over, you could be in trouble. You really need to be actively promoting your content, enough to grow a real audience. When you do this, link diversity tends to take care of itself. For websites that already have a lot of low quality links, you should engage in a link-cleaning program because the risk of a future algorithm update may strike your site. Use Google’s link disavowal tool to undergo off-site link rehabilitation. Copiously log your link cleanup efforts. Should you be hit with a manual penalty in the future, this log can help to demonstrate that you’ve already made efforts toward rehabilitation and may speed up the reconsideration process.


The anchor text that links to your website is a signal to search engines that the page being linked has information specific to the keywords in the anchor text. A concern is when too many offsite links use the same anchor text. This can occur quite naturally when other sites link to your pages using the article title or title tag. That is generally fine. The real concern comes from unnatural repetition of individual keywords or key phrases.


When it comes to links or domain authority or page authority, the old adage has always been “quantity and quality” – this will not change. If you are not earning new and better links at a faster rate than your keyword competitors, you’ll probably lose ranking battles.


Don’t buy links in hopes of better rankings. Just don’t do it.


Do not repeatedly link from the same group of websites. If it looks like spam, smells like spam and tastes like spam, Google will regard it as spam.



Trust has really started to evolve as a search engine ranking factor, or set of factors. Old signals like domain age are less important, partially because they were never that meaningful to begin with and partially because search engines are able to put more faith in new and better algorithmic signals. Now, in addition to links from high trust sites like or, trust is more about things like brand recognition and author recognition. You can be certain that Google and Bing have a database of brands and an automated way to add new ones to the list. Brands are important and get a boost in the rankings — not because you know them, but because people write about them and link to them. One way to build author trust is to have a central author as a voice for your company blog. One person devoting their time to developing great content and promoting it in social media will go a lot further than having round robin contributions from everybody on your staff. Author trust is something to be developed, and as it grows, so does the trust given to all of their past articles.


An older signal, search engines look at the age of the domain since it was first registered.


Can the owner of the website be verified by Google?


Keep your content management system up-to-date at all times. Most CMS updates include security patches to prevent takeovers and piracy. Do not fall behind. If your server or website does get hacked or infected with malware taken offline immediately and put up a 503 page. This lets the search engines know your site is temporarily off-line and will return shortly. If this happens and the search engines blocked your site to protect their users, do not go back online until you solve the problem then filed a reconsideration request.



The reality of social media as a search engine ranking factor has not met the hype created by the search engines and optimisation professionals. To be sure, social media is a ranking factor and one that will continue to become more important. That said, social will not replace link authority any time soon, and it appears to be progressing slower than anticipated.


Social media metrics such as Facebook likes and shares or Twitter mentions and retweets have a high correlation with high rankings. However as the search engine representatives like to remind us, correlation does not equal causation. Right now, this really is a case where popular websites and influencers are as likely to get links as they are social votes.



There are plenty of things to optimise for international results, such as proper use of subdomains or country code top-level domains, tagging pages with language codes and registering geographic targets in Google Webmaster Tools. Do not ignore these. Factors like IP address and server location will continue to become less influential as search engines get better at measuring user centric signals. This is one area in which social media will eventually play a major role. For example, if many people in Portugal have a company in their Google+ circles, that company maybe more likely to appear in the search engine result placements inside Portugal. That type of signal is a lot more meaningful than whether the server resides within Portugal or the website is written in the Portuguese language (something multiple countries use).


Start with registering your business in Google+ and Bing Places for Businesses. Then seek reputable directories and local websites in the immediate community to add your citation. Once again it comes back to links. If you are getting links from sites related to the geographic locations you’re targeting, your website is more likely to break into local rankings for those places.


Like social media, personal history is a ranking factor that is slowly coming into its own. Right now, if you are logged into Google or use Chrome and visit a web document, that page or site is more likely to show up in future search results. If social media friends visit a page, that document or site is more likely to show up in your future results. Based on personal experience, this is pretty fluid and seems to be one of those things the search engines keep evolving. Going forward, it makes a lot of sense for search engines to give trust to webpages that lots of people visit, something they can evaluate by using the collected search history data stored in their databases.


It is important to understand the relationships that search engines have with social media sites and be active on those sites. For example Google owns Google+ while Bing has relationships with Twitter and Facebook. And of course, personalised results will continue to be influenced by social media connection. If a user has a connection to a person or brand, search engines will use social media connections to display relevant content.

01Alchemy is the unstoppable force of online evangelism and marketing – if you need a team who will work tirelessly (even manically?) toward a requirement – look no further. I can safely say they are in a league of their own.

Lucas Gunn

Alastair is a true pioneer and professional when it comes to online media and marketing. His depth of high level global experience, tenacity and sense of humour prove an excellent combination to achieve great results on any given project.

Yvette Adams

01Alchemy are industry experts, they keep up to date with all the latest developments in a fast moving environment. The team provides well structured advice and can organise and deliver successful campaigns on time and on budget.

Andrew Robinson

Alastair is an enigma of the digital world! As an absolute master of online marketing, he is a forward-thinking individual and a true inspiration. He taught me half of what I know without even realising it and I am extremely grateful!

James Congdon