Every year the field of search engine optimization is evolving. What began in the 90′s as a way of spamming meta tags has turned into a sophisticated science of ranking websites. One core thing about modern SEO that hasn’t changed in the last 8 years, and that probably will never change, is that references from other websites (backlinks) are the primary means through which a website is given its value.
Whilst referencing will always be the most important part of SEO, there are other factors which heavily influence a websites ranking, such as the way the website is setup. Good use of title tags, heading tags, and a well set up internal linking structure are widely known as being crucial ranking factors. All this is pretty much bread & butter that every SEO is aware of however. But what about some of the more recent factors that influence a websites rankings?
In this article we’re going to cover some of the latest SEO techniques that are used by Google when determining the ranking of a website.
The speed with which a page loads is one of the latest contributing factors that is used in determining whether a website should be displayed higher in the SERPs. The main motivation for this new algorithmic implementation it seems (according to Matt Cutts) is to create a better user experience. Users shouldn’t spend a long time waiting for a page to load before they view it.
One common backlash within the webmaster community with this new ranking factor is that sites that contain a lot of information will naturally take a longer time to load than sites with a short snippet of information. It therefore doesn’t make sense to penalize websites with more information versus those with less. This issue isn’t as great as it may sound however. Google states that less than 1% of search queries are influenced by a page’s speed. It’s common knowledge within the SEO industry that a page with more information will typically hold more authority than a page with less. Therefore, it would seem that the ranking benefits gained from providing more information to users would offset any small penalty that may arise from the page taking longer to load.
It nevertheless seems that the way forward will be in providing users with quick answers to their questions. One crucial way of determining whether a page can answer a user’s question quickly is the speed with which the page loads. Therefore, reducing page load times will no doubt evolve and become an important SEO technique among webmasters. Google Webmster Tools contains some useful statistics on the loading speed of your website, however a good stand-alone open source tool to use is Page Speed. This tool will monitor the speed of individual pages so that you can find out how Google will be perceiving the speed with which they load.
It may seem like a strange suggestion that a social media website which uses nofollow tags on external links could indirectly enhance your rankings, however as the Internet expands and more users are running websites and blogs, the importance of having exposure on social media networks is increasing dramatically. We all know that a front page story on Digg won’t do much for SEO purposes. However, as more webmasters utilize services such as Twitter, LinkedIn and social bookmarking sites, gaining front page exposure can create a large influx of natural inbound links from webmasters linking to your content.
For this reason, setting up a social media presence is becoming more effective than ever before. Sure, social media traffic doesn’t convert like search engine traffic, however social media shouldn’t be used for conversions to begin with - it should be used for exposing your content to other webmasters.
Setting up a generic, spammy social media account to obtain backlinks from the social media sites themselves is very much an old 2007 SEO technique. We all know social media sites themselves aren’t worth much as references. Modern SEO techniques rely on setting up legitimate accounts on all of the major social networks, establishing a trusted community presence, and leveraging that presence for backlinks from other webmasters. For SEO purposes, it’s crucial to understand that social media sites are to be used to penetrate the webmaster presence within the community.
Manual reviews are becoming more common for websites competing in competitive niches. As Google is constantly increasing the number of employees who manually review websites, the odds are that when you start to receive any large quantity of meaningful, monetizable traffic, you will hit a manual review.
Some SEOs think that manual reviews are becoming rarer since the quantity of websites on the Internet is increasing in greater proportion than the number of Google reviewers. However, don’t let this simple deception fool you. If Google were to monitor only the top 10 results under a set range of keywords, then the number of websites in their index would not in any way hamper their ability to maintain this form of monitoring. The moment you would break into the top 10, you may be flagged for review, regardless of how many other websites were in the niche. This isn’t necessarily what happens, it’s just an example.
Therefore, as you can see, the more employees Google has that conduct manual reviews, the greater the keyword terms and niches that they can target. Incoming links from external websites are one key factor that may be utilized in a review. Whilst external links themselves can’t penalize your website, if you have a wide range of high PR incoming links that look illegitimate (ie: spammed links or paid links), you risk having the high algorithmic value that those links pass devalued.
All forms of external link building are worthwhile strategies, however what’s important is that you conduct the link building in a manner that is less likely to raise signals from a manual review.
Google states that they use over 200 factors when ranking websites. There’s no reason not to believe them. Any factor that you can think of that may influence your rankings probably does in some way. It’s been found that the age of a page compared to the recency of links that appear on it is taken into account when determining how much authority to pass to external links on the page. This could be used as a means of filtering the value of blog comments and forum post replies. The thing to understand however is that the core concepts of SEO haven’t changed since Google’s inception. References from authority websites still pass the greatest form of linking value. What you can do is use some of the latest SEO techniques covered in this article and add them to your overall link building strategy.