Performing online SEO intelligence can offer tremendous returns on investment. When you analyze what your competitors are doing, you are in a much better position than if you just guessed what you needed to do to rank well. Obviously we have to start with some goal or objective. For most businesses, the end result is increased conversions, which lead to sales and increased bottom line.
Greater visibility is obviously a way to achieve this goal. So how do you increase your visibility in the search engines? In many cases other sites are offering the same or better services than you are and that certainly will mean you may need to take a look at your value proposition.
What sets you apart from the rest. When doing search engine optimization for businesses I am often asked why a particular website ranks or performs better even though the belief is that they have inferior services.
Well, the truth is that search engines evaluate websites in a similar fashion that humans would evaluate individuals of high importance. The fact is that popularity plays a very important role.
However, because search engines rely on automated processes to determine which sites are more important for specific key phrases than others, there is always going to be a margin of ineptness on the part of these processes.
Machines cannot think for themselves. It can only deduce and calculate based on what its programming allows it to do.
All of that to say that sometimes what is needed to improve your ranking online is competitive intelligence. It’s effectively warring to beat out your competition. It is imperative that you understand WHO you are up against before you will be able to gain an advantage.
The steps to perform competitive intelligence online
Even though you can perform these tasks manually and with little up front expense, it takes an inordinate amount of time to do well. The smart way is through the use of very specific tools. The problem is that the very best tools and the most detailed reports won’t help your efforts if you don’t know how to interpret the data and build a strategy around them.
Step 1: Establish your high value keywords – perform keyword research
What guides every good seo campaign is keyword research. Do research and then determine which phrases are most important to your business. This involves running a baseline for specific groups or sets of keywords that have real conversion value to your business.
If you haven’t been mining for keywords, now is a good time to start.
However, if you already know which keywords have good conversions then you’re ahead of the game. Essentially, you begin by taking those keywords and searching for them by using a keyword research tool like Wordtracker or the Google keyword tool (which is free). If you use Google’s keyword tool, note that the total searches for the US are termed ‘Local Monthly Searches’.
This is important to know because as you can see there is quite a large difference between world-wide/global searches vs local. Your first group of seed keywords are shown above, but take not of those down below as well. These often have a hidden treasure trove of keywords with low competition just waiting to be mined. Take snapshots of each of the results that make it to the top 10.
Go through each keyword, and rank each site in order of how they appear in the SERPS. If you want to know how they perform in Google and Bing then just do them for both search engines.
Step 2: Introspection – Examine your own weaknesses
Running an audit is imperative and should be the very next thing you do before attempting to rank well. Performing a web site audit is at the heart of a well-rounded wholistic approach to a winnig SEO strategy. You need to know what on-page factors are problematic first as this does affect the quality of your page, among other things.
A web site audit can be performed on the following:
The top 4 most important factors when performing a web site audit:
Crawlable Pages
Ensure that your pages are crawlable and are indexed in Google. You should upload a sitemap through Google webmaster tools to aid this. Ensure that there are no meta tags that can hinder the bots finding your pages, such as noindex, nofollow tags. Do not use frames. If you are wondering why a page is not getting indexed in Google, first confirm that you have not been penalized. If this is not the case then remedy that by creating a few dofollow backlinks to those pages from extern sites, and also link to them with relevant anchor text from other internal pages with relatively good page rank. Ensure that these non-indexed pages pass the following 3 points as well.
Valid code and fast loading pages
You should aim to maintain validly structured code at all times. Run each of your pages through a validator at http://validator.w3.org/ or, if you use Firefox you can get the adon and find the errors and validate your pages very easily.
The key is to make sure that if you are using a specific DOCTYPE that the code on your page adheres to it.(probably a little too much information here.. but your web developer should be able to help).
Google considers fast loading pages, an important metric to measuring a quality user experience. You should use Googles speed test tool to run a page speed test. It gives a lot of good information on how to improve the speed of your pages. This is also a great way to improve user experience.
Another free tool that is great for this is Pingdom’s website speed test. http://tools.pingdom.com/fpt/
It shows you the specifics of what is actually slowing down your page’s load times. Often the cause are images that have not been optimized, not using compression for files on servers that support it, not using sprites for css background images, and javascript libraries that have not been minified to make them use less bandwidth to load, among other things.
Internal page structure
Here we can evaluate things like: duplicate meta title tags, incorrect use of titles for the page. Titles that don’t accurately reflect the contents of the page and is thus misleading to a user and to search engine spiders. Meta description tags should be carefully considered and also be very relevant to the page.
You could paraphrase some language from the page and make sure that it is compelling, because this and the title is the text that the user will see in the search engine results. Resist the use excessively long titles and also the use of duplicate titles or description tags. Never use frames if at all possible. Keep your page size under 100kb in size to improve load times. Link sensibly to other parts of your site using keyword rich and compelling link text.
Link stats and broken links
Broken links are terrible. Fix ‘em. They provide a bad user experience and you may be losing potential page rank boost to other pages on your site if links from pages with higher page rank are not transferring link juice to other internal pages. Pay attention to your use of dofollow and nofollow links. If you must link to external sites, keep the number low. Excessive linking to external websites can get your page flagged by the search engines and will sabotage your path to a higher page rank.
Be very selective of the pages you link to and who you allow to link to you – it’s guilt by association. You may not always be able to control who links to you, but you can chose who you link to. If you must link externally several times on a page, try adding the rel=”nofollow” to your anchors to signal the search engine bots that they should not follow this link. Users will still be able to click the link though, so you’re not deactivating the link.
Step 3: Baseline your competitor’s current performance and evaluate results
Examining several competitors and where they rank for a particular keyword gives insight into what they value, and may reveal particular strengths or weaknesses that their site has. The reason some sites do well in rankings depends on a number of factors, and knowing where your competition ranks and the phrases they rank for, will make it easier when you’re attempting to reverse engineer their strategy.
The image below is an example of a competitive keyword ranking analysis at a given moment in time. It is important to note that ranking is not what it used to be in search engines in years past. Google ranks web pages by geographic relevance as well. If your business only deals with customers in a specific geographical location, then all your reporting needs to be focused in that region otherwise your ranking results will be skewed.
The baseline is just the initial snapshot at a moment in time. It is from this point forward that you can ascertain with greater certainty which keywords your competitors place higher emphasis on.
Over time you will know which keywords are most valuable to them, based on performance in the serps for those phrases. Additionally there are other factors to take into account, such as social media buzz, site improvements as well as analyzing the data in the backlink analysis.
Step 4: Backlinks Ananlysis – Intel and insight into the direction of your competition
This is arguably my favorite step because it reveals so much about my competitor that it really is integral to doing a proper analysis. It is also the most time consuming portion of intelligence gathering.
When gathering intel, you should really be using professional tools to speed up the process.
I have had really good results using LinkAssistant’s SEO Powersuite of tools. They have unlimited free version of the software and it is excellent.
A comprehensive list of tools I use can be found at the bottom of this article in the checklist. There are some free web based tools that give you cursory information, and that’s ok, however, the more information you have on each link, the less you will have to rely on assumption.
I’m in the process of creating a video that will walk you through how I use this tool and the massive amounts of data one can mine from it. There is just so much information right at your fingertips.
The following factors should be considered during your analysis:
1. Backlink page that is analyzed
The page that the link appears on or is sourced from.
2. Evaluate the do-follow or no-follow links
Links can be divided into two main categories. Those that pass link-juice, and those that don’t. Do follow links allow google bots to attribute value to the receiving site. No follow links typically are found on blogs or Youtube for example where there has been a lot of abuse by spammers to link to their sites and influence se rankings. No follow links, as the name suggests, are not followed by search engine spiders.
3. Examine the links
What type of link is it? Image links are more valuable when they have valid alt properties that use meaningful text to describe the image link. Textual links should include the target keyword or partial phrase of the keyword as the link text. Excessive and exact link text can be harmful. Healthy and natural variation is better.
4. Page title of linking page
A huge factor that benefits the page being linked to. If the page that the link originates on is well optimized. For example: does the content in the body of the page deliver what the title promises? Is there a connection of relevance between the anchor text and the title.
5. Anchor url of the link – are there more deep linking or much more home page linking?
It is better to steer clear from having too many inbound links coming to the home page of a site. Deep internal links to pages throughout the site is very important for increasing the overall value of the site. Since search engines evaluate sites on a page by page basis, internal pages that have healthy inbound links to them will not suffer from obscurity and effectively and may cause a reduction in the bounce rate of your site because each internal page can focus on a specific subject matter and be optimized for a small number of keywords.
6. Page rank of the linking page and domain authority
Page rank is harder to come by once you reach PR4. Links from higher page rank sites come with exponential increases in value. Ex: a link from a PR3 site is several times more valuable than one from a PR1 or 2.
7. Number of external links on linking page
The number of external links coming from a web page is important because each link going out from that page has diminishing returns. The fewer external links, the greater the value contribution to the receiving page. Search engines consider pages with many external links to be less discriminating in their linking practices and as a result the value is diluted among the links.
8. Indexing and link popularity
How many pages does the receiving site have indexed in Google, Yahoo and other search engines? Shoot for having all your pages indexed. Google webmaster tools is handy to tell how many pages Google actually is aware of.
9. Social mentions
While most SEOs will agree that social media links are great for building traffic to the site and generate buzz, there are many that feel that because most of the links coming from those sites are nofollow, that the SEO value is not worth it.
I would venture to say that a healthy SEO campaign should include its share of social media links whether they are dofollow links or not. The purpose behind a link campaign should not always be to acquire link juice. There is growing belief now that search engines are beginning to take social mentions more seriously and include it as a factor that may influence popularity more than ever before.
10. Domain age
Age of a domain that is established often has gained higher page rank over time than very young websites. The link value from older more established sites tend to be better and sometimes of higher quality.
11. Compete rank and traffic
Compete gives a nice estimate of how many site visitors the site receives, but don’t place too much stock in this data. It can be very skewed.
12. Alexa rank
The lower the number the better. This is not something that I would consider highly important, although some may disagree. It has more focus on traffic and visibility than anything authoritative.
13. DMOZ and Yahoo Dir listing
It is still valuable to be listed in directories, but don’t overdo it. Inclusion in directories like DMOZ and Yahoo are great. DMOZ entries are evaluated by human moderators, which means that their links offer higher credibility in Google and other search engine’s and thus will be of greater value. Go after a handful (maybe 4-6) of high quality directory submissions that have been established rather than just submit to hundreds of directories.
A Word of caution when using automated tools
Some tools are inferior to others, and as the old adage goes, you get what you pay for. You must have them set up properly to avoid muddying your data. When a human searches for something online, they often pause and scan the page before moving on the next page. Many tools are inadequate for massive queries because they don’t use human emulation when querying Google for results. Google can tell that these queries are not coming from a human.
The result is that you will suffer a block from Google when performing normal searches. What you may notice is a captcha request. This can slow things down, and can be quite annoying.
The solution is actually quite simple. Superior quality tools, like Ranktracker by Link-Assistant, allows you to randomize the search frequency and emulate human behavior. Basically you can set the interval in seconds between each time the software queries Google for results. This works well, but it can take a very long time to run your queries.
If you have a small keyword list for example you should be fine. It’s when you start having dozens and hundreds of keywords that things start to take excessively long. This can be remedied by using proxy servers. There are several very good services out there and one that I have used before with good results is from trustedproxies.com.
What to do with all the information you’ve gathered.
Now that you have all the information at your fingertips, you essentially need to consider a plan of action. I am working on a video demo to clarify this process with a real world example. Here is the list of tools that I use when performing these audits and competitive intelligence gathering. These are free to use tools, however as with most software paying for extra features comes with obvious benefits.
1. Link Assistant’s SEO Powersuite of professional seo tools. You can get these individually or as a bundle
2. Rank Tracker – to track ranking in search engines
3. SEO Spyglass – to perform back link analysis
4. Website auditor – performs in-depth audits on a site wide or page by page basis.
5. Link assistant – this helps you acquire links and search for sites to link from or create relationships that will benefit your site. (Related: BuzzBundle – a new product that helps manage your social media posting automatically.)
6. Pingdom – Great tool to help diagnose loading time issues.
7. Google speed test – Run speed test on your web page and get improvement suggestions.
8. Smallseotools – has a great quick free backlink checker and several other handy tools.
9. Google Keyword Tool – Indispensable. No one should be without this.
10. Ubersuggest – An awesome tool that offers keyword suggestions based on your seed words/phrases.
11. Website code validator – W3C tools is indispensable for web developers
I hope this article was helpful to you. Please like and comment!