How to do Site Analysis?
As an SEO it’s important to see whether the site is SEO friendly or not, hence we need to conduct a site audit report or initial site analysis to know current status of site.
Follow this procedure to conduct site Analysis
Step#1 Install Screaming Frog in your Computer
Screaming Frog is a free application that crawls your website. It goes through every single one of your pages extract data from the pages.
- Errors – client and servers issues, such as 404 pages
- Redirects – any permanent or temporary redirects (301, 302)
- External links – all of the sites you link out to
- URL issues – dynamic URLs, uppercase characters, URLs that are too long, and underscores
- Duplicate pages – anything with duplicate content
- Page title tags – any missing, duplicate, long or even short titles
- Meta description tags – similar to title tags, it looks for anything missing, duplicate, long or short
- Meta keywords tag – the same stuff as title and meta description tags… I personally don’t look at this field as most search engines ignore it.
- Headings – the types of headings you use (h1, h2, h3) as well as keyword usage, duplicates, and any missing ones
- Meta robots – what you are allowing to be indexed or not indexed as well as if you use it
- Rel canonical – in case you are pointing search engines to a different URL
- File size – the smaller your file sizes, the faster your load time
- Page depth level – how many levels deep search engines have to crawl to find all of your content
- Internal links – what pages you are linking to within your own website
- Anchor text – the link text you are using for both images and web pages
- Follow & No-follow – which of your links are being followed or not
- Images – the sizes of your images, the alt text length, or any missing alt texts
- Bot crawling – you can crawl your website as Google, Bing or Yahoo bot… this helps you see what they see
Once you crawl your whole website with Screaming Frog, which shouldn’t take more than a few minutes, you can then export all of that data into Excel spreadsheet to help you better analyze the data.
Step #2 Check Indexed pages
Do a site: search i.e. (Type in google.com “site: www.abc.com”)
How many pages are returned in the result were Google indexed pages
Is the homepage showing up as the first result?
If the homepage isn’t showing up as the first result, there could be issues, like a penalty or poor site architecture/internal linking, affecting the site.
Step #3 URL
When you look at your Screaming Frog report, you should see a list of all of your URLs. The way I analyze URLs is:
- Static – your URLs should be static. Dynamic URLs usually contain random characters like: $, =, +, &. Static URLs typically contain numbers, letters and dashes.
- Length – although it doesn’t happen all the time, I try to keep URLs under 100 characters.
- User friendly – ideally your URLs should be easy to remember. Cut away dashes and slashes when you don’t need them.
If you have URLs that don’t fit the above criteria, you could create new URLs, When creating new ones, make sure you 301 redirect your old URLs to the new ones. That way you don’t lose the links that may be pointing to the old URLs.
Step # 4 Title Tags
The big misconception about title tags is that Google measures them in character limits. Google actually measures title tags by pixels.
So, what I do is I export my title tag data from Screaming Frog into Excel. I then change the font type to Arial and use the font size 12… as that’s what Google uses.
I then set the Excel column width to be set at 520 pixels as that’s Google’s cut off limit. Anything longer than 520 pixels is too long and anything under 350 pixels is too short.
Here are the rough guidelines you should use for your title tags:
- Roughly 60 to 70 characters in length.
- Be unique to that page (don’t use the same title tag on multiple pages).
- Use the keyword of that page twice if space permits (once at the start, followed by separator such as a colon, hyphen, or pipe, and then once again in a call to action).
- If relevant, include a geo-qualifier (such as Bangalore, Sarjapur).
Check: http://www.seomofo.com/snippet-optimizer.html
Step# 5 Meta Descriptions
The biggest mistake we see companies making with their meta description tag is that they stuff keywords in them. And although you should have keywords in your description, it should also read well. The more compelling your description is, the more likely people will click on your search result.
Assuming you aren’t missing meta description tags and they aren’t duplicate, here are some guidelines for creating them:
- Make sure they are unique and relevant to that page.
- They should be written as descriptive ad text, with a call to action.
- No more than 300 characters in length including spaces and punctuation. (Recently Updated)
- It should contain 2-3 complete sentences with correct punctuation, and no more than 5 commas.
- Use the keyword once per sentence, as close to the start of each sentence as possible.
- Include a geo-qualifier such as “Bangalore, Sarjapur” if relevant.
Step # 6 Meta keywords Tags (Now ignored by Google)
Use Targeted Keywords in these sections
Maximum : Use 2 Keywords 2 key phrases
See if there is key phrases
Step #7 Headings
Similar to title tags, there is a big misconception with heading tags. Search engines typically feel the most important keywords on a page are the ones biggest in font size. So popping keywords in heading tags and making the font size really small, may not have the same benefit.
With typical HTML standards, h1 tags are usually the largest on the page. For this reason it is important for you to use headings with large fonts within each page.
Here are some guidelines to run your headings by:
- Every page should have an H1 tag, as search engines look to the H1 to help determine the topic of a page. It should be the first thing in the body text of the page and should appear prominently.
- H1 tags should never contain images or logos, only text. The keyword of a page needs to be used in the H1 tag and in at least half of the total heading tags on a page, if more than one heading tag is present.
- From a usability perspective, paragraphs should never be longer than 5 lines of text, and it is wise to break up a page every 2-3 paragraphs with a sub-heading in the form of an H tag (H2 or H3) or an image. Testing has shown that when users are faced with a large block of unbroken text, most either skim over the text or skip it altogether, so content needs to be divided into usable chunks.
Make sure you go through all of the headings on your website so that they match the above requirements.
Step # 8 Content
You’ve heard the saying “Content is King”. So you want to make sure there is enough content on each page. Screaming Frog doesn’t do a great job of analyzing the content on each page of your website, but you can figure out where you have a good amount of content and where it lacks by just browsing your website.
For the pages that are lacking it, make sure you continually add more unique content. And don’t just add content for the sake of it, the content actually has to add value.
If you are looking for a rule of thumb, Google tends to rank web pages with over 2,400 words of content the highest. You’ll probably have a tough time adding that much content to each of your pages, but it shows that Google loves ranking good content.
Worst case scenario, each of your web pages should have at least 300 words of content. Ideally each page should have at least 400 to 600 words of unique content.
Tools to check copied content: www.copyscape.com
Step # 9 Keyword Density (Ignored now LSI came into Picture)
Keyword density in the most layman terms possible, keyword density is the percentage of times you use your keyword within a post. Percentage is important because it’s how Google and other search engines read your content. Google is going to scan the content on your website and based on what the most used keyword or phrase is, that is how Google is going to categorize your website and then rank it.
- One Word Keyword not more than 15%
- Two Words Keyword not more than 6%
- Three words keyword not more than 2%
Step # 10 internal linking
Both Webmaster Tools and Screaming Frog will give you data on internal links. The more you link within your own site, when relevant, the easier it will be for search engines to crawl your whole site.
Search engines typically don’t want you to have more than 100 links on a page, so you ideally want to stay under this number. Sometimes you won’t be able to, which is fine, but try to stay under that limit.
As for anchor text, you would also want to look at all of the anchor text when linking internally. Avoid using rich anchor text all the time. Naturally, you should also be using link text like “click here” or “learn more”. Having a lot of rich anchor text may bring your rankings down a few slots.
Every page of your website should have at least two to three internal links. And you should only use rich anchor text 10-30% of the time for internal links.
Step #11 Image text and alt texts
For the benefit of search engines, code compliance, and visually impaired users, every image MUST have an ALT tag. The ALT tag should accurately describe the image and contain a keyword relevant to your website (but only if the keyword is relevant to the image as well).
Image file names should be descriptive words, not numbers or query strings. They should accurately describe the image, and, if relevant, should also use the keyword. If an image is used as a link, then the ALT tag functions in place of anchor text.
A linked image should follow this structure:
<a href=”http://www.abc.com/”><img src=”http://www.domain.com/images/keyword-rich-image-name.jpg” alt=”Describe the Image and use a keyword if relevant” /></a>
By ensuring that all images are properly named and tagged, you will not only increase the SEO value of those images, but you will increase the likelihood of receiving referral traffic from image search results.
Also, for code compliance reasons, all images should also specify a height and width in the image tag.
Step # 12 Robot txt file
An easy way to exclude pages from being indexed is through using the robots.txt file as well as adding no index tags on pages you don’t want indexed.
The reason you want to remove pages from the index is because search engines don’t like it when you have thousands of mediocre pages with little to no content or duplicate content. The best way to solve this is to not have them on your website or stop those pages from being crawled.
With Google’s latest updates, sites with thousands of mediocre pages are easily being penalized. So, make sure you stop mediocre pages from being crawled.
Step #13: URL redirects
Unless a redirect is truly temporary (such as for a time sensitive promotion), 302 redirects should never be used. 302 redirects don’t pass any link value and are essentially a dead end for SEO. In almost every scenario where a redirect is needed, a 301 redirect should be used.
Any page that changes URLs or is deleted needs a 301 permanent redirect to tell search engines and users that the page has moved/is gone. There should never be more than one URL path to a page. You can learn more about redirects here.
Step #14: Duplicate Page
Search engines really don’t like duplicate content, as it leads to a poor user experience and other content quality issues. If you have duplicate content, you need to do everything you can to eliminate it.
There are 4 main options for addressing duplicate content:
- Fix the URL structure of the site to eliminate accidental duplicate content coming from URL issues, per recommendations in the URL Redirects section and this section.
- Rewrite all duplicate text content to make it unique.
- 301 redirect the duplicate content to one canonical page/site, if it is in your control.
- Implement the rel=”canonical” tag to identify the original source/root page to search engines.
Specify the canonical version of the URL using a tag in the head section of the page as follows:
<link rel=”canonical” href=”http://www.abc.com/” />
You can use the tag on pages within a single site (subdomains and subfolders are fine) or across domains (saying content on your site is identical to content on another site). You can use relative or absolute links, but the search engines recommend absolute links.
Step #15: Broken links
Because Google and other search engines crawl the web link-to-link, broken links can cause SEO problems for a website. When Google is crawling a site and hits a broken link, the crawler immediately leaves the site. If Google encounters too many broken links on a site, it may deem that site has a poor user experience, which can cause a reduced crawl rate/depth and both indexing and ranking problems.
Unfortunately, broken links can also happen due to someone outside of your site linking in incorrectly. While these types of broken links can’t be avoided, they can be easily fixed with a 301 redirect.
To avoid both user and search engine problems, you should routinely check Google and Bing Webmaster Tools for crawl errors and run a tool like XENU Link Sleuth on your site to make sure there are no crawlable broken links.
If broken links are found, you need to implement a 301 redirect per the guidelines in the URL Redirect section. You can also use your Google Webmaster Tools account to check for broken links that Google has found on your site.
Tool: XENU Link Sleuth
Step #16: Code validation
Because there are so many programming languages and so many ways to accomplish any one thing using each language, search engines rely on certain rules when they read the content of the website.
Having code that follows these rules removes and helps to minimize errors when parsing or separating the code from the content of any one page.
Search engines such as Google have openly stated that W3C standards are what they suggest you use to make the code easy to understand for them. I typically only test the homepage of the website because many issues can be easily fixed across the entire website using just its page templates.
Double-check your website with W3C to see how you stack up against the competition.
Step #17: Page load speed
Google’s recommended page load speed is 1.4 seconds or less. So, if your website loads faster, you are usually fine. If it is slow, your rankings won’t be as high as they could be.
Pingdom seo site audit
If you are wondering what your load speed is, use Pingdom’s free speed test tool.
Once you learn your load time, you can typically make it faster through browser caching, CSS Sprites for images where possible, and reducing the image file sizes as much as possible for images that can’t be spirited.
You can also reduce the total number of CSS and JavaScript files by combining them into fewer files and minimizing file sizes by using compression and minification where feasible.
You might also see benefits by using a content delivery network (CDN) for your images.
W3 Total Cache is an excellent WordPress plug-in that can help with page load speed issues, and a simple CDN can be set-up via Amazon AWS for very little money.
Tool : Pingdom.com or https://developers.google.com/speed/pagespeed/insights/
Step # 18 Check Age of Domain
Here’s the basics of what we know:
- Domain age is a factor in determining Google rankings, and a part of our SEO.
- Sites are significantly devalued for the first few months after Google first discovers them. It is extremely challenging to rank well for competitive terms in those first few months. In fact, some SEO professionals simply won’t work with brand new domains.
- According to Google’s Matt Cutts, the difference between a domain that’s 6 months old and 12 months old is very small.
Watch this video : http://www.youtube.com/watch?v=-pnpg00FWJY
Tools for domain age finder: http://who.is/
Step #19 IP Location of the server
Google and other major search engines use the physical location of the server as a signal. This signal becomes even more important when the domains have a generic TLD (such .com,.in .net, .gov etc) and when their webmasters have not enabled the Geographic Targeting Feature on Google Webmaster tools console.
At Web SEO Analytics we know the importance of the server location from first hand, since our Datacenter and servers are located in India. Even though our company has absolutely no other connection to India (other than the hosting), our websites achieve extremely high rankings for highly competitive keywords on Google India. Moreover when a user from India searches for the terms “google”, “youtube”, “facebook” etc (which is powered by Google), he/she will receive on the first page at least one relevant article from our blog! So trust us when we say that the location of the server can definitely be important.
Step #20 Favicon
- A favicon is not really directly a SEO factor as such however it creates better user experience
- Favicon can benefit your site in regards to your branding and identity,
- Where a favicon is most useful, imho, is when you have a page on your site bookmarked/added to favourites.
This helps an end user identify your site in their list of sites that have been bookmarked/added to favourites when they are trawling through or browsing their bookmarks as you will stand out from those who have no favicon.
Step #21 Bread Crumbs
Breadcrumbs present an important opportunity to make your site more search engine optimized. This is because breadcrumbs:
- Enable an excellent and natural way to get keywords on to a page which then helps with Keyword Density.
- Inform and define to search engines what the pages on the site are about. They achieve this through the internal links within the breadcrumb trail.
- Bread Crumbs are given a lot of importance by Google. A few years ago, Google started incorporating breadcrumb trails into search results.
Step # 22: Sitemap
Through a sitemap you can help encourage the indexation of your website. You can create an HTML sitemap as well as an XML sitemap. Once you have your XML sitemap, you can submit it to Webmaster Tools, which will tell you how many of your submitted URLs are indexed.
It’s rare that 100% of the URLs you submit will be indexed… especially if you have a large website. But through internal linking, you can increase the likelihood of having all of your pages indexed.
To Check XML Sitemap: Type www.domainname/sitemap.xml
To Check HTML Site Map: Find in footer
Tools to Create Sitemap: http://www.xml-sitemaps.com
Step #23: Inbound links
The more links come to your website, the higher you will typically rank. But it’s not just about the pure number, it’s also about how relevant the incoming links are, what anchor text of links is, and how many unique root domains are linking to you.
Most importantly, you have to look at where those links are pointing. If they all go to your homepage, it won’t help you as much as it could if all of those links were spread to all of your internal pages.
Through Open Site Explorer you can get a great overview of your inbound links:
Link overview:
If you are trying to grow your inbound link profile, keep in mind that the very best links come from trusted domains (sites like the New York Times, Wall Street Journal, Wired, Inc., TechCrunch, Huffington Post, Wikipedia, etc.). The more links you can get from authoritative websites, the better. Guest blog posts and press mentions are a great way to get those links.
One of the things that Google looks at and factors into the algorithm is domain diversity. Essentially, the concept is that ten links from ten domains would be more valuable as a ranking factor than ten links form one domain.
From an SEO perspective, you usually want to see a domain diversity of no less than 10% (i.e. 100 links from 10 domains), though higher is usually better. All other factors being equal, the site with the larger number of linking root domains would almost always rank higher. That said, in the case of extremely high quality sites, acceptable domain diversity could be as little as 2%
Tools: http://www.opensiteexplorer.org/
Step # 24: Authority and trust
Similar to Google PageRank, there is a metric called Domain Authority. It ranks websites from 0 to 100. Anywhere from 40 to 70 is good, and anything above 70 is great.
Domain Authority: Typically the higher your domain authority, the higher you will rank on search engines. The best way to increase this is to build links from as many unique domains as possible… and ideally from ones with high domain authority. Check in https://moz.com/researchtools/ose/
Just like in Step #22 above, you can build links through content marketing.
Step # 25 Google Page Rank
PageRank is a link analysis algorithm used by Google to help determine the relative importance of a website.
Every website is given a Google PageRank score between 0 and 10 on an exponential scale. The handful of PageRank 10 domains, including USA.gov, Twitter.com and Adobe Reader Download, have the highest volume of inbound links of any sites on the web. The top sites set the bar, so to speak, and the 10-point scale plummets exponentially down from there. Google.com and Facebook.com are PR 9. PageRank 5 websites have a good number of inbound links, PR 3 and 4 sites have a fair amount, and brand new websites without any inbound links pointing to them start at PageRank 0.
Since Google wants to return page one result that are high quality, relevant, and trustworthy, it may return web pages with better PageRank scores higher up in the SERPs, although PageRank is only one of many ranking factors taken into consideration.Since PageRank is only one factor in the Google ranking algorithm, it’s important to remember that a high PageRank does not guarantee high rankings – but it can significantly help. Tool to Check Page Rank: http://www.prchecker.info/check_page_rank.php
Step # 26 Competitor Analysis:
Your competitors can be your greatest allies if you allow them. Your goal is to get insight into the strategies that are working for others in your market so you can adopt them, improve them, and gain an edge.
Why You Need To Do Competitive Analysis
- Tell me, do you ever think about any of following?
- Growing your traffic exponentially using solely the resources at your disposal
- Finding keywords with high search volume that haven’t been targeted by other SEOs in your space
- Finding the keywords that give your competitors the highest ROI
- Going after high competition keywords and winning
Helpful tools for doing a site audit:
Xenu, Screaming Frog, SimilarWeb, SEO Quake, SEm Rush, Moz Bar
Conclusion
Now that you’ve done a complete analysis of your site, you can take the SEO audit template and plug in the information.
This will allow you to have a comprehensive report.
Comments
Post a Comment