Open Multiple URL In Single Click

Monday, 12 March 2012

Site Analysis Report Contents




Table of Contents
1.  SEO Audit
1.1 Domain Information
1.2. Hosting Information
1.3. SEO Crawl
1.4. Canonical URL Check
1.5. Check for use of Flash/Frames/Ajax
1.6. Google Banned URL check
1.7. Sitemap (XML & HTML)
1.8.  Robots.txt
2.  Website Analysis
2.1. Website Status on Search Engines
2.2. On Page Analysis
2.3. Ranking Analysis
3.  Online Social Media Presence
4.  Search Engine Optimization Recommendation
4.1. On page Recommendation
4.2. Off page Recommendation

1. SEO Audit

Search engine optimization (SEO) is the process of improving the volume and quality of traffic to a web site from search engines via "natural" ("organic" or "algorithmic") search results.
Search engine optimization improves the organic traffic to the website by increasing the search engine ranking position (SERP) and also it helps the crawler index all the pages of the website. SEO can be used to generate traffic through organic listings over the long term.
We have conducted an SEO audit for your website and below are the observations from the same. We have analyzed your home page www.example.com
 1.1 Domain Information
In most cases the domain information and host server information will vary. To have a control over the website these information are most needed. This also will give the created and expiration date and age of the domain.
Sr.NoDetailsDomain Information
1.Url
2.Website Registrant
3.Created Date
4.Expiry Date
5.Domain Age

1.2. Hosting Information

Information about the server like IP address with www and without www, website associated country/territory, server type and website registered address. Sometimes IP address may vary for both www and non www version of the website. This will lead to duplication of website and duplication of content.
Sr.No
Details
Hosting Information
1.IP Address
2.Hosting Company Name
3.Hosting Company Url
4.Server Location
5.Server Type
6.C Class Ip Hosting
 

1.3. SEO Crawl

The site is checked with a site crawler simulator to check if the site can be crawled by the search engine bots. If the web pages are not crawled then the web pages won’t be listed out in the search engine database and web page won’t be shown up in the search engine results like Google and Yahoo.
Result:
Recommendations: 
A  canonical  problem  exists  were  a  site  can  be  found  by  using  the  www  Version (http://www.yoursite.com) and the non www. version  (http://yoursite.com)  This creates a problem because it can be seen as duplicate content by the search engines and one version can be removed  from their  indexes – the problem arises when the wrong version is deleted (this is usually the non www. version)
Result:
Recommendations: 

1.5. Check for use of Flash/Frames/Ajax

Search engines cannot crawl and index flash, frames and Ajax. Excess usage of the
Above mentioned techniques tamper the visibility and performance of the website in Search engine results. 
Result:
Recommendations: 

1.6. Google Banned URL check

This check looks to see if a URL has been banned from the Google, this can be for a variety  of  reasons  and  action  needs  to  be  taken  if  it  has  been  removed  from  the index.
Result:
Recommendations: 

1.7. Sitemap (XML & HTML)

A site map is essential for every web site in order to keep the site linked well; each page should at least be linked to the sitemap. A good site infrastructure allows the search engine spiders to crawl the entire site and therefore ensure all pages of the site are present in the indexes of the major search engines.
Result:
Recommendations: 
Robots.txt" is a regular text file that through its name, has special meaning to the majority of "honorable" robots on the web. By defining a few rules in this text file, you can instruct robots to not crawl and index certain files, directories within your site, or at all.             
Result:
Recommendations: 
Here we can analyze the website with respect to the search engine presence, number of pages indexed, number of Back links to the website and the recent spider crawl date.

2.1. Website Status on Search Engines

Sr.No  DetailsWebsite Status
GoogleYahooBing
1.Website Presence


2.Indexed Pages


3.Back Links


 
Alexa Rank & Page Rank             


Alexa Rank Page Rank




 
These are the most need factors to be checked before optimizing a website. Proper implementation will help keyword ranking in search engine for your website.

Sr.NoDetailsOn Page Analysis Information
1Title Tag
2Meta Tag
3Header Tags 
4Bold Tags 
5Alt Attributes 

Recommendations: 
KeywordGoogle.com CompetitionCurrent rankings of www.example.com
   
   
   
 
Recommendations: 

3. Online Social Media Presence

 Social media is taking precedence in the online world.  Social media is favored by people and crawlers. People spend a lot of time on social media like face book, linkedin, dig and twitter and crawlers/search engines like to follow people.
Presence of a website on social media indicates that the business is at par with the technology and provides more visibility to the business. It helps people to find business on internet and crawlers to visit your site easily and regularly.
Result:
 Recommendations: 
4. Search Engine Optimization Recommendation

4.1. On page Recommendation

 Result: 
Recommendations: 
Result: 
Recommendations: 
 

No comments:

Post a Comment

ads