Thursday, 26 December 2013

What is an SEO-Friendly URL structure?

First of all, let me start by saying that it is always better to call in an SEO manager early in the development stage, so that there is no need to make sometimes hard-to-implement tweaks afterwards.

Some content management systems bake poor URL structures right into their websites. Lax rules can be a culprit, for example, not encoding spaces or special characters.
From an SEO point of view, a site’s URL structure should be:

Straightforward: URLs with duplicate content should have canonical URLs specified for them; there should be no confusing redirects on the site, etc.

Meaningful: URL names should have keywords in them, not gibbering numbers and punctuation marks.

With emphasis on the right URLs: SEO-wise, not all URLs on a site are of equal importance as a rule. Some even should be concealed from the search engines. At the same time, it is important to check that the pages that ought to be accessible to the search engines are actually open for crawling and indexing.

So, here is what one can do to achieve an SEO-friendly site URL structure:

1- Consolidate your www and the non-www domain versions

As a rule, there are two major versions of your domain indexed in the search engines, the www and the non-www version of it. These can be consolidated in more than one way, but I’d mention the most widely accepted practice.

Canonical issues, parameters that do not change page content, loose adherence to coding standards, or any number of reasons will create duplicate content.
Options for dealing with duplicate content include:
  • Reconfigure the content management platform to generate one consistent URL for each page of content.
  • 301 redirect duplicate URLs to the correct version.
  • Add canonical tags to webpages that direct search engines to group duplicate content and combine their ranking signals.
  • Configure URL parameters in webmaster tools and direct search engines to ignore any parameters that cause duplicate content (Configuration >> Settings >> Preferred Domain).

2- Avoid dynamic and relative URLs

Depending on your content management system, the URLs it generates may be “pretty” like this one:
or “ugly” like this one:

As I said earlier, search engines have no problem with either variant, but for certain reasons it’s better to use static (prettier) URLs rather than dynamic (uglier) ones. Thing is, static URLs contain your keywords and are more user-friendly, since one can figure out what the page is about just by looking at the static URL’s name.
Besides, Google recommends using hyphens (-) instead of underscores (_) in URL names, since a phrase in which the words are connected using underscores is treated by Google as one single word, e.g. one_single_word is onesingleword to Google.

And, to check what other elements of your page should have the same keywords as your URLs.
Besides, some web devs make use of relative URLs. The problem with relative URLs is that they are dependent on the context in which they occur. Once the context changes, the URL may not work. SEO-wise, it is better to use absolute URLs instead of relative ones, since the former are what search engines prefer.

Now, sometimes different parameters can be added to the URL for analytics tracking or other reasons (such as sid, utm, etc.) To make sure that these parameters don’t make the number of URLs with duplicate content grow over the top, you can do either of the following:
  • Ask Google to disregard certain URL parameters in Google Webmaster Tools in Configuration > URL Parameters.
  • See if your content management system allows you to solidify URLs with additional parameters with their shorter counterparts.
3- Avoid Mixed Case
URLs, in general, are 'case-sensitive' (with the exception of machine names). Mixed case URLs can be a source of duplicate content. These are not the same URLs,
The easiest way to deal with mixed case URLs is to have your website automatically rewrite all URLs to lower case. With this one change, you never have to worry if the search engines are dealing with it automatically or not.
Another great reason to rewrite all URLs to lower case is it will simplify any case sensitive SEO and analytics reports. That alone is pure gold.

4- Create an XML Sitemap

An XML Sitemap is not to be confused with the HTML sitemap. The former is for the search engines, while the latter is mostly designed for human users.
What is an XML Sitemap? In plain words, it’s a list of your site’s URLs that you submit to the search engines. This serves two purposes:
  1. This helps search engines find your site’s pages more easily;
  2. Search engines can use the Sitemap as a reference when choosing canonical URLs on your site.
The word “canonical” simply means “preferred” in this case. Picking a preferred (canonical) URL becomes necessary when search engines see duplicate pages on your site.
So, as they don’t want any duplicates in the search results, search engines use a special algorithm to identify duplicate pages and pick just one URL to represent the group in the search results. Other webpages just get filtered out.
Now, back to sitemaps … One of the criteria search engines may use to pick a canonical URL for the group of webpages is whether this URL is mentioned in the website’s Sitemap.
So, what webpages should be included into your sitemap, all of your site’s pages or not? In fact, for SEO-reasons, it’s recommended to include only the webpages you’d like to show up in search.

4. Close off irrelevant pages with robots.txt

There may be pages on your site that should be concealed from the search engines. These could be your “Terms and conditions” page, pages with sensitive information, etc. It’s better not to let these get indexed, since they usually don’t contain your target keywords and only dilute the semantic whole of your site.

The robotx.txt file contains instructions for the search engines as to what pages of your site should be ignored during the crawl. Such pages get a noindex attribute and do not show up in the search results.

Sometimes, however, unsavvy webmasters use noindex on the pages it should not be used. Hence, whenever you start doing SEO for a site, it is important to make sure that no pages that should be ranking in search have the noindex attribute.

Conclusion: Having SEO-friendly URL structure on a site means having the URL structure that helps the site rank higher in the search results. While, from the point of view of web development, a particular site’s architecture may seem crystal-clear and error-free, for an SEO manager this could mean missing on certain ranking opportunities.

Click here to Read More

Tuesday, 17 December 2013

How to Promote a Indian Domain in USA?

Dear Friends,

One of my friend Mr. Vishesh is asked me a very interesting question, that is - "I have a Indian new domain with extension ".in" and I need to promote this in usa, can i do this?"

My answer to my friend - "YESSS!! Domain is not a problem. Make sure you post quality content in your website that's the best way to bring visitors from all over the world."

Now here I explain in details, if you are agree/disagree with me please let me know by comment box.
If you want to get high SERP in any country,like USA ,you can submit your site to local directories,local search engines and local classified ad sites.and jion some local forums are also helpful to improve the SERP.

e.g- You just have to make the target country like USA

Do the following steps:

1) Do local publishing with valid local information and you have to local phone number too. But if you don't have then you can contact to purchase location based phone number.

2) Do best link building and most international. Because Google  USA is globally famous but if its not USA based like other country - France or Germany then you have to publish the same location and language based articles in location based websites.

3) Do local work most of with target city and country.

Then - you can get good response.

A quick example is also here

- If you have German language website then you need to do all German language articles, website publishing, local publishing, social bookmarking etc. i.e. you need to promote this in German language too., and your site also must have German version.
Click here to Read More

Thursday, 5 December 2013

Difference Detween Bad Directory and Good Directory Submission

Directories are not all created equal. Some are good, some are bad, and some are just downright ugly. I will discuss each one here, so you can make better informed decisions when preparing for a directory submission campaign. Let me start off with the ugly and finish off with the good.

The Bad Directory:
I would say most directories fall into this category. They’re not completely bad, but have some characteristics that make them questionable or borderline. So, what constitutes a directory, bad?

  • Database dumped directories. Some directory owners simply import a database of categories. This may save time, but it doesn’t solve the problem of duplicate content and it certainly doesn’t provide a unique experience for users. I would say this is a gray area because sometimes the categories might not be unique, but they may have unique listings.
  • Little to no editorial discretion. If a directory is approving every paid submission, then it is pretty obvious that the main objective of the directory is to make money. While no one thinks it’s a bad idea to make money – to run a directory based on the sole purpose of making money is an unsustainable business model.
  • Low or no PageRank directories. This isn’t always a bad thing because it takes time for directories to gain PR. However, if the directory is aged, but still has a low PR, it could be a sign that Google may have issues with it.
  • Non-unique descriptions for listings. If the site has a bunch of copied descriptions for many of its listing, it could create duplicate content issues. This is bad SEO for not only the directory, but also all listings within it.

The Good Directory:
From my experience, I would that say only a handful of directories out there are worth submitting to. Probably like 1 or 2%. So, what makes a directory, good?

  • Nice and unique design. Although a unique design doesn’t make or break a directory, it does add a unique experience for users. Plus, having a unique design tells me that the directory owner is in it for the long-run and willing to invest money into their business.
  • Editorial discretion / human-edited reviews. If a directory rejects a lot of submissions, there’s a good chance that it has a pretty solid guideline for approvals. Human-edited listings allow the directory owner to maintain a level of quality for their directory. And in the end, the quality is what it’s about.
  • Clear site structure and layout. A directory that is SEO-Optimized with a clear navigational structure is good for both users and search engines.
  • High PageRank. New sites and subpages may take time to gain PR, so disregard if it is new. For aged directories, PageRank does mean something. It says that at least one or a few good sites are linking to it, and that Google is acknowledging it as being a page of importance.
  • Other features. Is it just a directory, or is the directory providing other features like tools, resources, and blog posts? Directories that offer more features are providing greater value for their users. This is a win-win situation for all.
When it comes to web directories, you need to look a little deeper than the surface. Sometimes there is more than meets the eye. Make sure to do some research on several directories before you begin submitting.
Click here to Read More

Sunday, 1 December 2013

Effects of Recent Google Update, HummingbBird

The first thing one should know that Hummingbird is neither the replacement to earlier updates this year like Penguin 2.0, Panda, Encryption or others, nor their extension. However, all these updates are the parts of a bigger idea from Google to make its search engine more useful, sophisticated and user-friendly. This recent update has been long overdue since the last overhauling update from Google, which was Google Caffeine of 2010. Hummingbird update will help Google to know about the users intent behind the searches and therefore, it gives results for what users are trying to search rather than displaying results that match the keywords.

Now Google search will be more like a conversation between two people. It will keep track of what the user searched previously and will try to link those searches and give very specific results based on those searches. Like someone searched on a keyword Tea and then where it is grown, the search engine will link It with an earlier search of Tea and will give results accordingly. In addition, this update will also keep track of the users location and other important details.
So, in what way will this update affect SEO.

The effect of this update for those who were following Google guidelines for useful, informative and unique web content will show an increase in their website traffic. On the other hand, those who were spamming the internet with all kinds of black hat SEO techniques, banned by Google, to get higher ranking of their websites in search results will see serious actions from Google. Their websites will be penalized or may be removed under Manual Spam Action.

Concluding the discussion, the expert from the SEO Company, said as far as people follow all the guidelines from Google for publishing a unique and informative content on their websites, they need not worry about the ranking and the fate of their websites. In combination with earlier updates like Penguin 2.0, Panda, and Encryption, Hummingbird will only help to improve the strength of legitimate websites in search results.

Read the full story at
Click here to Read More

Tuesday, 15 October 2013

important Google Guidelines for new Website

Following these guidelines will help Google find, index, and rank your site.
  • Design and content guidelines
  • Technical guidelines
  • Quality guidelines
When your site is ready:
  • Submit a Sitemap using Google Webmaster Tools. Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your webpages.
  • Make sure all the sites that should know about your pages are aware your site is online.

Design and content guidelines:

  • Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

  • Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages.

  • Keep the links on a given page to a reasonable number.

  • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

  • Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

  • Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. If you must use images for textual content, consider using the "ALT" attribute to include a few words of descriptive text.

  • Make sure that your <title> elements and ALT attributes are descriptive and accurate.

  • Check for broken links and correct HTML.

  • If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

  • Review our recommended best practices for images, video and rich snippets.

Resource: and Google Blog

Click here to Read More

Saturday, 7 September 2013

New Directory Submission List 2013 with PR

We Provides you the list of latest and totally free directory Submission List.
If you have your own directory please drop your link/s on comment box we will added on our list.

Click here to Read More

Wednesday, 4 September 2013

Google Keyword Tool Is Officially Dead, Replaced by Keyword Planner Tool

I’ve used the external Google Adwords Keyword Tool (GAKT) for many years to research keyword metrics before buying keyword domain names. Not only did the GAKT let me know the approximate number of monthly searches for specific keywords and terms, but it also suggested similar search terms.I know many domain investors used GAKT for research, and Google has killed it off.

When I visited GAKT this morning, I was sent to a page with the following message at the top “Keyword Planner has replaced Keyword Tool.” The page explained how the Keyword Planner tool is taking over for the Google Adwords Keyword Tool.

I learned about this change a few months ago when Google announced that it would be happening. The company didn’t set a date for GAKT’s demise, but apparently this happened very recently.

Prior to GAKT, I used the Overture Keyword Tool, and that tool was taken down by Yahoo several years ago. With that in mind, I am sure I will figure out how the Keyword Planner tool will be helpful to my research, and hopefully it will be an improvement over GAKT. Time will tell though.

Good bye GAKT, it was nice knowing you...!! :(
Click here to Read More

Friday, 26 July 2013

How Google+ Can Help with Rankings


Speaking of creating social profiles for links, if your goal is to dominate Google, then you should make sure you join Google+. Google’s own social network can help you rank better in search results for people you are connected with. For example, when I’m logged in to Google+ and I search for SEO, I get the following in my top five search results. Personalized search results based on who I am friends with are marked by the little person iconWhen I’m not logged into Google+, you can see that the two personalized results shown earlier are not in the top five. One of them isn’t even on the first page.

It goes to show that personalized search results trump even local search results. Hence, if you want to get into the personalized search game, your goal is to do the following.
  • Create a Google+ personal profile and business page.
  • Recommend the website you want to rank well in search results by giving it a +1, sharing the website on your profile and/or page as a status update, and linking to it in your profile information under recommended websites.
  • Fill out your profile information completely to make others want to connect with you.
  • Check your settings and make sure that your profile information is public along with your status updates.
  • Share other interesting updates on your profile so it doesn’t look too self-promotional.
  • Start connecting with people who you want to see your website in search results. Use the search box on Google+ to find people to connect with and add them to your circles. Hopefully, most will add you back.
Essentially, the more popular you are on Google+, the more likely you are to influence personalized search results with those who are following you. Hence, take advantage of this social network to its fullest extent to reap the benefits.
Click here to Read More

Friday, 19 July 2013

How to recover from Google Panda and Penguin Penalties

Both Panda and Penguin updates aim at reducing traditional SEO practices and instead aim at quality content and social impact of the content which it measures through the social signals a website sends on various social networking sites. Recovering from Panda and Penguin penalties is not that much hard but yes it will take time. As these updates are rolled out periodically so you will have to wait for next update to recover your website’s rankings.

To recover from Google Panda penalty all you need to do is remove duplicate content from your website. Check the keywords density in your posts and maintain it below 2%. Get your highly optimized articles rewritten and this time write them for your readers and not for search engines.

To recover from Google Penguin penalty you need to get rid of all the spam and poor quality back-links that are pointing to your websites. However it is not possible to remove all the low quality links but you still can build some authority back links from similar niche websites to give your site a chance to recover its previous search engine rankings. You can make use of guest posting on authority blogs to build up a good link profile. There has also been stated that if an inner page of your site has been hit by the penguin penalty, then you can redirect that page (301 redirect) and create a similar inner page.

You might also Read: Google Panda and Google Penguin Update Overview
Click here to Read More

Thursday, 4 July 2013

How to Increase Your Website Traffic

There are many creative ways to increase traffic to your website. Some will cost you money, and some won't. Below you'll find many legitimate ways (ranging from free to costly) to boost the number of visitors to your website.

Search engine optimization is about ways to get your website to the top of search engines:
This cannot be stressed enough. You can do this by generating useful content like articles and blogs, by social networking and the development of multimedia. This sounds easy but it’s a lot of hard work. Still, you can also strategically spread your content in different websites while keeping your target niche in mind. Be very careful not to over-link pages within your site, this is a new penalty from Google's most recent update.

Offer free, original, and quality content on your site:

This is the most effective means for increasing traffic to a website; offering people something that they cannot obtain elsewhere, or at least, not to the level of quality that you are offering it. Ways in which to ensure that your content is of higher quality than competitors or is unique include:
  • Creating content that is helpful and useful. Simply cobbling together information from another website will not generate traffic.
  • Keep it fresh. For repeat visits, it is crucial to provide regular updates to the website, especially in frequently viewed zones. Add fresh content every few days if possible; at a minimum, weekly.
  • make sure your tags are in place and your links are not broken. You also need to submit a sitemap.xml file to Google to have your website show up in Google's search engines. There is a tool that inspects websites just in case you missed tags and links and it creates a free sitemap.xml file. It gives you results in minutes.
  • Outsource article writing. If you hate the thought of generating content yourself, or your team is not writing-savvy, consider outsourcing this end of the task.
  • Add video to your landing pages (VLP - Video Landing Page) that is informative and relevant to your site. Respected studies show that good video can improve conversions and page ranking more than most any other item you can add to your pages.
  • Use landing pages for fast fulfilment of your PPC advertisement.
  • Launch some contests. It'll draw you instant traffic. You can Goggle search some free sponsor program and do such contest frequently.
Trying to get more back links on your website:
Never copy and paste from another website - Google, Yahoo, MSN and other search engines are too smart for this nowadays and will detect copied and unoriginal content, sending you to the bottom of the pile.

Improve your search engine ranking: 

by focusing your content on keywords related to your topic. This is called search engine optimization and will help people find your website when they're searching the Web.

Exchange Links:
Trading links with other websites that are closely related to the subject of your website can bring you more website traffic.

Use Social Media: 

Post compelling content and you’ll soon build a loyal following. Follow and share with other users, who may reciprocate and follow you. Keeping up to date with social media and finding time to post can be tricky, but it’s well worth it. Let people talk and become part of a community and they’ll promote your content for you. This will save you a great deal of time, and they will share much further than via conventional strategies.
Click here to Read More

Thursday, 13 June 2013

Top Article Directories By Traffic, Pagerank

Article submission is an important part of SEO framework. It is necessary for every article submitted to the directories that it gets incoming link to the focus website so that at the end the purpose of search engine optimization is fulfilled. Initially for the articles to get positive feedback it must contain proper keywords so that during search on any search engine like Google it does make its appearance easily. Accordingly the links can be established to the parent website and will get the maximum number traffic hits at a given point of time. Today article or content writing has emerged out as a successful and profitable business and in this respect many companies are joining the queue. This SEO work is so rapidly growing that once it starts getting traffic it needs to be worked on constantly.

Quality Content”, that’s what is needed in your articles to attract more users to your website. But the big question is how exactly does it work? In an article submission, the content related to your product is placed on various article directories. Following this, the users are then able to read that content on your site. The articles are then published and hyperlinks generated allowing search engines to use their algorithms to work out the exact position of the website in the search results. Subsequently, you can increase the chances of your article getting syndicated by going through the article submission process over and over again.

Top 50 Article Directories Traffic, Pagerank:

Click here to Read More

Tuesday, 21 May 2013

How to see Who Views Your Facebook Profile the Most

Today I am going to teach you "How to find out who views your facebook profile without use any 3rd party websites or apps"

lets Started..

Take note that this tuts works with 'Google Chrome'. so if you dont have it installed in your PC then download it.

Here are the few steps that you need to follow -
Follow these steps and you can find out which people see your profile the most in numerical order.

Step 1- Open up Google Chrome and log in to Facebook. Right click anywhere on your profile and click "View page source" or simply press "ctrl + U" through keyboard.

Step 2- Press Ctrl+F and type "initialchatFriendsList".
You will now see multiple Profile ID numbers in quotation marks, the first one will be  the user who views your profile the most. The second ID number views your profile the second most, and so on.

see example in below figure:

Step 3- Type in the adress bar ""
(in the above link, change PROFILE_ID to 1st id that is listed when you searched for "InitialChatFriendsList" in step-2)

you will then be redirected to the profile of the person who views your profile the most.
Click here to Read More

Friday, 3 May 2013

Seo Interview Questions & Answers

This post is about SEO interview questions which have compiled SEO interview questions & answers. These are very helpful for SEO job interview that will definitely bring a successful and bright career.

Question: What is Search Engines?
Ans: Search Engines are very critical key element useful to find out specific and relevant information through huge extent of World Wide Web. Some major commonly used search engine:


Question: Tell me something about Google?
Ans: Google is the world’s largest and renowned search engine incorporating about 66.8% of market share. It was introduced in 1998 by students of Stanford University students Sergey Brin and Larry Page. The unique algorithmic ranking system is considered as its key of success. Apart of Google Mail services there are various worthy and useful tools are being offered absolutely free which include Blogger, Feedburner, YouTube, Google Plus, Adsense, Webmaster Tools, Adword, Analytics and many more.

Question: Define SEO?

Ans: SEO is the abbreviated form of “Search Engine Optimization”. It is the set of process in account of which a website or web page is being constructed or optimized that helps them to enhance their appearance or visibility at top in SERPs (Search Engine Result Pages).

Question: Explain distinct types of SEO practice?
Ans: Primarily two types of SEO are being sporting in practice – Off-Page SEO and On-Page SEO.

Off-Page SEO is the method of earning backlinks from other websites in order to enhance the ranking of the site. This method include various method of SEO including Blog posting, forum, article submission, Press release submission, classified and miscellaneous.

On-Page SEO is the process of optimizing a website which includes on-site work such as writing content, title, description, Alt tag, Meta tags as well as ensuring web-page’s code and design which can be indexed and crawled by search engines properly.

Question: What are the different techniques used in Offpage SEO?
Ans: There are lots of techniques used in Offpage SEO work. Major Techniques are:

    Directory Submission
    Social Bookmarking
    Blog Post
    Article Post
    Press Release Submission
    Forum Posting
    Yahoo Answer
    Blog Comment
    Deep link Directory Submission
    Regional Directory Submission and all that.

Question: Define blog, article & press release?
Ans: A blog is referred as an information or discussion published on website or World Wide Web incorporating distinct entries called as posts. Basically, the blog is referred as everything thing where you can include others too. It is more individual in contrast to article and press release. It is also considered as very personal in subject to both style and comprised ideas and information and can be written in the way just like you may talk to your readers. It is also called Web diary or Online Diary.

The articles are concerned with specific topic or event and are highly oriented towards an opinion instead of information. An article is supposed to be more oriented towards showing up opinions, views and idea. Generally, it is written  by a third party or expert of any specific field.

Press Release is related with a specific action or event which can be republished by distinct medium of mass-media including other websites. It should be simple, short and professional. It conveys a clear message or information.

Question: What are Meta Tags?
Ans: HTML meta tags are usually referred as tags of page data which sits between opening and closing head tags of a document’s HTML code. Actually these are hidden keywords who sits in the code. These are invisible to visitors but are visible and readable by Search Engines.


<title>Not considered as Meta Tag, even required anyway</title>
<meta name=”description” content=”Write your description here” />
<meta name=”keywords” content=”Write your keyword here” />

Question: Difference between keyword & keyword phrase?
Ans: The keyword term is basically concerned with a one-word term, on the other hand a keyword phrase considered as employment of two or more word-combinations. Therefore, it is very confounded to get high ranking in account of one-word keyword term until the one-word keyword has little online competition. Therefore, this practice is not encouraged to employ. In order to drive more traffic and top ranking in SERP it is recommended to employ keyword phrase.

Question: What do you know about Black Hat SEO?
Ans: In order to attain High Ranking in search engine result page, websites go for various methods and techniques which are characterized by two categories.

The method which are implemented and acceptable according to search engine guidelines are White Hat SEO, on the other hand, the method which are less acceptable or instructed to avoid in search engine guidelines are “Black Hat SEO”.

Question: Can you tell me some Black Hat SEO techniques?
Ans: Some Black Hat SEO techniques are:

    Keyword Stuffing
    Doorway Pages or Gateway Pages
    Link Farming
    Hidden Text, etc.

What is spider?
Ans: Spider also called as bot, crawler or robot is a set of computer program that browses World Wide Web in methodical and orderly fashion as well automatically scan the web-page and website for updated content and download a copy to its data center to index.

Question: Name the bots (spider) of major search engine?
Ans: The name of bots/spider of Google search engine is GoogleBot, Yahoo Slurp for Yahoo search and BingBot for Bing search engine.

Question: Can you differentiate ‘nofollow’ and ‘dofollow’?
Ans: Nofollow link is exactly vice-versa of dofollow link. These are non-crawling link which are not passed by search engines bot and hence can’t be cached or indexed. It is obvious when we wish to prevent a link from crawling and indexing.

Dofollow link is a kind of hyperlink which says all search engines crawlers to pass through which also put an impact over page rank. When we opt to employ or attempt to achieve a dofollow link then it is counted by search engines and sits in the eye of Google, Bing, MSN, Yahoo etc. as a backlink for your website and enhances your site ranking.

Question: Define Page Rank.
Ans: PageRank is a set of algorithm for link analysis named after Larry Page and employed by Google search engine towards defining a numerical value from 1 to 10 to each component of hyperlinked documents like world wide web. The value accepts only round figure that means decimal are not allowed. Page rank is calculated by their inbound links.

Establish a difference between PR & SERP?
Ans: PR is Page Rank which is defined by quality inbound links from other website or web-pages to a web page or website as well as say the importance of that site.

SERP stands for Search Engine Result Page is the placement of the website or web-page which is returned by search engine after a search query or attribute.

What is Cache?
Ans: Cache is the process performed by search engine crawler at a regular interval of time. It used to scan and take snapshot of each page over world wide web as well as store as a backup copy. Almost every search engine result page incorporates a cached link for every site. However, clicking over cached link show you the last Google cached version of that specific page rather than of current version. Also, you can directly prefix “cache:” with desired URL to view it cached version.

Question: Define Alt tag?
Ans: The alt attribute also called as alt tag are employed in XHTML and HTML documents in context of defining alternative text that is supposed to be rendered when the element can’t be rendered to which it is applied. One great feature of alt tag is that it is readable to ‘screen reader’ which is a software by means of which a blind person can hear this. In addition, it delivers alternative information for an image due to some specific reason a user can’t view it such as in case of slow connection and an error occurred in the src attribute.
For example, the HTML for this image will appear something like this:

<img alt=”you can define alt tag just below the input box of image title while uploading or editing a image.” src=”<http://www.example/wp-content/uploads/2013/07/Alt tag.jpg”>

What do you know about Adsense?
Ans: Adsense is a web program conducted by Google that enables publishers of content websites to cater text, rich media, image, video advertisements automatically which are relevant to content of website and audience. These advertisement are included, maintained and sorted by Google itself and earn money either by per-click or per-impression basis.

Question: Can you define Adword?
Ans: Adword is referred as the main advertising product of Google which is useful to make appear your ads on Google and its partner websites including Google Search. This Google’s product offer PPC (Pay Per Click) advertising which is a primary module and incorporate a sub module CPC (Cost Per Click) where we bid that rate that will be charged only when the users click your advertisement. One another sub module is CPM (Cost Per Thousand Impression) advertising where advertiser pay for a thousand impression on flat rate to publisher. In addition it also includes website targeted advertising of banner, text and rich-media ads. Moreover, the ad will appear especially to those people who are already looking for such type of product you are offering as well as offer to choose particular sites with geographical area to show your ads.

Question: What is PPC?
Ans: PPC is the abbreviated form of Pay Per Click and is a advertisement campaign conducted by Google. It is referred as a primary module with two sub module CPC (Cost-per-click) and CPM (Cost per thousand impression) through bidding and flat rate respectively. In CPC the advertiser would be only charged when the user click over their advert.

What are the aspects in SEO?
Ans: The main aspect in SEO are divided in two class: SEO On-Page and SEO Off-Page.

SEO On-Page includes Meta tag, description, keywords optimization, site structure and analysis, etc.

SEO Off-Page aspect are Keyword Research, unique and quality content, link building through Blog Comments, Blog Posting, Article submission, Press Release, Classified posting, Forum posting.

What do you know about RSS?
Ans: RSS stands for Really Simple Syndication is useful to frequently publish all updated works including news headlines, blog entries etc. This RSS document also known as web feed, feed or channel that incorporate summarized text including metadata i.e. authorship and publishing dates etc.

However, RSS feeds make the publishers flexible by syndicating the content automatically. There is a standardized file format XML that lets the information to be published once which can be visible to several distinct programs. Also, this make readers more ease to get updates timely by allowing them to subscribe from their favorite sites.

Question: How would you define Alexa?
Ans: Alexa is a California based subsidiary company of which is widely known for its website and toolbar. This Alexa toolbar congregate browsing behavior data and send it to website, where the data is analyzed and stored and create report for company’s web traffic. Also, Alexa provides data concerned to traffic, global ranking and other additional information for a websites.

Question: How can you achieve Google Page Rank?
Ans: Generally, Google Page Rank is based on inbound links, therefore, more backlinks you congregate more your page rank will be. Also, it is influenced by rank of page which is linked to you. One other thing to consider is that older your website will be, it will be more favorable and trusted to Google. Google reward those websites who incorporates lots of pages, tons of incoming link and also healthy quantity of internal links to another pages within the site. In respect of SEO projects, relatively it is not so significant but delivers a picture about work to perform towards earning inbound links.

Question: Why the Title Tag in Website is valuable?
Ans: In our SEO efforts Title Tags are very earnest. It is highly recommended to include a Unique Title that exactly says about the contents sits in that page. It is valuable because this is thing which appears in the search engine result section and tells the user & search engine, what is about this page.

Question: What is Site Map and distinguish between HTML sitemap and XML sitemap?
Ans: A sitemap incorporates list of web-pages which is accessible to users or crawlers. It might be a document in any form employed as a tool for planning either a web page or web design that enables them to appear on a website as well as typically placed in a hierarchical style. This helps search engine bots and users to find out the pages on a website. The site map renders our website more search engine friendly as well enhances the probability for frequent indexing.

HTML sitemap can be incorporated directly in a web page for user’s flexibility and can be implemented through proper design. On the other hand, XML sitemap is useful only for search engine crawlers or spiders and doesn’t visible to users. It sits in the root of website.

such as:

On Home Page, I have 10 images, from which I want the Google Spider to crawl only 5 and remaining 5 should be hidden from the Spider? What will I do?
Answer: I will make a separate folder in the root file and put all those five 5 images which I don’t want the spider to crawl and then add the folder in the Robots.txt

Question: What is the difference between Indexing and Crawling?
1) When the Search Engines come on your website and read the data or find the data, this process is called crawling.

2) Whenever Search Engines come on your website, which ever page they crawl, they take a copy of that page with them and save it on their server. This process done by the search engine is called Indexing

Which four search engines comprise 90%+ of all general web search traffic?
Answer: Google, Yahoo!, MSN/Live and Ask

Three most important elements in the head section of an HTML document that are employed by search engines.

Answer: Title, Meta Description and Meta Robots are the big 3.Meta keywords is another common answer, but it would rank as a distant 4th.

Question: Briefly explain the PageRank Algorithm?
Answer: In simple terms, Google uses the gross number of inbound links to a page to determine how important the page is. This “pagerank” has little to do with actual search results but can make a difference on user behavior.

What are XML sitemaps?
Answer: They are an additional tool to help the search engines when they crawl a site. There is no requirement for any sitemap and your pages will get indexed without them if you pay close attention to navigation within your site.

Question: Explain various steps that you would take to optimize a website?
1) Interview website owner or webmaster to get a good grasp of the site’s purpose and goals.

2) Perform a keyword analysis to find best performing keywords that should be used for that site and for individual pages of the site.

3) Analyze site content to determine usage of relevant keywords and phrases.
This includes visible text as well at titles, META tags, and “alt” attributes.

4) Examine site navigation.
5) Determine the existence of robots.txt and sitemap and examine those for effectiveness.

6) Make recommendations for changes needed for the site and each individual page.

Question: If the company whose site you’ve been working for has decided to move all of its content to a new domain, what steps would you take?
Answer: I would update the old site with permanent redirects to to new page for every page. Then I would attempt to remove old content from the major search engines to avoid duplicate content issues.
Click here to Read More

Art of Search Engine. Copyright 2012 All Rights Reserved True SEO Services by SEO Technology