Dear Readers, Welcome to SEO Interview Questions have been designed specially to get you acquainted with the nature of questions you may encounter during your Job interview for the subject of SEO. These SEO Questions are very important for campus placement test and job interviews. As per my experience good interviewers hardly plan to ask any particular questions during your Job interview and these model questions are asked in the online technical test and interview of many IT companies.
SEO is an abbreviation for search engine optimisation; it is used to describe the process designing your website in a way that it will appear higher in search engine rankings. In addition to designing, SEO as a process involves fixing the onsite issues, building links and popularizing the Website over the social media networks. Once we do the following processes our keywords gets higher positions with the search engines and in return we get traffic and sales.
The acronym "SEO" can also refer to "search engine optimizers," a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. But the use of SEO as “Search Engine Optimizer is very limited and that way we can say there is no difference between SEO and Search Engine Optimization.
Search Engine Optimization is the practice of maximizing the volume or quality of traffic to a web-site from search engines via ''natural'' or unpaid search results. Here the term natural is a qualifier since the SEO professionals use the Google Webmaster approved guidelines to optimize a Website and build up the traffic naturally.
Organic Search engine optimisation is also known as ''Natural SEO'' is used for the purpose of increasing traffic through the search engines by using several techniques that may seem complex but do not use any fake methods to increase the ranking.
Black Hat SEO refers to the use of aggressive SEO strategies, techniques and tactics that focus only on search engines and not a human audience, and usually does not obey Google webamster guidelines. For example, keyword stuffing, invisible text, doorway pages, sneaky redirects etc are various examples of black hat SEO techniques. In comparison to this when the SEO campaign is done with Google websmaster approved guidelines that's called white hat SEO. While in short run the Black hat has big success it will attract penalties which can go upto blacklisting, dropping the SERP and dropping the site from Google indexing etc.
Ethical SEO refers to the usage of the SEO strategies,techniques,tactics that focus on human audience opposed to search engines and completely follows search engines rules and policies. Question is “Are you able to tell what you have done to achieve the rankings?” If the answer is yes, it means you are following white hat techniques of SEO. So we can say when a webamster follows white hat techniques, he is following ethical SEO practice.
Spam is a process followed to get unfair advantage from the end user. So when you are trying to affect the end users decision by in genuine ways spam comes into practice. We can have various examples of content spam, email spam, link spam, comment spam etc. It give more problems than success. We know this and that's why avoid any kind of spamming.
The search engine crawlers or bots crawl through the website and generate the required information and update in the database. So when a user searches, they find the relevant document on the Website and if our site matches their requirements, it appears in the search engine listings.
A search engine is an information retrieval system designed to help find information stored on a computer system. The search results are usually presented in a list and are commonly known as hits.
Google, Yahoo, You Tube, Bing, Baidu, AOL, Ask.com etc. But interms of market share and partner network Google, Yahoo and MSN/Bing are the three largest search engines.
A spider, also known as a robot or a crawler, is a program that follows, or "crawls", links throughout the Internet, grabbing content from sites and adding it to search engine indexes.
There are basically three ways to get your site listed in a search engine:
- Submit your site directly to the search engine using a free submit form.
- Let the search engine find your site through links to site from other sites such as directories.
- Pay the search engine to index your sites.
We have IBP program, and using this we can submit to hundreds of search engines in faction of hours. But please note if the Website is crawled by Google, Yahoo and MSN, it will indirectly appear in the search results of other search engines. So the purpose of submitting the websites in tier two or three search engines is to get crawls from multiple sources.
As explained in the last point the Website is benefited having crawled from different sources. So if your Website is frequently crawled and indexed from different search engines, this helps in building the SERP and indirectly attract the traffic to your Website.
One way is to use "site:domain-name" to search for your site. This works with Google, Yahoo and Microsoft Live.
Search engines takes as many as 80 factors to evaluate the page strength. But there are few factors which plays a pivotal role and rest factors adds up the list. For example average meta strength, domain power, quantity and quality of back links, brand value over the social media circles, page rank, QDF-Query deserves freshness, page type tendency like informational, page loading time, sitemaps etc are the most important factors for getting top 10 listings.
The difference between free search engine listings and paid search engine listings is that the free search engine listings refers to listings from SEO process and paid search engine listings refers to the listings received from PPC process.
We target the following programs like Google Adwords, Yahoo Search Marketing and MSN Adcenter. Other than that Miva is also a part.
Google is trying to return the searches based on the past search behaviour, your average time consumed on the websites, your preference of search etc. So basically if this is done, your searching becomes interesting and effective. It's a fact that if personalized search is implementer that will directly affect the SEO process. Please note in the era of personalized search, the results people are getting might vary extensively and that will make the tracking of SEO success little difficult and challenging.
It refers to optimisation of the website belonging/referring to a particular geographical area. If your business targets to a particular area like state/county/city, the local search engine optimization has a major role to play. Basically in the Local search engine optimization we create the listings with Local search engines, yellow pages, try to exchange the traffic with the local sites etc.
Google indexing is an automatic process. So the check list for Google indexing is as follows:
Is it a new website: Submit with Goolge Add Url page.
Create the listings from hub pages that internet communities consider as authoritative.
Check from the robots, if the no index tag has been added.
Check for server response code and if any issue is found rectify it with instant effect.
Though indexing is automatic, we need the follow the above steps. Once done your site can take few hours to few days for being indexed.
By putting the <changefreq>your change time</changefreq> in xml sitemap. Also you can select the priority level in the Goolge webmaster tool account. But the fact is fixing the crawl rates won't make your website crawled frequently. You need to offer the spider proper food to be indexed frequently.
a) No indexed by search engines
b) May be canonical issues are there
c) May be the site is banned by search engines
Following are the situations:
a) If your domain is not coming in the domain name search.
b)After a long time your domain is not coming on search engines
c)Your server logs register no visit from search engines/engine.
d)You lost your keyword positions and visits dramatically in fraction of days.
In order to lift a ban you must know the reason of ban. After the reason is identified do the following things:
Rectify the reason which caused the ban.
Make the content change over the Website.
If possible make changes in the look and feel.
Build up links from highly authoritative sites.
SERP-Search engine ranking page, is the listing of web pages returned by a search engine in response to a keyword query. The result normally include a list of web pages with tittles, a link to the page , and a short description showing where the keywords have matched content within the page.
Search engines are independent entity and no one has connection with search engines. .
There is no such term like “Preferential Program”.
The Search Engine programs are not designed to read the dynamic content. How ever they are constantly increasing the program and now they can read few categories dynamic URLs not all.
No search engines doesn't index the images. That's why we do image optimization.
Yes Google can now follow flash. So after Adobe made some in roads search engines can now read the links and applications. But please note there is a dynamic application file that covert the Flash readable to search engines. So here the search engines are reading the Flash site indirectly.
Traces suggests that search engines are still working on how to read the Java Script, Sessions ID, Cookies etc.
A word or phrase which is used when searching for some information is know as a keyword. It is these words or phrases that webmaster use when describing or explaining the content of their website.
Our process starts with collecting the seed phrases related to client's business. Then we try to identify various combinations possible from the supplied seed phrases. This is done with the help of keyword aggregation tool from Google Adwords. Once we have a list of keywords we cross check from their traffic building capacity and competition. After these factors are determined we deduct all the irrelevant keywords in the elimination round. This is the standard process of compiling a keyword list.
The entire SEO process starts with the selection of best lot of keywords. Since the keyword phrases are the primary connector of business with the customer, we will have to select the sales driven keywords with moderate competition.
When targeting keywords phrase it is advisable to consider the plural phrases as it contains both plural and singular as for example businesses contain the word business. A search engine which considers the plural keyword to be different from the singular keyword now may not do so in the future, and vice-versa. In any case, you are always better off optimizing your site for both the plural and singular versions of your keywords, assuming that both versions are popular.
Monthly Search volume is the number of times that the users searches a particular keyword during a month. It is calculated as follows Global monthly search volume/no. of days. This figure is given Google Adwords External tool.
Broad match helps you attract more traffic to your website. It's user friendly but narrow match is mostly helpful for SEO optimizers.
The keyword tools we are using are Google Adwords External, Wordtracker and Keyword Discovery.
Keyword research file has got two keyword tabs. The first tab covers overall research data. The second tab covers all the keywords that we would like to target. You need to go through the final keyword list and select “x” keywords for SEO. For better understanding of the keyword research I suggest you to go through the tab-1 before finalizing the tab-2.
For selecting the final list of keywords you need to check the global search volume which should be higher/moderate and the SEO competition should be less. If the monthly search volume is less than 500 per month that keyword should not be taken into consideration.
General competition: It represents the total generic competition existing for the keyword phrase. Let us consider a keyword “Internet marketing”. So when we search that keyword all document featuring either “Internet” or “Marketing” will pop up in the search results. On the contrary when we do exact search of “Internet Marketing” all document concerning to Internet Marketing will pop up. As a performance metrics we calculate the numbers so that keyword decision making becomes easier.
Clicks indicates the number of times any user clicks on an ad while visits indicates the number of unique sessions each visits create.
As I said the keyword research is the most important part of SEO. So by looking to your PPC keywords we can have an idea of the keywords which generate traffic and make the decision making effective.
Okay no issues. Let me tell you that you can best determine the keywords. You know your business better than me. Also I would like to say that the keywords we have provided are the suggestions which we feel would bring traffic with the plan you have taken. So if you have some complain please amend the list and we can start working on the amended list. Until unless you are happy with our suggestions we can't start working. So I will be waiting for your next update on this.
Ideally the minimum monthly search volume should be 500 or more. But it might vary in accordance to geographical or long tail keywords.
By selecting key phrases using the expressed intent of targeting a specific consumer base, the business creates a greater probability of visitors to the web site being actual interested buyers instead of web window shoppers. Moreover, selecting keywords which are closest in relevance towards the content of the website much better streamlines visitors into that which yields profits. If you see the above selection has impact on the search engines as they are happy when their visitors are happy.
Yes, we have a genuine process for researching your competitors. Utilizing this process and some recognized tools we can easily know the keywords your competitors are targeting.
Visitors rarely type in one word and hope for the best. Most searches are refined using a number of words, identifying the actual content required more precisely.
People rarely search on one word alone. The public are much more used to using search engines now, and the days when they entered one generic keyword term and hoped for the best have gone. This is type of search might be informative in nature to gather knowledge and not conversion oriented. But for branding purpose and getting maximum visits you must target these keywords, with a long run plan.
These are the keywords or phrases which are having less search volume but higher conversion rate. Yes they helps in SEO. In short run and long run we must have a optimized blend of short trail and long trail keywords to get the best returns.
Keyword density is the percentage of times a keyword or phrase appears on a web page compared to the total number of words on the page. It is calculated in the following way.
(Targeted keyword/Total no. of words)*100. Also we must consider that the keyword density includes the page content as well as the page code while doing the percentage calculation.
There can't be a direct answer to this question. The SEO professional must be extremely instrumental while calculating the keyword density. Though it's suggested to have the keyword in full, we must not ignore the part keywords and other semantic keywords while this calculation is on.
No you should not jumble the keyword. This makes a new keyword, which can't have the searches we are expecting. So this practice should be avoided at any cost.
55.Should the keyword density check cover the code area?
While you are calculating the number of times repeation the code area should be taken into consideration.
LSI means Latent Semantic Indexing. This is owned and used by Google. The search engine tries to associate certain terms with concepts while indexing web pages. For example, Paris and Hilton are associated with a woman instead of a city and a hotel, Tiger and Woods are associated with golf.
Google has been using this concept to determine suitable ads for its AdSense service for some time now. It seems that Google is now also using this concept to improve the quality of its search results. If Google uses this concept in its ranking algorithm (which is very likely though the complete traces have not yet found out) then its advisable that you don't focus on a single keyword, but on a set of related keywords with your search engine optimization activities.
Keyword Stemming is a useful tool for web pages and search engines. The process of keyword stemming involves taking a basic but popular keyword pertaining to a particular website and adding a prefix, suffix, or pluralization to make the keyword into a new word. This particular process allows a website to expand upon the number of variable options, which can help a website get more traffic. Words that are a product of keyword stemming can expand in either direction, or even add words to the phrase, making the possibilities limitless.
Keyword Density: It is combination of the number of times a keyword or keyword phrases,in proportion with other words, appears on a web page.
Keyword Proximity:It refers to the distance in space as measured in within the keywords as well as the phrases including the keywords without repetition and with contextual relevance.
It measures the closeness between two keywords.
Keyword Relevancy: Keyword Relevancy encompasses a large area of potential confusion. Simply put, your "keywords" must be relevant to the contents, or "theme" of your web page. Keyword relevancy and analysis in choosing keywords that is matching the content of the web page is important to your success with the search engine. When selecting the keywords you have to be more specific as there is better chance of getting targeted traffic.
Search Engines are very critical key element useful to find out specific and relevant information through huge extent of World Wide Web. Some major commonly used search engine:
Google ,Yahoo, Bing.........
Google is the world’s largest and renowned search engine incorporating about 66.8% of market share. It was introduced in 1998 by students of Stanford University students Sergey Brin and Larry Page. The unique algorithmic ranking system is considered as its key of success. Apart of Google Mail services there are various worthy and useful tools are being offered absolutely free which include Blogger, Feedburner, YouTube, Google Plus, Adsense, Webmaster Tools, Adword, Analytics and many more.
If we want to block some data from search engine then we used this file. This file must be uploaded on root server. This is the first file which is crawled by search engine first.
Primarily two types of SEO are being sporting in practice – Off-Page SEO and On-Page SEO.
Off-Page SEO is the method of earning backlinks from other websites in order to enhance the ranking of the site. This method include various method of SEO including Blog posting, forum, article submission, Press release submission, classified and miscellaneous.
On-Page SEO is the process of optimizing a website which includes on-site work such as writing content, title, description, Alt tag, Meta tags as well as ensuring web-page’s code and design which can be indexed and crawled by search engines properly.
There are lots of techniques used in Offpage SEO work. Major Techniques are:
Press Release Submission
Deep link Directory Submission
Regional Directory Submission and all that.
A blog is referred as an information or discussion published on website or World Wide Web incorporating distinct entries called as posts. Basically, the blog is referred as everything thing where you can include others too. It is more individual in contrast to article and press release. It is also considered as very personal in subject to both style and comprised ideas and information and can be written in the way just like you may talk to your readers. It is also called Web diary or Online Diary.
The articles are concerned with specific topic or event and are highly oriented towards an opinion instead of information. An article is supposed to be more oriented towards showing up opinions, views and idea. Generally, it is written by a third party or expert of any specific field.
Press Release is related with a specific action or event which can be republished by distinct medium of mass-media including other websites. It should be simple, short and professional. It conveys a clear message or information.
HTML meta tags are usually referred as tags of page data which sits between opening and closing head tags of a document’s HTML code. Actually these are hidden keywords who sits in the code. These are invisible to visitors but are visible and readable by Search Engines.
The keyword term is basically concerned with a one-word term, on the other hand a keyword phrase considered as employment of two or more word-combinations. Therefore, it is very confounded to get high ranking in account of one-word keyword term until the one-word keyword has little online competition. Therefore, this practice is not encouraged to employ. In order to drive more traffic and top ranking in SERP it is recommended to employ keyword phrase.
In order to attain High Ranking in search engine result page, websites go for various methods and techniques which are characterized by two categories. The method which are implemented and acceptable according to search engine guidelines are White Hat SEO, on the other hand, the method which are less acceptable or instructed to avoid in search engine guidelines are “Black Hat SEO”.
Some Black Hat SEO techniques are: Keyword Stuffing Cloaking Doorway Pages or Gateway Pages Link Farming Hidden Text, etc.
I would make troubleshooting for the issues. Firstly, I would designate whether it is a new project. Again I will analyze relevant keywords and phrase for the site I am optimizing for as well as took an insight study of competitive analysis. If the website and pages has been index and appears in the first 10 pages of search engine result page but not in the top three, I would go for some transformation to on page text, page titles and page descriptions. But in the case if website has not been indexed still or dropped from index, might be it comprises some big issues and total re-submission and re-work might be required.
However, I opined here that this post will be highly beneficial for who are stepping towards marshaling their career as SEO professional especially for those who are still freshers. Moreover, in this post I have also listed some useful SEO interview question for experienced professionals that will definitely help them to grab their dream job and offer a great opportunity to add sparkles in their career.
Competitive analysis, as its name suggested it is establishment of a comparison of data between the website I am optimizing and that website who rank highly in search results. It will be a great source of idea.
First of all I would attempt to make a search on all search engines employing relevant keywords and keyphrases, I am optimizing for. The analysis of result will say whether the methods of optimization have gain results or lost. I would analyze the report regularly as search engine make update and index. I would attempt to another aspect of website statistics which says about origin of traffic.
Frames in HTML are obvious as they used to discriminate the page content into distinct fragments. Search engines treat these frames as absolutely different pages as well as frames also put an negative impact over SEO. Therefore, we should avoid the practice of using Frames and implement basic HTML instead.
LSI is the abbreviated form of Latent Semantic Indexing. It has been emerged as a technique of fetching data via establishing a communication among words as well as employing synonyms in the midst of retrieving the data from the index.
It is the practice to find out root word from search query. For instance, a keyword like “playful” will be split to the word “play” by stemming algorithm that turns it possible. Thus, the search result appear on the screen will contain the word “play” in it.
Including new, original, unique and quality content on our website more frequently enables search engines to crawl more frequently.
The main mistakes that should be avoided are filling keywords in Web-Page, employing identical anchor text for link building, keyword stuffing, getting low quality backlinks etc.
I would discard links from those websites which is supposed to act as link farm as well as poor quality sites which have low page rank. Ensure that our site must contain unique and quality content without keyword stuffing. Also, I used to avoid any practices of ‘spam’ which include certain affiliate advertising websites, unsolicited e-mail campaigns etc.
Social networking websites are considered as social media which is very effective and robust for viral marketing. Viral marketing has been proved as very powerful resource, in the case if our content is unique, attractive and appealing. Some media Site:
Google recently had the Penguin update which was a measure to control web spam. Penguin update had penalized websites which had a spammy backlinking profile and returned more semantic results. Semantic results were based on the relationship between words present on the search query. Penguin trusted sites that had original and good content, fresh content, good social media presence and quality organic links.
Verticals means on what themes of websites have you worked so far- this may include education, real estate, IT, travel, Shopping, jobs etc.
The Panda update was done to improve the quality of search results in Google. Panda update also known as the farmer update was done to eliminate content farms which provided less user friendly experience. It used machine language scalability as one of the important metrics for judging relevancy of a web page. All the focus was transformed on the user and now quality content , proper design, proper speed, proper use of images and videos, content to ad ratio all mattered more after the Panda update. You need to optimize your site for better clickthrough rate and a less bounce rate.
Google Sandbox is an imaginary area where new and less authoritative sites are kept for a specified time period until they establish themselves of being displayed on the search results. It happens by building too many links within a short period of time.
On page seo means optimizing your website and making changes on title, meta tags, site structure, site content, solving canonicalization problem, managing robots.txt etc.
Off page optimization means optimizing your web presence which involves backlink building and social media promotion.
.htacess file is used to solve the canonicalization issue of a website. It may happen that the home page of the site may take several urls like http://www.example.com or http://www.example.com/index.html or http://example.com . The search engines might treat these url as different and may divide the link juice gained by having various backlinks made with any of these 3 urls. The link juice gets divided between these three urls equally. .htacess file is created to have a single url of the home page so that the link juice is passed onto single url.
Keyword stemming is the process of finding out the root word from the search query. A query having the word “ playful” would be broken down to the word “play” with the help of stemming algorithm. The search results returned would be having the word ” play” in it.
LSI stands for Latent Semantic Indexing. It is a data retrieval technique which finds connection between words and the use of synonyms while fetching data from the index.
Florida update happened on November 16th 2003. This update applied stemming, maintained local rank, penalized over optimized sites by applying filter etc. In short, it modified the search results before presenting it to the user by applying filters.
Title tag can be between 66-70 characters and meta description tag can be between 160-170 characters.
No, Google does not make use of keyword tags.
It is a method of redirecting user from the old page url to the new page url. 301 redirect is a permanent redirect and is helpful in passing the link juice from the old url to the new url.
Provide a suitable explanation about your job profile which would include your job responsibilities and the amount of work you have handled so far.
Cloaking involves using deceptive techniques which presents the user with a different version of the webpage than that presented to the search engines.
I use Google webmaster tools, Google Analytics, Open site explorer, Alexa, Website grader etc.
See:- List of free seo tools
You will have to answer here about your good qualities like- I am hardworking, sincere, punctual, love to accept challenges etc.
Seo stands for Search Engine Optimization while Sem stands for Search Engine Marketing. Seo provides organic traffic to a website with the help of search engines while Sem involves the use of Google adwords and other paid channels of advertising.
PPC stands for Pay Per Click. It is a form of advertising methodology in which the advertiser pays for every click on their ads. Google uses this form of advertising in its channel of advertising called Adwords.
Meta Robots tag is much better as it helps in forcing the search engine crawlers not to index and display the hidden pages in your server.
Yes I use separate strategies for Google,Yahoo and other search engines.More backlinks are required for Google .It pays more attention to backlinks and site authority while Yahoo and Bing pays more attention to title and Meta tags. Hence, a site takes time to rank on Google as compared to Yahoo and Bing.
By building more backlinks from authority sites and high page rank webpages.
With the help of the link operator on Google and by using various external tools like Alexa, Backlink Watch , Open Site Explorer, Backlink finder etc.
Both are important. Building great content is necessary as it is your first step towards ranking ;and building backlinks helps to build authority to your website and is an important metric for ranking well. Hence both should go parallel and both are equally important.
Matt Cutts is the head of Google's web spam team.
It means 40% of web pages from that particular website are indexed by the search engine.
Clickable text written on an hyperlink is known as anchor text. It is of great value to the search engines and is used for evaluating the relevance of web pages with respect to search queries.
Google loves web standards hence I will apply the web standards provided by W3C while optimizing a web site.
I would use external style sheets, less images (unless necessary), optimize the images and decrease the file sizes of the image without reducing the quality of the image, use CSS sprites to reduce HTTP requests etc.
I use Google Keyword tool, Wordtracker tool, Wordstream, Seo book keyword tool etc.
*Remember certain tools are paid but you may use the trial version for some days.
I follow Matt Cutts blog. Seomoz blog, Seochat forums, Searchengineland.com,Seo book, Seosandwitch etc.
These are pages that are specially created to rank high on search engines using deceptive techniques.Doorway pages do not provide useful content but instead redirect users to the main page.
I have used blogs like blogger,wordpress,typepad etc, social bookmarking sites like Digg,Jumptags,Delicious etc., social networking sites like Facebook,Linkedin etc., Video Sharing sites like Youtube,Vimeo etc.