Questions and answers taken from real job interviews.
Find real interview questions and answers on this website:
Adword is referred as the main advertising product of Google which is useful to make appear your ads on Google and its partner websites including Google Search. This Google's product offer PPC (Pay Per Click) advertising which is a primary module and incorporate a sub module CPC (Cost Per Click) where we bid that rate that will be charged only when the users click your advertisement. One another sub module is CPM (Cost Per Thousand Impression) advertising where advertiser pay for a thousand impression on flat rate to publisher. In addition it also includes website targeted advertising of banner, text and rich-media ads. Moreover, the ad will appear especially to those people who are already looking for such type of product you are offering as well as offer to choose particular sites with geographical area to show your ads.
Nofollow link is exactly vice-versa of dofollow link. These are noncrawling link which are not passed by search engines bot and hence can't be cached or indexed. It is obvious when we wish to prevent a link from crawling and indexing. Dofollow link is a kind of hyperlink which says all search engines crawlers to pass through which also put an impact over page rank. When we opt to employ or attempt to achieve a dofollow link then it is counted by search engines and sits in the eye of Google, Bing, MSN, Yahoo etc. as a backlink for your website and enhances your site ranking.
SEO is a set of processes to get our website or pages appear in search engine result page. On the other hand SEM is search engine marketing being used in practice to purchase advertising space in search engine page result.
Frames in HTML are obvious as they used to discriminate the page content into distinct fragments. Search engines treat these frames as absolutely different pages as well as frames also put a negative impact over SEO. Therefore, we should avoid the practice of using Frames and implement basic HTML instead.
Webmaster Tool is a free service catered by Google which provide us a complete report for indexed data, crawling errors, backlinks information, search queries, website malware errors, CTR and submitting XML sitemap. Basically, it acts as a mediator between website and server provide complete overview of data, issues and other queries. Google Analytics is a free web analysis tool first rolled out in late 2005 but generally it become available for users in August 2006. This tool acts between website & internet browser/users and offers complete overview of visitors statistics which exactly says about general website activities like page views, site visits, bounce rates, average time spent on site or pages, sources of traffic, location etc. It is also obvious for tracking Adword queries.
A blog is referred as an information or discussion published on website or World Wide Web incorporating distinct entries called as posts. Basically, the blog is referred as everything thing where you can include others too. It is more individual in contrast to article and press release. It is also considered as very personal in subject to both style and comprised ideas and information and can be written in the way just like you may talk to your readers. It is also called Web diary or Online Diary. The articles are concerned with specific topic or event and are highly oriented towards an opinion instead of information. An article is supposed to be more oriented towards showing up opinions, views and idea. Generally, it is written by a third party or expert of any specific field. Press Release is related with a specific action or event which can be republished by distinct medium of mass-media including other websites. It should be simple, short and professional. It conveys a clear message or information.
PageRank is a set of algorithm for link analysis named after Larry Page and employed by Google search engine towards defining a numerical value from 1 to 10 to each component of hyperlinked documents like World Wide Web. The value accepts only round figure that means decimal are not allowed. Page rank is calculated by their inbound links.
The keyword term is basically concerned with a one-word term, on the other hand a keyword phrase considered as employment of two or more word-combinations. Therefore, it is very confounded to get high ranking in account of one-word keyword term until the one-word keyword has little online competition. Therefore, this practice is not encouraged to employ. In order to drive more traffic and top ranking in SERP it is recommended to employ keyword phrase.
PR is Page Rank which is defined by quality inbound links from other website or web-pages to a web page or website as well as say the importance of that site. SERP stands for Search Engine Result Page is the placement of the website or web-page which is returned by search engine after a search query or attribute.
Primarily two types of SEO are being sporting in practice – Off-Page SEO and On-Page SEO. Off-Page SEO is the method of earning backlinks from other websites in order to enhance the ranking of the site. This method include various method of SEO including Blog posting, forum, article submission, Press release submission, classified and miscellaneous. On-Page SEO is the process of optimizing a website which includes on-site work such as writing content, title, description, Alt tag, Meta tags as well as ensuring web-page's code and design which can be indexed and crawled by search engines properly.
Generally, Google Page Rank is based on inbound links, therefore, more backlinks you congregate more your page rank will be. Also, it is influenced by rank of page which is linked to you. One other thing to consider is that older your website will be, it will be more favorable and trusted to Google. Google reward those websites who incorporates lots of pages, tons of incoming link and also healthy quantity of internal links to another pages within the site. In respect of SEO projects, relatively it is not so significant but delivers a picture about work to perform towards earning inbound links.
First of all I would attempt to make a search on all search engines employing relevant keywords and keyphrases, I am optimizing for. The analysis of result will say whether the methods of optimization have gain results or lost. I would analyze the report regularly as search engine make update and index. I would attempt to another aspect of website statistics which says about origin of traffic.
The best way to opt and implement keywords is to designate those keywords which are popular, relevant to our content, comprises high search volume and effective. Stuffing and over employment of keywords must be avoided. In order to get best result and effect, our pages shouldn't contain keyword density more than 3- 4%. Including keywords into title and description is highly recommended.
Alexa is a California based subsidiary company of Amazon.com which is widely known for its website and toolbar. This Alexa toolbar congregate browsing behavior data and send it to website, where the data is analyzed and stored and create report for company's web traffic. Also, Alexa provides data concerned to traffic, global ranking and other additional information for a websites.
Generally, I would continue to employ robots.txt in order to make search engine indexing a directory on a website. This might be often a directory that is concerned with admin function or incorporate contents only in form of script or image gallery. Generally, robots.txt is employed to prevent a directory and its sub-folders and files to crawl by search engine bot as well as Meta robots tag for a specific web page.
HTML meta tags are usually referred as tags of page data which sits between opening and closing head tags of a document's HTML code. Actually these are hidden keywords who sit in the code. These are invisible to visitors but are visible and readable by Search Engines. Example: <head> <title>Not considered as Meta Tag, even required anyway</title> <meta name="description" content="Write your description here" /> <meta name="keywords" content="Write your keyword here" /> </head>
There are lots of techniques used in Offpage SEO work. Major Techniques are: · Directory Submission · Social Bookmarking · Blog Post · Article Post · Press Release Submission · Forum Posting · Yahoo Answer · Blog Comment · Deep link Directory Submission · Regional Directory Submission and all that.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Alternatively, Google might use your site's description in the Open Directory Project if your site is listed there (learn how to prevent search engines from displaying ODP data). Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has an informative post on improving snippets with better description meta tags. Words in the snippet are bolded when they appear in the user's query (2). This gives the user clues about whether the content on the page matches with what he or she is looking for. (3) is another example, this time showing a snippet from a description meta tag on a deeper page (which ideally has its own unique description meta tag) containing an article.
Adsense is a web program conducted by Google that enables publishers of content websites to cater text, rich media, image, video advertisements automatically which are relevant to content of website and audience. These advertisement are included, maintained and sorted by Google itself and earn money either by per-click or per-impression basis.
In order to attain High Ranking in search engine result page, websites go for various methods and techniques which are characterized by two categories. The method which are implemented and acceptable according to search engine guidelines are White Hat SEO, on the other hand, the method which are less acceptable or instructed to avoid in search engine guidelines are “Black Hat SEO”.
It is the practice to find out root word from search query. For instance, a keyword like “playful” will be split to the word “play” by stemming algorithm that turns it possible. Thus, the search result appear on the screen will contain the word “play” in it.
LSI is the abbreviated form of Latent Semantic Indexing. It has been emerged as a technique of fetching data via establishing a communication among words as well as employing synonyms in the midst of retrieving the data from the index.
Social networking websites are considered as social media which is very effective and robust for viral marketing. Viral marketing has been proved as very powerful resource, in the case if our content is unique, attractive and appealing. Some media Site: · Facebook · Twitter · Linkedin · Myspace · Digg · Youtube · Myspace, etc.
Bookmarking sites helps you to getting instant traffic on your site by his powerful social media factor. You can easily bookmarks this site on your favorites list and when it requires you can click on this and you will get this.
Cache is the process performed by search engine crawler at a regular interval of time. It used to scan and take snapshot of each page over World Wide Web as well as store as a backup copy. Almost every search engine result page incorporates a cached link for every site. However, clicking over cached link show you the last Google cached version of that specific page rather than of current version. Also, you can directly prefix "cache:http://www.speechus.com" with desired URL to view it cached version.
Cloaking is a technique which is used in Black Hat SEO that enables to create two distinct pages where the page content being presented to search engine spider is different from that which is being presented to user's browser. This technique does not come under the guidelines of search engines.
PPC is the abbreviated form of Pay Per Click and is a advertisement campaign conducted by Google. It is referred as a primary module with two sub module CPC (Cost-per-click) and CPM (Cost per thousand impression) through bidding and flat rate respectively. In CPC the advertiser would be only charged when the user click over their advert.
Search Engines are very critical key element useful to find out specific and relevant information through huge extent of World Wide Web. Some major commonly used search engine: · Google · Yahoo · Bing
A sitemap incorporates list of web-pages which is accessible to users or crawlers. It might be a document in any form employed as a tool for planning either a web page or web design that enables them to appear on a website as well as typically placed in a hierarchical style. This help search engine bots and users to find out the pages on a website. The site map renders our website more search engine friendly as well enhances the probability for frequent indexing. HTML sitemap can be incorporated directly in a web page for user’s flexibility and can be implemented through proper design. On the other hand, XML sitemap is useful only for search engine crawlers or spiders and doesn't visible to users. It sits in the root of website. Such as: http://www.speechus.com/sitemap.xml
Spider also called as bot, crawler or robot is a set of computer program that browses World Wide Web in methodical and orderly fashion as well automatically scan the web-page and website for updated content and download a copy to its data center to index.
These are following steps to be followed while optimizing a website: · First of all we will interview webmaster or website owner to congregate relevant information, goals and website's purpose. · Performing keyword analysis and find out the best search volume keywords that should be incorporated into the website as well as individual pages of the website. · Analyzing the content of website in order to ensure usage of content relevant keywords and phrases. This comprises titles, “alt” attributes and META tags (Meta Title, Meta description & Meta Keyword). · Target & implementing keywords as H1, H2 & so on relevant to the site and its content. · Analyzing website navigation. · Ensuring the robots.txt file and sitemap existence as well as check their efficiency. · If required, making recommendations for modifications in website as well as its each and every page and so on…..
Basically there are various things that is being employed for organically ranking a website which can be classified in 3 distinct categories: · Website content: It must be quality and unique content as well as most be well optimized and well structured. · Website structure: This include TAGS, clear navigation, ensuring usability, validation of HTML errors and miscellaneous. · Back-links: You can create a link for any where but prior to this it is obvious to ensure for relevant site and healthy link.
Robots.txt file is considered as a convention useful to prevent cooperating web robots and web crawlers from accessing all or part of a website or its content for which we don't want to be crawled and indexed but publicly viewable. It is also employed by search engines to archive and categorize website and to generate a rule of no follow regarding some particular areas of our websites.
In our SEO efforts Title Tags are very earnest. It is highly recommended to include a Unique Title that exactly says about the contents sits in that page. It is valuable because this is thing which appears in the search engine result section and tells the user & search engine, what is about this page.