Search engine optimization (SEO) stands for Search Engine Optimization is the process of improving the ranking of the entire website or a page of the website in search engines and is located in the natural search results. SEO may target search, including image search, local search, video search and vertical search (vertical search).
As an Internet marketing channel strategy, SEO is the way to look at how the search engines and what people search for. Optimizing a website may involve editing the content, HTML, programming methods to increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic.
Search engine optimization can also be issued as a stand-alone service or as part of a broader marketing campaign. To do effective SEO may have to change the HTML source code of the website should be SEO can be combined from the beginning of the design and development of web pages. The term "
search engine friendly (friendly with search engine)" can be used to describe the design of the site, the main menu, content management systems, images, videos, shopping carts , and other factors have been optimized for the purpose of helping with the search engines easily exposed to the website.
Another technique of SEO, known as black hat SEO or spamdexing, use methods such as link farms, keyword stuffing and copy the contents degrade both the relevance of search results and user experience search engines. Search engines will remove the sites that use this technique out of the index.
seo?
History
Webmaster and designer began optimizing sites for search engines in the mid-1990s, when that search engines are mainly listed categories. Initially, all the webmaster of a web address, or URL, to the various search engines, the search engine will send a "spider" (web spiders) to collect information of the page and extract links to other pages from it, the information will be stored and found on the search engines after being indexed. Related process includes a search engine spider downloading a page and storing it on the search engine's own server, followed by a second program known as an indexer, extracts the information about the site, such as the words it contains, the location of this, the number of specific words, and all links in the page, then the system will automatically set the time to turn the page to add new data collection.
Site owners began to realize the value of their site ranked high in search engine results, creating an opportunity for both "white hat" and "black hat" SEO practice. According to industry analyst Danny Sullivan, the phrase "
search engine optimization" was put into use in 1997.
The first version of the search engines use algorithms which are mainly based on the webmaster to provide information such as the keyword meta tag, or index files in the stuff like Aliweb. Meta tags provide a guide to each page's content. Gradually data information in the index pages was found less reliable, because the webmaster to use a lot of keywords in the keyword meta tags and many do not accurately represent the content of a page is found . Inaccurate, incomplete, and inconsistent data in meta tags and the page has caused disturbance, create pages that rank high in the search engines, but the content is not relevant. Web content also manipulated a number of attributes in the HTML source code trying to rank well in the search engines.
By relying so much on factors such as keyword density is exclusively within a webmaster's control, search engines began to abuse and ranking manipulation. To provide better results to users, search engines had to adapt to ensure the results page for the most relevant search results, rather than unrelated pages with keywords stuffing by webmaster. As the success and popularity of a search engine is determined by the ability to produce the most relevant results for any search, and the results wrong will cause users to switch to other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors were more difficult for webmasters to manipulate rankings.
As a student at Stanford University, Larry Page and Sergey Brin, developed "backrub," a search engine based on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm is called PageRank, is a function of the quantity and strength of inbound links. PageRank estimates the likelihood that a certain website can appear the random course before a user when surfing the web, and allows links from one page to another is found. In practice, this means that some links are stronger than the other links, as well as a higher PageRank page is more likely to be higher users find.
Page and Brin founded Google in 1998. Google has attracted a large number of Internet users is increasing, who liked its simple design. The factors outside the "off page" (like PageRank and hyperlink analysis) as well as the elements on the page "on the page" (like keyword frequency, meta tags, headings, links and site structure ) to enable Google to avoid the kind of manipulation found in search engines that only considered on page factors for their rankings. Although PageRank was more difficult to play, the webmater have developed tools and link placement system and affect the Inktomi search engine, and these methods proved to be able to apply similar for PageRank. Many sites focused on exchanging, buying and selling thousands of links often on a large scale. Among these include the creation of a website system, or link farms, involved the creation of thousands of sites for the sole purpose of link spam.
By 2004, search engines have incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. Google says it uses more than 200 different signals to determine the ranking of a website. The leading search engines, Google and Yahoo, do not disclose the algorithms they use to rank pages. SEO service providers, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen, have studied different approaches to search website optimization, and have published their opinions on a variety of performance Forums and blogs. SEO practitioners may also participate in courses organized by search engines to gain insight in the algorithm.
In 2005, Google began to personalize the search results for each user. Depending on their history of previous searches, Google saves the result to the user sign in to the Google. In 2008, Bruce Clay said that "ranking is dead" because of personalized search. It would become meaningless to discuss how a website ranked, because its rank would likely be different for each user and each search.
In 2007 Google announced a campaign against paid links to improve PageRank. June 15, 2009, Google announced that they have taken measures to mitigate the effects of PageRank applied by using the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer receive link nofollowed in the same way. To return on the SEO service providers have devised a procedure to replace the nofollow tag with obfuscated Javascript and allow PageRank is calculated. In addition, a number of solutions have been proposed including the use of iframes, Flash and Javascript.
December 2009 Google announced it would use the web search history of all users in order for the search results. Real-time search was introduced in late 2009 in an effort to make search results timely and relevant. With the development of social networks and blogs, the top search engines have changed their algorithms to allow new content to rank quickly in the search results. This approach creates an importance in creating new and unique content.
Relationship with search engines
1997 search engines recognized that
webmasters were efforts to rank well in their search engines, and that some webmasters have tried to manipulate to get a high position in the rankings in the search results by cramming too much or irrelevant keywords. First search engines, such as Infoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.
Due to the high marketing value of search results has the potential for a relationship between search engines and SEO service providers. In 2005, the search engine has held an annual conference name AIRWeb, was created to discuss and minimize the damaging effects of the information on the Web.
SEO companies that use positive techniques too can make your client's site is blocked from search results. In 2005, the Wall Street Journal reported on a company called Traffic Power allegedly used high-risk techniques and did not disclose the risks to customers. Wired magazine reported that the company sued blogger and SEO Aaron Wall wrote about the ban. Matt Cutts at Google later confirmed that Google has blocked Traffic Power and some of the company's customers.
Some search engines also reached out to the SEO industry, both regular donors and clients at conferences, chats, and seminars on SEO. In fact, with the advent of paid advertising, some search engines now have a vested interest in maintaining good SEO community. The search engines provide information and guidelines to help optimize the site. Google Sitemaps to help webmasters learn if Google has any problems indexing their website and also provides data for Google, for example: traffic to the site. Google guidelines are a list of basic SEO practices guide for webmasters. Yahoo! Site Explorer provides a way for webmasters to submit URLs, determine how many pages stored in Yahoo.! Information about the link.
Indexing
The top search engines like Google, Bing and Yahoo!, use the Crawler to find the website for search results with their algorithm. Pages that are linked from other search engine indexed automatically. Some search engines, notably Yahoo!, ensuring data collection and indexing sites for both paid search (Google Adwords) or natural search. Such programs usually guarantee included in the database, but do not guarantee specific ranking within the search results. Yahoo has two large directories, the Yahoo Directory and the Open Directory Project both require manual and human editorial review. Google offers Google Webmaster Tools, Site Map XML data can be created and submitted for free to ensure that all pages are found, especially pages that are not detectable by automatically find the link . Crawlers of search engines may consider a number of different factors when a site collection. Not all pages are indexed. Distance of pages from the root directory of a site may also be a factor to whether or not the pages crawled.
Prevent from collecting information
To avoid undesirable content in the search list, webmasters can instruct spiders not crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be excluded from the search engine from the database by using a meta tag specific to robots search. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed, and it will guide the robot search page does not gather information. Search engines can keep a cached copy of the file on the web, sometimes it crawls a page that webmasters do not want to be crawled. The page usually stop collecting information including login page, shopping cart and content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because the local search is considered spam.
Increase famous
A variety of methods can increase the prominence of the site in search results. cross-linking between pages of the same website to provide more links to most important pages may improve its capabilities. Write content including regular keyword search phrases related to a series of such search queries will tend to increase traffic. Add keywords related to the web page's meta data, including the title tag and description tag will tend to improve the relevance of the search list a web page, thus increasing the traffic of the site. Use the "canonical" meta or 301 redirects can help create links from different versions, all directed to the most popular page url.
White hat and black hat
SEO techniques are classified into two broad categories: techniques that search engines recommend as part of good design, and those techniques that search engines do not approve and try to minimize the impact of the be regarded as spamdexing. Some industry commentators classify these methods, and those who use them are either white hat SEO or black hat SEO. White hats tend to produce results in the long time, whereas black hats anticipate that their sites will eventually be blocked once the search engines discover what they are doing.
Transparent SEO technique or method is considered white hat if it conforms to the guidelines of the search engines' and not related to deception. The guidelines of search engines is not a series of rules or orders, this is an important distinction to note. White hat SEO is not just follow the instructions, but to ensure that the content is indexed will be the content that searchers will see when the website.
White hat generally creating content for users, not for search engines, and then make the content more easily accessible to web spiders are trying to game the algorithm. White hat SEO is in many ways similar to web development that promotes accessibility, although the two are not identical.
White hat SEO is effective marketing efforts to deliver quality content to an audience that has requested the quality content.
Black hat SEO attempts to improve rankings in ways not approved by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or position away from the screen. Another method is for a different page depending on whether the page is being requested by a human or a search engine, a technique known as cloaking (cloaking).
Search engines have blocked the site to discover sites using black hat methods, by reducing the rank or excluded from their databases altogether. Such penalties can be applied by automatic algorithm of the search engine, or by a manual site review. A famous example is the February 2006 Google removal both website: BMW Germany and Ricoh Germany using deceptive practices. Both companies quickly apologized, fixed the violation website, and has been restored to Google's list.
As a marketing strategy
SEO is not necessarily an appropriate strategy for every website. Other Internet marketing strategies can be much more effective, depending on the goals of the site owner. A successful Internet marketing campaign to bring major traffic through optimization techniques and not paid advertising, but it can also include the use of paid advertising on search engines Search and other sites. Building high-quality sites to help bring the user to interact with and can keep search engine indexing sites. Install Analytics helps website owners measure their success, and improve customer conversion rate of the site.
SEO may generate a return on investment. However, the search engines are always changing their algorithms, and there is no guarantee this is not detrimental to the site owner. (Some trading sites such as eBay can be a special case for this, Ebay can be notified when the ranking algorithm will change a few months before changing the algorithm). Due to the lack of assurance and certainty, a business that rely on search engine traffic can suffer major losses if the search engines stop sending visitors. The company relies heavily on business websites need to liberate themselves from dependence on search engines. Seomoz.org a leading company about SEO has proposed, "search marketing, only a very small part of the traffic from the search engines .." Instead, the main source Their links from other websites.
International markets
Market share of search engines vary by market, as well as competition. In 2003, Danny Sullivan says that Google represented about 75% of all search engines. Markets outside the U.S. Google's market share is often larger, and Google remains the dominant search engine's biggest in 2007. In 2006, Google had an 85-90% market share in Germany. While there are hundreds of SEO firms in the U.S. at that time, only about five in Germany. June 2008, Google's market share in the UK was close to 90% according to Hitwise. It is the market share in some countries.
As of 2011, only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is due to lag behind local rivals. Most notable is the market such as: China, Japan, Korea, Russia and the Czech Republic, where the relative with search engine Baidu, Yahoo! Japan, Naver, Yandex and Seznam is the market leader .
To successfully optimize for the international market need professional translation for web pages, registration of a domain name in the target market, and web server provides a local IP address . Rest of the basic elements of search optimization are essentially the same, regardless of language.