Search engine optimization (SEO) is the process of improving the quality and quantity of
website traffic to a
website
A website (also written as a web site) is any web page whose content is identified by a common domain name and is published on at least one web server. Websites are typically dedicated to a particular topic or purpose, such as news, educatio ...
or a
web page
A web page (or webpage) is a World Wide Web, Web document that is accessed in a web browser. A website typically consists of many web pages hyperlink, linked together under a common domain name. The term "web page" is therefore a metaphor of pap ...
from
search engine
A search engine is a software system that provides hyperlinks to web pages, and other relevant information on World Wide Web, the Web in response to a user's web query, query. The user enters a query in a web browser or a mobile app, and the sea ...
s. SEO targets unpaid search traffic (usually referred to as "
organic" results) rather than direct traffic, referral traffic,
social media
Social media are interactive technologies that facilitate the Content creation, creation, information exchange, sharing and news aggregator, aggregation of Content (media), content (such as ideas, interests, and other forms of expression) amongs ...
traffic, or
paid traffic.
Unpaid search engine traffic may originate from a variety of kinds of searches, including
image search,
video search,
academic search,
news search, and industry-specific
vertical search engines.
As an
Internet marketing
The Internet (or internet) is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP) to communicate between networks and devices. It is a network of networks that consists of private, publ ...
strategy, SEO considers how search engines work, the computer-programmed
algorithm
In mathematics and computer science, an algorithm () is a finite sequence of Rigour#Mathematics, mathematically rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algo ...
s that dictate search engine results, what people search for, the actual search queries or
keywords typed into search engines, and which search engines are preferred by a target audience. SEO is performed because a website will receive more visitors from a search engine when websites rank higher within a
search engine results page (SERP), with the aim of either converting the visitors or building brand awareness.
History
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early
Web
Web most often refers to:
* Spider web, a silken structure created by the animal
* World Wide Web or the Web, an Internet-based hypertext system
Web, WEB, or the Web may also refer to:
Computing
* WEB, a literate programming system created by ...
. Initially, webmasters submitted the address of a page, or
URL to the various search engines, which would send a
web crawler
Web crawler, sometimes called a spider or spiderbot and often shortened to crawler, is an Internet bot that systematically browses the World Wide Web and that is typically operated by search engines for the purpose of Web indexing (''web spider ...
to ''crawl'' that page, extract links to other pages from it, and return information found on the page to be
indexed.
According to a 2004 article by former industry analyst and current
Google
Google LLC (, ) is an American multinational corporation and technology company focusing on online advertising, search engine technology, cloud computing, computer software, quantum computing, e-commerce, consumer electronics, and artificial ...
employee
Danny Sullivan, the phrase "search engine optimization" came into use in 1997. Sullivan credits SEO practitioner Bruce Clay as one of the first people to popularize the term.
Early versions of search
algorithm
In mathematics and computer science, an algorithm () is a finite sequence of Rigour#Mathematics, mathematically rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algo ...
s relied on webmaster-provided information such as the keyword
meta tag or index files in engines like
ALIWEB
ALIWEB (Archie-Like Indexing for the Web) is the first Web search engine.
First announced in November 1993 by developer Martijn Koster while working at Nexor, and presented in May 1994 at the First International Conference on the World Wide W ...
. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Flawed data in meta tags, such as those that were inaccurate or incomplete, created the potential for pages to be mischaracterized in irrelevant searches.
Web content providers also manipulated attributes within the
HTML
Hypertext Markup Language (HTML) is the standard markup language for documents designed to be displayed in a web browser. It defines the content and structure of web content. It is often assisted by technologies such as Cascading Style Sheets ( ...
source of a page in an attempt to rank well in search engines. By 1997, search engine designers recognized that webmasters were making efforts to rank in search engines and that some webmasters were
manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as
Altavista
AltaVista was a web search engine established in 1995. It became one of the most-used early search engines, but lost ground to Google and was purchased by Yahoo! in 2003, which retained the brand, but based all AltaVista searches on its own sear ...
and
Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.
By relying on factors such as
keyword density, which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their
results pages showed the most relevant search results, rather than unrelated pages with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.
Search engines responded by developing more complex
ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
Some search engines have also reached out to the SEO industry and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.
Google has a
Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.
Bing Webmaster Tools
Bing Webmaster Tools (previously the ''Bing Webmaster Center'') is a free service as part of Microsoft's Bing search engine which allows webmasters to add their websites to the Bing index crawler, see their site's performance in Bing (clicks, i ...
provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
In 2015, it was reported that
Google
Google LLC (, ) is an American multinational corporation and technology company focusing on online advertising, search engine technology, cloud computing, computer software, quantum computing, e-commerce, consumer electronics, and artificial ...
was developing and promoting mobile search as a key feature within future products. In response, many brands began to take a different approach to their Internet marketing strategies.
Relationship with Google
In 1998, two graduate students at
Stanford University
Leland Stanford Junior University, commonly referred to as Stanford University, is a Private university, private research university in Stanford, California, United States. It was founded in 1885 by railroad magnate Leland Stanford (the eighth ...
,
Larry Page
Lawrence Edward Page (born March 26, 1973) is an American businessman, computer engineer and computer scientist best known for co-founding Google with Sergey Brin.
Page was chief executive officer of Google from 1997 until August 2001 when ...
and
Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm,
PageRank
PageRank (PR) is an algorithm used by Google Search to rank web pages in their search engine results. It is named after both the term "web page" and co-founder Larry Page. PageRank is a way of measuring the importance of website pages. Accordin ...
, is a function of the quantity and strength of
inbound links.
PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of
Internet
The Internet (or internet) is the Global network, global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP) to communicate between networks and devices. It is a internetworking, network of networks ...
users, who liked its simple design.
Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency,
meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to
game
A game is a structured type of play usually undertaken for entertainment or fun, and sometimes used as an educational tool. Many games are also considered to be work (such as professional players of spectator sports or video games) or art ...
, webmasters had already developed link-building tools and schemes to influence the
Inktomi
Inktomi Corporation was an American Internet service provider (ISP) software developer based in Foster City, California. Customers included Microsoft, HotBot, Amazon.com, eBay, and Walmart.
The company developed Traffic Server, a proxy se ...
search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focus on exchanging, buying, and selling links, often on a massive scale. Some of these schemes involved the creation of thousands of sites for the sole purpose of
link spamming.
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation.
The leading search engines, Google,
Bing
Bing most often refers to:
* Bing Crosby (1903–1977), American singer
* Microsoft Bing, a web search engine
Bing may also refer to:
Food and drink
* Bing (bread), a Chinese flatbread
* Bing (soft drink), a UK brand
* Bing cherry, a varie ...
, and
Yahoo
Yahoo (, styled yahoo''!'' in its logo) is an American web portal that provides the search engine Yahoo Search and related services including My Yahoo, Yahoo Mail, Yahoo News, Yahoo Finance, Yahoo Sports, y!entertainment, yahoo!life, an ...
, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization and have shared their personal opinions. Patents related to search engines can provide information to better understand search engines. In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.
In 2007, Google announced a campaign against paid links that transfer PageRank. On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the
nofollow
nofollow is a setting on a web page hyperlink that directs search engines not to use the link for page ranking calculations. It is specified in the page as a type of link relation; that is: <a rel="nofollow" ...>. Because search engi ...
attribute on links.
Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any no follow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting. As a result of this change, the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated
JavaScript
JavaScript (), often abbreviated as JS, is a programming language and core technology of the World Wide Web, alongside HTML and CSS. Ninety-nine percent of websites use JavaScript on the client side for webpage behavior.
Web browsers have ...
and thus permit PageRank sculpting. Additionally, several solutions have been suggested that include the usage of
iframes,
Flash, and JavaScript.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results. On June 8, 2010 a new web indexing system called
Google Caffeine was announced. Designed to allow users to find news results, forum posts, and other content much sooner after publishing than before, Google Caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."
Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs, the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.
In February 2011, Google announced the
Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system that punishes sites whose content is not unique. The 2012
Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine. Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links by gauging the quality of the sites the links are coming from. The 2013
Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of "conversational search", where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
In October 2019, Google announced they would start applying
BERT models for English language search queries in the US. Bidirectional Encoder Representations from Transformers (BERT) was another attempt by Google to improve their natural language processing, but this time in order to better understand the search queries of their users. In terms of search engine optimization, BERT intended to connect users more easily to relevant content and increase the quality of traffic coming to websites that are ranking in the
Search Engine Results Page.
Methods
Getting indexed

The leading search engines, such as Google, Bing,
Brave Search and Yahoo!, use
crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine-indexed pages do not need to be submitted because they are found automatically. The
Yahoo! Directory and
DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review. Google offers
Google Search Console
Google Search Console (formerly Google Webmaster Tools) is a web service by Google which allows webmasters to check indexing status, search queries, crawling errors and search engine optimization, optimize visibility of their websites.
Until 20 ...
, for which an XML
Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links in addition to their URL submission console. Yahoo! formerly operated a paid submission service that guaranteed to crawl for a
cost per click; however, this practice was discontinued in 2009.
Search engine
A search engine is a software system that provides hyperlinks to web pages, and other relevant information on World Wide Web, the Web in response to a user's web query, query. The user enters a query in a web browser or a mobile app, and the sea ...
crawlers may look at a number of different factors when
crawling a site. Not every page is indexed by search engines. The distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.
Mobile devices are used for the majority of Google searches. In November 2016, Google announced a major change to the way they are crawling websites and started to make their index mobile-first, which means the mobile version of a given website becomes the starting point for what Google includes in their index. In May 2019, Google updated the rendering engine of their crawler to be the latest version of Chromium (74 at the time of the announcement). Google indicated that they would regularly update the
Chromium
Chromium is a chemical element; it has Symbol (chemistry), symbol Cr and atomic number 24. It is the first element in Group 6 element, group 6. It is a steely-grey, Luster (mineralogy), lustrous, hard, and brittle transition metal.
Chromium ...
rendering engine to the latest version. In December 2019, Google began updating the User-Agent string of their crawler to reflect the latest Chrome version used by their rendering service. The delay was to allow webmasters time to update their code that responded to particular bot User-Agent strings. Google ran evaluations and felt confident the impact would be minor.
Preventing crawling
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard
robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a
meta tag specific to robots (usually
). When a search engine visits a site, the robots.txt located in the
root directory
In a Computing, computer file system, and primarily used in the Unix and Unix-like operating systems, the root directory is the first or top-most Directory (computing), directory in a hierarchy. It can be likened to the trunk of a Tree (data st ...
is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish to crawl. Pages typically prevented from being crawled include login-specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.
In 2020, Google
sunsetted the standard (and open-sourced their code) and now treats it as a hint rather than a directive. To adequately ensure that pages are not indexed, a page-level robot's meta tag should be included.
Increasing prominence
A variety of methods can increase the prominence of a webpage within the search results.
Cross linking between pages of the same website to provide more links to important pages may improve its visibility. Page design makes users trust a site and want to stay once they find it. When people bounce off a site, it counts against the site and affects its credibility.
Writing content that includes frequently searched keyword phrases so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the
title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic.
URL canonicalization of web pages accessible via multiple URLs, using the
canonical link element
A canonical link element is an HTML element that helps webmasters prevent duplicate content issues in search engine optimization by specifying the " canonical" or "preferred" version of a web page. It is described in RFC 6596, which went live in ...
or via
301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score. These are known as incoming links, which point to the URL and can count towards the page link's popularity score, impacting the credibility of a website.
White hat versus black hat techniques
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). Search engines attempt to minimize the effect of the latter, among them
spamdexing
Spamdexing (also known as search engine spam, search engine poisoning, black-hat search engine optimization, search spam or web spam) is the deliberate manipulation of search engine indexes. It involves a number of methods, such as link building ...
. Industry commentators have classified these methods and the practitioners who employ them as either
white hat SEO or
black hat SEO. White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.
An SEO technique is considered a white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines
are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility, although the two are not identical.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible
div, or positioned off-screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as
cloaking. Another category sometimes used is
grey hat SEO. This is in between the black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms or by a manual site review. One example was the February 2006 Google removal of both
BMW
Bayerische Motoren Werke AG, trading as BMW Group (commonly abbreviated to BMW (), sometimes anglicised as Bavarian Motor Works), is a German multinational manufacturer of vehicles and motorcycles headquartered in Munich, Bavaria, Germany. Th ...
Germany and
Ricoh
is a Japanese multinational imaging and electronics company. It was founded by the now-defunct commercial division of the Institute of Physical and Chemical Research (Riken) known as the ''Riken Concern'', on 6 February 1936 as . Ricoh's hea ...
Germany for the use of deceptive practices.
Both companies subsequently apologized, fixed the offending pages, and were restored to Google's search engine results page.
Companies that employ black hat techniques or other spammy tactics can get their client websites banned from the search results. In 2005, the ''
Wall Street Journal
''The Wall Street Journal'' (''WSJ''), also referred to simply as the ''Journal,'' is an American newspaper based in New York City. The newspaper provides extensive coverage of news, especially business and finance. It operates on a subscriptio ...
'' reported on a company,
Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients. ''
Wired
Wired may refer to:
Arts, entertainment, and media Music
* ''Wired'' (Jeff Beck album), 1976
* ''Wired'' (Hugh Cornwell album), 1993
* ''Wired'' (Mallory Knox album), 2017
* "Wired", a song by Prism from their album '' Beat Street''
* "Wired ...
'' magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.
Google's
Matt Cutts later confirmed that Google had banned Traffic Power and some of its clients.
As marketing strategy
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay-per-click
(PPC) campaigns, depending on the site operator's goals.
Search engine marketing (SEM) is the practice of designing, running, and optimizing search engine ad campaigns. Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. SEM focuses on prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search. A successful Internet marketing campaign may also depend upon building high-quality web pages to engage and persuade internet users, setting up
analytics
Analytics is the systematic computational analysis of data or statistics. It is used for the discovery, interpretation, and communication of meaningful patterns in data, which also falls under and directly relates to the umbrella term, data sc ...
programs to enable site owners to measure results, and improving a site's
conversion rate
In electronic commerce, conversion marketing is a marketing technique aimed at increasing ''conversions''—that is, turning site visitors into paying customers.
Conversion marketing addresses low online conversion rates by improving overall cu ...
. In November 2015, Google released a full 160-page version of its Search Quality Rating Guidelines to the public, which revealed a shift in their focus towards "usefulness" and
mobile local search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by
StatCounter in October 2016, where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device. Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their
Google Search Console
Google Search Console (formerly Google Webmaster Tools) is a web service by Google which allows webmasters to check indexing status, search queries, crawling errors and search engine optimization, optimize visibility of their websites.
Until 20 ...
, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and determine how user-friendly their websites are. The closer the keywords are together their ranking will improve based on key terms.
SEO may generate an adequate
return on investment
Return on investment (ROI) or return on costs (ROC) is the ratio between net income (over a period) and investment (costs resulting from an investment of some resources at a point in time). A high ROI means the investment's gains compare favorab ...
. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantee and uncertainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors. Search engines can change their algorithms, impacting a website's search engine ranking, possibly resulting in a serious loss of traffic. According to Google's CEO,
Eric Schmidt
Eric Emerson Schmidt (born April 27, 1955) is an American businessman and former computer engineer who was the chief executive officer of Google from 2001 to 2011 and the company's chairman, executive chairman from 2011 to 2015. He also was the ...
, in 2010, Google made over 500 algorithm changes – almost 1.5 per day. It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic. In addition to accessibility in terms of web crawlers (addressed above), user
web accessibility
Web accessibility, or eAccessibility,European CommissionCommunication from the Commission to the Council, the European Parliament and the European Economic and Social Committee and the Committee of the Regions: eAccessibility, EC(2005)1095 pu ...
has become increasingly important for SEO.
International markets and SEO
Optimization techniques are highly tuned to the dominant search engines in the target market.
The search engines' market shares vary from market to market, as does competition.
In 2003,
Danny Sullivan stated that
Google
Google LLC (, ) is an American multinational corporation and technology company focusing on online advertising, search engine technology, cloud computing, computer software, quantum computing, e-commerce, consumer electronics, and artificial ...
represented about 75% of all searches. In markets outside the United States, Google's share is often larger, and data showed Google was the dominant search engine worldwide as of 2007. As of 2006, Google had an 85–90% market share in Germany.
While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.
As of March 2024, Google still had a significant market share of 89.85% in Germany. As of June 2008, the market share of Google in the UK was close to 90% according to
Hitwise. As of March 2024, Google's market share in the UK was 93.61%.
Successful search engine optimization (SEO) for international markets requires more than just translating web pages. It may also involve registering a domain name with a
country-code top-level domain (ccTLD) or a relevant
top-level domain
A top-level domain (TLD) is one of the domain name, domains at the highest level in the hierarchical Domain Name System of the Internet after the root domain. The top-level domain names are installed in the DNS root zone, root zone of the nam ...
(TLD) for the target market, choosing web hosting with a local IP address or server, and using a
Content Delivery Network
A content delivery network (CDN) or content distribution network is a geographically distributed network of proxy servers and their data centers. The goal is to provide high availability and performance ("speed") by distributing the service spat ...
(CDN) to improve website speed and performance globally. It is also important to understand the local culture so that the content feels relevant to the audience. This includes conducting keyword research for each market, using hreflang tags to target the right languages, and building local backlinks. However, the core SEO principles—such as creating high-quality content, improving user experience, and building links—remain the same, regardless of language or region.
Regional search engines have a strong presence in specific markets:
* China:
Baidu
Baidu, Inc. ( ; ) is a Chinese multinational technology company specializing in Internet services and artificial intelligence. It holds a dominant position in China's search engine market (via Baidu Search), and provides a wide variety of o ...
leads the market, controlling about 70 to 80% market share.
* South Korea: Since the end of 2021,
Naver
Naver (; stylized as NAVER) is a South Korean online platform operated by the Naver Corporation. The company's products include a search engine, email hosting, blogs, maps, and mobile payment.
History
Naver was the first Korean web provide ...
, a domestic web portal, has gained prominence in the country.
* Russia:
Yandex is the leading search engine in Russia. As of December 2023, it accounted for at least 63.8% of the market share.
The Evolution of International SEO
By the early 2000s, businesses recognized that the web and search engines could help them reach global audiences. As a result, the need for multilingual SEO emerged. In the early years of international SEO development, simple translation was seen as sufficient. However, over time, it became clear that localization and transcreation—adapting content to local language, culture, and emotional resonance—were far more effective than basic translation.
Legal precedents
On October 17, 2002, SearchKing filed suit in the
United States District Court
The United States district courts are the trial courts of the United States federal judiciary, U.S. federal judiciary. There is one district court for each United States federal judicial district, federal judicial district. Each district cov ...
, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a
tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the
United States District Court for the Northern District of California (
San Jose Division) dismissed KinderStart's complaint without leave to amend and partially granted Google's motion for
Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.
See also
*
Competitor backlinking
*
List of search engines
Search engines, including web search engines, selection-based search engines, metasearch engines, desktop search tools, and web portals and vertical market websites have a search facility for online databases.
By content/topic
General ...
*
Search engine marketing
*
Search neutrality, the opposite of search manipulation
*
User intent
*
Website promotion
*
Search engine results page
*
Search engine scraping
References
External links
Webmaster Guidelinesfrom Google
Google Search Quality Evaluators Guidelines (PDF)from Yahoo!
Webmaster Guidelinesfrom
Microsoft Bing
Microsoft Bing (also known simply as Bing) is a search engine owned and operated by Microsoft. The service traces its roots back to Microsoft's earlier search engines, including MSN Search, Windows Live Search, and Live Search. Bing offers a ...
The Dirty Little Secrets of Searchin
The New York Times
''The New York Times'' (''NYT'') is an American daily newspaper based in New York City. ''The New York Times'' covers domestic, national, and international news, and publishes opinion pieces, investigative reports, and reviews. As one of ...
(February 12, 2011)
{{Authority control
Digital marketing
Web analytics
Internet terminology
Online advertising
Promotion and marketing communications