In 2014, BrightEdge published research that showed that 51% of channel traffic came from organic search and that it was far and away the largest channel. In the intervening years there were significant search algorithm changes that dramatically altered the layout and ranking of the search engine results pages and saw an increase in the space taken by Local, Videos, Images, Ads, and Quick Answers.
Search engine optimization (SEO) campaigns generate organic traffic. The goal of SEO is to improve website ranking for relevant keywords. You want your website to show up on the first page of organic search results – ‘organic’ refers to the middle section of results you see in search engines – for what you’re selling, and you’d probably be considered super human if you did that without organic SEO.
Google claims their users click (organic) search results more often than ads, essentially rebutting the research cited above. A 2012 Google study found that 81% of ad impressions and 66% of ad clicks happen when there is no associated organic search result on the first page. Research has shown that searchers may have a bias against ads, unless the ads are relevant to the searcher's need or intent 
Here users can find what services are around, or where to buy a particular product. Meanwhile, local searches provide instant information and specific data on customers’ needs, such as a telephone numbers, the address of a company or its public opening hours. Also, do not forget your smartphone as a tool to find information anywhere. 77% of the users of these devices use them to find information preferably in their immediate environment.
Cost and Sustainability. While it’s not totally free and easy, SEO can be more cost-effective in the long run as it will give you a targeted and relevant results that you are aiming for your business. With regards to sustainability, even with the smallest of a budget, you can still develop a plan that will still give you the quality of results and still keep the traffic that your business is getting.
It’s unreasonable to assume that you will pull top rank in Google for every keyword relating to your industry. Your goal should be to pull top rank on the most desired keywords. This is an exercise that will take the effort of both marketing and management. Think about how people would search for your products and services, make a list of these keywords, and check the traffic for each term with a tool like Google’s Keyword Planner. Naturally you will want to rank for the keywords with the most traffic, so whittle your list down to the highest-trafficked, most relevant terms.
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."
Traffic data is a great way to take the temperature of your website and marketing initiatives. When you are writing and promoting blog content on a regular basis, you can use traffic data to track results and correlate these efforts to actual ROI. Be sure to look at traffic numbers over long-term intervals to see trends and report on improvement over time.
One important thing to note is a website’s domain authority (DA) and page authority (PA). This is a number from 1 to 100 that indicates the strength of a website’s domain or a specific page. DA and PA are two of several factors that go into how a website will be ranked on a SERP. The higher the DA and PA, the better the chances are of that webpage ranking on the front page of a SERP (everyone’s dream!). This number is determined by a few things, such as the age of the website and number of links leading to it (backlinks).
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.
I think it has become harder and harder for smaller brands to really stand out in any kind of search. This is especially true with small brands who face lots of competition form other small brands in large cities. How does one build name recognition in NYC as an acupuncturists when any given building may house 3 or 4 practitioners with the same address. Then these small businesses are facing the Google Possum filter. And in some cases brands without websites are showing up in the three pack over highly optimized websites.
Because so few ordinary users (38% according to Pew Research Center) realized that many of the highest placed "results" on search engine results pages (SERPs) were ads, the search engine optimization industry began to distinguish between ads and natural results. The perspective among general users was that all results were, in fact, "results." So the qualifier "organic" was invented to distinguish non-ad search results from ads.