So just how much of the traffic that finds itself labeled as direct is actually organic? Groupon conducted an experiment to try to find out, according to Search Engine Land. They de-indexed their site for the better part of a day and looked at direct and organic traffic, by hour and by browser, to pages with long URLs, knowing that pages with shorter URLs actually do get a large amount of direct traffic, as they can be typed quickly and easily into a browser. The results showed a 50% drop in direct traffic, clearly demonstrating how all of these other factors come into play during the analytics process.
The term was first used by Internet theorist John Kilroy in a 2004 article on paid search marketing. Because the distinction is important (and because the word "organic" has many metaphorical uses) the term is now in widespread use within the search engine optimization and web marketing industry. As of July 2009, "organic search" is now common currency outside the specialist web marketing industry, even used frequently by Google (throughout the Google Analytics site, for instance).
Organic content marketing, on the other hand, finds ways to make customers look for you naturally. In effect, it means using any type of marketing method that doesn’t require a direct payment. But, there are still costs involved. These include paying for content creation and the time spent monitoring the campaign and responding to customers. This type of inbound marketing involves providing valuable content that customers need. Then, supporting it with a constant, online presence (often through social media).
There are also a few more similarities. All of these marketing methods are measurable to an extent never seen in any other media. Every click can be measured – where and when it came – and followed through to the conversion, the sale and the lifetime customer value. This feedback loop creates optimization opportunities that can create huge incremental improvements in your SEM campaigns.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines. By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.