While there are several HTML tagging techniques that improve a page’s Google SEO results, creating relevant page content is still the best way to rank high. A big part of content creation is your use of targeted keywords. You should include important keywords in your first 50 words, since early placement can be a signal of relevance. And while you should never repeat keywords too often at the expense of good writing, you should repeat keywords in your content two or three times for short pages and four to six times for longer pages. Also, you may wish to use some keyword variation in your content – such as splitting keywords up – as this could potentially improve your ranking.
An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[18][19][51] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[52] although the two are not identical.
Web search is one of the most powerful tools we have today. Just like you, people from all walks of life use Google to find solutions, learn new things and understand the world around. One of those new things may be determining whether SEO vs SEM is best for your business. Whether you’re an online business or a local business, chances are that people are actively looking for you.
In the zero-results sets, Google was still willing to show AdWords, which means if we have customer targets, we can use remarketed lists for search advertising (RLSA), or we can run paid ads and still optimize for those. We could also try and claim some of the data that might show up in zero-result SERPs. We don't yet know what that will be after Google rolls it back out, but we'll find out in the future.

In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit and the amount of traffic to the site dropped by 70%. On March 16, 2007 the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[69][70]
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed only to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Website ranking doesn't just come from what's on your website. Google, the number one search engine used today, uses a variety of other factors to rank websites. Things like your social media activity, appearances on other sites through interviews or guest blogging, and being listed as a resource on another site all increase your standing in Google's eyes.
That’s not to say that there isn’t the same kind of competitive intelligence where SEO is concerned. In fact, I’m a big fan of analyzing your competitor’s SEO landscape. You can check what keywords they’re targeting and their sources of referral traffic to strengthen your strategy. But the bottom line is this: Your competitors can’t steal your content. The experience that you provide to users is unique to you.The more organic traffic you have, the more currency you have in the online space.
In addition to helping you find keywords you should be bidding on, thorough keyword research can also help you identify negative keywords – search terms that you should exclude from your campaigns. Negative keywords aren’t terms with negative connotations, but rather irrelevant terms that are highly unlikely to result in conversions. For example, if you sell ice cream, you might want to exclude the keyword “ice cream recipes”, as users searching for ice cream recipes are unlikely to be in the market for your product.
Thanks for the comment Slava good too see your team is on top of things and happy you liked the post. The website in the case listed was a client who has taken on an agency who was doing lower quality SEO work which was affecting the site such as the huge link network and a strategy which only revolved around mainly head terms. We saw no long tail integration from the old agency's strategy, so we were able to yield great results to begin with. The clients site has 100's of high quality articles which we were able to re optimize and update as noted. Further to this they had a large index of high quality pages to work from. Sure enough the points listed above were key elements to a far wider strategy which could be 100's of points. I just wanted to include some of the biggest wins and easy to implement points.  
Now, it is not that these sites are not interested in Google users. In fact, they have hired us to help them increase their share. However, they are getting so much traffic from sites like Facebook that it seems there is less urgency about attracting this traffic and less willingness to change the site to meet organic standards. Not long ago, sites would urgently and unquestioningly abide by Google’s standards to court that traffic.