I feel we can also focus a lot on the kind of keywords we target. I had a client who was in a very competitive market place, we optimized their site for some really targeted, long tailed keywords which didn’t have very high search volume, so the traffic didn’t really go up drastically but the amount of conversions & the kind of CTRs the site received was incredible.
What you are in fact talking about, are Google's death stars like the Shopping box, Knowledge Graph etc. It's fully understandable why many SEOs can't stand them 'cause whole categories of websites (price comparison platforms, for instance) have already fallen victim of such death stars, and there will be certainly numerous other portals, which will lose almost all of their traffic in the near future. Despite your (quite good) suggestions on how to circumvent such an issue, the situation for such an endangered portal can be hopeless when it's its whole business model, which a new Google feature makes obsolete. See geizhals.at for a very famous example.
Remember that users enter search words into search engines based on their interest. The matching keywords and phrases you choose for your campaigns help determine if your ad will display when users search online. But be aware, other advertisers may also be using the same keywords. That creates competition with your ad campaign. What can you do to win? One solution is keyword bidding.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit and the amount of traffic to the site dropped by 70%. On March 16, 2007 the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.
And executives in particular, whether you’re getting help from an external coach like myself, or have an internal expert that can help them with their profile, this is really important because your executives are the key to really building thought leadership. I know even at Act-On, Bill Pierznik has been making some really amazing posts on LinkedIn around his thoughts on the business world, and it’s garnered a lot of engagement. So, the more you could have your executives be the voice for your company and really get their profiles up and running, the better.
Pay-per-click (PPC) campaigns generate paid traffic. The goal of running a PPC ad campaign is to increase visibility of and traffic to your website by showing up in the paid search results – you can identify ‘paid’ results easily as they are brief advertisements separate from the organic results. These ads are prompted when a user types in a relevant keyword into the search engine.