What you are in fact talking about, are Google's death stars like the Shopping box, Knowledge Graph etc. It's fully understandable why many SEOs can't stand them 'cause whole categories of websites (price comparison platforms, for instance) have already fallen victim of such death stars, and there will be certainly numerous other portals, which will lose almost all of their traffic in the near future. Despite your (quite good) suggestions on how to circumvent such an issue, the situation for such an endangered portal can be hopeless when it's its whole business model, which a new Google feature makes obsolete. See geizhals.at for a very famous example.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
You job is to make your fanbase feel part of a community. It’s equally important that you respond to all queries and complaints in a timely and polite manner. This will show your customers that you genuinely care. Address any concerns that your fans have about your product and/or service and do everything you can to address any issues they might have. Use a social media monitoring platform like Hootsuite to keep track of all of your social media interactions.
First, I will show you a quick snapshot of the traffic uplift, which yielded an additional 400,000 unique visitors from organic search traffic on a monthly basis. Then I will explain to you the steps in which we took to get the client to this level. I have also tried to keep this quite general so everyone can adapt their own situation to this case study.

As the number of sites on the Web increased in the mid-to-late 1990s, search engines started appearing to help people find information quickly. Search engines developed business models to finance their services, such as pay per click programs offered by Open Text[7] in 1996 and then Goto.com[8] in 1998. Goto.com later changed its name[9] to Overture in 2001, was purchased by Yahoo! in 2003, and now offers paid search opportunities for advertisers through Yahoo! Search Marketing. Google also began to offer advertisements on search results pages in 2000 through the Google AdWords program. By 2007, pay-per-click programs proved to be primary moneymakers[10] for search engines. In a market dominated by Google, in 2009 Yahoo! and Microsoft announced the intention to forge an alliance. The Yahoo! & Microsoft Search Alliance eventually received approval from regulators in the US and Europe in February 2010.[11]


Social media is adopting its own form of SEO in a way that promotes a positive user experience. The way this algorithm works is by putting your posts in a pool as small as one percent of your followers. If those people engage with the content, then it gets introduced into a larger pool. Slowly but surely, more and more people see it, but only if it’s engaging.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[39] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[40] in addition to their URL submission console.[41] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[42] however, this practice was discontinued in 2009.
Local SERPs that remove almost all need for a website. Then local SERPs, which have been getting more and more aggressively tuned so that you never need to click the website, and, in fact, Google has made it harder and harder to find the website in both mobile and desktop versions of local searches. So if you search for Thai restaurant and you try and find the website of the Thai restaurant you're interested in, as opposed to just information about them in Google's local pack, that's frustratingly difficult. They are making those more and more aggressive and putting them more forward in the results.
The ad auction process takes place every single time someone enters a search query into Google. To be entered into the ad auction, advertisers identify keywords they want to bid on, and state how much they are willing to spend (per click) to have their ads appear alongside results relating to those keywords. If Google determines that the keywords you have bid on are contained within a user’s search query, your ads are entered into the ad auction.
So, you have downloaded your links profiles on a CSV and you now have an extensive list of all your linked domains. If you have been doing SEO for 8+ years like me you can probably just know from analysis which links are bad from a TLD and URL point of view. If you do not know too much you can use tools such as Link Detox: http://www.linkdetox.com/ to complete analysis of your link profile. I would always consult the advice of an expert SEO in this instance because it is easy for these tools to mistake good and bad links.
This is one of the more apparent benefits of organic search traffic. If your website ranks for a relevant keyword, you’ll enjoy regular targeted traffic. This consistent flow of traffic will open the floodgates for new leads and customers. That alone is a sweet deal. But here’s the thing about an organic audience: You have no choice but to create a rich content experience.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
There are many different updates happening in the SEO world from time to time. This is to ensure that the users are seeing only the best search engine results against their queries. However, due to such frequent changes, your website’s position in the organic search results can be affected. And sometimes, you may lose ranking that you built over a period of time.
James is an Ecommerce consultant and owner of Digital Juggler, an E-commerce and Digital Marketing consultancy helping retailers develop, execute and evolve E-commerce strategies and optimise their digital channel. With a background as a Head of E-commerce and also agency side as Head of Client Development, he has experienced life on both sides of the fence. He has helped companies like A&N Media, Sweaty Betty and Smythson to manage RFP/ITT proposals. and been lead consultant on high profile projects for Econsultancy, Salmon and Greenwich Consulting. He is a guest blogger for Econsultancy, for whom he also writes best practice guides, regularly contributes to industry events and co-hosts #ecomchat, a weekly Twitter chat for e-commerce knowledge sharing. For e-commerce advice and support, connect with James on LinkedIn and Twitter.
RankBrain can have an impact on your keywords campaigns. When you are defining your keywords, you are looking for relevant terms that respond to customers queries. This is how a successful SEO strategy works. And logically, if you are picking keywords your audience is not searching for, you campaign will have no chance to succeed. This is where RankBrain can play a role.

If you’ve seen ads appear at the top or right of search engines, then you already know what SEM is. When your ads display in these optimal, highly visible positions, your business can have an edge over your competition. Plus, most searchers only click on the first few results, which gives these premium ad spots a visual advantage to catch the attention of your future customers.


An effective tactic to use to improve your SEO analysis is to measure the sources of your visitors and leads. By doing this, you will understand how impactful your tactics and strategy truly is. By using website tools such as Google Analytics and Adobe Omniture, you can learn about your traffic sources which can be helpful in learning if your SEO optimizations are effective. For example, are your users entering your website through your social media posts or are they finding your content through an organic search? Or maybe, you’ll find that your paid marketing tactics are more effective than you’d imagine. The bottom line is tracking your visitors and leads can provide many insights for both your paid and organic marketing strategies.

Organic is different. Matching keywords to user intent means you may be present in many searches. The user may find you consistently, and once they get to your site, they are more likely to stay. Organic users are still your best long-term customers. In my experience, they have lower bounce rates and more pages visited, and they are more likely to return.
×