Nice post. I was wondering if all this content of your strategy was been writien in blog of the site, or if you added to content in some other specific parts of the sites. I don't believe 100% in the strategy of reomoving links. If Google just penalize you taking into account your inbound likes, It would be so easy to attack your competitors just by buying dirty link packages targeting to their sites.
Good question, for most directories I use they ask for mobile number to send a message of verification, for the ones which phone you for verification inform the company before hand to tell their customer service people to be ready. I know the bigger the company the more tricky these things get you just have to find out what works best to answer the calls even if they give you a direct number to use. 
Together is better! For many businesses, the best approach is a mix of both organic and paid search results. The advantage of this approach is that organic rankings give a business credibility and evergreen search results. Paid search (PPC) provides immediate top-of-the-page listings and greater click through rates, i.e., sales, when consumers are ready to purchase.
Well, yes and no. Sure, you can get hit with an algorithm change or penalty that destroys all your traffic. However, if you have good people who know what they are doing, this is not likely to happen, and if it does, it is easy (in most cases) to get your visits back. Panda and Penguin are another story, but if you get hit by those it is typically not accidental.
Organic search is extremely important for online retailers, as many studies suggest it drives around 50% of website traffic. When it comes to search engine optimization for eCommerce, marketers get obsessive about testing all methods available to them to try and achieve higher rankings. The logic is very simple: higher positions on SERPs automatically result in higher impressions and a significantly better click-through rate. More people visiting your store should ultimately translate into better conversions and higher revenues.  
Plan your link structure. Start with the main navigation and decide how to best connect pages both physically (URL structure) and virtually (internal links) to clearly establish your content themes. Try to include at least 3-5 quality subpages under each core silo landing page. Link internally between the subpages. Link each subpage back up to the main silo landing page.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
Here users can find what services are around, or where to buy a particular product. Meanwhile, local searches provide instant information and specific data on customers’ needs, such as a telephone numbers, the address of a company or its public opening hours. Also, do not forget your smartphone as a tool to find information anywhere. 77% of the users of these devices use them to find information preferably in their immediate environment.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However Google implemented a new system which punishes sites whose content is not unique.[35] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[36] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[37] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognised term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [38]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
In 2007, Google announced a campaign against paid links that transfer PageRank.[29] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[30] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript.[31]
Thanks for the comment Slava good too see your team is on top of things and happy you liked the post. The website in the case listed was a client who has taken on an agency who was doing lower quality SEO work which was affecting the site such as the huge link network and a strategy which only revolved around mainly head terms. We saw no long tail integration from the old agency's strategy, so we were able to yield great results to begin with. The clients site has 100's of high quality articles which we were able to re optimize and update as noted. Further to this they had a large index of high quality pages to work from. Sure enough the points listed above were key elements to a far wider strategy which could be 100's of points. I just wanted to include some of the biggest wins and easy to implement points.  
Cost and Sustainability. While it’s not totally free and easy, SEO can be more cost-effective in the long run as it will give you a targeted and relevant results that you are aiming for your business. With regards to sustainability, even with the smallest of a budget, you can still develop a plan that will still give you the quality of results and still keep the traffic that your business is getting.
Essentially, what distinguishes direct from organic traffic today is tracking. According to Business2Community, direct traffic is composed of website visits which have “no referring source or tracking information.” A referring source can be a search engine, or it can be a link from another website. Direct traffic can include visits that result from typing the URL directly into a browser, as the simple definition suggests.
Search engine optimization consultants expanded their offerings to help businesses learn about and use the advertising opportunities offered by search engines, and new agencies focusing primarily upon marketing and advertising through search engines emerged. The term "search engine marketing" was popularized by Danny Sullivan in 2001[12] to cover the spectrum of activities involved in performing SEO, managing paid listings at the search engines, submitting sites to directories, and developing online marketing strategies for businesses, organizations, and individuals.
When the topic of SEO vs SEM arises, some experts may argue that SEO is the best way to go as it offers higher quality leads at a cheaper cost when compared to SEM. However, it isn’t so simple. Every business is different and has unique needs. For example, your small business may not have a big ad budget and it may also lack the resources needed for doing effective SEO.
While you are updating your website’s copy, you’ll also want to implement changes in your HTML – specifically, in your H1 and H2 title tags, as well as your meta descriptions and URLs. You’ll want to put your most relevant keywords in these sections of your website HTML. Search engines take into account the words in these sections of your website’s HTML when listing out relevant webpages in a search result.
Keyword difficulty is a number that lets you know how difficult it will be to rank for a certain keyword. The higher the number, the more difficult it will be to rank on that keyword. There are a few sites online that will tell you keyword difficulty of a word or phrase. Record these numbers in your Excel document or Google Sheet that you made earlier.
This is both a challenge and a great opportunity. The challenge is because the organic spots aren’t what they used to be – there used to be ten organic spots on the first page to compete for – and only five above the fold, (which refers to the vertical limitations of a user’s screen and the amount of content one can view without scrolling). Now there might be local search results, news releases, images and video included in the results – many of those above the fold. What’s the new number one spot? Is it the first local result, news release, or organic listing?
Search Engine Marketing or SEM encompasses the steps taken to increase relevant traffic to your website, through higher rankings on search engines. Traditional SEM is made up of two processes: “organic” search engine optimization (SEO) and pay-per-click advertising (PPC) (or cost-per-click (CPC)). However, the field of SEM is a changing and expanding field thanks to constant new developments, such as:
Search engine marketing (SEM) is a form of Internet marketing that involves the promotion of websites by increasing their visibility in search engine results pages (SERPs) primarily through paid advertising.[1] SEM may incorporate search engine optimization (SEO), which adjusts or rewrites website content and site architecture to achieve a higher ranking in search engine results pages to enhance pay per click (PPC) listings.[2]
Click through rates: Searches using terms that denote high purchase intent such as product or brand-specific keywords will get more clicks than organic results. The advantage of paid search can clearly be seen in the Internet retailers MarketLive Performance Index data. For the year 2013 as a whole, PPC accounted for 36.5% of search traffic but an outsized 47.9% of revenue from search.
Though a long break is never suggested, there are times that money can be shifted and put towards other resources for a short time. A good example would be an online retailer. In the couple of weeks leading up to the Christmas holidays, you are unlikely to get more organic placement than you already have. Besides, the window of opportunity for shipping gifts to arrive before Christmas is ending, and you are heading into a slow season.
×