One of the reasons for a traffic drop can also be due to your site losing links. You may be seeing a direct loss of that referral traffic, but there could also be indirect effects. When your site loses inbound links, it tells Google that your site isn't as authoritative anymore, which leads to lower search rankings that in turn lead to traffic drops (because fewer people are finding your site if it's not ranked as highly and more).

Go to local events or Meetup events and connect with bloggers in your industry. An example of an event I run to connect with bloggers and people in the online marketing word is: http://www.meetup.com/Online-Marketing-Sydney/. Make friends first and then try to gain guest posts later. I am not really a fan of websites which are flooded with guest posts one after another; it is the type of thing which Google is just waiting to target.


Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
Keyword research is one of the most important thing you need to do before producing any content. Understanding what your target audience is searching for and what content to produce for them is vital in getting your content found, read and shared. In this session, Sameen shares how to do a keyword research and her strategies for finding the best keywords for your brand.
It is important to target keywords that your target consumer is likely to search for while looking for the product or service that you offer. For example, if you are an accounting firm located in Miami you would not want to target a general keyword such as “accounting firm.” Not only is it a very difficult keyword, but also you will be attracting visitors from all over the globe. Instead, you would want to target a more precise keyword such as “accounting firm in Miami” or “Miami accounting firm.”
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
This is a crucial area. If you do not have schema markup and rel="author", you are costing your business money. It is as simple as that. As an example, say I want to make spaghetti (pasta) for dinner I search for “Spaghetti Recipe” and instantly I see some great markup in play, but one competitor has no markup and no rel="author" they are losing business in my eyes. Wouldn't you agree?.

Your site is GOOD! Well-written, informative, authentic. You live it; you write it. You are spot-on for what your "typical reader" needs/wants. That authenticity makes a difference. Yes, I've surfed around other low-carb diet sites, but I feel like they are just "spouting" at me. Your posts are written as if we're sisters or best friends and you're talking WITH me.
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility.[47] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[47] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's meta data, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL normalization of web pages accessible via multiple urls, using the canonical link element[48] or via 301 redirects can help make sure links to different versions of the url all count towards the page's link popularity score.
I feel we can also focus a lot on the kind of keywords we target. I had a client who was in a very competitive market place, we optimized their site for some really targeted, long tailed keywords which didn’t have very high search volume, so the traffic didn’t really go up drastically but the amount of conversions & the kind of CTRs the site received was incredible.
It means that every piece of content that leads searchers to you is extending your brand equity. Not only that, you’re creating multiple touch points, so potential customers have every opportunity to discover your business. It takes on average of 6-8 touch points with a brand before someone becomes “sales-ready.” Too many? Well, for some industries, it’s way more. One woman’s car-buying journey took 900 digital touch points spanning three months.
Paid search is a great option for anyone targeting transactional queries. The people behind these types of queries have already researched and decided what they want, and are often one click away from getting their credit cards out. In fact, these “high commercial intent” searches for product or brand-specific keyphrases receive more clicks via paid ads than organic results, by a margin of nearly 2:1 It is worth noting however that 94 per cent of web users prefer organic results to paid results so it is best not to put all of your eggs in one basket.
Another one of the benefits of SEM is that people who see your PPC ads are those most likely to want to buy your product or service. PPC ads require you to choose a geographic location and specific search queries to target. As a result, you can be sure that anyone who clicks on your ad is not arbitrarily surfing the web, but rather, is looking for your product or service and in a position to do so.
Local SERPs that remove almost all need for a website. Then local SERPs, which have been getting more and more aggressively tuned so that you never need to click the website, and, in fact, Google has made it harder and harder to find the website in both mobile and desktop versions of local searches. So if you search for Thai restaurant and you try and find the website of the Thai restaurant you're interested in, as opposed to just information about them in Google's local pack, that's frustratingly difficult. They are making those more and more aggressive and putting them more forward in the results.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[67][68]
As pointed out, they are certainly not the same, but it might not be a bad idea to track and report on the direct traffic. If there has been outreach done and the company is mentioned in print with a URL, direct traffic (along with some search traffic on the URL or business name itself) is likely to go up. If your email newsletters are not tagged, they're likely to show up under direct traffic. Depending on your role, some of what you do under the greater SEO/inbound marketing role can show up under the direct traffic.
It works in a similar way as a canonical tag does to show when a duplicate version of a page exists. But in this case, it helps Google index local content easier within local search engines. It also helps to pass trust across sites and improve the way Google crawls these pages. To know more about how errors you should not make when it comes to hreflang tags, you can check our previous article.
Increase your real estate. When you show up in the first page of the organic AND paid search results, you’re doing something right. That’s not just something the marketing nerds like us will notice either – people realize (consciously or not) that it takes a successful, legitimate, intelligent company to show up twice in one search. A company that doesn’t know what they’re doing couldn’t do that… right?
The thing about SEO in 2018 is that Google changes its algorithms more than once a day! Reports say that the company changes its algorithms up to 600 times a year. While the majority of those updates consist of smaller changes, among them is the occasional, major update like Hummingbird or Panda that can really wreak havoc with your traffic and search rankings.
Look at your short- and long-term goals to choose whether to focus on organic or paid search (or both). It takes time to improve your organic search rankings, but you can launch a paid search campaign tomorrow. However, there are other considerations: the amount of traffic you need, your budget, and your marketing objectives. Once you’ve reviewed the pros and cons, you can select the search strategy that’s right for you.
As you can see by the example above, organic marketing tactics are natural, authentic, and value-based. The benefits that businesses reap from these tactics are cultivated over longer periods of time, ultimately helping you build a brand that customers want to visit regularly. And although you’re doing less “pushing” with organic marketing, when done right you can still generate a steady stream of loyal traffic. Remember, this is about building high-value brand equity and industry authority with your customers. You’re not just attempting to drive sales here; you’re working hard to engage and educate your target audiences.  
Your keyword research will determine whether or not your search optimization effort will be a success or a failure. Many businesses make the mistake of basing their keywords on just the search volume. This often leads to attempting to rank for keywords that are very difficult and costly to move up, or even keywords that aren’t “buyer” keywords and just send useless traffic to the website.

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
Organic is different. Matching keywords to user intent means you may be present in many searches. The user may find you consistently, and once they get to your site, they are more likely to stay. Organic users are still your best long-term customers. In my experience, they have lower bounce rates and more pages visited, and they are more likely to return.
×