First, I will show you a quick snapshot of the traffic uplift, which yielded an additional 400,000 unique visitors from organic search traffic on a monthly basis. Then I will explain to you the steps in which we took to get the client to this level. I have also tried to keep this quite general so everyone can adapt their own situation to this case study.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]

In short, press request alerts are requests for sources of information from journalists. Let's say you're a journalist putting together an article on wearable technology for The Guardian. Perhaps you need a quote from an industry expert or some products that you can feature within your article? Well, all you need to do is send out a request to a press service and you can wait for someone to get back to you.
The advantage of paid search is that you can have your website listed on the first pages in a prominent spot on Google and other search engines. However, showing up is only part of the process. You need to create an ad that not only leads to clicks, but to sales or whatever result you're looking for. If you don't know what you're doing, it's possible to write an ad that people are drawn to and click on, however, you don't make sales. Since you pay per click, and clicks can add up quickly, you can lose money.
Those who communicate value, attract others who seek that value in that topic. Those who are in the know, have an edge over those who don’t know, and that’s valuable. It helps people get more of what they want and like, and reduce their risk and negative experiences. In business, it’s profitable to have more of the right information than your competitors ahead of time (as long as it’s done and used legally to avoid fines and reprimands like with insider trading). Having valuable information first means you can make moves and investments before competitors, to buy cheaper and/or sell higher.
James, you give a great template for how a business needs to move forward in their chosen niche online.  Quite informative and the meeting of minds has been something a number of us have done online and in person to gain better insight into our small similar businesses.  Thank you for sharing your detailed approach to increasing organic traffic...content still is king.
Google doesn't always include a whole paragraph of text in the Featured Snippet. If you add "Step 1," "Step 2," "Step 3," etc. to the start of each HTML heading within your content (for example, within your H2 tags), Google will sometimes just list out your headings within the Featured Snippet. I've started to see this happen more and more in keywords beginning with "how to".
A few links down and I've noticed that Brian has a link from WordPress.org. Not bad! Turns out that his content has been referenced within one of WordPress's codex posts. If I were to reach out and offer some additional insight, citing one of my articles, there's a chance I could bag a similar link, especially considering they have a 'Useful Resources' section.
(function(){"use strict";function s(e){return"function"==typeof e||"object"==typeof e&&null!==e}function a(e){return"function"==typeof e}function u(e){X=e}function l(e){G=e}function c(){return function(){r.nextTick(p)}}function f(){var e=0,n=new ne(p),t=document.createTextNode("");return n.observe(t,{characterData:!0}),function(){t.data=e=++e%2}}function d(){var e=new MessageChannel;return e.port1.onmessage=p,function(){e.port2.postMessage(0)}}function h(){return function(){setTimeout(p,1)}}function p(){for(var e=0;et.length)&&(n=t.length),n-=e.length;var r=t.indexOf(e,n);return-1!==r&&r===n}),String.prototype.startsWith||(String.prototype.startsWith=function(e,n){return n=n||0,this.substr(n,e.length)===e}),String.prototype.trim||(String.prototype.trim=function(){return this.replace(/^[\s\uFEFF\xA0]+|[\s\uFEFF\xA0]+$/g,"")}),String.prototype.includes||(String.prototype.includes=function(e,n){"use strict";return"number"!=typeof n&&(n=0),!(n+e.length>this.length)&&-1!==this.indexOf(e,n)})},"./shared/require-global.js":function(e,n,t){e.exports=t("./shared/require-shim.js")},"./shared/require-shim.js":function(e,n,t){var r=t("./shared/errors.js"),i=(this.window,!1),o=null,s=null,a=new Promise(function(e,n){o=e,s=n}),u=function(e){if(!u.hasModule(e)){var n=new Error('Cannot find module "'+e+'"');throw n.code="MODULE_NOT_FOUND",n}return t("./"+e+".js")};u.loadChunk=function(e){return a.then(function(){return"main"==e?t.e("main").then(function(e){t("./main.js")}.bind(null,t))["catch"](t.oe):"dev"==e?Promise.all([t.e("main"),t.e("dev")]).then(function(e){t("./shared/dev.js")}.bind(null,t))["catch"](t.oe):"internal"==e?Promise.all([t.e("main"),t.e("internal"),t.e("qtext2"),t.e("dev")]).then(function(e){t("./internal.js")}.bind(null,t))["catch"](t.oe):"ads_manager"==e?Promise.all([t.e("main"),t.e("ads_manager")]).then(function(e){undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined}.bind(null,t))["catch"](t.oe):"publisher_dashboard"==e?t.e("publisher_dashboard").then(function(e){undefined,undefined}.bind(null,t))["catch"](t.oe):"content_widgets"==e?Promise.all([t.e("main"),t.e("content_widgets")]).then(function(e){t("./content_widgets.iframe.js")}.bind(null,t))["catch"](t.oe):void 0})},u.whenReady=function(e,n){Promise.all(window.webpackChunks.map(function(e){return u.loadChunk(e)})).then(function(){n()})},u.installPageProperties=function(e,n){window.Q.settings=e,window.Q.gating=n,i=!0,o()},u.assertPagePropertiesInstalled=function(){i||(s(),r.logJsError("installPageProperties","The install page properties promise was rejected in require-shim."))},u.prefetchAll=function(){t("./settings.js");Promise.all([t.e("main"),t.e("qtext2")]).then(function(){}.bind(null,t))["catch"](t.oe)},u.hasModule=function(e){return!!window.NODE_JS||t.m.hasOwnProperty("./"+e+".js")},u.execAll=function(){var e=Object.keys(t.m);try{for(var n=0;n=c?n():document.fonts.load(l(o,'"'+o.family+'"'),a).then(function(n){1<=n.length?e():setTimeout(t,25)},function(){n()})}t()});var w=new Promise(function(e,n){u=setTimeout(n,c)});Promise.race([w,m]).then(function(){clearTimeout(u),e(o)},function(){n(o)})}else t(function(){function t(){var n;(n=-1!=y&&-1!=v||-1!=y&&-1!=g||-1!=v&&-1!=g)&&((n=y!=v&&y!=g&&v!=g)||(null===f&&(n=/AppleWebKit\/([0-9]+)(?:\.([0-9]+))/.exec(window.navigator.userAgent),f=!!n&&(536>parseInt(n[1],10)||536===parseInt(n[1],10)&&11>=parseInt(n[2],10))),n=f&&(y==b&&v==b&&g==b||y==x&&v==x&&g==x||y==j&&v==j&&g==j)),n=!n),n&&(null!==_.parentNode&&_.parentNode.removeChild(_),clearTimeout(u),e(o))}function d(){if((new Date).getTime()-h>=c)null!==_.parentNode&&_.parentNode.removeChild(_),n(o);else{var e=document.hidden;!0!==e&&void 0!==e||(y=p.a.offsetWidth,v=m.a.offsetWidth,g=w.a.offsetWidth,t()),u=setTimeout(d,50)}}var p=new r(a),m=new r(a),w=new r(a),y=-1,v=-1,g=-1,b=-1,x=-1,j=-1,_=document.createElement("div");_.dir="ltr",i(p,l(o,"sans-serif")),i(m,l(o,"serif")),i(w,l(o,"monospace")),_.appendChild(p.a),_.appendChild(m.a),_.appendChild(w.a),document.body.appendChild(_),b=p.a.offsetWidth,x=m.a.offsetWidth,j=w.a.offsetWidth,d(),s(p,function(e){y=e,t()}),i(p,l(o,'"'+o.family+'",sans-serif')),s(m,function(e){v=e,t()}),i(m,l(o,'"'+o.family+'",serif')),s(w,function(e){g=e,t()}),i(w,l(o,'"'+o.family+'",monospace'))})})},void 0!==e?e.exports=a:(window.FontFaceObserver=a,window.FontFaceObserver.prototype.load=a.prototype.load)}()},"./third_party/tracekit.js":function(e,n){/**

Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.
Companies with stronger SEO efforts on increasing performance emphasised the development of content and updating content on the website as a relatively easy task to perform. However, they start to get lost when it comes to the management of keywords and the construction of external links as the tactics become more difficult to perform. Blogging and the integration of social media were mentioned as simple enough tasks to perform in-house.
We are an experienced and talented team of passionate consultants who live and breathe search engine marketing. We have developed search strategies for leading brands to small and medium sized businesses across many industries in the UK and worldwide. We believe in building long-term relationships with our clients, based upon shared ideals and success. Our search engine marketing agency provides the following and more:
Great analysis and tips, Rand. Your first tip was to focus on branded search... but when it comes to small-medium businesses, there's a limit on how much brand search-demand one can produce. Are you saying that google search will eventually become an engine made only for entities/businesses we already know of? Where does that leave the whole discovery process?
In 2014, Cisco stated that video made 64% of all internet traffic. In 2015, Searchmetrics was releasing a white paper quoting that 55% of all keyword searches in the U.S. return at least one video blended into Google’s web search results and that 8 out 10 of those videos belonged to YouTube. And in 2016, Cisco was also sharing that online videos will account for more than 80% of all consumer internet traffic by 2020.
I'm having a problem that I suspect many marketers share. Quite simply … SEO or just buy the traffic. I noticed that you switched to SEO because you like the passive income component. But when I consider ALL the work and ongoing moving parts to SEO .. visions of the hamster on a treadmill appear in place of couch potato cash. Have you noticed that there is always something new to do … now it's Google+ ect. and "more to do" is surly on it's way. It's reached the point where it's mind numbing.
It means that every piece of content that leads searchers to you is extending your brand equity. Not only that, you’re creating multiple touch points, so potential customers have every opportunity to discover your business. It takes on average of 6-8 touch points with a brand before someone becomes “sales-ready.” Too many? Well, for some industries, it’s way more. One woman’s car-buying journey took 900 digital touch points spanning three months.
Delivering business transformation is an incredibly complex task for IT. Keeping the lights on while supporting digital transformation initiatives requires a new era of hybrid IT so IT teams can address the needs of not just IT operations staff, but also application developers and LOB executives.	  Through in-depth interviews of IT operations and LOB staff, IDC shares how … Continue Reading...

When a search engine user in the targeted area searches for the keywords or keyphrases that you chose, your ad enters an immediate online auction. Your ad is displayed when it matches bid and relevancy criteria, so you want to make sure that you have an appropriate budget size, and that you are bidding on keyphrases relevant to your products/services (such as those indicated on your website or landing page). You are not charged when your ad is displayed, but rather when someone clicks on your ad to take further action.
Internet marketing isn’t like having the confused shopper experience, where you’re holding an organic and non-organic apple in your hand, wondering which one is truly better. A combined strategy of using organic search with paid search is a powerful one-two punch strategy that increases traffic, generates leads, and converts window shoppers into loyal, repeat customers.

If we can do that effectively, it is in the best interest of the search engine to serve our content to our target audience for us, providing highly targeted visitors to our pages at no incremental cost. This is the secret to the organic channel, and why organic search listings are so valuable. Other channels often come with additional cost for each new visitor provided.
Organic-search traffic is the only search traffic you should be interested in. This is the traffic you get without any direct action on your part. It’s not something you can achieve with ease, though. The statistics say that 83 percent of organic clicks go to the first four results under most categories on search engines. So, to see your site attain that status, lay the foundations well in advance for the traffic you want.
The ideal behind this is not only to achieve more traffic, but obtain more qualifed traffic to your website, traffic that arrives at your website with the purpose of purchasing a product or service. There are many agencies and budgets available, yet at the time when Choosing SEO Services make sure they cater for your needs and strike the right balance between budget and objectives.
Organic social media is anything that happens on social media without paid promotion. When you post as your page but don’t put any money behind this post to “boost” it, you are creating an organic post. If you comment on a business’s post in your news feed, and the “Sponsored” tag does not appear on the post, that action qualifies as organic. In other words, organic actions occur on non-ads.
Marketing strategies have evolved over the course of time. Back then, organic search, when hit right on the spot, would give you the right amount of exposure that your business needs. Apart from that, it used to give you a better deal at a lower cost compared to when you do pay per click. However, with the digital age coming in and small businesses are starting to become popular, they are now mostly leaning towards the fast-paced wonders of paid search.
In addition to helping you find keywords you should be bidding on, thorough keyword research can also help you identify negative keywords – search terms that you should exclude from your campaigns. Negative keywords aren’t terms with negative connotations, but rather irrelevant terms that are highly unlikely to result in conversions. For example, if you sell ice cream, you might want to exclude the keyword “ice cream recipes”, as users searching for ice cream recipes are unlikely to be in the market for your product.
High organic rankings for specific terms represent an opportunity to double down by taking up more real estate with paid results. When your brand is still establishing a presence for a specific keyword in organic results, you can use paid ads to get your brand some visibility on the SERP. Once you start ranking well organically for that keyword, you may choose to allocate your paid budget to another targeted opportunity and deploy a similar strategy.
The typical Web user might not realize they’re looking at apples and oranges when they get their search results. Knowing the difference enables a searcher to make a better informed decision about the relevancy of a result. Additionally, because the paid results are advertising, they may actually be more useful to a shopping searcher than a researcher (as search engines favor research results).
×