Paid social can help amplify organic content, using social network advertising tools to target the audience. Using the rugby example, on Facebook you could target people who like other leading rugby fan pages. I recommend testing paid social campaigns to promote key content assets like reports and highlight important news/announcements. With a small budget you can quickly measure amplification impact.
When you’re truly faced with the dilemma, “Organic vs. paid search traffic?” don’t get too worked up. There’s always a solution to your specific situation – trust us, we deal with it all of the time! Every business is different and has different goals, so the perfect search marketing mix does exist for your company. First, think about each of these bullet points, and at the end ask yourself, “Am I willing to commit?”

SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[49] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[50]

Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.

A meta description is a short blurb about the particular page of your website. This is a great place to insert keywords easily. However, you also want to include helpful information for potential site visitors to draw them into clicking on your website. This blurb will appear in search engine results pages under your H1 title tag and URL of your webpage.
I feel we can also focus a lot on the kind of keywords we target. I had a client who was in a very competitive market place, we optimized their site for some really targeted, long tailed keywords which didn’t have very high search volume, so the traffic didn’t really go up drastically but the amount of conversions & the kind of CTRs the site received was incredible.

Now, it is not that these sites are not interested in Google users. In fact, they have hired us to help them increase their share. However, they are getting so much traffic from sites like Facebook that it seems there is less urgency about attracting this traffic and less willingness to change the site to meet organic standards. Not long ago, sites would urgently and unquestioningly abide by Google’s standards to court that traffic.
×