I am too much late to commenting on this article. I want to read "How much get Organic traffic by SEO", found your your article on top & really very interesting. James Norquay, you did good research.I think Now days google block mostly SEO activities. Is this worthy for current marketing scnerio?If any other post to related strategy for increasing Organic traffic, you can reffer me.
Sometimes considered to be a part of SEM, social media sites like Twitter, YouTube, Facebook, and Delicious have search fields and also pass authority to sites through links. Making sure your content and links are placed (where necessary) on these social media sites can increase your influence in user search engine queries. SMM is a rapidly growing area of Internet marketing but to discuss it further is beyond the scope of this Guide.
Another way search engine marketing is managed is by contextual advertising. Here marketers place ads on other sites or portals that carry information relevant to their products so that the ads jump into the circle of vision of browsers who are seeking information from those sites. A successful SEM plan is the approach to capture the relationships amongst information searchers, businesses, and search engines. Search engines were not important to some industries in the past, but over the past years the use of search engines for accessing information has become vital to increase business opportunities.[31] The use of SEM strategic tools for businesses such as tourism can attract potential consumers to view their products, but it could also pose various challenges.[32] These challenges could be the competition that companies face amongst their industry and other sources of information that could draw the attention of online consumers.[31] To assist the combat of challenges, the main objective for businesses applying SEM is to improve and maintain their ranking as high as possible on SERPs so that they can gain visibility. Therefore, search engines are adjusting and developing algorithms and the shifting criteria by which web pages are ranked sequentially to combat against search engine misuse and spamming, and to supply the most relevant information to searchers.[31] This could enhance the relationship amongst information searchers, businesses, and search engines by understanding the strategies of marketing to attract business.
It means that every piece of content that leads searchers to you is extending your brand equity. Not only that, you’re creating multiple touch points, so potential customers have every opportunity to discover your business. It takes on average of 6-8 touch points with a brand before someone becomes “sales-ready.” Too many? Well, for some industries, it’s way more. One woman’s car-buying journey took 900 digital touch points spanning three months.

Get a handle on your brand reputation. Your brand story is the one that you tell. Your reputation is the story that customers tell on your behalf. If someone consistently stumbles on your site when they type in niche search queries, they’ll be intrigued. The result? They’ll start conducting navigational searches for your brand. The intent behind that search? They want reviews and other customer’s experiences with your business. Ask your customers for reviews and reach out to third-party review sites in your niche. This way, these navigational searches don’t come up empty. I also recommend monitoring your brand mentions. The easy way is to set up Google Alerts. Type in your brand name and create your alert. Any mention online and you’ll be notified.


To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]


Look to successful brands to see how they keep customers engaged. Do you need to constantly update content, as Home Depot does with its seasonal DIY tips? Or do you need to position yourself as a lifestyle brand, following in Red Bull’s footsteps by running a series of videos on YouTube? Whatever method you adopt, it’s important to have some quantifiable way to measure how successful you are, so keep your ultimate goal in mind as you evaluate your return on investment.
Keyword difficulty is a number that lets you know how difficult it will be to rank for a certain keyword. The higher the number, the more difficult it will be to rank on that keyword. There are a few sites online that will tell you keyword difficulty of a word or phrase. Record these numbers in your Excel document or Google Sheet that you made earlier.
Check out the average CPC for some top industries. These are the costs of one click. Mind you, that’s a click that is not guaranteed to convert a customer. With paid campaigns, you have to keep optimizing and testing the ad creative to lower your CPC and increase your CTR. You can imagine that it takes a massive budget to even set a paid campaign in motion. And it requires just as much to keep maintaining it.

Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.
There’s also a lot of stuff even just around how much having a few simple images in your LinkedIn Pulse or blog post can really increase engagement. We’re becoming a much more visual society as it is. If you look at every social network, there’s now the ability to do video, to add photos. And so the more that you can capitalize on that, the better.
In today’s complex organizations, IT departments are already overburdened and experiencing expertise gaps, shrinking budgets and only so many hours to get IT done. Migrating to O365 is a top priority for many organizations, but it can also be costlier, more complicated and more time-consuming than expected — especially when internal IT resources are already stretched … Continue Reading...
Organic is what people are looking for; the rest of these simply put things in front of people who may or may not be seeking what you offer. We know that approximately X number of people are looking for Y every day. So if we can get on front of those people, we have a much greater opportunity to create long-term relationships and increase our overall ROI.
×