Delivering business transformation is an incredibly complex task for IT. Keeping the lights on while supporting digital transformation initiatives requires a new era of hybrid IT so IT teams can address the needs of not just IT operations staff, but also application developers and LOB executives.   Through in-depth interviews of IT operations and LOB staff, IDC shares how … Continue Reading...

Search engine marketing encompasses a range of activities all centred around making your website more visible when someone uses a search engine. If someone is looking for your business on the internet, it is vital your website appears prominently in the search engines’ results pages, or it will never deliver the value to your business that today’s economy demands.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
The social media landscape is constantly evolving. New networks rise to prominence (e.g. Snapchat), new technology increases user participation and real-time content (e.g. Periscope) and existing networks enhance their platform and product (e.g. Facebook,Twitter, Pinterest and Instagram launching ‘buy’ buttons). Organic reach is also shrinking as the leading networks ramp up their paid channels to monetise platform investment.
I am too much late to commenting on this article. I want to read "How much get Organic traffic by SEO", found your your article on top & really very interesting. James Norquay, you did good research.I think Now days google block mostly SEO activities. Is this worthy for current marketing scnerio?If any other post to related strategy for increasing Organic traffic, you can reffer me.

I am too much late to commenting on this article. I want to read "How much get Organic traffic by SEO", found your your article on top & really very interesting. James Norquay, you did good research.I think Now days google block mostly SEO activities. Is this worthy for current marketing scnerio?If any other post to related strategy for increasing Organic traffic, you can reffer me.
Google has the larger market share by some way (in the UK it holds 98 per cent of the mobile search market and 90 per cent across all platforms) so it’s fair to say there is potential for more eyes on the ad. Bing’s interface is also less swanky than that of Google’s but, as mentioned, it’s worth giving Bing a shot and enabling yourself to be in two places instead of one.
And then on the flip side with partners, it is a little bit different. Again, you can do that education piece, but ultimately offering reciprocation is going to be your best bet. So finding out who the social media manager is for your partner, talking with them, figuring out, ‘Hey, we’re going to retweet you this many times this month, or is there any events that you have coming up that we can help promote?’ And that reciprocation is very natural. It also builds credibility for both organizations. I think there’s a lot of opportunity to engage with your partners as well as amplify their content.

Within social media, there are a lot of various ways to optimize your targeting organically. While paying for social media marketing can be effective, adjusting the targeting on your posts can boost your organic reach. Within Facebook and Twitter, you can adjust for your posts to target the following criteria: gender, relationship, status, education level, age, location, language, interests, and post end date. With these targeting attributes available, you can better target your audience so the right people can see your content.


With more and more content being created on Facebook every day, organic reach is steadily declining. That’s why you might want to consider using Facebook’s paid advertising options to promote and increase the reach of your posts. While organic posts only get shown to your own Facebook fans, paid ads allow you to target people who have not liked your page but have similar interests and/or demographics.

Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed only to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.

Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]
In the end of the day it depends on the size of the website you are working with and how well known the brand is in the market. You can adapt some of the strategies listed above in the post on scale and it can have a highly positive impact on a web property, the property in question is a real content house so any thing is possible. What else do you suggest we should do I will advise you if it has been done already?
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7] On May 2, 2007,[8] Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona[9] that SEO is a "process" involving manipulation of keywords and not a "marketing service."

Hey, Matt! Thank you for your sharing, and I learned much from it, but I still have a question. We began to do SEO work for our site 2 years ago, and our organic traffic grew 5 times ( from 8K to 40K every day). But two years later, it is very difficult to get it grow more, even it drop to 3.2K every day. So can you give me any advice to make our site's traffic grow again? Thank you in advance!
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[25] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[26] Patents related to search engines can provide information to better understand search engines.[27] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[28]
However, while bidding $1,000 on every keyword and ranking #1 for every relevant search sounds nice in theory, most businesses have to play a balancing game between ranking higher and paying too much for clicks. After all, if it costs $17.56 to rank in position #1, but you can only afford to pay $5.00 per click, bidding $1,000 on a keyword to guarantee yourself the #1 position would be a great way to bid yourself out of business.
Pay-per-click (PPC) campaigns generate paid traffic. The goal of running a PPC ad campaign is to increase visibility of and traffic to your website by showing up in the paid search results – you can identify ‘paid’ results easily as they are brief advertisements separate from the organic results. These ads are prompted when a user types in a relevant keyword into the search engine.
For example, we have seen retailers who produce seasonal items, such as ornaments, based upon data that they are able to find through the Data Cube. This information tells them about the themes and ideas that are popular throughout the year and during their peak selling seasons, helping them to know which items will be the most successful and the optimal names to select.
As an Internet marketing strategy, SEO considers how search engines work, the computer programmed algorithms which dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content, adding content, doing HTML, and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. By May 2015, mobile search had surpassed desktop search.[3] In 2015, it was reported that Google is developing and promoting mobile search as a key feature within future products. In response, many brands are beginning to take a different approach to their Internet marketing strategies.[4]
Guest blogging purely for inbound links is a flawed strategy because the value of those links are going down. However, guest blogging for traffic is still an incredibly viable strategy. While that inbound link you get at the end of a guest post doesn’t have as much SEO value as it used to, it still has the value of exposing your content to a new audience.
One important thing to note is a website’s domain authority (DA) and page authority (PA). This is a number from 1 to 100 that indicates the strength of a website’s domain or a specific page. DA and PA are two of several factors that go into how a website will be ranked on a SERP. The higher the DA and PA, the better the chances are of that webpage ranking on the front page of a SERP (everyone’s dream!). This number is determined by a few things, such as the age of the website and number of links leading to it (backlinks).
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
The first step that I take is to do a quick Google search to find pages on my domain where I've mentioned the keyword in question so that I can add an internal link. To do this, I'll use the following search query, replacing DOMAIN with your domain name (e.g. matthewbarby.com) and KEYWORD with the keyword you're targeting (e.g. "social media strategy"):

While there isn’t much you can do about how long your website has been around, it’s important to note this factor if you ever want a new website for your business. It’s more beneficial to keep an old domain and redesign the layout than it is to start from scratch. A page’s authority isn’t always the same as the whole domain’s authority, but they’re determined in the same way. New web pages on an old domain will still have the domain authority to back it up, but their page authority will start out at 1.
People are more receptive to hearing these marketing messages because they are more natural and organic. They’re not paid for and stuffed in their faces through interruption marketing. An indelible experience has intrinsic emotional and psychological value, as people enjoy it. They then feel emotionally compelled to share their experience with others that they have a close relationship with as part of relating, so that they can share in the emotional experience too. This is done through conversation, phone, text, email, Facebook shares, Tweets, and the like.
I agree with the point that a small or medium-sized business will have limits for budget and resources in generating brand search, but I wouldn't say that stops them from potentially producing huge growth there if they do something interesting/innovative etc. I can think of quite a few small to medium sized brands that have more name recognition than larger companies in their space. The challenge is that many of those brands are led by people who are very good at generating marketing and publicity - it's harder for small brands to be able to afford to pay for a top level consultant or agency.
So, you have downloaded your links profiles on a CSV and you now have an extensive list of all your linked domains. If you have been doing SEO for 8+ years like me you can probably just know from analysis which links are bad from a TLD and URL point of view. If you do not know too much you can use tools such as Link Detox: http://www.linkdetox.com/ to complete analysis of your link profile. I would always consult the advice of an expert SEO in this instance because it is easy for these tools to mistake good and bad links.
Quick question. Do you ever click on the paid results when you conduct a search? It turns out, most users don’t. People typically bypass paid results and click on the top organic results. I get it. They’re looking for the most relevant and trustworthy answers to their problems. A top result that appears to be bought doesn’t appeal to them as much as an organic result. That’s where the credibility factor comes into play. It’s why 75% of clicks are organic.

While you are updating your website’s copy, you’ll also want to implement changes in your HTML – specifically, in your H1 and H2 title tags, as well as your meta descriptions and URLs. You’ll want to put your most relevant keywords in these sections of your website HTML. Search engines take into account the words in these sections of your website’s HTML when listing out relevant webpages in a search result.
Mobile traffic: In the Groupon experiment mentioned above, Groupon found that both browser and device matter in web analytics’ ability to track organic traffic. Although desktops using common browsers saw a smaller impact from the test (10-20 percent), mobile devices saw a 50 percent drop in direct traffic when the site was de-indexed. In short, as mobile users grow, we are likely to see direct traffic rise even more from organic search traffic.
×