Social media changes so quickly that it can be difficult to keep up with terminology and best practices. One of the biggest points of confusion that I see with business owners and marketers is the distinction and strategic balance between organic and paid social media. Understanding this is essential to creating a successful social media marketing strategy.  Organic and paid social media have distinct advantages but work best when implemented together. Every popular social media outlet has paid and organic elements, but since Facebook is the largest social network (both in terms of users and advertisers) I’m going to focus on that platform in this article.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[63] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[64] As of 2006, Google had an 85–90% market share in Germany.[65] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[65] As of June 2008, the marketshare of Google in the UK was close to 90% according to Hitwise.[66] That market share is achieved in a number of countries.
Melissa: I think with thought leadership there’s a variety of different ways that you can go about this. But one of the best ways is really just utilizing that blog feature, the LinkedIn Pulse, part of LinkedIn, because you are already connected with the best audience possible. This is your business network, right? And then every time someone in your network likes or engages with your blog post, it amplifies it to their network. It’s like having a built in audience for your blog without all of that groundwork of creating your own blog.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
Use long tail keywords. Don’t just go with the most popular keywords in your market. Use keywords that are more specific to your product or service. In time, Google and other search engines will identify your website or blog as a destination for that particular subject, which will boost your content in search rankings and help your ideal customers find you. These tools will help.
Guest blogging purely for inbound links is a flawed strategy because the value of those links are going down. However, guest blogging for traffic is still an incredibly viable strategy. While that inbound link you get at the end of a guest post doesn’t have as much SEO value as it used to, it still has the value of exposing your content to a new audience.
Well, yes and no. Sure, you can get hit with an algorithm change or penalty that destroys all your traffic. However, if you have good people who know what they are doing, this is not likely to happen, and if it does, it is easy (in most cases) to get your visits back. Panda and Penguin are another story, but if you get hit by those it is typically not accidental.
×