With stats like that, you’re probably wondering why you should even bother with organic posts. Although organic reach is low, it’s still important to have an active, consistent presence on social media. Your Facebook page, Instagram account, Twitter profile, etc. are often where people turn to for updates from your company or to ask questions. Low organic reach doesn’t mean you should stop posting organically all together—it means you should focus more of your efforts on a paid social media strategy while maintaining a solid organic strategy.
Melissa: I think with thought leadership there’s a variety of different ways that you can go about this. But one of the best ways is really just utilizing that blog feature, the LinkedIn Pulse, part of LinkedIn, because you are already connected with the best audience possible. This is your business network, right? And then every time someone in your network likes or engages with your blog post, it amplifies it to their network. It’s like having a built in audience for your blog without all of that groundwork of creating your own blog.
While SEO doesn’t guarantee that you will keep receiving traffic on autopilot for a long time, it does give you the peace of mind that doesn’t come with PPC marketing. Once you rank for a particular keyword, you can keep doing the needed on-page SEO to maintain your rank. By being more sustainable, SEO can help you dominate a market segment without burning a hole in the wallet.
To do this, I often align the launch of my content with a couple of guest posts on relevant websites to drive a load of relevant traffic to it, as well as some relevant links. This has a knock-on effect toward the organic amplification of the content and means that you at least have something to show for the content (in terms of ROI) if it doesn't do as well as you expect organically.
The piece on generating demand for branded queries rather than just product-based ones is particularly interesting here. It sounds as though it'll be more important than ever to have a strong brand in order to succeed (rather than just having a well-optimized site -- and ideally, having the strategic, technical, and creative sides all working together cohesively). Perhaps it's possible that brand exposure through things like answer boxes can still deliver some value too, even if it's difficult to measure, and CTRs are diminished?
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.
Since there is an obvious barrier of entry for anyone trying to beat you once you’re established, you won’t have to worry about having competitors “buying” their way to the top. Their only option is pay per click ads, but then again, it isn’t the same as getting a higher position on the SERPs. Again, this is assuming that you took the right steps and were patient enough to solidify your place in the top search results.
Before you dig into some of the more technical ways to improve SEO ranking, always remember that writing compelling, high-quality content that attracts interest and compels visitors to share it and link back to it is vital. Good content has the best chance of being viral content, and Google rewards content virality heavily in its rankings algorithm.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review. Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links in addition to their URL submission console. Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click; however, this practice was discontinued in 2009.
Well as noted in the post it is not just above the links that was only one key part of a wider strategy. This website in question has deep levels of content. So it is not just about a blog section, they have numerous high quality content sections we have developed over time. It would not be advisable ever to attack competitors sites with low quality links.
You can then use the Data Cube to uncover topics relevant to prospects later in the buyer’s journey, such as your brand versus a competitor or case studies illustrating the value you provide for customers. Since your organic search research will help you understand what people are interested in reading, your emails will be more targeted and helpful, boosting your success rates.
Organic is different. Matching keywords to user intent means you may be present in many searches. The user may find you consistently, and once they get to your site, they are more likely to stay. Organic users are still your best long-term customers. In my experience, they have lower bounce rates and more pages visited, and they are more likely to return.