Optimise for your personas, not search engines. First and foremost, write your buyer personas so you know to whom you’re addressing your content. By creating quality educational content that resonates with you>r ideal buyers, you’ll naturally improve your SEO. This means tapping into the main issues of your personas and the keywords they use in search queries. Optimising for search engines alone is useless; all you’ll have is keyword-riddled nonsense.
It’s unreasonable to assume that you will pull top rank in Google for every keyword relating to your industry. Your goal should be to pull top rank on the most desired keywords. This is an exercise that will take the effort of both marketing and management. Think about how people would search for your products and services, make a list of these keywords, and check the traffic for each term with a tool like Google’s Keyword Planner. Naturally you will want to rank for the keywords with the most traffic, so whittle your list down to the highest-trafficked, most relevant terms.

Search engine optimization (SEO) is the process of affecting the online visibility of a website or a web page in a web search engine's unpaid results—often referred to as "natural", "organic", or "earned" results. In general, the earlier (or higher ranked on the search results page), and more frequently a website appears in the search results list, the more visitors it will receive from the search engine's users; these visitors can then be converted into customers.[1] SEO may target different kinds of search, including image search, video search, academic search,[2] news search, and industry-specific vertical search engines. SEO differs from local search engine optimization in that the latter is focused on optimizing a business' online presence so that its web pages will be displayed by search engines when a user enters a local search for its products or services. The former instead is more focused on national or international searches.


Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
An important thing to note is the effect that localized searches will have on search engines. For example, say you’re in Atlanta and you’re looking for a photographer for your wedding. When you search “wedding photography” it wouldn’t be helpful to see results of photographers in Los Angeles. This is why Google takes into account where you are when you search for certain words, and shows you listings from businesses that are close in proximity to you.
Here’s the thing. Your web visitors aren’t homogeneous. This means that everyone accesses your site by taking a different path. You may not even be able to track that first point of contact for every visitor. Maybe they first heard of you offline. But in most cases, you can track that first touch point. The benefit? You can meet your potential customers exactly where they are.
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[67][68]
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
RankBrain can have an impact on your keywords campaigns. When you are defining your keywords, you are looking for relevant terms that respond to customers queries. This is how a successful SEO strategy works. And logically, if you are picking keywords your audience is not searching for, you campaign will have no chance to succeed. This is where RankBrain can play a role.
Publishing quality content on a regular basis can help you attract targeted organic search traffic. But creating great content that gets ranked higher in the search engines isn’t easy. If your business doesn’t have the necessary resources, developing strong content assets can prove to be a challenge. Which affects your ability to have a working content strategy.
RankBrain can have an impact on your keywords campaigns. When you are defining your keywords, you are looking for relevant terms that respond to customers queries. This is how a successful SEO strategy works. And logically, if you are picking keywords your audience is not searching for, you campaign will have no chance to succeed. This is where RankBrain can play a role.
Kristine Schachinger has 17 years digital experience including a focus on website design and implementation, accessibility standards and all aspects of website visibility involving SEO, social media and strategic planning. She additionally specializes in site health auditing, site forensics, technical SEO and site recovery planning especially when involving Google algorithms such as Penguin and Panda. Her seventeen years in design and development and eight years in online marketing give her a depth and breadth of understanding that comes from a broad exposure to not only digital marketing, but the complete product lifecycle along with the underlying technology and processes. She is a well known speaker, author and can be found on LinkedIn, Google+ and Twitter.
×