Nathan Gotch is the founder of Gotch SEO, a white label SEO services provider and SEO training company based in St. Louis. Gotch SEO is now one of the top SEO blogs in the world and over 300 entrepreneurs have joined his SEO training platform, Gotch SEO Academy. Nathan’s SEO strategies and advice have also been featured on Forbes, Entrepreneur, Business.com, and Search Engine Journal.
Content is one of the 3 main Google ranking factors. As a matter of fact, having a blog can help you get your content indexed faster. An active blog – relying on qualitative and insightful content, and backed by authoritative links – will help you improve your rankings and thus your organic traffic. Also, long-form copies (above 1200 words) tend to rank higher and be crawled more often.
Essentially, what distinguishes direct from organic traffic today is tracking. According to Business2Community, direct traffic is composed of website visits which have “no referring source or tracking information.” A referring source can be a search engine, or it can be a link from another website. Direct traffic can include visits that result from typing the URL directly into a browser, as the simple definition suggests.
Keyword research is one of the most important thing you need to do before producing any content. Understanding what your target audience is searching for and what content to produce for them is vital in getting your content found, read and shared. In this session, Sameen shares how to do a keyword research and her strategies for finding the best keywords for your brand.
The HTML tag is meant to be a concise explanation of a web page’s content. Google displays your meta description beneath the page title in their organic results. While meta descriptions aren’t as important as page titles in your Google ranking, they do play a big role in getting clicks from users. People read descriptions as a preview of your page and use it to determine if your content is worth visiting. You should keep your meta descriptions to under 150 characters since Google won’t display text beyond that. You should also include target keywords in your text, since any words matching a user’s search query will be displayed in bold.
While farms and processing facilities for organic products are required to get organic certification, it’s optional for retailers. For certification, there are numerous steps and processes to insure organic integrity from when products arrive at a store until you put them in your basket. That integrity is important to us – and many of our shoppers – so we became the first national certified organic grocer.
Great analysis and tips, Rand. Your first tip was to focus on branded search... but when it comes to small-medium businesses, there's a limit on how much brand search-demand one can produce. Are you saying that google search will eventually become an engine made only for entities/businesses we already know of? Where does that leave the whole discovery process?
You control the cost of search engine marketing and pay nothing for your ad to simply appear on the search engine. You are charged only if someone clicks on your ad, and only up to the amount that you agreed to for that click. That’s why SEM is also known as pay per click (PPC), because you only get charged for each click that your ad generates. No click? No charge.
Here’s the thing. Your web visitors aren’t homogeneous. This means that everyone accesses your site by taking a different path. You may not even be able to track that first point of contact for every visitor. Maybe they first heard of you offline. But in most cases, you can track that first touch point. The benefit? You can meet your potential customers exactly where they are.
What you are in fact talking about, are Google's death stars like the Shopping box, Knowledge Graph etc. It's fully understandable why many SEOs can't stand them 'cause whole categories of websites (price comparison platforms, for instance) have already fallen victim of such death stars, and there will be certainly numerous other portals, which will lose almost all of their traffic in the near future. Despite your (quite good) suggestions on how to circumvent such an issue, the situation for such an endangered portal can be hopeless when it's its whole business model, which a new Google feature makes obsolete. See geizhals.at for a very famous example.
While organic search may drive many times more traffic to your site than paid search, you can use this report to see the quality of traffic driven by each method. For example, if you look atSite Usage statistics, you may see that organic search delivers 20 to 30 times the number of visitors, but those visitors view only half as many pages and have twice the bounce rate. And when you look at Ecommerce statistics, you may see that visitors from paid search have a much higher rate of transactions, along with a higher average value per transaction, and a higher dollar value per visit. If you find that your visitors who arrive via paid search represent a significantly higher value customer, that may be an argument to invest more in paid search.
Additionally, there are many situations where PPC (a component of SEM) makes more sense than SEO. For example, if you are first launching a site and you want immediate visibility, it is a good idea to create a PPC campaign because it takes less time than SEO, but it would be unwise to strictly work with PPC and not even touch search engine optimization.
Now in your reconsideration request make sure you are honest and tell Google everything that the prior agency was up too. Be sure to include all Excel information of removed links and say you are going to make an ongoing effort to remove everything negative. It is common knowledge that Google may not accept your first reconsideration request, so it may take a few times.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links. PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.