Additionally, knowing how your SEO is performing can help you make the necessary changes over time to keep your rankings high. Look around for a keyword tracking tool to help you track how your SEO is performing. It’s typical for your ranking to fluctuate from week to week and even day to day. Look for a general upward trend from month to month that shows that your efforts are successful.
Step #3: Calculate your ROI based on the right performance indicators The performance indicators will depend on the objective you selected in the first step. Want to generate leads? You could track your new subscribers. Want to increase engagement? You could track clicks, comments, shares, etc. Let’s go with the first example: Your goal is customer acquisition. You’ve already set up tracking for sales conversions. It’s time to dissect your organic search traffic.
Make sure you have some good analytics software installed on your website. You will want to know where your traffic is coming from. Is it coming from social media, referrals, search engine queries, or is it direct traffic? It will also help you determine where people are getting lost, or dropping out of the funnel. It can also help you determine what approaches, content, and CTAs are the most effective.
Utilizing keywords in your URLs will also help with your rankings. Unfortunately, there isn’t much you can do to change the home page URL of your website without your domain authority being reset. However, each additional page you add is a place to insert a keyword, as long as it is relevant to the actual page content. We’ll go over blogging shortly, but URLs of blog posts are a great place to use your keywords.
Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.
At the end of the day, webmasters just need to know their sites: chances are your analytics tool is more like a person than a software package, and will classify traffic in irrational ways. I’ve stumbled across website traffic originating from diverse and confusing sources being classed as direct — often requiring a considerable amount of thought and exploration to work out what is happening.
Make your content longer. Despite video marketing's rise seeming to point to consumers' desire for shorter content, that implication couldn’t be further from the truth. Google wants content that’s longer and more detailed. Studies have shown that this is what ranks higher in the search engines. Don’t artificially inflate your word count, though. You should hone in on topics that naturally require more words to write about.
Every new blog post that you publish gives you an opportunity to spread it through social media, which helps to drive more traffic back to your site. Use your blog as a way to connect with your audience. Your blog isn’t a place to just post overly promotional posts. This is an opportunity to address possible concerns or even common questions related to your service or product. If you are worried about coming up with enough content ideas to publish blog posts on a regular basis then check out these resources:
Beyond organic and direct traffic, you must understand the difference between all of your traffic sources and how traffic is classified. Most web analytics platforms, like Google Analytics, utilize an algorithm and flow chart based on the referring website or parameters set within the URL that determine the source of traffic. Here is a breakdown of all sources:

Website saturation and popularity, or how much presence a website has on search engines, can be analyzed through the number of pages of the site that are indexed by search engines (saturation) and how many backlinks the site has (popularity). It requires pages to contain keywords people are looking for and ensure that they rank high enough in search engine rankings. Most search engines include some form of link popularity in their ranking algorithms. The following are major tools measuring various aspects of saturation and link popularity: Link Popularity, Top 10 Google Analysis, and Marketleap's Link Popularity and Search Engine Saturation.

This is the number of views that you can test each month on your website.It's up to you how you choose to use them, either by allocating all the views to one test or to multiple test, either on one page or on multiple pages. If you have selected the 10.000 tested views plan and you run an experiment on your category page which is viewed 7000 times per month, then at the end of the month 7000 is what you'll be counted as tested views quota.
Melissa: I completely agree. And the other thing about them adding and the way they’re added in the LinkedIn video option is that there’s an auto play. So as folks are scrolling through their feed, they’re more likely to stop with this video that just kind of starts playing, as well. I think that’s a big opportunity to really get some more eyes on your content.
Google is currently been inundated with reconsideration requests from webmasters all over the world. On public holidays the Search Quality teams do not look at reconsideration requests. See below analysis. From my experience it can take anywhere from 15-30+ days for Google to respond to reconsideration requests; during peak periods it can even take longer.
Kristine Schachinger has 17 years digital experience including a focus on website design and implementation, accessibility standards and all aspects of website visibility involving SEO, social media and strategic planning. She additionally specializes in site health auditing, site forensics, technical SEO and site recovery planning especially when involving Google algorithms such as Penguin and Panda. Her seventeen years in design and development and eight years in online marketing give her a depth and breadth of understanding that comes from a broad exposure to not only digital marketing, but the complete product lifecycle along with the underlying technology and processes. She is a well known speaker, author and can be found on LinkedIn, Google+ and Twitter.
×