In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However Google implemented a new system which punishes sites whose content is not unique.[35] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[36] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[37] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognised term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [38]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
In addition to helping you find keywords you should be bidding on, thorough keyword research can also help you identify negative keywords – search terms that you should exclude from your campaigns. Negative keywords aren’t terms with negative connotations, but rather irrelevant terms that are highly unlikely to result in conversions. For example, if you sell ice cream, you might want to exclude the keyword “ice cream recipes”, as users searching for ice cream recipes are unlikely to be in the market for your product.
One important thing to note is a website’s domain authority (DA) and page authority (PA). This is a number from 1 to 100 that indicates the strength of a website’s domain or a specific page. DA and PA are two of several factors that go into how a website will be ranked on a SERP. The higher the DA and PA, the better the chances are of that webpage ranking on the front page of a SERP (everyone’s dream!). This number is determined by a few things, such as the age of the website and number of links leading to it (backlinks).
Kristine Schachinger has 17 years digital experience including a focus on website design and implementation, accessibility standards and all aspects of website visibility involving SEO, social media and strategic planning. She additionally specializes in site health auditing, site forensics, technical SEO and site recovery planning especially when involving Google algorithms such as Penguin and Panda. Her seventeen years in design and development and eight years in online marketing give her a depth and breadth of understanding that comes from a broad exposure to not only digital marketing, but the complete product lifecycle along with the underlying technology and processes. She is a well known speaker, author and can be found on LinkedIn, Google+ and Twitter.
RankBrain can have an impact on your keywords campaigns. When you are defining your keywords, you are looking for relevant terms that respond to customers queries. This is how a successful SEO strategy works. And logically, if you are picking keywords your audience is not searching for, you campaign will have no chance to succeed. This is where RankBrain can play a role.
Hey Ashok! Good question. I work with clients in a lot of different industries, so the tactics I employ are often quite different depending on the client. In general though, creating killer resources around popular topics, or tools related to client services. This provides a ton of outreach opportunity. For example: We had a client build a tool that allowed webmasters to quickly run SSL scans on their sites and identofy non-secure resources. We reached out to people writing about SSLs, Https migration etc and pitched it as a value-add. We built ~50 links to that tool in 45 days. Not a massive total, but they were pretty much all DR 40+.
Black hat SEO refers to the practice of trying to trick the search engines into giving you higher rankings by using unethical tactics, such as buying links. The risk is just too great. Even if you enjoy a temporary boost in rankings due to black hat tactics, it’s likely to be short lived. Google is getting better and better at spotting dirty tricks and sooner or later the progress you made will be wiped out by an algorithm update, or worse, your site will get removed from the index altogether.
This is one of the more apparent benefits of organic search traffic. If your website ranks for a relevant keyword, you’ll enjoy regular targeted traffic. This consistent flow of traffic will open the floodgates for new leads and customers. That alone is a sweet deal. But here’s the thing about an organic audience: You have no choice but to create a rich content experience.
With the advanced search feature, I always recommend that after you put in your demographic information, you check the option for second degree connection. And the reason for that is it’s really powerful to be able to ask whoever the mutual connection is to make an introduction, or to use them as a reference in your introduction. I know even for myself that I’m much more likely to respond to someone who knows someone that’s already within my network. And I’ve seen this and I’ve heard about this working really well for a lot of different sales folks, telling me how just even name dropping a single person got them that initial phone call.
In short, press request alerts are requests for sources of information from journalists. Let's say you're a journalist putting together an article on wearable technology for The Guardian. Perhaps you need a quote from an industry expert or some products that you can feature within your article? Well, all you need to do is send out a request to a press service and you can wait for someone to get back to you.
Beyond organic and direct traffic, you must understand the difference between all of your traffic sources and how traffic is classified. Most web analytics platforms, like Google Analytics, utilize an algorithm and flow chart based on the referring website or parameters set within the URL that determine the source of traffic. Here is a breakdown of all sources:
Here’s the thing. Your web visitors aren’t homogeneous. This means that everyone accesses your site by taking a different path. You may not even be able to track that first point of contact for every visitor. Maybe they first heard of you offline. But in most cases, you can track that first touch point. The benefit? You can meet your potential customers exactly where they are.
Step #3: Calculate your ROI based on the right performance indicators The performance indicators will depend on the objective you selected in the first step. Want to generate leads? You could track your new subscribers. Want to increase engagement? You could track clicks, comments, shares, etc. Let’s go with the first example: Your goal is customer acquisition. You’ve already set up tracking for sales conversions. It’s time to dissect your organic search traffic.
Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[53] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.[54]

An ideal keyword has high monthly searches and a relatively low keyword difficulty. Once you find some keywords that make sense for your business, have a relatively high search volume and a relatively low keyword difficulty, it’s time to research the competition. However, you’ll first want to understand how local searches affect organic search engine optimization.


Paid inclusion is a search engine marketing method in itself, but also a tool of search engine optimization, since experts and firms can test out different approaches to improving ranking and see the results often within a couple of days, instead of waiting weeks or months. Knowledge gained this way can be used to optimize other web pages, without paying the search engine company.
Improved Ads and Visual Products. PPC Ads are simply Advertisements. With this, they have full control on what they would like their clients to see first-hand. Contact Number, Location, Services, and Pricing are some of the details that they can easily display to clients. Apart from that, images are now readily available so that clients will be able to get a preview of what they are browsing or buying (for selling sites).
Plan your link structure. Start with the main navigation and decide how to best connect pages both physically (URL structure) and virtually (internal links) to clearly establish your content themes. Try to include at least 3-5 quality subpages under each core silo landing page. Link internally between the subpages. Link each subpage back up to the main silo landing page.
Though a long break is never suggested, there are times that money can be shifted and put towards other resources for a short time. A good example would be an online retailer. In the couple of weeks leading up to the Christmas holidays, you are unlikely to get more organic placement than you already have. Besides, the window of opportunity for shipping gifts to arrive before Christmas is ending, and you are heading into a slow season.
×