In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However Google implemented a new system which punishes sites whose content is not unique.[35] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[36] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[37] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognised term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [38]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
Nathan: You’ve mentioned status updates. One of the things I’ve been doing with the podcast is creating a video introduction. It was last fall that LinkedIn started having native uploads of videos. And I’ve been noticing anywhere from 2,000 to 3,000 views per post that I upload, where nobody was checking out my videos or status updates when I was doing it in the past. That might be something people think about, too, is adding the video element into their thought leadership post or their status updates. What are your thoughts on that?

Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.


Each paid ad will likely point to a product page, a specific landing page, or something that has the potential to drive financial results. As paid marketing would suggest by its name alone, you’re spending money on ads to drive specific actions. You need to determine ROAS beyond vanity metrics alone (like engagement or total leads). Say you drove five leads but spent $5,000 on your paid campaign. Your ROAS would be $1,000 per lead, which is a bit steep (depending on your industry). In this case, you’d want to adjust your strategy to avoid wasting money.
Many people also search specifically for services within their area. Utilizing keywords such as “wedding photography Atlanta” instead of just “wedding photography” can help you face local competition better. Setting up a Google My Business page is also a tool that will help your business pop up in localized searches. It’s free to set up, and requires a physical address for your business.

Creating the link between SEO and PPC represents an integral part of the SEM concept. Sometimes, especially when separate teams work on SEO and PPC and the efforts are not synced, positive results of aligning their strategies can be lost. The aim of both SEO and PPC is maximizing the visibility in search and thus, their actions to achieve it should be centrally coordinated. Both teams can benefit from setting shared goals and combined metrics, evaluating data together to determine future strategy or discuss which of the tools works better to get the traffic for selected keywords in the national and local search results. Thanks to this, the search visibility can be increased along with optimizing both conversions and costs.[21]
The term is intuitive; the definition of organic marketing refers to the act of getting your customers to come to you naturally over time, rather than ‘artificially’ via paid links or boosted posts. It includes any direct, instinctive, and , with the exception of paid marketing tools. Paid tools, such as artificial paid link-ads, are considered inorganic marketing. If you’ve been putting your blood, sweat and tears into revising and reinventing your user interface, maintaining Twitter and Facebook accounts, building your email lists, and improving your SEO, you’re doing it already. Now, let’s take a closer look at why it’s effective, and how you can do it better.
This all sounds amazing right? unfortunately, organic marketing is also very difficult to implement and resource intensive. Ranking organically on Google and other search engines can be very hard, especially for competitive keywords such as “buy shoes.” Ultimately you want to rank within the first page, and possibly within the top 3 organic results, but this can be next to impossible when you are competing with large companies that have teams dedicated to ranking their keywords.
Organic is what people are looking for; the rest of these simply put things in front of people who may or may not be seeking what you offer. We know that approximately X number of people are looking for Y every day. So if we can get on front of those people, we have a much greater opportunity to create long-term relationships and increase our overall ROI.
×