Slide 1 Title Here

Your Description Here..................................

Slide 2 Title Here

Your Description Here..................................

Slide 3 Title Here

Your Description Here..................................

Slide 4 Title Here

Your Description Here..................................

Slide 5 Title Here

Your Description Here..................................

40+ Tools to Advance Your International SEO Process

12:01 AM | , , , ,

One of the most frequent questions I get is about the tools that I use for international SEO, and although I included most of them in my international SEO presentation at MozCon, since I didn't had the time to focus on them, I would like to share how I use them to support my international SEO activities.
There are tools to support every part of your journey, including identifying the potential, targeting an international audience, optimizing and promoting the websites, earning international popularity, and measuring and achieving benefit with the international SEO process. Let's get started!
Identify

Your initial international search status

Identify your initial international search visibility, from the volume and trends of queries to pages' impressions, clicks, and the CTR you get per country. Use the "Search Queries" report in Google Webmaster Tools and filter by location.
Google Webmaster Tools
In the Google Analytics "Demographics" report, check your current visits, conversions, conversion rate volume, and trends coming from different countries and languages, along the traffic sources, keywords, and pages used.
Google Analytics

Your international search potential

Beyond researching the search volume for relevant keywords in the language and country that you want to target (using the keyword tool of the most popular search engine in the relevant country), you can also use tools like SEMrush and SearchMetrics — which support many countries — to identify your current market activity and competitors.
To find out which search engine is the most popular in your target country, you can use StatCounter or Alexa, and then use their keyword tools to verify the specific search volume. It would most likely be Google Keyword Planner for the western world that mostly uses Google, Yandex Keyword Statistics for Russia, and Baidu Index for China.
SearchMetrics and SEMrush

Your international keyword ideas

Identify additional keyword ideas with Ubersuggest (where you can choose between many different languages and countries) and the Suggestion Keyword Finder tool.
Ubersuggest and SEOchat Suggestion Keyword Finder

Why I don't recommend Google's Global Market Finder

I'm also frequently asked why I don't recommend (or recommend, but only very carefully) Google's Global Market Finder in my International SEO advice, and here's the reason: It's frequently inaccurate with the translations and term localization, and can easily lead to confusion and misunderstandings.
The tool has an "important note" below the results:
"...since the translations are created using Google Translate, they are not always perfect so be sure to confirm that the terms you're selecting are accurate..."
Even so, people usually assume that since it's a Google tool the results should be okay. In some cases, though, when you're not a native speaker of a language, it's very hard to know for sure when it's right or not.
Because of this, the tool is useless most of the time, since it only adds additional complexity to the process. In the end, you'll need native support anyway, as well as validation with other keyword tools for more accurate keyword ideas and their search volume.
For example, let's say I'm from an American company looking for the potential search volume in Mexico related to "apartments" and "rent apartments":
Google Global Market Finder
The tool suggests "pisos", "alquiler apartamentos", and "alquilar apartamentos". These results have the following issues:
  • The term "pisos" in Mexico is not used as a translation of "apartments," but instead is what the "floor" is called. It is in Spain where apartments are called "pisos."
  • "Alquiler apartamentos" is "apartment rentals," and "alquilar apartamentos" is "rent apartaments," but while these terms are popular in Spain (and some other countries), they are not in Mexico. In Mexico, for "Alquiler apartamentos" it would be "Renta departamentos," and "Alquilar apartamentos" would instead be "Rentar departamentos."
You can see how if you search for these Global Market Finder-suggested terms in Google's own keyword research tool, their local search volume is very low compared to the ones I mention, which are the correct ones to use in this situation:
Rentar Departamentos / Alquilar Apartamentos Keyword research
Additionally, the term "Alquiler apartamentos" is not grammatically correct, since it needs a "de" preposition. It should be "Alquiler de apartamentos" (literally meaning "Rent of apartments" in Spanish). Although it's true this can also happen with any keyword research tool, in this case it adds even more confusion to the process. As I mentioned before, you will end-up requiring native support to be accurate anyway.
Target

Your international audience profile

Understand your target international audience's demographic characteristics and online buying preferences not only by researching with studies like the Comscore Data Mine, but by browsing the TNS Digital Life and Google's Consumer Barometer sites. These sites let you select and interact with their data for almost every industry, country, and demographic characteristic.
TNS Research and Consumer Barometer

Your international industry's behavior and characteristics

Identify your competitors in the international market, including their characteristics and trends, by researching with Alexa, Rnkrnk, Google's Display Network Research, and SimilarWeb.
You should understand which are their most popular products and content, their unique selling proposition, their weaknesses and strengths, which marketing activities they're already developing, and a little about their online communities.
SimilarWeb Tool
Optimize

Your hreflang annotations

Make sure to include the correct hreflang annotations on the different versions of your international pages, indicating the language and country targeting of each page, following the ISO639-1 format for the language attribute and ISO 3166-1 Alpha 2 for the country attribute.
You can use the DejanSEO hreflang validator to check the usage on a specific page, or Rob Hammond's SEO Crawler to quickly verify if all the pages are correctly featuring the notation. If you need to validate more than the 250 internal pages allowed, you can use the filters in Screaming Frog to specifically identify those pages which contain (or don't contain) the desired hreflang tags.
hreflang Tools

Your country-targeted website's geolocation

If you're country targeting and using a top-level domain, you can geolocate it using Google, Bing, and Yandex Webmaster Tools' geolocation features.
Nonetheless, the best way to geolocate a domain is by using the relevant ccTLD for each country. Take a look atIANA's database with each country code registry operator that usually allows domains to be purchased on their sites, or feature those approved domain registrars in each country.
Additionally, although it doesn't play as important a role as before, take a look at the example below. Minube, one of the most important travel communities in Spain, is hosted in Germany. If you can have a local IP for your website without much effort, that could be beneficial. You can check any website IP by using the FlagFox extension for Firefox or the Flag for Chrome extension.
Identify IP Tool

Your international web content

It's important that you develop attractive and optimized content for your international target audience that not only includes the desired keywords, but is interesting, serves to connect with your visitors, and helps you achieve your international website goals.
For this, it's fundamental that you have native support. If it's difficult for you to find that, check out online translator communities such as ProZ.
In order to validate your content, you might want to use professional translation software (more reliable than Google Translate) that also integrates with Office for example, making it easier to use. PROMT is one good example.
If at some specific point in the process (hopefully not for long) you don't have direct access to a native language speaker, or you just want to double-check something specifically, you should take a look at the WordReference forum. There's an amazing number of threads around phrases and translations for many languages.
On a day-to-day basis, you should also keep updated with the international trends and hot topics in order to identify new content for the website. For this, you can use Google Trends (take a look at the Hot Searches per country); Twitterfall, which lets you to easily follow up with a specific topic and has geotargeting features; and Talkwalker, a tool that supports many languages and easily generates alerts via email or RSS.
International Alerts and Trends
Promote

Your international popularity analysis

To research and understand your international competitors' link-building strategies, sources, and the popularity gap you have with them, you can use the same link- and social-analysis tools you likely already have, like Open Site Explorer, MajesticSEO, LinkRisk, and SocialCrawlytics.
Nonetheless, in this case, you should pay extra attention to the international audience's preferences, beyond link quality, volume, trends, sources, and types. Look at the social activity and profile, the most linked and shared content, the seasonality, the terms used and sites shared, the local industry influencers, and the favorite types of content, topics, and formats.
International Link Analysis

Your international link-building

Promote your international website assets by leveraging relevant local sites, understanding cultural factors, building relationships with local influencers and media, and identifying what works best in each country to scale and track the response to each international version.
For international prospecting you can use Link Prospector, FollowerWonk, and Topsy, and then follow up and manage your links with BuzzStream.
International Link Building
Measure

Your international search visibility

To easily verify how your international search audience sees your site ranking in their search results, you can use I Search From or Search Latte to quickly get the desired country and language's results.
Nonetheless, to make sure you're really seeing what your audience from other locations is seeing, it's best to do so with a local IP by using a proxy service. This will also let you verify your website from the desired international location and check to see if there's any types of settings for them, like a redirect, for example.
For this, you can use a free proxy browser add-on, like the ones from FoxyProxy, along any of HMA's Public Proxy list. If you want to have more reliable service, better speed, and select between many IPs, you also have paid ones, such as Hide My Ass or Trusted Proxies.
Geolocation tools

Your international search results

Measure each of your International web versions independently, from the rankings for each relevant country and language to the visits and conversions. Remember to pay extra attention to the currency settings, cross-domain tracking, and the country and language traffic alignment.
For each of the international versions, segment and analyze the rankings, visits, conversions, average conversion value and rate, the used keywords, pages, sources of traffic per languages, location, and devices.
For your search rankings, you can use web-based tools like Moz Rank Tracker, SEscout, and Authority Labs, which support international search engines, or use desktop applications such as Advanced Web Rankings, along with a proxy service to avoid being blocked. For quick revisions you can use free browser extensions such as Rank Checker for Firefox and SEO SERP for Chrome.
For the site behavior with the search engines, it is important that you also follow up with Google Webmaster Tools (or the Webmaster Tools of the relevant international search engine) along with Google Analytics, from a traffic and conversion analysis perspective. That will let you to continuously follow-up with your International SEO results, and allow you to make the appropriate decisions.
International Search Rankings
Benefit

Your international SEO ROI

Calculate what's required in order to achieve your conversion goals and a high ROI in your international SEO process while taking the SEO process costs into consideration. You can use the International SEO ROI calculator to facilitate this activity.
International SEO ROI Calculator
Always use your brain
Last but not least, let's not forget that despite all the help that these tools might give you the most important tool you have is your own brain.
Unfortunately I've seen how we forget sometimes about turning on an "autopilot," missing great opportunities (or even making mistakes) as a consequence.
Tools are not meant to replace you, but to support you, so do your own analysis, test everything and validate frequently, using your brain.
courtsey-moz.com
Read More

How to perform Great SEO Audit?

11:46 AM | , , ,

How to Perform the World's Greatest SEO Audit This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.
World's Greatest Audit MugNow that tax season is over, it's once again safe to say my favorite A-word... audit! That's right. My name is Steve, and I'm an SEO audit junkie.
Like any good junkie, I've read every audit-related article; I've written thousands of lines of audit-related code, and I've performed audits for friends, clients, and pretty much everyone else I know with a website.
All of this research and experience has helped me create an insanely thorough SEO audit process. And today, I'm going to share that process with you.
This is designed to be a comprehensive guide for performing a technical SEO audit. Whether you're auditing your own site, investigating an issue for a client, or just looking for good bathroom reading material, I can assure you that this guide has a little something for everyone. So without further ado, let's begin. SEO Audit Preparation When performing an audit, most people want to dive right into the analysis. Although I agree it's a lot more fun to immediately start analyzing, you should resist the urge.
A thorough audit requires at least a little planning to ensure nothing slips through the cracks. Crawl Before You Walk Before we can diagnose problems with the site, we have to know exactly what we're dealing with. Therefore, the first (and most important) preparation step is to crawl the entire website. Crawling Tools I've written custom crawling and analysis code for my audits, but if you want to avoid coding, I recommend using Screaming Frog's SEO Spider to perform the site crawl (it's free for the first 500 URIs and £99/year after that).
Alternatively, if you want a truly free tool, you can use Xenu's Link Sleuth; however, be forewarned that this tool was designed to crawl a site to find broken links. It displays a site's page titles and meta descriptions, but it was not created to perform the level of analysis we're going to discuss.
For more information about these crawling tools, read Dr. Pete's Crawler Face-off: Xenu vs. Screaming Frog. Crawling Configuration Once you've chosen (or developed) a crawling tool, you need to configure it to behave like your favorite search engine crawler (e.g., Googlebot, Bingbot, etc.). First, you should set the crawler's user agent to an appropriate string. Popular Search Engine User Agents: Googlebot - "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" Bingbot - "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" Next, you should decide how you want the crawler to handle various Web technologies.
There is an ongoing debate about the intelligence of search engine crawlers. It's not entirely clear if they are full-blown headless browsers or simply glorified curl scripts (or something in between).
By default, I suggest disabling cookies, JavaScript, and CSS when crawling a site. If you can diagnose and correct the problems encountered by dumb crawlers, that work can also be applied to most (if not all) of the problems experienced by smarter crawlers.
Then, for situations where a dumb crawler just won't cut it (e.g., pages that are heavily reliant on AJAX), you can switch to a smarter crawler. Ask the Oracles The site crawl gives us a wealth of information, but to take this audit to the next level, we need to consult the search engines. Unfortunately, search engines don't like to give unrestricted access to their servers so we'll just have to settle for the next best thing: webmaster tools.
Most of the major search engines offer a set of diagnostic tools for webmasters, but for our purposes, we'll focus on Google Webmaster Tools and Bing Webmaster Tools. If you still haven't registered your site with these services, now's as good a time as any. Helpful Videos: How to Register Your Site with Google Webmaster Tools How to Register Your Site with Bing Webmaster Tools Now that we've consulted the search engines, we also need to get input from the site's visitors. The easiest way to get that input is through the site's analytics.
The Web is being monitored by an ever-expanding list of analytics packages, but for our purposes, it doesn't matter which package your site is using. As long as you can investigate your site's traffic patterns, you're good to go for our upcoming analysis.
At this point, we're not finished collecting data, but we have enough to begin the analysis so let's get this party started! SEO Audit Analysis The actual analysis is broken down into five large sections:
  • Accessibility
  • Indexability
  • On-Page Ranking Factors
  • Off-Page Ranking Factors
  • Competitive Analysis
(1) Accessibility If search engines and users can't access your site, it might as well not exist. With that in mind, let's make sure your site's pages are accessible. Robots.txt The robots.txt file is used to restrict search engine crawlers from accessing sections of your website. Although the file is very useful, it's also an easy way to inadvertently block crawlers. As an extreme example, the following robots.txt entry restricts all crawlers from accessing any part of your site: Robots.txt Example Manually check the robots.txt file, and make sure it's not restricting access to important sections of your site. You can also use your Google Webmaster Tools account to identify URLs that are being blocked by the file. Robots Meta Tags The robots meta tag is used to tell search engine crawlers if they are allowed to index a specific page and follow its links.
When analyzing your site's accessibility, you want to identify pages that are inadvertently blocking crawlers. Here is an example of a robots meta tag that prevents crawlers from indexing a page and following its links: Robots Meta Tag Example HTTP Status Codes Search engines and users are unable to access your site's content if you have URLs that return errors (i.e., 4xx and 5xx HTTP status codes).
During your site crawl, you should identify and fix any URLs that return errors (this also includes soft 404 errors). If a broken URL's corresponding page is no longer available on your site, redirect the URL to a relevant replacement.
Speaking of redirection, this is also a great opportunity to inventory your site's redirection techniques. Be sure the site is using 301 HTTP redirects (and not 302 HTTP redirects, meta refresh redirects, or JavaScript-based redirects) because they pass the most link juice to their destination pages. XML Sitemap Your site's XML Sitemap provides a roadmap for search engine crawlers to ensure they can easily find all of your site's pages. Here are a few important questions to answer about your Sitemap:
Is the Sitemap a well-formed XML document? Does it follow the Sitemap protocol? Search engines expect a specific format for Sitemaps; if yours doesn't conform to this format, it might not be processed correctly. Has the Sitemap been submitted to your webmaster tools accounts? It's possible for search engines to find the Sitemap without your assistance, but you should explicitly notify them about its location. Did you find pages in the site crawl that do not appear in the Sitemap? You want to make sure the Sitemap presents an up-to-date view of the website. Are there pages listed in the Sitemap that do not appear in the site crawl? If these pages still exist on the site, they are currently orphaned. Find an appropriate location for them in the site architecture, and make sure they receive at least one internal backlink. Helpful Videos: How to Submit a Sitemap to Google How to Submit a Sitemap to Bing Site Architecture Your site architecture defines the overall structure of your website, including its vertical depth (how many levels it has) as well as its horizontal breadth at each level.
When evaluating your site architecture, identify how many clicks it takes to get from the homepage to other important pages. Also, evaluate how well pages are linking to others in the site's hierarchy, and make sure the most important pages are prioritized in the architecture.
Ideally, you want to strive for a flatter site architecture that takes advantage of both vertical and horizontal linking opportunities. Flash and JavaScript Navigation The best site architecture in the world can be undermined by navigational elements that are inaccessible to search engines. Although search engine crawlers have become much more intelligent over the years, it is still safer to avoid Flash and JavaScript navigation.
To evaluate your site's usage of JavaScript navigation, you can perform two separate site crawls: one with JavaScript disabled and another with it enabled. Then, you can compare the corresponding link graphs to identify sections of the site that are inaccessible without JavaScript. Site Performance Users have a very limited attention span, and if your site takes too long to load, they will leave. Similarly, search engine crawlers have a limited amount of time that they can allocate to each site on the Internet. Consequently, sites that load quickly are crawled more thoroughly and more consistently than slower ones.
You can evaluate your site's performance with a number of different tools. Google Page Speed and YSlow check a given page using various best practices and then provide helpful suggestions (e.g., enable compression, leverage a content distribution network for heavily used resources, etc.). Pingdom Full Page Test presents an itemized list of the objects loaded by a page, their sizes, and their load times. Here's an excerpt from Pingdom's results for SEOmoz: Pingdom Results for SEOmoz These tools help you identify pages (and specific objects on those pages) that are serving as bottlenecks for your site. Then, you can itemize suggestions for optimizing those bottlenecks and improving your site's performance. (2) Indexability We've identified the pages that search engines are allowed to access. Next, we need to determine how many of those pages are actually being indexed by the search engines. Site: Command Most search engines offer a "site:" command that allows you to search for content on a specific website. You can use this command to get a very rough estimate for the number of pages that are being indexed by a given search engine. For example, if we search for "site:seomoz.org" on Google, we see that the search engine has indexed approximately 60,900 pages for SEOmoz: Google site: Command for SEOmoz Although this reported number of indexed pages is rarely accurate, a rough estimate can still be extremely valuable. You already know your site's total page count (based on the site crawl and the XML Sitemap) so the estimated index count can help identify one of three scenarios:
The index and actual counts are roughly equivalent - this is the ideal scenario; the search engines are successfully crawling and indexing your site's pages. The index count is significantly smaller than the actual count - this scenario indicates that the search engines are not indexing many of your site's pages. Hopefully, you already identified the source of this problem while investigating the site's accessibility. If not, you might need to check if the site's being penalized by the search engines (more on this in a moment). The index count is significantly larger than the actual count - this scenario usually suggests that your site is serving duplicate content (e.g., pages accessible through multiple entry points, "appreciably similar" content on distinct pages, etc.). If you suspect a duplicate content issue, Google's "site:" command can also help confirm those suspicions. Simply append "&start=990" to the end of the URL in your browser: Google site: Example URL Then, look for Google's duplicate content warning at the bottom of the page. The warning message will look similar to this: Google Duplicate Content Warning If you have a duplicate content issue, don't worry. We'll address duplicate content in an upcoming section of the audit. Index Sanity Checks The "site:" command allows us to look at indexability from a very high level. Now, we need to be a little more granular. Specifically, we need to make sure the search engines are indexing the site's most important pages. Page Searches Hopefully, you already found your site's high priority pages in the index while performing "site:" queries. If not, you can search for a specific page's URL to check if it has been indexed: Google Example URL Search If you don't find the page, double check its accessibility. If the page is accessible, you should check if the page has been penalized.
Rand describes an alternative approach to finding indexed pages in this article: Indexation for SEO: Real Numbers in 5 Easy Steps. Brand Searches After you check whether your important pages have been indexed, you should check if your website is ranking well for your company's name (or your brand's name).
Just search for your company or brand name. If your website appears at the top of the results, all is well with the universe. On the other hand, if you don't see your website listed, the site might be penalized, and it's time to investigate further. Search Engine Penalties Hopefully, you've made it this far in the audit without detecting even the slightest hint of a search engine penalty. But if you think your site has been penalized, here are 4 steps to help you fix the situation: Step 1: Make Sure You've Actually Been Penalized I can't tell you how many times I've researched someone's "search engine penalty" only to find an accidentally noindexed page or a small shuffle in the search engine rankings. So before you start raising the penalty alarm, be sure you've actually been penalized.
In many cases, a true penalty will be glaringly obvious. Your pages will be completely deindexed (even though they're openly accessible), or you will receive a penalty message in your webmaster tools account.
It's important to note that your site can also lose significant traffic due to a search engine algorithm update. Although this isn't a penalty per se, it should be handled with the same diligence as a true penalty. Step 2: Identify the Reason(s) for the Penalty Once you're sure the site has been penalized, you need to investigate the root cause for the penalty. If you receive a formal notification from a search engine, this step is already complete.
Unfortunately, if your site is the victim of an algorithmic update, you have more detective work to do. Begin searching SEO-related news sites and forums until you find answers. When search engines change their algorithms, many sites are affected so it shouldn't take long to figure out what happened. For even more help, read Sujan Patel's article about identifying search engine penalties. Step 3: Fix the Site's Penalized Behavior After you've identified why your site was penalized, you have to methodically fix the offending behavior. This is easier said than done, but fortunately, the SEOmoz community is always happy to help. Step 4: Request Reconsideration Once you've fixed all of the problems, you need to request reconsideration from the search engines that penalized you. However, be forewarned that if your site wasn't explicitly penalized (i.e., it was the victim of an algorithm update), a reconsideration request will be ineffective, and you'll have to wait for the algorithm to refresh. For more information, read Google's guide for Reconsideration Requests and Bing's guide for Getting Out of the Penalty Box. With any luck, Matt Cutts will release you from search engine prison: Matt Cutts Prison Guard (3) On-Page Ranking Factors Up to this point, we've analyzed the accessibility and indexability of your site. Now it's time to turn our attention to the characteristics of your site's pages that influence the site's search engine rankings.
For each of the on-page ranking factors, we'll investigate page level characteristics for the site's individual pages as well as domain level characteristics for the entire website.
In general, the page level analysis is useful for identifying specific examples of optimization opportunities, and the domain level analysis helps define the level of effort necessary to make site-wide corrections. URLs Since a URL is the entry point to a page's content, it's a logical place to begin our on-page analysis. When analyzing the URL for a given page, here are a few important questions to ask: Is the URL short and user-friendly? A common rule of thumb is to keep URLs less than 115 characters. Does the URL include relevant keywords? It's important to use a URL that effectively describes its corresponding content. Is the URL using subfolders instead of subdomains? Subdomains are mostly treated as unique domains when it comes to passing link juice. Subfolders don't have this problem, and as a result, they are typically preferred over subdomains. Does the URL avoid using excessive parameters? If possible, use static URLs. If you simply can't avoid using parameters, at least register them with your Google Webmaster Tools account. Is the URL using hyphens to separate words? Underscores have a very checkered past with certain search engines. To be on the safe side, just use hyphens. Additional URL Optimization Resources: 11 Best Practices for URLs SEO URL Optimization When analyzing the URLs for an entire domain, here are a few additional questions: Do most of the URLs follow the best practices established in the page level analysis, or are many of the URLs poorly optimized? If a number of URLs are suboptimal, do they at least break the rules in a consistent manner, or are they all over the map? Based on the site's keywords, is the domain appropriate? Does it contain keywords? Does it appear spammy? URL-based Duplicate Content In addition to analyzing the site's URL optimization, it's also important to investigate the existence of URL-based duplicate content on the site. URLs are often responsible for the majority of duplicate content on a website because every URL represents a unique entry point into the site. If two distinct URLs point to the same page (without the use of redirection), search engines believe two distinct pages exist. For an exhaustive list of ways URLs can create duplicate content, read Section V. of Dr. Pete's fantastic guide: Duplicate Content in a Post-Panda World (go ahead and read the entire guide - it's amazing). Ideally, your site crawl will discover most (if not all) sources of URL-based duplicate content on your website. But to be on the safe side, you should explicitly check your site for the most popular URL-based culprits (programmatically or manually). In the content analysis section, we'll discuss additional techniques for identifying duplicate content (including URL-based duplicate content). Content We all know content is king so now, let's give your site the royal treatment. To investigate a page's content, you have various tools at your disposal. The simplest approach is to view Google's cached copy of the page (the text-only version). Alternatively, you can use SEO Browser or Browseo. These tools display a text-based version of the page, and they also include helpful information about the page (e.g., page title, meta description, etc.). Regardless of the tools you use, the following questions can help guide your investigation: Does the page contain substantive content? There's no hard and fast rule for how much content a page should contain, but using at least 300 words is a good rule of thumb. Is the content valuable to its audience? This is obviously somewhat subjective, but you can approximate the answer with metrics such as bounce rate and time spent on the page. Does the content contain targeted keywords? Do they appear in the first few paragraphs? If you want to rank for a keyword, it really helps to use it in your content. Is the content spammy (e.g., keyword stuffing)? You want to include keywords in your content, but you don't want to go overboard. Does the content minimize spelling and grammatical errors? Your content loses professional credibility if it contains glaring mistakes. Spell check is your friend; I promise. Is the content easily readable? Various metrics exist for quantifying the readability of content (e.g., Flesch Reading Ease, Fog Index, etc.). Are search engines able to process the content? Don't trap your content inside Flash, overly complex JavaScript, or images. Additional Content Optimization Resources: SEO Copywriting Tips The Ultimate Blogger Writing Guide When analyzing the content across your entire site, you want to focus on 3 main areas: 1. Information Architecture Your site's information architecture defines how information is laid out on the site. It is the blueprint for how your site presents information (and how you expect visitors to consume that information). During the audit, you should ensure that each of your site's pages has a purpose. You should also verify that each of your targeted keywords is being represented by a page on your site. 2. Keyword Cannibalism Keyword cannibalism describes the situation where your site has multiple pages that target the same keyword. When multiple pages target a keyword, it creates confusion for the search engines, and more importantly, it creates confusion for visitors. To identify cannibalism, you can create a keyword index that maps keywords to pages on your site. Then, when you identify collisions (i.e., multiple pages associated with a particular keyword), you can merge the pages or repurpose the competing pages to target alternate (and unique) keywords. 3. Duplicate Content Your site has duplicate content if multiple pages contain the same (or nearly the same) content. Unfortunately, these pages can be both internal and external (i.e., hosted on a different domain). You can identify duplicate content on internal pages by building equivalence classes with the site crawl. These classes are essentially clusters of duplicate or near-duplicate content. Then, for each cluster, you can designate one of the pages as the original and the others as duplicates. To learn how to make these designations, read Section IV. of Dr. Pete's duplicate content guide: Duplicate Content in a Post-Panda World. To identify duplicate content on external pages, you can use Copyscape or blekko's duplicate content detection. Here's an excerpt from blekko's results for SEOmoz: blekko Duplicate Content Results for SEOmoz HTML Markup It's hard to overstate the value of your site's HTML because it contains a few of the most important on-page ranking factors. Before diving into specific HTML elements, we need to validate your site's HTML and evaluate its standards compliance. W3C offers a markup validator to help you find standards violations in your HTML markup. They also offer a CSS validator to help you check your site's CSS. Titles A page's title is its single most identifying characteristic. It's what appears first in the search engine results, and it's often the first thing people notice in social media. Thus, it's extremely important to evaluate the titles on your site. When evaluating an individual page's title, you should consider the following questions: Is the title succinct? A commonly used guideline is to make titles no more than 70 characters. Longer titles will get cut off in the search engine results, and they also make it difficult for people to add commentary on Twitter. Does the title effectively describe the page's content? Don't pull the bait and switch on your audience; use a compelling title that directly relates to your content's subject matter. Does the title contain a targeted keyword? Is the keyword at the front of the title? A page's title is one of the strongest on-page ranking factors so make sure it includes a targeted keyword. Is the title over-optimized? Rand covers this topic in a recent Over-Optimization Whiteboard Friday.
Additional Title Optimization Resources: Are Your Titles Irresistibly Click Worthy & Viral?! How to Write Magnetic Headlines
When analyzing the titles across an entire domain, make sure each page has a unique title. You can use your site crawl to perform this analysis. Alternatively, Google Webmaster Tools reports duplicate titles that Google finds on your site (look under "Optimization" > "HTML Improvements"). Meta Descriptions A page's meta description doesn't explicitly act as a ranking factor, but it does affect the page's click-through rate in the search engine results.
The meta description best practices are almost identical to those described for titles. In your page level analysis, you're looking for succinct (no more than 155 characters) and relevant meta descriptions that have not been over-optimized.
In your domain level analysis, you want to ensure that each page has a unique meta description. Your Google Webmaster Tools account will report duplicate meta descriptions that Google finds (look under "Optimization" > "HTML Improvements"). Other Tags We've covered the two most important HTML elements, but they're not the only ones you should investigate. Here are a few more questions to answer about the others:
Are any pages using meta keywords? Meta keywords have become almost universally associated with spam. To be on the safe side, just avoid them. Do any pages contain a rel="canonical" link? This link element is used to help avoid duplicate content issues. Make sure your site is using it correctly. Are any pages in a paginated series? Are they using rel="next" and rel="prev" link elements? These link elements help inform search engines how to handle pagination on your site. Additional Resources: Google Explains the rel="canonical" Link Google Explains the rel="next" and rel="prev" Links Images A picture might say a thousand words to users, but for search engines, pictures are mute. Therefore, your site needs to provide image metadata so that search engines can participate in the conversation.
When analyzing an image, the two most important attributes are the image's alt text and the image's filename. Both attributes should include relevant descriptions of the image, and ideally, they'll also contain targeted keywords.
For a comprehensive resource on optimizing images, read Rick DeJarnette's Ultimate Guide for Web Images and SEO. Outlinks When one page links to another, that link is an endorsement of the receiving page's quality. Thus, an important part of the audit is making sure your site links to other high quality sites.
To help evaluate the links on a given page, here are a few questions to keep in mind: Do the links point to trustworthy sites? Your site should avoid linking to spammy sites because it reflects poorly on the trustworthiness of your site. If a site links to spam, there's a good chance that it's also spam. Are the links relevant to the page's content? When you link to another page, its content should supplement yours. If your links are irrelevant, it leads to a poor user experience and reduced relevancy for your page. Do the links use relevant anchor text? Does the anchor text include targeted keywords? A link's anchor text should accurately describe the page it points to. This helps users decide if they want to follow the link, and it helps search engines identify the subject matter of the destination page. Are any of the links broken? Links that return a 4xx or 5xx status code are considered broken. You can identify them in your site crawl, or you can also use a Link Checker. Do the links use unnecessary redirection? If your internal links are generating redirects, you're unnecessarily diluting the link juice that flows through your site. Make sure your internal links point to the appropriate destination pages. Are any of the links nofollowed? Aside from situations where you can't control outlinks (e.g., user generated content), you should let your link juice flow freely. Additional Link Optimization Resources: The Importance of Internal Linking Internal Link - Best Practices for SEO When analyzing a site's outlinks, you should investigate the distribution of internal links that point to the various pages on your site. Make sure the most important pages receive the most internal backlinks. To be clear, this is not PageRank sculpting. You're simply ensuring that your most important pages are the easiest to find on your site. Other Tags Images and links are not the only important elements found in the HTML section. Here are a few questions to ask about the others: Does the page use an H1 tag? Does the tag include a targeted keyword? Heading tags aren't as powerful as titles, but they're still an important place to include keywords. Is the page avoiding frames and iframes? When you use a frame to embed content, search engines do not associate the content with your page (it is associated with the frame's source page). Does the page have an appropriate content-to-ads ratio? If your site uses ads as a revenue source, that's fine. Just make sure they don't overpower your site's content. We've now covered the most important on-page ranking factors for your website. For even more information about on-page optimization, read Rand's guide: Perfecting Keyword Targeting & On-Page Optimization. (4) Off-Page Ranking Factors The on-page ranking factors play an important role in your site's position in the search engine rankings, but they're only one piece of a much bigger puzzle. Next, we're going to focus on the ranking factors that are generated by external sources. Popularity The most popular sites aren't always the most useful, but their popularity allows them to influence more people and attract even more attention. Thus, even though your site's popularity isn't the most important metric to monitor, it is still a valuable predictor of ongoing success.
When evaluating your site's popularity, here are a few questions to answer:
Is your site gaining traffic? Your analytics package is your best source for traffic-based information (aside from processing your server logs). You want to make sure your site isn't losing traffic (and hence popularity) over time. How does your site's popularity compare against similar sites? Using third party services such as Compete, Alexa, and Quantcast, you can evaluate if your site's popularity is outpacing (or being outpaced by) competing sites. Is your site receiving backlinks from popular sites? Link-based popularity metrics such as mozRank are useful for monitoring your site's popularity as well as the popularity of the sites linking to yours. Trustworthiness The trustworthiness of a website is a very subjective metric because all individuals have their own unique interpretation of trust. To avoid these personal biases, it's easier to identify behavior that is commonly accepted as being untrustworthy.
Untrustworthy behavior falls into numerous categories, but for our purposes, we'll focus on malware and spam. To check your site for malware, you can rely on blacklists such as DNS-BH or Google's Safe Browsing API.
You can also use an analysis service like McAfee's SiteAdvisor. Here is an excerpt from SiteAdvisor's report for SEOmoz: SiteAdvisor Results for SEOmoz When investigating spammy behavior on your website, you should at least look for the following: Keyword Stuffing - creating content with an unnaturally high keyword density. Invisible or Hidden Text - exploiting the technology gap between Web browsers and search engine crawlers to present content to search engines that is hidden from users (e.g., "hiding" text by making it the same color as the background). Cloaking - returning different versions of a website based on the requesting user agent or IP address (i.e., showing the search engines one thing while showing users something else). Additional Web Spam Resources: Web Spam Taxonomy Cloaking and Redirection: A Preliminary Study Even if your site appears to be trustworthy, you still need to evaluate the trustworthiness of its neighboring sites (the sites it links to and the sites it receives links from).
If you've identified a collection of untrustworthy sites, you can use a slightly modified version of PageRank to propagate distrust from those bad sites to the rest of a link graph. For years, this approach has been referred to as BadRank, and it can be deployed on outgoing links or incoming links to identify neighborhoods of untrustworthy sites.
Alternatively, you can attack the problem by propagating trust from a seed set of trustworthy sites (e.g., cnn.com, mit.edu, etc.). This approach is called TrustRank, and it has been implemented by SEOmoz in the form of their mozTrust metric. Sites with a higher mozTrust value are located closer to trustworthy sites in the link graph and therefore considered more trusted. Additional Trust Propagation Resources: Combating Web Spam with TrustRank Propagating Trust and Distrust to Demote Web Spam Backlink Profile Your site's quality is largely determined by the quality of the sites linking to it. Thus, it is extremely important to analyze the backlink profile of your site and identify opportunities for improvement.
Fortunately, there is an ever-expanding list of tools available to find backlink data, including your webmaster tools accounts, blekko, Open Site Explorer, Majestic SEO, and Ahrefs.
Here are a few questions to ask about your site's backlinks: How many unique root domains are linking to the site? You can never have too many high quality backlinks, but a link from 100 different root domains is significantly more valuable than 100 links from a single root domain. What percentage of the backlinks are nofollowed? Ideally, the vast majority of your site's backlinks will be followed. However, a site without any nofollowed backlinks appears highly suspicious to search engines. Does the anchor text distribution appear natural? If too many of your site's backlinks use exact match anchor text, search engines will flag those links as being unnatural. Are the backlinks from sites that are topically relevant? Topically relevant backlinks help establish your site as an authoritative source of information in your industry. How popular/trustworthy/authoritative are the root domains that are linking to the site? If too many of your site's backlinks are from low quality sites, your site will also be considered low quality. Additional Backlink Analysis Resources: 71 Technical Factors for Backlink Analysis Anchor Text Distribution: Avoiding Over Optimization The Professional Guide to Link Building Authority A site's authority is determined by a combination of factors (e.g., the quality and quantity of its backlinks, its popularity, its trustworthiness, etc.). To help evaluate your site's authority, SEOmoz provides two important metrics: Page Authority and Domain Authority. Page Authority predicts how well a specific page will perform in the search engine rankings, and Domain Authority predicts the performance for an entire domain. Both metrics aggregate numerous link-based features (e.g., mozRank, mozTrust, etc.) to give you an easy way to compare the relative strengths of various pages and domains. For more information, watch the corresponding Whiteboard Friday video about these metrics: Domain Authority & Page Authority Metrics. Social Engagement As the Web becomes more and more social, the success of your website depends more and more on its ability to attract social mentions and create social conversations. Each social network provides its own form of social currency. Facebook has likes. Twitter has retweets. Google+ has +1s. The list goes on and on. Regardless of the specific network, the websites that possess the most currency are the most relevant socially. When analyzing your site's social engagement, you should quantify how well it's accumulating social currency in each of the most important social networks (i.e., how many likes/retweets/+1s/etc. are each of your site's pages receiving). You can query the networks for this information, or you can use a third party service such as Shared Count. Additionally, you should evaluate the authority of the individuals that are sharing your site's content. Just as you want backlinks from high quality sites, you want mentions from reputable and highly influential people. Additional Social Engagement Resources: Tracking the KPIs of Social Media How Authorship (and Google+) Will Change Linkbuilding AuthorRank Could be Bigger than all Panda Updates Combined (5) Competitive Analysis Just when you thought we were done, it's time to start the analysis all over for your site's competitors. I know it sounds painful, but the more you know about your competitors, the easier it is to identify (and exploit) their weaknesses. My process for analyzing a competitor's website is almost identical to what we've already discussed. For another person's perspective, I strongly recommend Selena Narayanasamy's Guide to Competitive Research. SEO Audit Report After you've analyzed your site and the sites of your competitors, you still need to distill all of your observations into an actionable SEO audit report. Since your eyes are probably bleeding by now, I'll save the world's greatest SEO audit report for another post. In the meantime, here are three important tips for presenting your findings in an effective manner: Write for multiple audiences. The meat of your report will contain very technical observations and recommendations. However, it's important to realize that the report will not always be read by tech-savvy individuals. Thus, when writing the report, be sure to keep other audiences in mind and provide helpful summaries for managers, executives, and anyone else that might not have a working knowledge of SEO. Prioritize, prioritize, and then prioritize some more. Regardless of who actually reads your report, try to respect their time. Put the most pressing issues at the beginning of the report so that everyone knows which items are critically important (and which ones can be put on the back burner, if necessary). Provide actionable suggestions. Don't give generic recommendations like, "Write better titles." Provide specific examples that can be used immediately to make a positive impact on the site. Even if the recommendations are large in scope, attempt to offer concrete first steps to help get the ball rolling. Additional Resources Just in case 6,000+ words weren't enough to feed your SEO audit hunger, here are a few more SEO audit resources: Technical Site Audit Checklist - Geoff Kenyon provides an excellent checklist of items to investigate during an SEO audit. If you check off each of these items, you're well on your way to completing an excellent audit. The Ultimate SEO Audit - This is a slightly older post by The Daily Anchor, but it still contains a lot of useful information. It's organized as three individual audits: (1) technical audit, (2) content audit, and (3) link audit. A Step by Step 15 Minute SEO Audit - Danny Dover offers a great guide for identifying large SEO problems in a very short period of time. Find Your Site's Biggest Technical Flaws in 60 Minutes - Continuing with the time-sensitive theme, this post by Dave Sottimano shows you just how many SEO-related problems you can identify in an hour. Courtsy-seomoz
Read More

Google Vs Facebook : Comparative Data

12:30 AM | , ,

Google vs Facebook in figures

Google Inc is taking the threat posed by Facebook Inc's internet social network more seriously since co-founder Larry Page returned as CEO a year ago. Although Facebook is far smaller than Google, Page is worried the social network is gaining valuable insight into its users' lives that could lure away online advertisers.

CEO

  • Google: Co-founder Larry Page
  • Facebook: Co-founder Mark Zuckerberg

Annual Net Income

  • Google: $9.7 billion
  • Facebook: $668 million

Annual Revenue

  • Google: $38 billion
  • Facebook: $3.7 billion

Social Networking Users

  • Google: Plus-100 million
  • Facebook: 845 million

Advertising Revenue

  • Google: $36.5 billion
  • Facebook: $3.2 billion.

Employees

  • Google: 32,500
  • Facebook: 3,200
Read More

New SEO Strategy : Best for Current Scenerio

11:44 PM | , , ,

The responsibilities of SEO practitioners have changed to include far more of the digital ecosystem, yet for so many, much of the SEO process remains the same. Currently there are several segments of SEO strategy seen as optional that are actually absolutely imperative to the success of an SEO campaign, as well as to the synergy of other initiatives within the marketing mix. In other words, SEO must adopt and adapt in order to be taken seriously and command the type of influence required to drive change. As it stands, SEO looks to disrupt the symphony (or cacophony) that is a brand’s marketing mix. Let’s discuss a new process that allows SEO to improve the effectiveness of all digital marketing channels – not just inbound.
SEO = Kanye + Calculus
Disclaimer: Kanye West is awesome, but you understand how he is perfect to illustrate these points.

Problems with the Old Process

I’ve heard SEO called a lot of ugly things in the past few years. My favorite one lately was delivered to me by the wonderful Brittan Bright after someone passionately declared to her that SEO is the “Calculus of Marketing.” I love it simply because it fits. Just like Calculus, if you’re not looking at the aggregate value of what you’re working on you may do a lot of work for a result that doesn’t seem big in the grand scheme. Just like Calculus, SEO is quite specific and esoteric to those that haven’t studied it. Just like Calculus, you can be completely successful without it altogether. And finally SEO and Calculus both set a barrier of entry that excludes more than it includes.
With all that said, here is the typical SEO process as it has been defined over the years.
The Standard SEO process
Although we often treat it like one, SEO has never been an initiative that existed within a vacuum. It has always required changes be made across a complete digital ecosystem in which there are numerous stakeholders. However, this existing process always asked for change without justification with regard to the purpose of goals of these touchpoints. For example, if my recommendation is to change a title tag there has been no justification as to how that affects the CTR of a page shared on Facebook. Perhaps the social media team has discovered that the target audience clicks through less when a page title doesn’t feature a brand name. That’s a hypothetical situation but let’s go into a little more detail as to why SEO will not continue to work this way.

No Regard for Market Research

Just as the diagram above suggests, most SEOs jump right into keywords, analytics and competitive analysis of those keywords. Wrong move; search is about fulfilling needs. Before looking at a single keyword there needs to be a deep understanding of business objectives and the market. Standard kickoff questions often look like this:
  • What analytics package do you use?
  • Are there any other domains or sites that you own?
  • What SEO efforts have been done in the past?
  • List your top 3 competitors.
  • Do you have social media accounts?
  • What keywords are you looking to rank for?
Kanye Ain't Doin' No Market ResearchThe biggest problem with this is we often take these inputs at face value. That is to say, very often the brands that the client believes they are competing with offline are not the sites they are competing with for keyword coverage in the SERPs. Also the keywords a client may think they should rank for are not the keywords that are going to help them meet their actual goals.
To simplify it, many SEO teams send clients kickoff questions to get a sense of the keywords they should target and then hop right into the keyword tool. Pages are optimized. Keywords are allocated to pages. Links are built. Content is pushed into social. Performance is measured to identify subsequent opportunities. Obviously it oftentimes goes far more in-depth for many, but this is basically the widely accepted process.
One of my biggest issues as a consumer of Search that understands SEO is if the results I click appear to be overly optimized I become quite leery of the content. This is simply because in my experience many copywriters (SEO or otherwise) often don’t know what they are talking about. Recalling dusty memories of early in my own SEO career when I wrote copy, in most cases I was just a human article spinner. I definitely read a few wiki articles and the top results for a given keyword and just reworded what other people said. I shared all that to say: Becoming an expert in the niche that you are optimizing for is an extremely underrated step in the SEO process. For this reason, if I were to hire an agency, I would prefer one with extensive prior experience or specialty in my vertical.  All my in-house SEOs – make some noise!

Little Regard for the Audience

Truthfully, the real differentiation between clients happens in a latter set of questions. Unfortunately, the following doesn’t get asked enough in the standard SEO kick-off:
  • What is the purpose of your site?
  • What are you trying to get users to do once they arrive?
  • Who is your target audience?
Description: D:\users\mking\Documents\kanye\kanye-audience-research.png
These are typically questions that Conversion Rate Optimization teams focus on rather than SEO teams. For shame SEOs, for shame!
We all want traffic and we all want to rank #1 for juicy head terms, but these things are not goals. By themselves these are not KPIs that make clients successful. Simply put, if you rank highly for keywords but aren’t fulfilling the needs of people searching for them, you just put a ton of effort into exactly the wrong thing. It’s not about the keywords; it’s about the people searching for them.
Consider this offline example of Target using data on customers to identify when they’ve become pregnant to learn when to ramp up efforts to turn mothers-to-be into long-term big spenders at the wholesale department store. You can do this far more effectively with Search if you’re mindful of your audience and their needs. This measurement of intent plus interests plus demographics plus network is the Holy Grail of Marketing. With that in mind it becomes quite clear what Google’s ulterior motives are with Plus and the consolidation of privacy policies.
Recently, I had a short conversation with AJ Kohn via Twitter about personas and how client research can prove useless. I agree somewhat because clients that have done audience research beforehand may have only looked at offline factors. To that point, it is important that we validate or disprove those insights with our own research rather than taking what the client says at face value. Our goal is to optimize, not paint by numbers.

SEO Disrupts Most Digital Strategies

As much as I hate to say it, the reality of SEO is that it disrupts much of digital planning even when it’s included from the onset.
Most other digital capabilities start from the target audience before they do anything. User Experience has user stories, personas and user flows. Strategy teams build personas and need states by examining demographics and psychographics in efforts to really try and understand what does and will influence and fulfill the target audience.
Kanye WILL Disrupt Your CampaignWhichever of these teams develops these audience insights then feeds them to other teams so that efforts are glued together by the target consumer. Paid channels such as Facebook Ads, Display Advertising and Paid Search benefit from this significantly in their ability to target demographically. Media teams examine the available audience by vendor and allocate dollars based on where the delivery will be most effective.
Traditionally, Organic Search ignores this step entirely and declares “HEY! I’M HERE NOW WE’RE DOING THIS MY WAY!” This is partially why SEO gets shunned by brands when they are determining where to distribute their efforts within the marketing mix. SEO is certainly effective, but it has always been a maverick that didn’t want to play by the rules. There is little meritocracy because if channels were chosen only by ROI – Display Advertising would have died 10 years ago. Evidently, they are not chosen this way so for SEO to get buy-in it needs to be team player.

Many Link Building Initiatives Exist in a Vacuum

Regardless of the hundreds of strategies, tactics and tools that are being born for link building daily, every successful link building campaign boils down to making news and/or making friends. As SEOs, we try to strong arm how and where brands will do this. Making news and building relationships are functions of many different groups and initiatives within a business from top to bottom. How is it that we as SEOs believe our best initiatives can exist outside of the things the brand itself contributes to?
Other Vehicles Don't Matter to Kanye
Brands launch PR campaigns, social media efforts, events, so on and a variety of other social strategies to facilitate the awareness of the news they create. How is link building any different? The fact of the matter is, it isn’t. Therefore it should be attacked from, and included with, the same standpoint as the rest of a brand’s social strategies for both scale and effectiveness. Simply put, link building is better when the entire muscle of a brand is leveraged.
The New SEO Process
To do effective SEO now, at the very least, you have to be a digital strategist, social media marketer, a content strategist, conversion rate optimizer, and a PR specialist. I’m skipping anything coding related because although I believe you should be able to build a website you don’t necessarily have to. SEOs are already inherently each of these things, however in most businesses these are all different capabilities that sit in different groups, or offices or cities. Who are we to upset an entire digital ecosystem and undermine so many people?
Well I work with some awesome digital strategists, content strategists, creatives, etc. and while they tend to have impressive grasps of web trends, audiences and their specific capabilities they typically don’t know how to leverage cross-channel campaigns as specifically as SEOs or Inbound Marketers.  It is now the role of Inbound Marketers to drive strategies that looks far more like this (sorry guys, Kanye had to go – busy schedule):
The NEW SEO Process

I wish very much that I could be there for your “aha!” moment right now as no doubt you recognize many of these steps and can guess where other tasks will fall. Now let’s break it down completely – forgive me for anything that is obvious.
The New SEO Process Explained
  • Opportunity Discovery – Opportunity Discovery is a cyclical process of understanding brand opportunity with regard to business goals, target audience, industry specifications and past performance. It’s cyclical in that insights from one step often refine insights from another step in the process.
     
  • Business Objectives Everything must be done within the context of the goals of the brand. This requires a deep understanding of where the brand has been and where it’s going. In many cases businesses large and small may not understand how to translate their goals and therefore it is the job of the Inbound Marketer to do so.
     
  • Market Research The reason why SEO gets such a bad rap for polluting the web is that so many people simply do not build content that is worthwhile or has utility for the market. At this point, the entire team must take a deep dive into the industry and be able to have more than cursory conversations on the subject matter. For those that believe this to be a largely arduous task I suggest specializing in verticals of interest.
     
  • Audience Research –The Facebook Ads tool is the Adwords Keyword Tool of personas. The Doubleclick Ad Planner is also good for understanding the demographics of existing sites. If available, Facebook Insights gives demographic data on the existing users visiting the site as well. The output of this is a set of user segments and stories or – personas.
     
  • Analytics Mining – As always, you should mine existing analytics data to understand who is visiting. Take deep dives into keyword performance, especially in concert with any internal Search data, to identify opportunities. All in all, this is no different than normal unless the client has already been tracking their audience at which point you can see if who they are trying to attract is actually coming or not.
     
  • Social Listening – Using a core set of keywords, collect data on the conversation around those keywords. Keep track of patterns and identify user segments, demographics and need states of the people partaking in that social conversation. You’ll also want to keep track of how these users are using the keywords as this will allow you to eliminate ambiguity in keyword decisions and help to create messaging that resonates with the audience during the customer decision journey.
     
  • Quantitative Analysis – Services such as ComScore, Quantcast, Forrester Research, etc. track a multitude of data points on users in various verticals by demographic. Leveraging these reports gives you deeper insight into what types of users visit your competitors and exist within the market.
     
  • Keyword Research – Keyword Research must be completed with regard to the audience not just a determination of whether the keyword is viable from a search volume standpoint, but whether the keyword intent matches the business goals. Keywords should then be correlated with target personas and need states to help drive the build of content that is optimized for people first and search engines second.
     
  • Site Audit – Under the New SEO Process the Site Audit becomes decidedly more comprehensive, as it covers UX issues that would normally fall into a CRO Audit. Specifically, the audit talks about things impeding the conversions due to incongruence with the target audience in addition to the standard SEO technical issues that it covers.
     
  • Asset Inventory – A standard practice SEOs are already doing wherein there is an understanding of what a brand controls and is willing to leverage to the benefit of the campaign.
     
  • Content Audit – What content inside our outside of the site can be leveraged?
     
  • Brand Relationships – What other companies, businesses, groups and events are the brand involved with?
     
  • Offline Assets – What tools, venues, prizes, etc. are at the brands disposal?
     
  • Competitive Analysis – As always, competitive analysis is a collection of high-level audits of competitors across the vertical. The difference is that since site audits are completed with regard to the audience, the competitive analysis must also include a determination of how other brands are capturing that audience.
     
  • Measurement Planning –A standard practice amongst analytics teams the Measurement Plan is the Statement of Intent and determination of Key Performance Indicators with regard to the business goals and audience. Avinash Kaushik covers measurement planning in his Digital Marketing and Measurement Model post. (Hat tip: @scotttdodge)
     
  • Content Strategy & Development – Content Strategy and Development are big picture initiatives with a variety of stakeholders, so it often carries with it the most pushback. Creative teams just want to take big swings for big ideas and brand managers just want to advertise. To be effective we have to show how our content ideas will connect with the brand’s target audience and make sure content is designed to our specification.
     
  • Content Ideation –With all this social data we have collected and correlated to keywords we can now come with ideas for content with portions of the target audience built-in. Do so.
     
  • Wireframes – are an early deliverable in the design phase of a website wherein we can annotate considerations for SEO and CRO to ensure that Creative teams design with both in mind. Be very involved in this phase.
     
  • Content Build – Once all your points are baked in, it’s time to let the Creatives do what they do. If they come back with creative is not congruent with what is agreed upon in an earlier phase, then you now have data to back up your position with the client.
     
  • Technical Development –Technical SEO is the price of admission and cannot be ignored, so this where we make sure that the structure of the house is sound.
     
  • Technical Build –At this point, we’ve done all we can do now we just wait to see what the tech teams come back with. We’ve specified everything in wireframes and hopefully have had some say in the build of the CMS, but the tech team is going to do what they know. We’re just going to have to wait to see what they come back with unless they are open to our input during the actual build.
     
  • Implementation Audit – We’ll always have to double-check the work of a technical team and this is the spreadsheet in which we do it. An implementation audit briefly recounts the issues outlined in the site audit and wireframes and says whether or not they were successfully implemented. This is the easiest way to show that the bottlenecks are not so much with the SEO team but the tech team – as they oftentimes are.
     
  • Social Strategy – Typically link building is an initiative that exists by itself, in the new SEO process link building is an initiative that must be completed as part of a broader scope. While it is clear that low quality tactics like blog commenting continue to work, even those are far more effective coupled with a social push across PR and social media. Leveraged strategically, you are launching a piece of content with a cross-channel marketing push and therefore the link velocity will appear more natural to search engines and the return on the social strategy is likely to be higher. While link building has always been about casting the widest net, social strategy is about casting the rightest net the widest. I just made up a word. Kanye approves.
     
  • Link Strategy – Link building for most businesses, particularly small businesses, is not an “if you build it, they will come” situation. Therefore it is not enough to just launch content and hope for the best, we must continue to supplement content launches with smaller complementary content launches, outreach and manual submission link building. This is where this strategy is defined with its own measurement plan. Yes, I’m saying we should report both our prospects and the links we close. If you’re proud of your work that shouldn’t be a problem. Link Building is just like a PR campaign in that there is no guarantee of placements and should be explained as such.
     
  • PR – News is better than advertising, so a key part of social strategy is doing things that make news. Users spend a large part of their day reading, sharing and linking to news so make it a large part of the social strategy to make sure that content is newsworthy and get it to the news outlets that your audience frequents.
     
  • Contests – Contests are an excellent way to get a one-to-many return on incentives. Rather than performing outreach and directly offering them a free sample or (gasp) money request that they enter a contest wherein their entry is a blog post about the brand’s topic that contains a link. Also add a layer of gameplay to the contest by determining the winner through the number of times their post is shared in social media. Unbounce had a similar blogging contest in 2011 but link building wasn’t the goal of the campaign so they had all the posts on their own site.
     
  • Events – Throwing a party, conference or trade show is another one-to-many return for link building. Simply host an event and invite influencers in the brand’s audience where the stipulation for attendance is that people must blog about it and link back to you.
     
  • Social Media – is a two way street. Not only is it a place for discovery but also a place for conversation. Use that conversation to find the influencers in the space with regard to the target audience and business goals. Build social media profiles to be authoritative and engaging to easily get your content shared and also convert sharers into linkers. Regardless of where Google is headed, the social graph will never completely replace the link graph.
     
  • Social Implementation – is the phase when you let it all rip for the best synergy.
     
  • Measurement – is not just about whether or not we hit the goals. It’s the insights into why that makes measurement the most valuable step in Online Marketing. Measuring with regard to the audience helps with understanding the why even further than speaking in concrete abstracts such as bounce rate of a keyword. After all the ability to tangibly measure is why digital marketing is far more effective than traditional.
     
  • Reporting – is tailored specifically to the goals of the client. There’s no one-size-fit-all report. For example, a client business goal may be to get user segment A to watch a video and therefore, the primary metrics reported should be the Time On Site and persona type versus traffic and keyword. Rankings are only important with regard to how they’ve affected traffic. Everything should be focused on who (persona A) and why (because the message is unclear) rather than what (“blue widgets for sale ranked #5”).
     
  • Link Reporting – Under the umbrella of social strategy there is a lot to be said about what has been done to increase visibility. Aggregate rankings should be reported with regard to link building efforts to show the direct correlation between the two. Furthermore, link prospects and closes should also be reported with close rates to show clients what is being done on their behalf. This is obviously a subject of contention within the community, but if the links you build are so suspect that you are afraid to show them to the people you’re building them for – you need a different approach.
     
  • Optimization – I had an art teacher once that always used to say “No work of art is ever finished, we just give up.” The art and science of SEO is never complete and there is always an opportunity to do more.
     
  • Conversion Rate Optimization – While CRO is far more baked into this strategy it still likely to take its own seat at the table. That is to say that while SEOs may also be CROs they may be too close to the project to properly optimize. This is much the same way that the mixing engineer of a song is not supposed to also be the mastering engineer. At this point, a separate CRO Team should run A/B Tests, Usability Tests and so on and report back.
     
  • Continued SEO – Do it all over again!

5 Advantages to this New Process

A Better Web

Not to go all “land of milk and honey” on you guys, but the consumer is the biggest winner here. Naturally businesses benefit immensely as well, but the more we optimize with people in mind the more likely their needs will be fulfilled and consequently, the more likely we are to get those people to convert. Including people throughout the process and making the core goal to encourage them to do something ultimately makes the web a better place because everything we create will have a distinct purpose for the user and never solely for search engines. This is not to say we are circumventing the technical tenets of SEO as they are the price of admission.

Brand Buy-In

SEO has always been an industry that explains itself using empirical data. Starting from the audience, a place that businesses can understand, it is far easier to get buy-in for SEO initiatives. So when we make recommendations and explain the impact of our efforts on a target audience that has been determined as a focus of all initiatives, it’s easier to obtain brand buy-in than when we’re just talking about keywords and traffic.
Compare the following statement:
“We want to build links targeting websites with a PageRank of 3 or higher. We’ll reach out to a variety of prospects and target anchor text for keyword opportunities identified by our extensive keyword research in order to gain rankings for your brand.”
with:
“We’d like to launch a contest targeting Influential Moms with over 5000 followers on Twitter. To enter they’d write blog posts that link back to our properties in order to drive traffic for our target Listener Moms that are using Search to buy more healthy cereal.”
Both ideas would potentially accomplish the same goals however the former will require far more explanation for the client and ultimately more effort on the part of the SEO team. Whereas the latter explains a link building campaign in terms of the brand’s target audience and business goals then further lays out a campaign wherein the brand commits cross-channel resources that the SEO team can leverage. Understanding the business objectives and the audience make it easier to develop and deliver strategies that client can easily get behind.

Scalability

Getting on the same page with the other capabilities allows SEO efforts to be scaled considerably for brands large and small. This is how we regularly achieve those otherwise rare instances of synergy between capabilities when the PR team is facilitating Link Building, the Content Strategy teams and Creative teams are creating link bait and SEO is both driving and supplementing those efforts. That is the perfect storm where we spend far more time chiseling our perfect sculptures rather than polishing poop and our efforts have far more impact with less effort.

Cross-Channel Optimization

Learnings and wins in SEO can influence other channels. Imagine we discover through social listening, keyword research and/or measurement are a large number of the client’s target audience is looking for “red kanye west t-shirts” but the client only sells every color but red. We now have a tight business case as to why that client should start manufacturing the t-shirt in red. Conversely, what if we find out that people love the shirt but bounce from the landing page because they hate the user experience of the site? There is any number of scenarios that when explained purely from the context of search brands are far less likely to make a move. However when you explain these insights through the context of personas and market research you have a tighter case that can affect change across all channels and capabilities.

[not provided]…so what?

Google has positioned itself to take away all of our organic keyword referral data and let’s be honest they ultimately will take it all. Plus, and the consolidation of privacy policies to allow cross-product data access, is Google’s way of positioning itself to attain the Holy Grail of Marketing. However, measuring through our audience essentially allows us a new way to determine the effectiveness of a campaign. We know the keywords we are targeting for a given page and we can see rankings and analytics of a given landing page by channel to determine whether or not Search is driving traffic. The true measure of success was never the rankings, nor the traffic but how well the page a given page converted for our visitors. If we track conversions based on audience that is the only metric that is truly worth optimizing against. The holistic performance of a channel is what brands are concerned with, not necessarily the performance of a given keyword.
Opportunity Discovery Resources
The following are a list of posts, pages, tools and presentations to help get a deeper understanding of personas and need states and how to apply them to various Inbound Marketing efforts.

Personas

Need States

Useful Social Tools

Quantitative Analysis Providers (PAID)

I'm let you finish

During the #seochat I did on the SEO Process there were some questions of whether this applies to small businesses or not, citing that small businesses only care about the #1 spot and they “just want rank.” Yes, understanding what makes an audience tick applies to all businesses. Again, the ability to quantify the interests and intent of your audience and track a brand’s ability to persuade is the advantage of digital marketing of any kind. As I said on Twitter, #1 is not a goal, but a means to an end. #1 gets users to the door; it doesn’t keep them in the house.

Finally, the new SEO process is a call for us to speak the language of other capabilities and deliver strategies that can plug and play with what brands truly understand. The new SEO process is not about chasing the algorithm; it’s about fulfilling the needs of the people the algorithm serves. It’s about creating and discovering the content that resonates with the people that a business is trying to reach and then also covering the technical bases required to get results. It’s about understanding the connections between keywords in the mind of your target audience in order to optimize for them effectively. And most importantly, it’s about having SEO become the driver of the marketing mix rather than the outcast. No doubt SEO will remain the esoteric “Calculus of Marketing” but it’s time to prove that we can actually do the math so to speak.
Read More