The Internet Marketing Driver: Glenn Gabe's goal is to help marketers build powerful and measurable web marketing strategies.

Wednesday, March 17, 2010

.htaccess for Windows Server: How To Use ISAPI Rewrite To Handle Canonicalization and Redirects For SEO


ISAPI Rewrite, .htaccess for Windows Server.If you’ve read previous blog posts of mine, then you know how important I think having a clean and crawlable website structure is for SEO. When performing SEO audits, it’s usually not long before the important topic of canonicalization comes up. Canonicalization is the process of ensuring that you don’t provide the same content at more than more URL. It’s also one of the hardest words in SEO to pronounce. :) If you don’t address canonicalization, you can end up with identical content at multiple URL’s, which can present duplicate content issues. And you don’t want duplicate content. For example, you don’t want your site to resolve at both non-www and www, at both http and https, using mixed case, having folders resolve with and without trailing slashes, etc.

In addition to handling canonicalization, you also want to have a system in place for handling 301 redirects. A 301 redirect is a permanent redirect and will safely pass PageRank from one URL to another. This comes in handy in several situations. For example, if you go through a website redesign and your URL’s change, if you remove campaign landing pages, if you remove old pieces of content, etc. If you don’t 301 redirect these pages, you could end up paying dearly in organic search. Imagine hundreds, thousands, or millions of URL’s changing without 301 redirects in place. The impact could be catastrophic from an SEO standpoint.

Enter ISAPI Rewrite, .htaccess for Windows Server
So I’m sure you are wondering, what’s the best way to handle canonicalization and redirects for SEO? If you conduct some searches in Google, you’ll find many references to .htacess and mod_rewrite. Using mod_rewrite is a great solution, but it’s only for Apache Server, which is mainly run on linux servers. What about windows hosting? Is there a solution for .net-driven websites?

The good news is that there is a solid solution and it’s called ISAPI Rewrite. ISAPI Rewrite is an IIS filter that enables you handle URL rewriting and redirects via regular expressions. It’s an outstanding tool to have in your SEO arsenal and I have used it now for years. There are two versions of ISAPI Rewrite (versions 2 and 3) and both enable you to handle most of what .htaccess can do. Actually, I think so much of ISAPI Rewrite, that it’s the topic of my latest post on Search Engine Journal.

So, to learn more about ISAPI Rewrite, the two versions available, and how to use it (including examples), please hop over to Search Engine Journal to read my post.

ISAPI Rewrite: Addressing Canonicalization and Redirects on Windows Server

GG

Labels: , , , , , ,

Wednesday, March 10, 2010

SES NY 2010 Series: Getting Penalized and Banned in Search, An Interview With Michael Stebbins from Market Motive


How to get penalized and banned in Google.It’s that time of year again. SES New York is only a few weeks away and I’ll be covering the conference again via blogging and Twitter. As part of my coverage, I’ll be writing a blog posts previewing some of the sessions that I’m excited about attending. My first post is about a session titled “Post Mortem: Banned Site Forensics” and it will be co-presented by Michael Stebbins, the CEO of Market Motive, and Rand Fishkin, the CEO of SEOmoz, on Tuesday, March 23rd at 12:45. During the session, Michael and Rand will share some of the most egregious tactics that can get you in trouble, and also how to deal with getting penalized or banned. I had a chance to interview Michael last week about the session and you will find the interview below.

Getting Penalized or Banned in Search
If you work in SEO long enough, you’ll eventually hear the nightmare stories about sites getting penalized or banned by the search engines. I actually monitored a site a few months ago (a major brand) that was pulled from Google’s index for a five to six week period before being reincluded by the search giant. I can’t imagine how much money the company lost during that timeframe. It took me only ten minutes of digging to understand what they were doing wrong (and the tactic was blatantly against Google’s webmaster guidelines). That was a bad move and I’m sure it cost them dearly.

But every company being penalized doesn’t set out to break the rules. I’ve seen many instances of companies implementing dark grey to black hat tactics simply based on a lack of experience. They might have read about how to quickly rank highly on some random blog and went ahead and implemented those tactics. They weren’t necessarily trying to game the system, but ended up making changes that could get them in trouble. Sure, they might jump up the rankings for a few weeks or months, but they also might eventually get caught. That’s typically when the companies getting penalized or banned seek professional assistance.

Michael Stebbins of Market Motive.Needless to say, this is an important topic in SEO and why I chose to write about the session here on my blog. Michael has a wealth of experience in helping companies that have been penalized or banned, and was able to take a few minutes last week to answer some of my questions.

So without further ado, here is my interview with Michael Stebbins:

Glenn: What are the top three or four things people will learn at your session?

Michael: We'll cover which sins are forgivable and which ones can result in indefinite exclusion from the search results. We’ll also cover how to know if your site is banned in the first place. We get calls for help where site owners are certain they've been banned and it turns out the site is still indexed, but is penalized. Being penalized and being banned are very different outcomes. I'll show attendees a way to know for sure. We’ll then cover the five most common reasons sites are taken out of the index and I'll show the do's and don'ts in the reinclusion process.

Glenn: In your opinion, what are the leading causes/tactics for sites getting banned (over the past 12 to 18 months)?

Michael: Nearly all the “unforgivable” sins center around trying to fool the search engines into believing your site is more popular than it really is. The bots are getting smarter, but they are still blind and deaf. Since they cannot emulate a human behind a browser this leaves some opportunities for unscrupulous site owners to manipulate what the engines read versus what real people see and experience.

Glenn: As the engines have evolved, how have tactics for getting penalized evolved? i.e. How have older tactics like white on white text, keyword stuffing, cloaking, etc. evolved to more advanced forms of breaking the rules?

Michael: Google keeps this information close to the vest. But Bing recently posted what they are looking for to identify web spam. If you understand Google's motivation to show relevant sites, and combine this with some technical knowledge of how a bot finds and reads a web page, it's not too hard to figure out what the engines are looking for. Only certain false popularity techniques can be picked up with a bot at this time. The rest have to be reported and then checked via a manual review.

Glenn: Based on your experience, what are some of the top misconceptions about getting penalized by the engines?

Michael: It's funny, or actually it's not so funny, but nearly everyone who gets a site banned denies that they've done anything wrong. It's like a crime drama where the “victim” hides evidence out of embarrassment or denial. Eventually, we figure it out and are able to help. Another one that keeps coming up is denial of service after over-using Google resources. The denial of service relates to queries to Google's data -- not to inclusion in the index.

Glenn: Are there times where a smaller SEO violation can lead to a website completely getting pulled from the index?

Michael: Absolutely. We've found sites that trigger manual review for a forgivable sin, but once under review, an unforgivable sin is discovered and the site is beyond recovery at that point. Picture a driver getting pulled over for a tail light infraction only to get arrested for a bank robbery.

Glenn: Based on your experience helping sites that have been penalized or banned, how long does it take to bounce back from a penalty? (If a site owner goes through the process of fixing the issue and then filing a reinclusion request).

Michael: We've seen reinclusion in two weeks, but we've seen hundreds of sites that have little hope of ever being reincluded.

Glenn: Are there any case studies you are going to present during your session (along with statistics) about sites that were penalized?

Michael: I'll use some anonymized data to give examples of statistical data that can trigger a review. But for obvious reasons, we don't want to expose sites that were banned or are working on a reinclusion.

--------------------
Based on the importance of the subject matter, along with Michael and Rand’s experience, I believe this is a session that is hard to miss… I think the information being presented can help clients, agencies, consultants, and in-house SEO’s all better understand how to keep their sites in good standing. I’ll be attending the session on Tuesday and tweeting core points as they come up. Again, the session is scheduled for 12:45-1:45 on Tuesday, March 23rd.

So, be there or get banned by Google. Just kidding. :)

If you have any questions, post them below. Either Michael or myself will respond.

GG

Labels: , ,

Wednesday, January 13, 2010

A CM-Mess for SEM - How Content Management Systems (CMS) Can Be a Thorn in Your SEM Side


CMS-related problems for SEM.It’s widely known in the search community that CMS packages can cause serious SEO problems. These problems can sometimes be caused by the actual CMS being used or by the implementation of that CMS. There’s definitely a distinction. And to be clear, not all CMS packages or implementations cause these problems. You just need to be careful when choosing and implementing one. When you start to look at the impact from content management systems, the list of potential SEO issues can get quite long. For example, you might run into canonical issues, duplicate content, lack of content optimization, issues with flash seo, unfriendly redirects, etc. The irony is that CMS packages are supposed to make your life easier (and some definitely do), but there are times they can cause serious headaches.

But I Mentioned SEM, and Not Just SEO
Even though most of the focus has been on content management systems impacting SEO, paid search can also be affected. I’ve run into several CMS-related problems that can end up inhibiting the success of your paid search campaigns. From developing custom landing pages to accurately tracking conversion to implementing multivariate testing, content management systems can sometimes present their own obstacles (or little gremlins depending on the issue at hand). That brings me to the point of this post!

My latest blog post on Search Engine Journal (which went live today) addresses this topic and covers four categories of potential problems that content management systems can cause while running paid search. The post provides a description of each problem, recommendations for making changes, and a list of key takeaways. If you’re a search marketer that’s working with a CMS (or attempting to work with a CMS), then I recommend reading my post. :) And if you think I left out any problems, definitely feel free to post a comment below or on Search Engine Journal.

To learn more, check out my post now:
A CM-Mess for SEM – Does Your Content Management System Cause These Paid Search Problems?

GG

Labels: , ,

Tuesday, January 05, 2010

Exploring AdWords Geotargeting - 4 Points About Location Targeting in Google That Are Often Misunderstood


Why many new bloggers and Twitter users get frustrated and drop off the social media grid.I receive a lot of questions from local businesses about how to best geotarget their paid search ads in Google. AdWords actually provides some robust ways to target your ads by country, region, state, city, and there is also an option for choosing a custom location to target. For example, you could create a polygon on a map to choose a very specific area to target. But just because those options are available doesn’t mean that everyone using AdWords understands how location targeting actually works. I’m going to explain four points in this blog post that seem to confuse advertisers (plus one bonus topic). My goal is to arm you with the right information about geotargeting so you can understand the best ways to structure your campaigns and drive outstanding results.

Here are four points (plus a bonus topic at the end) about location targeting in AdWords that you might not realize are in effect while prospective customers are searching for your products or services.

Query Parsing
Some advertisers are confused when they geotarget a specific location and end up seeing visitors from outside that area. I hear questions about this often. Some advertisers believe that there must have been a glitch in AdWords that showed their ads to untargeted searchers. What’s actually happening is that Google uses query parsing to detect when a search is local in nature. So, if you are geotargeting New York City, but someone in Alabama searches for New York Hotels, Google might show them your ads targeted for New York. Again, that’s even if you are targeting people only in New York. You should keep this in mind if you plan to geotarget your campaigns, but also want to reach people outside that area for specific keywords.

Query Parsing in Action
(Click the image below to view a larger version:)
Query parsing in AdWords.

IP Address
If Google can determine your location via IP address, then you might see ads based on that location. So if your IP address shows you are from Princeton, New Jersey, and you search for bakeries, then you might see ads for bakeries in the Princeton area. Notice that the query bakeries did not have a local qualifier (such as a city or town). Google has continually refined the way that it handles queries that it deems local in nature. You might have noticed a big change in March of 2009, when the 10 pack of local listings (now 7 pack) was triggered via non-geo keywords. Prior to that, queries with geographic qualifiers would trigger local listings (such as bakeries in Princeton, NJ). Now they can be triggered via broad terms (if Google believes it’s a local search). Keep this in mind when building your keyword lists for geotargeted campaigns.

Local Ads Based on IP
(Click the image below to view a larger version:)
Local ads triggered via IP Address.

Google Country-Specific Search Engines
This point relates to the Google domain you are searching on (and Google has over 100 country-specific domains). When people are searching on Google, they will see ads based on the Google domain they are using, such as google.co.uk, google.ca, google.co.jp, etc. So, if you are located in Canada, but are using Google UK, ads will be UK-focused. If you are in Japan, but you are using Google.com (US), then your ads will be targeted for the US. This is important to understand if you will be targeting people in several countries. You would want to structure your campaigns so they are extremely targeted for the locations (and languages) you are focusing on.

Google Domain Driving Ads
(Click the image below to view a larger version:)
Ads displayed based on Google domain.

Location Targeting on the Content Network
If you are running campaigns on the content network, then geotargeting does work and come into play. This essentially means that your ads will show up on sites across the content network (or via specific placements) when visitors to those sites are within your targeted locations. So, if you are targeting Washington DC and your ads show up on about.com, then your ads should only show for visitors from the Washington DC area (or on pages that Google deems local in content). The latter point is similar to query parsing when keywords are involved. For example, if you are reading information about Princeton NJ, but you are outside of the Princeton area, you might still see geotargeted ads for Princeton. Since your ads are contextually targeted on the content network, queries are not part of the targeting process (because there is no query to target). For example, visitors aren’t searching to trigger your ads across the content network. Instead, Google is analyzing the page at hand and determining if your ad matches the content on that page. Note, there’s a difference between a query and a keyword. :)

Geotargeted Ads on the Content Network
Geotargeting on the Content Network.

Bonus: A Quick Note About Local Extensions (A Form of Ad Extensions)
Wouldn’t it be valuable to include your address in your text ad when it’s extremely relevant to the person searching? That’s a leading question, isn’t it? :) Location extensions enable you to do this and they are very easy to set up. If you are a business owner with a Google Local Business Center account, then you can attach your business address to your ads. Note, your listing must be validated in Google Local Business Center for your address to show up. When you use local extensions, your business address will show up below your traditional text ad as seen in the screenshot below. If you don’t have a local business center account, then you can manually enter up to nine addresses that can be used as local extensions. Check out the AdWords help center to learn more about local extensions.

Local Extensions in Action
Query parsing in AdWords.

Be Prepared to Target
I hope this post clarified some of the nuances of geotargeting in AdWords. As a paid search advertiser, it’s important to understand how Google uses location targeting so you can build your campaigns to maximize your results. From query parsing to Google domains to IP detection, there are several factors that can trigger your ads beyond the locations that you think you’re targeting. Now aim for the bullseye and target away. :)

GG

Labels: , ,

Tuesday, November 17, 2009

Invalid Clicks and Click Fraud in Local Search Marketing (SEM) - Giving a Whole New Meaning to the Term HyperLocal


The Impact of Click Fraud on Local AdvertisersI’m currently helping several businesses focused on local advertising with both SEO and SEM (Paid Search, PPC). Depending on the industry and market, Local PPC can be both extremely competitive and pricey. Of course, the upside is capturing those highly targeted clicks and turning them into paying customers, which could yield hundreds or thousands of dollars per conversion. When the difference between page one and page two could be significant amounts of revenue, the companies vying to gain those clicks can become hyper-competitive (and that’s an understatement). I’ve heard stories about some companies incorporating clicking through competitor ads as part of their morning routine… That’s not cool, but very real for the industry and market they are located in.

Based on what I just explained above about, the dark side of paid search ends up rearing its ugly head for some local businesses. In highly competitive industries, and in highly competitive markets, click fraud can run rampant. The thought process is simple (and unethical). Eat up your competitor’s budget so you have more of an opportunity to catch highly targeted clicks. As mentioned above, those highly targeted clicks could yield thousands of dollars per day from new customers (depending on the industry).

I think a lot of people have heard about click fraud, but few have actually explored the problem and how it’s affecting their campaigns. For many local businesses attempting to land the ultra-targeted, “ready to buy” customer, click fraud can be a real click, I mean thorn, in their side. That's not good for anyone involved (including Google and the other search engines).

How Big of a Problem is Click Fraud?
So how much of a problem is click fraud for local businesses? It depends on the industry and market, but I’ve seen click fraud rates as high as 35%. Click Forensics publishes the Click Fraud Index and found that the industry average for Q3 2009 was 14.1%. That’s definitely high, but the abnormally high click fraud rates for local search give a whole new meaning to the term hyperlocal. :) Click fraud rates that high can make a serious dent in your budget, put a strain on ROI for your paid search campaigns, and can end up intensifying the overall click fraud problem (increasing the amount of click fraud as some business owners retaliate). So yes, click fraud is a big problem (and can be especially fierce in local advertising).

Defining Click Fraud:
For those of you not that familiar with click fraud, I’ll provide a quick rundown. There are actually several types of click fraud (and reasons for committing it), but I’ll focus on the act of attempting to deplete a competitor’s paid search budget by clicking on their paid search advertisements. And of course there is no intention of taking action on the competitor’s website once clicking through. In a nutshell, it’s Business A clicking on Business B’s ads in order to deplete Business B’s daily budget. Also note that it doesn’t have to be in the form of repetitive clicks from one location. Business A might hire other people or companies to help click on competitor ads (which can be accomplished via click farms, bots, etc.) You can read more about click fraud in Google’s Ad Traffic Quality Resource Center.

Google and Invalid Clicks
Many local businesses running paid search have no idea that Google actually provides statistics on the “invalid clicks” they catch. And by the way, “invalid” is a nice way of saying “click fraud.” :) Google provides some great reporting functionality as part of AdWords and I think too many companies (especially small businesses that are moving at light speed) never tap into the reporting to track campaign performance.

To access the reporting interface in AdWords, you can click the Reporting Tab, and then Reports. Then you can Create a New Report and choose to run a Campaign Performance Report. As part of setting up this report, you can click Add or Remove Columns to customize the report. Then you can click the checkboxes for Invalid Clicks and Invalid Click Rate to view the statistics at an account or campaign level. Depending on your line of business and where you are located, you might be surprised at how many invalid clicks were recorded for your campaigns…

Accessing AdWords Reporting Functionality:
Creating a new report in Google AdWords

Running An Invalid Clicks Report:
Running an invalid clicks report in Google AdWords


What is an Invalid Click?
Google’s system is continually analyzing clicks and looking for patterns that may be fraudulent. For example, clicks from the same IP address, duplicate clicks, clicks from “known sources of invalid activity”, etc. You can read more about how Google calculates invalid clicks in AdWords help. The system is essentially looking for any type of suspicious activity.

Local Showing a Higher Rate of Click Fraud:
Based on running invalid click reports for a wide range of clients, I typically see a much higher percentage of invalid clicks for companies focused on local search. How much higher? Well, comparing invalid click rates across industries, I’ve seen local-centric clients receive 4X to 5X the percentage of invalid clicks. That’s a lot of clicks, and more importantly, a lot of potential money at risk. Now you might be asking, “Glenn, if Google catches the invalid clicks, then the companies shouldn’t get charged, right?” True, but that’s only for what Google catches… Their system isn’t flawless (especially because well-crafted click fraud is nearly impossible to identify). That’s just an unfortunate reality. So, if you see a 20% invalid click rate, it just might be 30-35%.

The Impact on Budget
Let’s add a monetary value to the click fraud problem I mentioned above. In some industries, local businesses are paying $20-$30 per click (yes, you read that correctly). For argument’s sake, let’s say you receive 100 clicks per day at $20 per click. If Google picks up a 20% invalid click rate, and we estimate that it’s really 30% (just for this example), then there is 10% still getting past Google’s filters. So, the 100 clicks coming through are “actual clicks” according to Google (since it won’t charge you for the invalid clicks, or the 20% it caught). Out of the 100 actual clicks that you are being charged for, the 10% of invalid clicks that slip through equate to 10 clicks at $20 per click (or $200 per day). Over a month, that’s over $6000 per month potentially wasted. For many small businesses, that may be too much to overcome. And that’s exactly what the people committing click fraud want to happen. They want to push competitors to the point of quitting AdWords (and paid search in general), which leaves the fraudsters in control of the paid listings. Needless to say, this isn’t good for the paid search industry, the local businesses getting hit by fraud, and of course Google (since Google makes a majority of its money from paid search).

What Can Local Businesses Do About It?
Although click fraud is a big problem, and one that’s hard to overcome, there are some things you can do to stay on top of the problem. I’ve provided a list of recommendations below to help you stay informed, track your clicks, and potentially fight click fraud. The more you understand what’s going on, the more you can develop a strategy for documenting and combating the problem.

Here’s what you can do:

1. Run invalid click reports on a regular basis. This will help you understand how many invalid clicks are occurring, if they spike during certain times, and which campaigns they are impacting. You can also speak with your Google rep (or any rep at AdWords) about the problem, based on the data you collect.

2. Break up your campaigns logically. You can run invalid click reports on an account or campaign level (but not ad group). If you lump all of your ad groups into one campaign, you won’t get as clear of a picture of the click fraud problem impacting your business.

3. Analyze your log files to determine problematic IP’s. Unfortunately, Google isn’t going to provide details about the invalid clicks they find. They will just show you a total number and not reveal who is committing the click fraud. I think that’s unfortunate, but it’s just the way it is right now. But, you can get in touch with your hosting provider (or your IT department) to analyze your server logs. If you competitor is clicking from a specific IP (like their office down the block from you), you might be able to pick it up. Then work with Google and your lawyer on next steps.

4. There are third party solutions that can help you track and identify click fraud. If you believe that your business is the victim of severe click fraud, you might want to go down this path. For example, Click Forensics (which also publishes The Click Fraud Index mentioned earlier) provides services for ad networks, publishers, agencies, and advertisers. There are also several other solutions for tracking fraudulent clicks that are relatively easy to set up. Do some research and demo the various solutions. They could end up saving you a lot of money.

5. Get familiar with Google’s Ad Traffic Quality Resource Center. There’s some good information about click fraud in the center, including an overview of the problem, key definitions, a help center, ways to contact Google’s quality team, etc.

6. Don’t participate in click fraud. Although it should be obvious, contributing to the overall click fraud problem isn’t going to help anything. You should focus your time and attention on running ethical and ROI-driven paid search campaigns and then deal with click fraud legally. Work with Google, your IT staff, your hosting provider, third party solutions for tracking click fraud, and your lawyer in determining the best path to take.

Not All Clicks Are Created Equal
Is click fraud a problem for local businesses? You bet. But you don’t have to sit there in the dark as your competitors click your ads. You should educate yourself about click fraud, stay vigilant, remain white hat (ethical), and analyze the situation to the best of your ability.

As mentioned earlier, Click Forensics says the industry click fraud rate was 14.1% in Q3 of 2009. As a business owner focused on local advertising, you need to decide if you’re ok with that number... Is click fraud just part of doing business in Local PPC or should you fight to save your budget (and the potential customers that would come from that budget?) Like I said earlier, local click fraud gives a whole new meaning to the term Hyperlocal.

GG

Labels: , , , , ,

Monday, November 09, 2009

FaceYahoogle – The Impact of Facebook, Yahoo, and Google on Website Traffic


The Power of Google, Yahoo, and Facebook on Site Traffic
It’s hard to get through a conversation about online marketing right now without bringing up Google, Facebook, and Yahoo (among other popular companies). However, if you’re not heavily involved in online marketing, and you’re not close to the actual referring traffic numbers from Google, Yahoo, and Facebook, then their influence can easily become nebulous. It’s easy to say, “Google is a powerhouse” or “Facebook has 325 million members”, and “You need to be there”, but how powerful are they really?

From a traffic perspective, the three companies are so powerful that I’ve given them their own combined name, or FaceYahoogle. The power of FaceYahoogle ends up becoming very real for my clients after I complete a competitive analysis (which includes identifying major sources of traffic for their company, as well as their competitors). The numbers I present across websites typically show extremely high referral data from FaceYahoogle, and by viewing the actual traffic numbers, you start to get a feel for how much impact the three entities have traffic-wise and potentially revenue-wise.

Digging Deeper into FaceYahoogle
If you’ve read previous posts of mine, then you already know that I’m a big believer in using data versus opinion to make decisions. The power of analytics in online marketing enables you to see granular performance data across a number of key metrics. And the more websites I analyze, the more I see a significant trend across industry categories. I see FaceYahoogle sending large amounts of traffic to a wide range of sites. The abnormally high percentage of traffic coming from Google, Yahoo, and Facebook is not only amazing to see, it’s actually scary. With thousands and thousands of potential referring sites on the web, to see FaceYahoogle send that high of a percentage of traffic is alarming. I think you begin understand how Google built up a $22 billion war chest! :)

I think many people would suspect Google being high on the referring sites list, based on having ~70% market share in search and also having Gmail, Google Maps, Google Docs, etc. However, I’m not sure many know how much actual traffic is coming from Googleland. Also, we hear that Facebook has over 300 million members, which is powerful, but are those members visiting your site via the social network? I’ll answer that question below via screenshots. And then you have Yahoo, with turmoil somewhat cloaking the power of its sites. How much traffic actually comes from Yahoo Search, Yahoo Mail, Yahoo News, Finance, Answers, etc?

So that’s my quick introduction to FaceYahoogle. Now let’s take a look at some numbers! I have provided Compete data (September 09) for a number of popular websites across a range of categories so you can view their referring sources. Note, I know Compete isn’t perfect, but it does provide a wealth of information to analyze for each website (especially for sites that receive large amounts of traffic).

Referring Sites for NYTimes.com
31% from FaceYahoogle (and 17% from Google alone…)

Referring Sources for The New York Times

Referring Sites for LinkedIn
36% from FaceYahoogle, and over 8% from Facebook.

Referring Sources for LinkedIn

Referring Sites for Weather.com
24% from FaceYahoogle

Referring Sources for Weather.com

Referring Sites for JCrew
31% from FaceYahoogle

Referring Sources for JCrew

Referring Sites for The Huffington Post
33% from FaceYahoogle (and almost 8% from Facebook)

Referring Sources for The Huffington Post

Referring Sites for Yelp
A whopping 55% from FaceYahoogle (and 43% of that from Google!)

Referring Sources for Yelp

Referring Sites for ESPN
25% from FaceYahoogle (and nearly 10% from Facebook)

Referring Sources for ESPN

Referring Sites for Amazon.com
25% from FaceYahoogle (cha-ching…)

Referring Sources for Amazon.com

Referring Sites for Apple.com
28% from FaceYahoogle

Referring Sources for Apple.com

Let’s throw in a military site to see how the 3 headed monster works here:
Referring Sites for AirForce.com
Over 40% of referring traffic from FaceYahoogle

Referring Sources for The US Airforce

The screenshots above make it a little more tangible, right? FaceYahoogle is accounting for 40%+ of referring traffic for some websites. If you analyze website traffic often, then you know how insane those numbers are… But that’s not the whole story. The downstream data is important too. It ends up that a large percentage of traffic from these websites is going back to FaceYahoogle. Let’s take a look at just a few from above.

Downstream Data for Apple.com
26% of visitors leave Apple.com and go back to FaceYahoogle

Downstream Traffic from Apple.com

Downstream Data for AirForce.com
31% of visitors leave Apple.com and go back to FaceYahoogle

Downstream Traffic from AirForce.com

I saw the same trend across the other sites.

So, FaceYahoogle is driving enormous amount of traffic, but it’s also the top recipient of traffic from many sites. In particular, Facebook provides some unique opportunities with regard to downstream traffic. Give your visitors something to take back and you can possibly end up with even more traffic (WOM-based or possibly viral-based). And with some Google and Yahoo traffic going to back to Gmail, Yahoo Mail, Yahoo Answers, etc., you also have opportunities for spreading the word about your products, company, brand, etc. Let’s quickly take a closer look at each part of FaceYahoogle below.

Google
As you can see, Google is an absolute powerhouse, even showing 43% of Yelp's overall referring traffic. That’s outrageous! And it’s not just any traffic, right? Many of the visitors from Google just searched for specific products or services that each site provides (AKA, high quality visitors). Imagine the revenue impact of Google traffic for those sites. In case you are wondering, Google traffic numbers include Search, Maps, Mail, Docs, Video, etc.

Seeing the high percentages from Google across sites, you can start to understand why SEO and SEM have been incredibly hot in online marketing… Some companies survive based on Google traffic alone (via paid and organic search traffic). A slip in rankings can be catastrophic for some websites, with the potential of impacting millions of dollars of revenue. Think about it. If you have 40% of your traffic coming from Google and slip to page two, three, or beyond, you will lose many targeted visitors, and the money they would have spent on your site. So is Google powerful? You bet it is. The numbers combined with my experience tell me so. :)

Facebook
Facebook has grown by leaps and bounds over the past few years and is estimated to have 325 million members now. Clearly people are signing up in droves, using the platform at a staggering pace (104 billion pageviews based on Compete September 09), and oh yeah, they are visiting websites from Facebook. As you can see in the screenshots above, Facebook ranks in the top five referring sites for many of the properties I checked. Actually, it was typically in the top three. And in case you’re wondering, Twitter is moving up the charts too. Depending on the focus on the site in question, I see Twitter sending large amounts of traffic (and that doesn't count desktop clients which many Twitter members use). On that note, to read an example of how Twitter can impact exposure, traffic, and subsequent SEO power, check out my post about the Twitter effect on SEO. It’s a great example of how Search works with Social.

So, if your company is ignoring social media, then go back through the screenshots above and take note of the percentage of referring traffic from Facebook again. In meetings, I find myself saying more and more that if you ignore social media (and especially Facebook and Twitter), do so at your own risk. Again, the numbers are pretty convincing.

Yahoo
Although Yahoo has taken a back seat recently, the numbers are still strong from a referring source perspective. Between Yahoo Search, Yahoo Mail, Yahoo Answers, Yahoo News, Finance, etc. there are still millions of people visiting each property per month. And yes, those sites end up as top referring sources (impacting traffic, sales, sign-ups, twitter followers, Facebook fans, etc.) Yahoo consistently showed up in the top five referring sites, and often number one or two. Don’t count out Yahoo just yet. If you do, you’d be dismissing a huge traffic source (when you take all of their properties into account).

The Future of FaceYahoogle
I’m sure you are wondering which sites will be the major sources of traffic in 2010 and beyond? Will Twitter beat out Facebook, will Bing surpass Google, will Yahoo be non-existent? The beauty of the web (and technology) is that we never know. But the data does tell us something… don’t ignore Search and Social, and how they can work together.

People are searching and people are talking. And the people that are talking can impact how people that are searching find your website. And people searching can lead to sharing on social networks, based on what they find. Look at the numbers again. Don’t discount Facebook because you think people are tagging photos or playing games all day. You also shouldn’t disregard Google’s dominance. It is too powerful to ignore. And Yahoo isn’t dead yet. There are millions of people visiting Yahoo Sites on a regular basis.

Last, to emphasize that we never really know what will take off, I have provided Twitter’s trending below (Compete data over the past 24 months). I bet many people don’t even know that it was around in 2006 and 2007… and that it crept along until 2008 when it gained serious traction. So, is the next Twitter out there right now slowly growing and about to gain traction? Good question. :)

Click the image to view a larger version:
Twitter Trending 24 Months

GG

Labels: , , , , , , ,

Tuesday, October 13, 2009

SEO and AJAX: Taking a Closer Look at Google’s Proposal to Crawl AJAX


Taking a closer look at Google's proposal for crawling AJAX.Last week at SMX, Google announced a proposal to crawl AJAX. Although it was great to hear the official announcement, you had to know it was coming. Too many web applications are using AJAX for Google to ignore it! After the news was released, I received a lot of questions about what the proposal actually means, how it works, and what the impact could be. There seemed to be a lot of confusion, and even with people in the Search industry. And I can understand why. If you don’t have a technical background, then Google’s blog post detailing the proposal to crawl AJAX can be a bit confusing. The mention of URL fragments, stateful pages, and headless browsers can end up being confusing for a lot of people, to say the least. And if you’ve never heard of a headless browser, fear not! Since it’s close to Halloween and I grew up near Sleepy Hollow, I’ll spend some time in this post talking about what a headless browser is.

So based on my observations over the past week or so, I decided to write this post to take a closer look at what Google is proposing. My hope is to clear up some of the confusion so you can be prepared to have your AJAX crawled. And to reference AJAX’s original slogan, let’s find out if this proposal is truly Stronger Than Dirt. :)

Some Background Information About SEO and AJAX:
So why all the fuss about AJAX and SEO? AJAX stands for asynchronous JavaScript and xml, and when used properly, it can create extremely engaging web applications. In a nutshell, a webpage using AJAX can load additional data from the server on-demand without the page needing to refresh. For example, if you were viewing product information for a line of new computers, you could dynamically load the information for each computer when someone wants to learn more. That might sound unimpressive, but instead of triggering a new page and having to wait as the page loads all of the necessary images, files, etc., the page uses AJAX to dynamically (and quickly) supply the information. As a user, you could quickly see everything you need and without an additional page refresh. Ten or more pages of content can now be viewed on one… This is great for functionality, but not so great for SEO. More on that below.

Needless to say, this type of functionality has become very popular with developers wanting to streamline the user experience for visitors. Unfortunately, the search engines haven’t been so nice to AJAX-based sites. Until this proposal, most AJAX-based content was not crawlable. The original content that loaded on the page was crawlable, but you had to use a technique like HIJAX to make sure the bots could find all of your dynamically loaded content. Or, you had to create alternative pages that didn’t use AJAX (which added a lot of rework.) Either way, it took careful planning and extra work by your team. On that note, I’ve yet to be part of project where AJAX developers jump up and down with joy about having to do this extra work. Based on what I explained above, Google’s proposal is an important step forward. But there just had to be a better solution.

What is Google’s Proposal to Crawl AJAX?
When hearing about the proposal, I think experienced SEO’s and developers knew there would be challenges ahead. It probably wasn’t going to be a simple solution. And for the most part, we were right. The proposal is definitely a step forward, but webmasters need to cooperate (and share the burden of making sure their AJAX can be crawled). In a nutshell, Google wants webmasters to process AJAX content on the server and provide the search engines with a snapshot of what the page would look like with the AJAX content loaded. Then Google can crawl and index that snapshot and provide it in the search results as a stateful URL (a URL that visitors can access directly to see the page with the AJAX-loaded content).

If the last line threw you off, don’t worry. We are going to take a closer look at the process that’s being proposed below.

Getting Your AJAX Crawled: Taking a closer look at the steps involved:

1. Adding a token to your URL:
Let’s say you are using AJAX on your site to provide additional information about a new line of products. A URL might look like:

example.com?productid.aspx#productname

Google is proposing that you use a token (in this case an exclamation point !) to make sure Google knows that it’s an AJAX page that should be crawled. So, your new URL would look like:

example.com?productid.aspx#!productname

When Google comes across this URL using the token, it would recognize that it’s an AJAX page and take further action.

2. The Headless Browser (Scary name, but important functionality.)
Now that Google recognizes you are using AJAX, we need to make sure it can access the AJAX page (and the dynamically loaded content). That’s where the headless browser comes in. Now if you just said, “What the heck is a headless browser?”, you’re not alone. That’s probably the top question I’ve received after Google announced their proposal. A headless browser is a GUI-less browser (a browser with no graphical user interface) that will run on your server. The headless browser will process the request for the dynamic version of the webpage in question. In the blog post announcing this proposal, Google referenced a headless browser called HTMLUnit and you can read more about it on the website.

Why would Google require this? Well, Google knows that it would take enormous amounts of power and resources to execute and crawl all of the JavaScript being used today on the web. So, if webmasters help out and process the AJAX for Google, then it will cut down on the amount of resources needed and provide a quick way to make sure the page gets properly crawled.

To continue our example from above, let’s say you already provided a token in your URL so Google will recognize that it’s an AJAX page. Google would then request the AJAX page from the headless browser on your server by escaping the state. Basically, URL fragments (an anchor with additional information at the end of a URL), are not sent with requests to the server. Therefore, Google needs to change that URL to request the AJAX page from the headless browser (see below).

Google would end up requesting the page like this:
example.com/productid.aspx?_escaped_fragment=productname
Note: It would make this request only after it finds a URL using the token explained above (the exclamation point !)

This would tell the server to use the headless browser to process the page and return html code to Google (or any search engine that chooses to participate). That’s why the token is important. If you don’t use the token, the page will be processed normally (AJAX-style). If that’s the case, then the headless browser will not be triggered and Google will not request additional information from the server.

3. Stateful AJAX Pages Displayed in the Search Results
Now that you provided Google a way to crawl your AJAX content (using the process above), Google could now provide that URL in the search results. The page that Google displays in the SERPs will enable visitors to see the same content as if they were traversing your AJAX content on your site. i.e. They will access the AJAX version of the page versus the default content (which is what would normally be crawled). And since there is now a stateful URL that contains the AJAX content, Google can check to ensure that the indexable content matches what is returned to users.

Using our example from above, here is what the process would look like:
Your original URL:
example.com/productid.aspx#productname

You would change the URL to include a token:
example.com/productid.aspx#!productname

Google would recognize this as an AJAX page and request the following:
example.com/productid.aspx?_escaped_fragment=productname

The headless browser (on your server) would process this request and return a snapshot of the AJAX page. The engines would then provide the content at the stateful URL in the search results:
example.com/productid.aspx#!productname

Barriers to Acceptance
This all sounds great, right? It is, but there are some potential obstacles. I’m glad Google has offered this proposal, but I’m worried about how widespread of an acceptance it’s going to gain. Putting some of the workload on webmasters presents some serious challenges. When you ask webmasters to install something like a headless browser to their setup, you never know how many will actually agree to participate.

As an example, I’ve helped a lot of clients with Flash SEO, which typically involves using SWFObject 2.x to provide alternative and crawlable content for your flash movies. This is a relatively straightforward process and doesn’t require any server-based changes. It’s all client side. However, it does require some additional work from developers and designers. Even though it’s relatively painless to implement, I still see a lot of unoptimized flash content out there… And again, it doesn’t require setting up a headless browser on the server! There are some web architects I’ve worked with over the years that would have my head for requesting to add anything to their setup, no pun intended. :) To be honest, the fact that I even had to write this post is a bad sign… So again, I’m sure there are challenges ahead.

But, there is an upside for those webmasters that take the necessary steps to make sure their AJAX is crawlable. It’s called a competitive advantage! Take the time to provide Google what it wants, and you just might reap the benefits. That leads to my final point about what you should do now.

Wrapping Up: So What Should You Do?
Prepare. I would spend some time getting ready to test this out. Speak with your technical team, bring this up during meetings, and start thinking about ways to test it out without spending enormous amounts of time and energy. As an example, one of my clients agreed to wear a name tag that says, “Is Your AJAX Crawlable?” to gain attention as he walks the halls of his company. It sounds funny, but he said it has sparked a few conversations about the topic. My recommendation is to not blindside people at your company when you need this done. Lay the groundwork now, and it will be easier to implement when you need to.

Regarding actual implementation, I’m not sure when this will start happening. However, if you use AJAX on your website (or plan to), then this is an important advancement for you to consider. If nothing else, you now have a great idea for a Halloween costume, The Headless Browser. {And don’t blame me if nobody understands what you are supposed to be… Just make sure there are plenty of SEO’s at the Halloween party.} :)

GG

Related Posts:
The Critical Last Mile for SEO: Your Copywriters, Designers and Developers
Using SWFObject 2.0 to Embed Flash While Providing SEO Friendly Alternative Content
6 Questions You Should Ask During a Website Redesign That Can Save Your Search Engine Rankings
SEO, Forms, and Hidden Content - The Danger of Coding Yourself Into Search Obscurity

Labels: , , , ,

Tuesday, September 08, 2009

SEO, Forms, and Hidden Content - The Danger of Coding Yourself Into Search Obscurity


How forms and web applications can hide content from the search engines.When I perform a competitive analysis for a client, I often uncover important pieces of information about the range of websites they are competing with online. Sometimes that information is about traffic, campaigns, keywords, content, inbound links, etc. There are also times I uncover specific practices that are either beneficial or problematic for the competitor. For example, they might be doing something functionality-wise that could be inhibiting the overall performance of the site. If I do uncover something like that, I usually dig much deeper to learn more about that problem to ensure my clients don’t make the same mistakes. So, I was analyzing a website last week and I uncovered an interesting situation. On the surface, the functionality the site was providing was robust and was a definite advantage for the company, but that same functionality was a big problem SEO-wise. Needless to say, I decided to dig deeper to learn more.

Slick Web Application Yielding Hidden Content
As part of the competitive analysis I was completing, I came across a powerful web application for finding a variety of services based on a number of criteria. The application heavily used forms to receive information from users. The application included pretty elaborate pathing and prompted me to clarify answers in order to provide the best recommendations possible. After gathering enough information, I was provided with dozens of targeted service listings with links to more information (to more webpages on the site). So you might be thinking, “That sounds like a good thing Glenn, what’s the problem?” The problem is that the web application, including the robust form functionality, essentially hid all of the content from the search engines. In this case, we are talking about more than 2000 pages of high quality, high demand content. I say “high demand”, because I completed extensive keyword research for this category and know what people are searching for. Unfortunately for this company, the application yielded results that are simply not crawlable, which means the site has no chance to rank for competitive keywords related to the hidden pages. And by all means, the site should rank for those competitive keywords. For those of you asking, “but isn’t Google crawling forms?” I’ll explain more about that below. For this application, none of the resulting content was indexed.

Losing Visitors From Natural Search and Missing Opportunities For Gaining Inbound Links
Let’s take a closer look at the problem from an SEO standpoint. Forms often provide a robust way to receive user input and then provide tailored information based on the data collected. However, forms can also hide that content from the search engine bots. Although Google has made some strides in executing forms to find more links and content, it’s still not a perfect situation. Google isn’t guaranteeing that your forms will be crawled, it limits what it will crawl to GET forms (versus POST), and some the form input is generated by common keywords on the page (for text boxes). That’s not exactly a perfect formula.

Using forms, you might provide an incredible user experience, but you might also be limiting the exposure and subsequent traffic levels to your web application from natural search. I come across this often when conducting both SEO technical audits and competitive analyses for clients. In this case, over 2000 pages of content remain unindexed. And if the content is not indexed, then there is no way for the engines to rank it highly (or at all).

The Opportunity Cost
Based on the keyword research I performed, a traffic analysis of competing websites, and then comparing that data to the 2000 pages or so of hidden content, I estimate that the site in question is missing out on approximately 10-15K highly targeted visitors per day. That additional traffic could very easily yield 300-400 conversions per day, if not higher, based on the type of content the site provides.

In addition to losing targeted traffic, the site is missing a huge opportunity to gain powerful inbound links, which can boost its search power. The content provided (yet hidden) is so strong and in demand, that I can’t help but think the 2000 pages would gain many valuable inbound links. This would obviously strengthen both the domain’s SEO power, as well as the power of the specific pages (since the more powerful and relevant inbound links your site receives, the more powerful it is going to become SEO-wise.)

Some Usability Also Hindered
Let’s say you found this form and took the time to answer all the questions. After you completed the final steps of the form, you are provided with a list of quality results based on your input. You find the best result, click through to more information, and then you want to bookmark it so you can return later. But unfortunately you can’t… This is due to the web application, which doesn’t provide permanent URL’s for each result. Yes, the form is slick and its algorithm is great, but you don’t have a static page that you can bookmark, email to someone else, etc. How annoying is that? So if you want to return to the listing in question, you are forced to go back through the form again! It’s another example of how SEO and usability are sometimes closely related.

SEO and Forms, A Developer's Perspective
I started my career as a developer, so I fully understand why you would want to create a dynamic and powerful form-based application. This specific form was developed using asp.net, which utilizes postback (where the form actually posts back information to the same page). The URL doesn’t change, and the information submitted is posted back to the same page where the programmer can access all of the variables. Coding-wise, this is great. SEO-wise, this produces one URL that handles thousands of different pieces of content. Although you might have read that Google started crawling html forms in 2008, it’s a work in progress and you can’t guarantee that all of your forms will be crawled (to say the least…) On that note, you should really perform a thorough analysis of your own forms to see what Google is crawling and indexing. You might be surprised what you find (good or bad). So, the application I analyzed (including the forms) isn’t being crawled, the URL never changes, the page optimization never changes, and the content behind the form is never found. This is not good, to say the least.

If I were advising the company using this application, I would absolutely recommend providing another way to get the bots to all of this high quality content. They should definitely keep their robust web application, but they should also provide an alternative path for the bots. Then they should optimize all of those resulting webpages so they can rank for targeted queries. I would also disallow the application in robots.txt, blocking the bots from crawling any URL’s that would be generated via the form (just in case). With the right programmer, this wouldn’t take very long and could produce serious results from natural search…

The Most Basic SEO Requirement: Your Content Needs to be Found In Order to Rank
It sounds obvious, but I run into this problem often as I perform SEO technical audits. Your killer content will not rank just because it’s killer content. The content needs to be crawled and indexed in order to rank highly for target keywords. In this case, the site should definitely keep providing its outstanding functionality, but they should seriously think about the search implications (and provide an easy way for the bots to find optimized content.)

The bad news for my client's competitor is that I believe they aren’t aware of the severity of the problem and how badly it’s impacting their natural search traffic. However, the good news for my client is that they know about the problem now, and won’t make the same mistake as their competitor. That’s the power of a competitive analysis. :)

GG

Related Posts:
6 Questions You Should Ask During a Website Redesign To Save Your Search Engine Rankings
The Critical Last Mile for SEO, Your Copywriters, Designers, and Developers

Labels: , , ,

Wednesday, August 12, 2009

Your Google Local Business Center Dashboard, Analyzing and Refining Your Google Maps Listing Based on Analytics


Google Local Business Center DashboardMore and more small businesses are realizing the importance of advertising online, including how to maximize their presence in Search. As local businesses get more involved in online marketing, they begin to understand how prospective customers research products and services. Needless to say, many are searching for information online. And, if you offer a product or service they are looking for, it’s obviously important for you to show up for targeted searches. If you don’t rank highly for target keywords, other businesses are...and they are the ones receiving calls (or visits in person).

In addition, there are searches that Google and the other engines deem as “local” in nature. For example, bakery in Princeton, NJ and florist in Miami, FL. Google may provide a 10 pack of local results for searches like this, and it’s important to make sure you show up. Even further, Google recently changed the way it processes requests that it deems local. For example, you often don’t need to put a location to trigger the 10 pack. Google knows your location and provides tailored local results for you. How nice. :)

To learn more about local listings in Google, you can read a previous post of mine about how to set up a Google maps listing in Google Local Business Center. In the post I walk you through what it is and how to set one up. By the way, once you take a hard look at Google’s 10 pack of local listings, it should be no surprise that it attracts a lot of attention. The 10 pack, which sometimes shows less than 10 listings, contains a map with markers showing the location of each business. It’s pretty hard to ignore this on the results page… The 10 pack also pushes down the organic results, which can potentially move your organic listing down the page.

Why Continual Analysis Can Provide Serious Benefits
I've found that many local businesses either don't have a listing or they set one up and check it off their list, never to return to analyze and refine the listing. But hold on a second… businesses should really be asking themselves, “How is that local listing working for me?” I recently had a client make some relatively minor changes based on reporting. These changes ended up having a significant impact on their local rankings and subsequent visits and calls from prospective customers. That’s pretty powerful considering the reporting they analyzed cost them nothing. Yes, $0. I helped my client use data provided to them in their Google Local Business Center Dashboard. You might have heard about this recently, as Google launched it in June of this year. That said, I’m sure some of you reading this post have no idea what it is. That’s ok, since this post is here to provide a thorough overview of your local dashboard, while also giving you some ideas for how to best use the data to attract prospective customers.

The Google Local Business Center Dashboard, Free Analytics for Local Businesses
OK, let’s assume you read my post about setting up your Google maps listing and you are showing up for some targeted searches. That’s great, but do you really know how well that listing is working for your business? Until recently (June 2009), you really didn’t have a lot of insight into the performance of your local listing. Sure, you probably had Google Analytics or another analytics package set up, but that doesn’t specifically give you data about your local listing. Thankfully, Google understood this and did something about it. They rolled out a Local Business Center Dashboard that is basically a scaled down Google Analytics report for your local listing. It provides some important data about how your listing is being triggered, viewed, and accessed. Let’s explore the features below.

The Features of Your Local Dashboard
First, log into Google Local Business Center. You will see your business information, status, and a label for “Statistics”. Under the heading for statistics, you will see a quick view of impressions and actions. Impressions include the number of times your local listing was triggered and viewed as a result of a search on Google or Google Maps. Actions include when someone viewing your listing actually interacted with it. More on this shortly. Click the “View Report” link to access your dashboard.

Accessing the dashboard from Google Local Business Center
Google Analytics-like Graphs for Impressions and Actions
The first thing you will see is a timeline at the top of the page showing activity for your listing. The chart breaks down impressions and actions visually by day, over the time period you selected. The default timeframe is the past 30 days, but you can easily change that by using date range selector in the upper right corner and then clicking apply. Right below the timeline, you will see the number of impressions, which again is the number of times your listing is viewed as a result of a search on Google or on Google Maps. Underneath impressions, you will see a breakdown of actions, which is the number of times a user took “action” with your listing. Possible actions include clicks for more information on Google Maps, clicks for driving directions, and clicks to your website. Actions are aggregated in the graph, but actually broken down underneath the graph. Providing this reporting enables you to get a quick snapshot of the performance of your local listing.

Viewing impressions and actions in Your Google Local Business Center Dashboard
What to look for:
You might notice spikes in impressions and actions based on advertising campaigns you have launched. You can identify the most active days of the week or periods of time based on activity. For example, are many people searching for your services on weekends or during the week, right before holidays, or heavily during a specific season? You can also test the effectiveness of the details of your listing. Google provides the ability to edit the details of your local listing, so my recommendation is to test various ways to communicate your business and then view the impact on impressions and actions. For example you can refine your description, specialties, and categories served to determine the optimal combination of elements. Don’t just throw up a local listing without revisiting its performance on a regular basis.

Top Search Queries
Below the breakdown of actions, you will find top search queries that triggered your local listing, along with the number of impressions. Although this isn't a robust list of keywords like you would see in Google Analytics or another analytics package, it still provides important data for you to review. You probably have an idea about the types of keywords that trigger your listing, however, I’ll bet some of the keywords in the list surprise you. It’s just like when I talk about performing keyword research, you should find what people are actually searching for versus what you think they are searching for. Trust data, and not necessarily opinion.

Click the image below to view a larger version:
Viewing top search queries in Your Google Local Business Center Dashboard

Are there keywords you never thought about targeting that people are actually searching for? Analyzing even this simple keyword report can help you target the right people locally, based on what they are really looking for. For example, let's say you are a florist focused on wedding arrangements and none of the keywords triggering your listing seem targeted for that niche. You find that most people are searching for gifts or flowers versus a specific type of arrangement. Or, you might find the opposite is true and that people are searching for very specific types of arrangements. Again, you never know until you look. Then you can determine the best path to take with regard to your local listing.

Based on what you find, you should start to think about why your listing is showing up for those searches. Is that because of the type of search being conducted or the information contained in your actual listing? It’s a good question and it is definitely worth analyzing... For example, did you let Google know that you provide organic food at your restaurant? Take the time to analyze the data and make changes to your listing. Don’t miss out on customers. In addition, the data can help you craft new marketing messages, and even possibly how you explain your business in person or via other forms of advertising. Using the example above, are you using the word organic in your advertising, whether that’s on TV, in mailers, at shows or festivals, and when you speak with people in your community. If they are searching for it, you might want to start including it. :)

Know Where Your Customers Are Coming From (Literally)
Underneath top search queries, you will find a list of zip codes, based on where driving direction requests are coming from. To clarify, this is when someone clicks “Directions” or “Get Directions” from your local listing. This data would mean more to a business with a physical location serving local customers and can provide some interesting data. For example, you can see the impact of offline marketing, you can see which areas provide high demand for your products or services, and can help you craft future advertising campaigns. For example, I know some local businesses like to attend town festivals, which enable you to set up a booth. Let’s say you planned to attend four festivals in the fall (at $750 per booth). Your knee jerk reaction might be to set up at festivals that are in close proximity to your business, maybe the four closest towns to your business. However, you might change that strategy based on data you view in your dashboard. Maybe more requests are coming from locations 10-15 minutes away versus 5 minutes away. You actually might pass on the festivals right around your town and target ones that are two or three towns over. Again, you don’t know until you review the data. If you don’t, you could miss opportunities to get in front of more targeted groups of people. This is why I always recommend continual analysis and refinement based on data. It has become a motto here at G-Squared Interactive.

Click the image below to view a larger version:
Viewing where direction requests are coming from in Your Google Local Business Center Dashboard

Go Check Your Local Dashboard Now
So there you have it, an overview of your Local Business Center Dashboard, or what I like to call a scaled down Google Analytics report for your local listing. I would love to see the ability to access more data, but this is still better than flying blind (which is what many businesses were doing beforehand).

Here are some key points to think about after reading this post:
* First, do you have a local listing and are you effectively managing that listing?
* Second, are you reviewing reporting for your listing and making changes based on the data?

Remember, you don’t want to miss an opportunity that’s right around the corner…literally. :)

GG

Related Posts:
How to Set Up Your Google Maps Listing
How to Perform Keyword Research for SEO
The Difference Between Sales and Marketing

Labels: , , ,

Thursday, June 11, 2009

How To Create A Google News Sitemap and Submit It Via Google Webmaster Tools


Creating and submitting a Google News sitemap.As Twitter and Facebook boom, the need for real-time search grows more important. When people want information about breaking news, they Google it. It’s their initial reaction... And if you're not there, you might as well not exist (even if you have the greatest article on the web about the subject at hand.) So, when I’m analyzing websites that contain articles and posts that could be considered news, I'm obviously interested in seeing the amount of traffic coming from sites like Google News. After checking referring traffic levels, top content, and trending, I check to see if a Google News sitemap exists. I’ve always been a believer that if Google provides a way to send it structured data with additional information about your posts and articles, you should use it (period!) Unfortunately, many site owners don’t take the time to set up a Google News sitemap. I think it sounds harder to do than it really is, so they just brush it off. As you probably can guess, I think that’s a bad idea. :)

Google News Being More Than Google News…
When searching for a hot topic, some people head straight to Google News, however, many simply search on Google’s homepage or via their Google Toolbar. The way your listing shows up will vary depending on where the user searches. For example, thanks to Universal Search, news content is being mixed into the organic listings for targeted queries. For example, you might see a thumbnail and headline in a Google News one box at the top of the search results. See the screenshots below for a few examples.

Example of Google News one box in search engine results.

How Google News content can show up blended into the organic search results.

I’ve found that news content ranking in the organic listings can be a powerful driver of highly targeted search traffic (for obvious reasons). By the way, having your listing show up in the SERPs (with associated thumbnail) substantially increases your chances of click-through. Check the latest Google heatmap study to see the effect of Universal Search on user behavior if you don’t believe me. :) It also provides a great opportunity to gain valuable readers and subscribers, since you might be viewed as an authority site by visitors (since you rank highly in Google News.) Don't underestimate how powerful top rankings can be credibility-wise.

So, how do you make sure Google has the necessary information about your latest articles, posts, and content so you can have a chance of ranking in Google News (and as part of Universal Search)? One way is to provide a Google News sitemap. Let’s dig deeper.

What is a Google News Sitemap?
In a nutshell, a Google News sitemap is an xml feed that enables you to tell Google about your latest content, including information like publication date and news tags or keywords. In addition, as part of the keywords you provide, you can include Google News categories. You might already be familiar with xml sitemaps, or the xml feeds you provide Google and the other search engines that contain all the URL's on your site. Google News sitemaps are similar, just tailored for news-related content. Note, Google requires that the information contained in the sitemap is less than three days old, so you wouldn't want to provide a running list of URL's in the feed. Instead, you would want to make sure your latest posts and stories are included. For example, if you provide the latest in electronics or search engine marketing or celebrity news, then a Google News sitemap containing your latest articles would be a smart feed to employ.

What Information Should You Provide In A Google News Sitemap?
You should create a Google News sitemap using the sitemap protocol (which is what you are probably using to create your standard xml sitemap). The core elements of a news sitemap include the namespace/URLset tag, your list of URL’s, publication date of each URL in W3C format, and optional news tags (which can include Google News categories). There's no limit to the number of keywords you can provide, but Google recommends you keep them fewer than 12. Click here to see a full listing of all categories used by Google News.

A Quick Example of a Google News Sitemap:
Let’s say I ran a website covering the latest in baseball. To keep this example simple, here is what my Google News sitemap would look like if it contained two new articles: (Can you tell I'm optimistic about the Yankees this year?)

Click the image below to view a larger version:
A sample Google News sitemap.

Submitting Your Google News Sitemap
Once you create your Google News sitemap, you should submit it via Google Webmaster Tools. Note, webmaster tools was just updated (June 10, 2009), and now you can find the sitemaps tab by clicking the plus sign next to Site Configuration (the first listing in the left navigation). First, upload your sitemap to your website (in the root directory of your website). Then submit your sitemap via webmaster tools by entering its location in the text box once you click the sitemaps tab.

Submitting a Google News sitemap via Google Webmaster Tools.

Including a Reference to Your Sitemap or Sitemap Index File in Robots.Txt
You would also want to include a reference to your sitemap in your robots.txt file. If you have more than one sitemap, then use a sitemap index file, which can contain references to up to 1000 sitemaps (although you will probably never come close to that number). In addition, each news sitemap should not contain more than 1000 URL's. If your sitemap contains URL's older than 3 days, they will be rejected. If you have more than 1000 URL’s for your news sitemap, break them into separate sitemap files.

Here is what you would enter in your robots.txt file on a new line. Note, you would either enter the location to the sitemap file itself or the sitemap index file, which would reference several sitemap files.

Sitemap: {sitemap_location}

Google Webmaster Tools and Error Messages
Be sure to monitor your news sitemap in Google Webmaster Tools to view any errors being encountered by Google. Google will notify you and provide the exact error message, which can be extremely helpful. There are a number of errors that can occur, such as date not found, date too old, empty article, etc. You can find a full list of Google News sitemap errors here.

Moving Forward With Your Google News Sitemap
Based on what I’ve explained above, my hope is that you are ready to create your own Google News sitemap. It’s relatively straight forward to create and submit and can help you notify Google of all the news-related content hitting your website(s). In addition, if you automate the creation of your Google News sitemap, then it can work for you without having to dedicate any additional resources to it… It’s one of the projects I often recommend knocking out before other, more time-consuming SEO projects. Good luck and stop back and let me know how it worked out for you. I’d love to hear your thoughts.

GG

Labels: , , , , ,

Thursday, May 14, 2009

Killer Content, A Loyal Community, The Twitter Effect, and Its Impact on SEO


How the social web, great content, and seo all work together.
How's that for a title? I witnessed a pretty amazing thing last week from an online marketing perspective. I love finding dynamic examples of how the social web works, especially when it unfolds right in front of your eyes over just a few hours. What I experienced last week was an outstanding example of how great content, a loyal following, respect in the industry, and SEO all tie together. It's kind of like the perfect storm, but in a good way.

Organic Linkbuilding
First, I'm a believer that your best linkbuilding comes naturally. If you create killer content that provides value to your readers and visitors, you often will end up generating high quality links. In my experience, I've seen a direct relationship between the time and care you take to create content and the impact that content has from a linkbuilding standpoint. For example, I've developed content that took a relatively long time to create (days to write and sometimes weeks to research), but based on the popularity of that content, the buzz it generated, the targeted traffic, and subsequent inbound links, it was well worth the time. Compare that to content developed or written quickly, with little or no thought put in, provides little value, and subsequently has no impact. It makes a lot of sense if you think about it. Are you going to link to a quick post that provides no value and no original content? Probably not, right? But you might link to a post that greatly helps your efforts (for whatever you are trying to achieve).

How it Unfolded
So let's get back to what happened last week? Here's the deal. I watched an editor break a story on a website (providing killer content), I saw that content go viral on Twitter (due to a loyal following), then it got picked up by a popular industry website (due to respect in the industry), and then I saw that content go on to generate over 22,000 inbound links in a matter of days. I saw how the content ranked in just hours in Google (due to Query Deserves Freshness QDF), and then how it ended up ranking for dozens of competitive keywords in a short period of time. That's darn powerful.

Let's break down what happened and its impact:

1. Content
The content was great (a scoop), and probably wasn't easy to come by. But providing valuable content (in this case breaking news), is only part of the equation. That news could have easily led to little traffic, no links, and no rankings, right? Everyone has heard about sites getting their scoops ripped off. That's a good segue...

2. Loyal Community
Enter the next important part of the equation. If you're publishing to a black hole, who cares about what you write. But, if you've built up a serious following, earned respect, and engage your community, then amazing things can happen. In this case, community members starting tweeting, then retweeting, and more retweeting. You get the picture. I scrolled through pages and pages of tweets linking to the story. For people that think Twitter provides no value, please read this section again. :)

3. Respect in the Industry
Ah, the point at which things can take a different path. What happens if people try to steal your scoop? For example, they find out the breaking news from you and then post their own version of it, essentially watering down your impact. I don't care who you are, that's a horrible feeling and happens more than you think. But, if you've gained the respect of your peers (even beyond your community), you might see an interesting effect, like what I saw last week. A major industry website wrote an article about the breaking news and linked to the scoop I mentioned earlier. A “hat tip”, so to speak. That hat tip ended up being the top referring source for a few days. Again, powerful (and a great link for SEO too.)

4. SEO Power
The culmination of what I listed above was 22,588 inbound links, including links from some powerful websites in the industry. Inbound links are the lifeblood of SEO, so gaining thousands of them from relevant and powerful sites is a good thing. :) This article generated quality links, and a lot of them. This resulted in top rankings for competitive keywords around the subject matter. Right now, the site ranks for dozens of keywords related to the subject of the article. And, that was after just a few days.

Also, I mentioned Query Deserves Freshness (QDF) earlier. That's a part of Google's algorithm that determines when a query requests information about breaking news and which listings to provide that reference the breaking news. Google determines this by monitoring the activity around a given subject. The content Google provides in the SERPs may be new blog posts or stories from trusted sites that don't have any inbound links yet (or are in the process of increasing inbound links). The site I was monitoring is definitely a trusted site in the industry, and benefited from QDF. In case you want to learn more, Rand Fishkin from SEOmoz provides a video explaining the ins and outs of QDF. As usual, Rand does a great job explaining how it works.

Let's summarize what happened:
So, after just a few days the article ended up being one of the most popular pages traffic-wise, it generated quality visitors, and incredible rankings in organic search. It's a great example of how the social web works and its connection to SEO. A quick side note, the page wasn't perfectly optimized for SEO, but it still ranks like mad. I think it shows which SEO factors are most important, right? (cough, quality inbound links) I can only imagine what the page would rank for if it was well optimized! :)

So, have you witnessed something like this? I'd love to hear your thoughts!

GG

Labels: , , , , ,

Thursday, April 23, 2009

What To Do When You've Been Labeled An Attack Site By Google, My Guest Post About Malware on Search Engine Journal


Steps to take when your site has been labeled an attack site that contains malware.Imagine you wake up one morning and notice a significant drop in traffic to your website. You dig deeper in your analytics package and notice that search traffic from Google is down (as part of the larger overall drop). You start checking rankings for keywords that drive a lot of traffic to your site and notice that you still are ranking…but there’s a slight addition to your listing in the SERPs:

“This site may harm your computer.”

Yes, Google has labeled you as an attack site! It gets worse, though. When you are identified as an attack site that contains malware, Firefox 3.x users will be redirected to an interstitial page warning them about your site. Not good, right? Between the new line in your search listing, an interstitial page presented by Google, and another presented by Firefox, you can experience a serious negative impact on your traffic levels (and revenue levels.)

Needless to say, you would want to tackle the problem quickly and efficiently. But where do you start? Well, that’s the focus of my guest post on Search Engine Journal, which went live yesterday. To learn more about the attack site situation, including steps to resolve the problem, you’ll have to visit my post on SEJ! :)

My guest post:
Yes, You’re An Attack Site That Contains Malware, Now Here’s What To Do About It

If you have dealt with attack site or malware situations, please post a comment either here or on my post on Search Engine Journal. I’d love to hear how you handled the problem and how you cleared your website’s name!

GG

Labels: , ,