The Internet Marketing Driver: Glenn Gabe's goal is to help marketers build powerful and measurable web marketing strategies.

sunday, April 25th, 2010

A Baker’s Dozen: A Quick Update on Kati’s Kupcakes, The Winner of The Search a Small Business Holiday Giveaway [PODCAST]

If you’re a frequent reader of my blog, then you probably remember the Search a Small Business Holiday Giveaway I launched this past December. The purpose of the contest was to give an ultra-small business in New Jersey a free online marketing audit, which would produce plan for enhancing the company’s digital strategies.

Continue reading this post>

Wednesday, March 17, 2010

.htaccess for Windows Server: How To Use ISAPI Rewrite To Handle Canonicalization and Redirects For SEO

ISAPI Rewrite, .htaccess for Windows Server.If you’ve read previous blog posts of mine, then you know how important I think having a clean and crawlable website structure is for SEO. When performing SEO audits, it’s usually not long before the important topic of canonicalization comes up. Canonicalization is the process of ensuring that you don’t provide the same content at more than more URL. It’s also one of the hardest words in SEO to pronounce. :) If you don’t address canonicalization, you can end up with identical content at multiple URL’s, which can present duplicate content issues. And you don’t want duplicate content. For example, you don’t want your site to resolve at both non-www and www, at both http and https, using mixed case, having folders resolve with and without trailing slashes, etc.

In addition to handling canonicalization, you also want to have a system in place for handling 301 redirects. A 301 redirect is a permanent redirect and will safely pass PageRank from one URL to another. This comes in handy in several situations. For example, if you go through a website redesign and your URL’s change, if you remove campaign landing pages, if you remove old pieces of content, etc. If you don’t 301 redirect these pages, you could end up paying dearly in organic search. Imagine hundreds, thousands, or millions of URL’s changing without 301 redirects in place. The impact could be catastrophic from an SEO standpoint.

Enter ISAPI Rewrite, .htaccess for Windows Server
So I’m sure you are wondering, what’s the best way to handle canonicalization and redirects for SEO? If you conduct some searches in Google, you’ll find many references to .htacess and mod_rewrite. Using mod_rewrite is a great solution, but it’s only for Apache Server, which is mainly run on linux servers. What about windows hosting? Is there a solution for .net-driven websites?

The good news is that there is a solid solution and it’s called ISAPI Rewrite. ISAPI Rewrite is an IIS filter that enables you handle URL rewriting and redirects via regular expressions. It’s an outstanding tool to have in your SEO arsenal and I have used it now for years. There are two versions of ISAPI Rewrite (verseions 2 and 3) and both enable you to handle most of what .htaccess can do. Actually, I think so much of ISAPI Rewrite, that it’s the topic of my latest post on Search Engine Journal.

So, to learn more about ISAPI Rewrite, the two versions available, and how to use it (including examples), please hop over to Search Engine Journal to read my post.

ISAPI Rewrite: Addressing Canonicalization and Redirects on Windows Server


Labels: , , , , , ,

Wednesday, March 10, 2010

SES NY 2010 Series: Getting Penalized and Banned in Search, An Interview With Michael Stebbins from Market Motive

How to get penalized and banned in Google.It’s that time of year again. SES New York is only a few weeks away and I’ll be covering the conference again via blogging and Twitter. As part of my coverage, I’ll be writing a blog posts previewing some of the sessions that I’m excited about attending. My first post is about a session titled “Post Mortem: Banned Site Forensics” and it will be co-presented by Michael Stebbins, the CEO of Market Motive, and Rand Fishkin, the CEO of SEOmoz, on Tuesday, March 23rd at 12:45. During the session, Michael and Rand will share some of the most egregious tactics that can get you in trouble, and also how to deal with getting penalized or banned. I had a chance to interview Michael last week about the session and you will find the interview below.

Getting Penalized or Banned in Search
If you work in SEO long enough, you’ll eventually hear the nightmare stories about sites getting penalized or banned by the search engines. I actually monitored a site a few months ago (a major brand) that was pulled from Google’s index for a five to six week period before being reincluded by the search giant. I can’t imagine how much money the company lost during that timeframe. It took me only ten minutes of digging to understand what they were doing wrong (and the tactic was blatantly against Google’s webmaster guidelines). That was a bad move and I’m sure it cost them dearly.

But every company being penalized doesn’t set out to break the rules. I’ve seen many instances of companies implementing dark grey to black hat tactics simply based on a lack of experience. They might have read about how to quickly rank highly on some random blog and went ahead and implemented those tactics. They weren’t necessarily trying to game the system, but ended up making changes that could get them in trouble. Sure, they might jump up the rankings for a few weeks or months, but they also might eventually get caught. That’s typically when the companies getting penalized or banned seek professional assistance.

Michael Stebbins of Market Motive.Needless to say, this is an important topic in SEO and why I chose to write about the session here on my blog. Michael has a wealth of experience in helping companies that have been penalized or banned, and was able to take a few minutes last week to answer some of my questions.

So without further ado, here is my interview with Michael Stebbins:

Glenn: What are the top three or four things people will learn at your session?

Michael: We'll cover which sins are forgivable and which ones can result in indefinite exclusion from the search results. We’ll also cover how to know if your site is banned in the first place. We get calls for help where site owners are certain they've been banned and it turns out the site is still indexed, but is penalized. Being penalized and being banned are very different outcomes. I'll show attendees a way to know for sure. We’ll then cover the five most common reasons sites are taken out of the index and I'll show the do's and don'ts in the reinclusion process.

Glenn: In your opinion, what are the leading causes/tactics for sites getting banned (over the past 12 to 18 months)?

Michael: Nearly all the “unforgivable” sins center around trying to fool the search engines into believing your site is more popular than it really is. The bots are getting smarter, but they are still blind and deaf. Since they cannot emulate a human behind a browser this leaves some opportunities for unscrupulous site owners to manipulate what the engines read versus what real people see and experience.

Glenn: As the engines have evolved, how have tactics for getting penalized evolved? i.e. How have older tactics like white on white text, keyword stuffing, cloaking, etc. evolved to more advanced forms of breaking the rules?

Michael: Google keeps this information close to the vest. But Bing recently posted what they are looking for to identify web spam. If you understand Google's motivation to show relevant sites, and combine this with some technical knowledge of how a bot finds and reads a web page, it's not too hard to figure out what the engines are looking for. Only certain false popularity techniques can be picked up with a bot at this time. The rest have to be reported and then checked via a manual review.

Glenn: Based on your experience, what are some of the top misconceptions about getting penalized by the engines?

Michael: It's funny, or actually it's not so funny, but nearly everyone who gets a site banned denies that they've done anything wrong. It's like a crime drama where the “victim” hides evidence out of embarrassment or denial. Eventually, we figure it out and are able to help. Another one that keeps coming up is denial of service after over-using Google resources. The denial of service relates to queries to Google's data -- not to inclusion in the index.

Glenn: Are there times where a smaller SEO violation can lead to a website completely getting pulled from the index?

Michael: Absolutely. We've found sites that trigger manual review for a forgivable sin, but once under review, an unforgivable sin is discovered and the site is beyond recovery at that point. Picture a driver getting pulled over for a tail light infraction only to get arrested for a bank robbery.

Glenn: Based on your experience helping sites that have been penalized or banned, how long does it take to bounce back from a penalty? (If a site owner goes through the process of fixing the issue and then filing a reinclusion request).

Michael: We've seen reinclusion in two weeks, but we've seen hundreds of sites that have little hope of ever being reincluded.

Glenn: Are there any case studies you are going to present during your session (along with statistics) about sites that were penalized?

Michael: I'll use some anonymized data to give examples of statistical data that can trigger a review. But for obvious reasons, we don't want to expose sites that were banned or are working on a reinclusion.

Based on the importance of the subject matter, along with Michael and Rand’s experience, I believe this is a session that is hard to miss… I think the information being presented can help clients, agencies, consultants, and in-house SEO’s all better understand how to keep their sites in good standing. I’ll be attending the session on Tuesday and tweeting core points as they come up. Again, the session is scheduled for 12:45-1:45 on Tuesday, March 23rd.

So, be there or get banned by Google. Just kidding. :)

If you have any questions, post them below. Either Michael or myself will respond.


Labels: , ,

Tuesday, February 09, 2010

Domain Strategy and SEO – Build Strength in Natural Search While Minimizing Security Risks

Domain Strategy and SEO.Do you know how many domains your company or clients are using? Are they building SEO power to one domain or splitting that power across ten? Do they use an excessive amount of subdomains or are they siloing content on their core website? From a security standpont, is there sensitive content sitting on test servers freely available to competitors? These are all important questions to explore, and how you address these questions can end up having a strong impact on your SEO efforts.

Don’t Overlook Domain Strategy
I’ve written extensively about SEO technical audits here on my blog, and how I think they provide the most SEO bang for your buck. There are a lot of important issues you can identify when performing an audit, including problems with indexation, canonicalization, navigation and internal linking, sitemaps, content optimization, etc. But there’s another important aspect to technical audits that is sometimes overlooked – Domain Strategy. Developing a solid domain strategy helps build the foundation for your overall SEO efforts. For example, would you rather have twelve domains with a few thousand inbound links per domain or one domain with 25K inbound links? Should your blog be hosted on your core domain or be on its own domain? Are you using 35 subdomains to organize content? Do you even need to use subdomains?

Don’t skip domain strategy. It’s too important to ignore. :)

And that’s why it’s the focus of my latest post on Search Engine Journal. I cover what domain strategy is, why it’s important, and I provide real-word situations I’ve come across during audits where developing a domain strategy was desperately needed. So head over to Search Engine Journal and read my post now! If you have comments or questions, feel free to post them either on Search Engine Journal or back here on my blog.

Domain Strategy – A Critical Component to SEO Technical Audits


Labels: , , ,

Tuesday, February 02, 2010

Say Cheese Please - How The Right Marketing Campaign About Lactose Intolerance Could Add $1.8 Billion To The Cheese Industry Annually

Marketing lactose free cheese in the United States.Hi. My name is Glenn Gabe and I’m lactose intolerant. That’s right, me and about 40 million other Americans. Although it’s not the worst thing that can happen to you, it’s definitely a bit of a downer. I was 32 when I figured out that I was lactose intolerant, and that’s also when I learned how much of a nuisance it was to exclude certain foods from my diet. And those foods were some of my favorite things to eat, including milk, cheese, pizza, ice cream, to just name a few. Cheese, in particular, is in so many foods and meals that you eat on a regular basis, that it’s almost impossible to avoid. Now, that’s assuming that I really do have to avoid cheese. More on that shortly.

What is Lactose Intolerance?
For those of you not that familiar with lactose intolerance, here’s a quick rundown. Lactose is the sugar found in milk. Lactase is the enzyme that your body produces to break down lactose. Lactose intolerant people don’t produce enough lactase to break down the lactose they ingest. And if it’s not broken down, it causes problems (to varying degrees). For most people the symptoms aren’t horrible, but can be more of an annoyance. Since milk is a core ingredient of cheese, you would think that cheese would cause serious problems for lactose intolerant people. Not so fast...

Cabot is Sharp (And I Mean Smart)
I was making lunch about a month ago when it happened. I’m typically stuck using some flimsy science cheese for my sandwiches or choosing from the anemic selection of lactose free cheeses available. That day my wife ended up taking out her favorite cheese, which is Cabot Extra Sharp Cheddar. By the way, that’s like dangling a gourmet sandwich in front of a person that’s been stranded on an island for 5 years. :) After a quick glance at the cheese, I wiped the drool from my face and went back to my science project, I mean lunch. That was until my wife glanced at the side of the Cabot packaging. She noticed a small message on the side of the package that read “Lactose FREE”. Huh? I dropped my sandwich on the floor and ran over. Was this a mistake? Are they messing with me? I checked to make sure I wasn’t being punk’d and then I started doing some research.

Cabot's Packaging Promotes Lactose Free Cheese:
Cabot Labeling Showing Lactose Free Cheese.

After doing some searches, I couldn’t believe what I was reading… It ends up that MOST aged cheeses are lactose free. From what I gather, the aging process yields cheese with either very low amounts of lactose or 0 grams of lactose. That includes cheddar, swiss, romano, provolone, etc. Needless to say, I was ridiculously excited. I’m not sure if all the cheeses listed have 0 grams of lactose, but most have such a low amount that they cause no problems for lactose intolerant people.

Where Were The Cheese Companies?
Then it hit me…why in the world aren’t cheese companies promoting this? Is there some reason they don’t want people to buy more of their cheese? Why didn’t I know about this? And why doesn’t the greater lactose intolerant community know more about this? I know quite a few people that are lactose intolerant, and I’m convinced that few of them actually know what they can and cannot eat! While doing my research, most of the search results were to forums and question and answer sites where people like me were asking questions about lactose free foods. Almost none of the major players in cheese ranked for the topic. Finlandia did have a page about how its cheeses were naturally lactose free, which is great, but I think more needs to be done…

The Revenue Implications of Smart Marketing
I couldn’t help but think of the massive revenue impact of effectively promoting this message to targeted people. How could cheese marketers get the word out via a number of channels?

A Target Market of 40 million lactose intolerant people…
I don’t know about you, but a target market of between 30 and 50 million lactose intolerant people provides a pretty darn good opportunity. And the fact that many of those people are dying to eat the foods they once loved (like cheese) makes it even a stronger opportunity. If cheese manufacturers or the cheese industry, decided to launch a thorough marketing and education campaign, I can only think they would strike gold. Simply getting the word out that most cheeses are low in lactose, and many are lactose free, could be a windfall for the cheese industry. There’s actually nothing to sell… your target market wants to eat cheese. They just can’t eat it (or so they think). A well-crafted campaign combining TV, Viral Marketing, Social Media Marketing, Search Marketing, Blogger Outreach, etc. could be huge for the cheese industry. It could be a cheese extravaganza!

Here’s an example of how simple it could be given the desperate eating state of most lactose intolerant people are. Jim and Laura work together:

Jim: Hey Laura, you can’t eat cheese, right?
Laura: Yes, unfortunately I’m lactose intolerant… Are you rubbing it in?
Jim: No, I just saw a video on YouTube explaining that most cheeses are low in lactose and many have no lactose at all… You should check it out.
Laura: WHAT?? Get out of my way! {She tackles Jim to get at his computer, clicks play on YouTube and shoots out the door to the store to buy 16 blocks of aged cheese.}

Revenue Lift: Now That’s A Lot of Cheddar
Let’s do the math. If you reached even 25% of lactose intolerant people in the United States, and they ended up spending an additional $15 per month on cheese, then you are looking at a lift of $1.8 billion per year. That’s a lot of cheddar, pun intended. :)

40 million lactose intolerant people in the US
25% = 10 million people
10 million x $15 per month = $150 million per month
$150 million per month x 12 months = $1.8 billion per year in additional revenue

Moving Forward
If I ran marketing for a cheese company and I was looking for ways to increase revenue, I would launch a killer campaign that engages the lactose intolerant market. Why try and get a .5% lift from the people who already buy and eat cheese when you can get a much greater lift from people that are dying to eat cheese, but just THINK that they can’t.

Now that would be sharp. :)


Labels: , , , , ,

Monday, December 21, 2009

The Black Hole of Blogging and Twitter, The Importance of Consistency and Persistence for Building Critical Mass in Social Media

Why many new bloggers and Twitter users get frustrated and drop off the social media grid.It’s hard to have a conversation about online marketing without bringing up both blogging and Twitter. Both have become critical components of a well-balanced online marketing mix (and for good reason). Blogs can be the anchor in a social media marketing strategy, enabling a company to humanize itself, provide valuable content for targeted users, and also target the long tail of SEO (which is critically important for natural search). Then you have Twitter, which has become a powerful way to engage targeted users and to get the word out about your valuable content. If you’re new to social media marketing, then blogging and Twitter should probably be the first two items on your checklist when starting. They are too powerful and ubiquitous to ignore.

So based on what I just explained, it’s natural for companies to get excited about launching a blog and Twitter account. Setting them up is the easy part (as most people find out). The act of consistently and continually blogging and tweeting is the hard part (and where most people fail). Once the accounts are set up and ready to go, I typically hear a few important questions from new bloggers and Twitter users. For example, “what should I blog about?” or “why do my tweets seem to go nowhere?”, and “what’s the ROI of this?” I’ve heard these types of questions so many times, that I can almost answer them in my sleep. To help demonstrate the problem, I’ve displayed a bell curve below representing the stages in the process of starting a blog or Twitter account. The graph includes brainstorming, excitement and enthusiasm, the launch, publishing, the first encounter of the “black hole”, and then a quick fade to confusion, frustration, slowdown, and ultimately silence. The cause of the trend is what I like to call The Black Hole of Blogging and Twitter. It won’t be studied in astronomy classes across the country, but believe me, it’s there.

The Bell Curve of New Bloggers and Twitter Users.

Defining The Black Hole of Blogging and Twitter
There's a slide in my presentation about social media marketing that consists of a single large black circle with the caption, “This is what you’ll be blogging to once you launch.” Then the following slide contains another black circle with the caption, “And this is what you’ll be tweeting to...” Both circles represent the black hole that new bloggers and Twitter users face during the beginning of their social media initiatives.

But what exactly is the black hole of blogging and Twitter? It’s actually simple when you break it down (and makes a lot of sense). When you start a blog or Twitter account, nobody knows about you (usually) and the hard truth is that nobody cares. Your priority as a new blogger or Twitter user should be to build credibility and trust, and just like in the offline world, that takes time. So, you start writing killer blog posts and tweeting valuable content. You build some subscribers and followers, but nobody gets in touch with you.

There are no retweets.
There are no votes.
There are no stumbles.
There are no high search engine rankings.
There are no comments.
And there are no calls.

Yes, you just realized that you’re blogging and tweeting to a black hole. Cue Twilight Zone music. :)

Overcoming The Black Hole
I’m sure you’re wondering how you break out of the black hole. Good question. In order to break out and gain some traction, you need to build critical mass. And no, this isn’t easy and you cannot game critical mass. Building 2000 Twitter followers in a week via some automated service won’t build you true followers. It will build zombie followers. And although they’ll be there, they won’t know who you are, they won’t care about you or your tweets, and you’ll get no value from having them. You need to earn true followers.

The Key To Breaking Out of the Black Hole
The key to breaking out of the black hole is to build a strategy for blogging and tweeting and simply keep going… You need to keep blogging, promoting your posts, and connecting with other bloggers. You need to respond to comments on your blog and on Twitter (although there won’t be many in the beginning). You need to be consistent, persistent, and tough it out. On Twitter, you need to keep pumping out valuable content. It should be content that interests targeted users. You should track your tweets to find out what your followers are interested in and refine the content that you tweet. You need to filter what’s important and make sure you tweet multiple times per day, every day. Yes, that’s every day, including weekends. You need to engage other Twitter users, respond to direct messages, and help out your followers. No, it’s not easy, but the benefit will greatly outweigh the work involved. But, that benefit will only come if you work your way out of the black hole.

If you do end up gaining critical mass, then the black hole will start to shrink. You’ll see breaks of light in the darkness and you might start connecting with people from all over the world. If you’re tracking your efforts, you’ll start to see more subscribers, retweets, inbound links to blog posts, social media activity around your posts (like Stumbles, Diggs, Bookmarks, etc.) You might just start becoming a believer in Twitter, blogging, and social media. And always try to remember the bell curve I provided above, and try as hard as possible to not become part of that trend. Unfortunately, I see it way too often from companies launching new blogs and Twitter accounts.

Did I Mention SEO?
As more people enjoy your posts, share them with others, tweet them to their followers, vote for them, and bookmark them, the more valuable links your blog will build. The more valuable links you build, the more SEO power you gain. The more SEO power you gain, the more keywords you’ll rank for. And as more targeted users search for topics you write about, they might very well end up at your blog. And since you’ll promote your Twitter account right on your blog, you’ll also gain them as Twitter followers. And the more subscribers, followers, fans, and new customers you build, the more you’ll want to blog and tweet. The cycle will all make sense to you at this point, but you need to get there first. I’ve written about the Twitter Effect on SEO previously on my blog. Read the post and you can see how both blogging and Twitter can have a profound effect on natural search. You shouldn’t ignore that fact. Natural Search is too powerful to ignore.

Don’t Give Up
If you’re new to blogging and Twitter and you are currently dealing with the infamous black hole, don’t get frustrated. Stay the course and keep going. You need to keep building and sharing quality content, connecting with others, tweeting great articles, etc. And if you’re able to work your way through the black hole, you might eventually see the power waiting on the other side. But if you let the black hole get to you (like many people do), you’ll end up off the grid, and you’ll lose out. And if that happens, you’ll leave a void that your competitors could fill. And they sure will. The opportunity is there. Make sure you’re in the game.

Now go write a blog post. :)


Labels: , , , ,

Monday, December 07, 2009

Announcing The “Search a Small Business” Holiday Giveaway from G-Squared Interactive

The Search a Small Business Holiday Sweepstakes from G-Squared Interactive.The holidays are always a great time to reflect upon the past year in online marketing. Looking back at 2009, it’s interesting to analyze how various companies utilized new technologies and marketing channels to increase sales and engage prospective customers. I feel fortunate to be in a position where I get to speak with many marketers from a wide range of companies (both large and small) to learn which tactics they are using to grow their businesses. I think it's been an amazing year, with Search, Social Media, and Mobile attracting a lot of attention from a wide range of companies and organizations.

However, looking back on my conversations and projects over the past year, it’s hard to ignore the lack of resources available to ultra small businesses. These small businesses unfortunately don’t have the time or budgets to tackle online marketing the way larger companies can. They also happen to be a critical component of our economy, so it just doesn’t seem right.

Ultra-Small Businesses & Online Marketing
To me, ultra small businesses are companies run by one or two individuals, employ less than ten people, generate under $500K per year in revenue, and move at light speed to keep their businesses moving. Ultra small businesses are critically important for our economy, but tend to be overshadowed by news from larger brands and companies. As article after article is written about multi-billion dollar powerhouses, the small business owner remains somewhat anonymous. Yet, those very business owners in aggregate employ millions of people and are an important part of the framework of commerce in the United States.

This got me thinking. What could I do this holiday season to help a small business start 2010 the right way? Let’s face it, many small businesses don't have the time to learn about the latest in online marketing and how to leverage those tactics to increase sales. But, just because they don’t have huge budgets and big brand names doesn’t mean they have to be left out in the cold!

The “Search a Small Business” Holiday Giveaway
So in the spirit of the holidays, I decided to launch The “Search a Small Business” Holiday Giveaway here at G-Squared Interactive. Over the next week, small businesses that meet the requirements listed below can simply send us an email to enter the contest. The winning business will receive a free online marketing audit, which includes an analysis of how their current website is performing. And more importantly, the analysis will provide recommendations for improving the website and various online marketing efforts. Insights from the analysis could include recommendations for improving Search Engine Optimization (SEO), Paid Search (SEM), Social Media Marketing, Website Optimization, and Web Analytics. The goal is to help the winning company quickly understand changes that can impact its business. The analysis will be performed by myself and Matt Leonard, an incredibly smart online marketer and good friend of mine that has agreed to help. For those of you on Twitter, Matt is @mjleonard and you should follow him now if you aren’t already. Together, we plan to arm a small business with key information for improving its online marketing efforts.

Please review the following requirements before entering the contest to make sure your business is eligible.

In order to be eligible, you must:
* Have less than 10 employees.
* Already have a website. Since this is an online marketing analysis, we need something to analyze. :)
* Be located less than 60 miles from Princeton, NJ and be a NJ business. This is because Matt and I will present the results to you in person at your office. I recommend you check Google Maps to see if you are eligible.
* Be willing to let us write follow-up blog posts about the giveaway and project. We would like to provide updates about how the winner is using the information provided in our analysis.
* Be willing to make changes! The analysis can only go so far. You will need to act on the recommendations in order to see an impact.

To read the official rules and regulations, please click here.

So let us help your business start off 2010 with the right online marketing strategies in place! Enter now by emailing us at Be sure to include all of your contact information so we know how to get in touch with you, including your full name, business name, business address, number of employees, phone number, and website URL. We will be accepting emails for the contest from Monday, December 7th, 2009 through Monday, December 14th, 2009. We will announce the winner on December 15th on this blog (and directly contact the winner via the information they provide when entering the contest).

Happy Holidays!

Glenn Gabe and Matt Leonard

Labels: , , , , , ,

Monday, November 09, 2009

FaceYahoogle – The Impact of Facebook, Yahoo, and Google on Website Traffic

The Power of Google, Yahoo, and Facebook on Site Traffic
It’s hard to get through a conversation about online marketing right now without bringing up Google, Facebook, and Yahoo (among other popular companies). However, if you’re not heavily involved in online marketing, and you’re not close to the actual referring traffic numbers from Google, Yahoo, and Facebook, then their influence can easily become nebulous. It’s easy to say, “Google is a powerhouse” or “Facebook has 325 million members”, and “You need to be there”, but how powerful are they really?

From a traffic perspective, the three companies are so powerful that I’ve given them their own combined name, or FaceYahoogle. The power of FaceYahoogle ends up becoming very real for my clients after I complete a competitive analysis (which includes identifying major sources of traffic for their company, as well as their competitors). The numbers I present across websites typically show extremely high referral data from FaceYahoogle, and by viewing the actual traffic numbers, you start to get a feel for how much impact the three entities have traffic-wise and potentially revenue-wise.

Digging Deeper into FaceYahoogle
If you’ve read previous posts of mine, then you already know that I’m a big believer in using data versus opinion to make decisions. The power of analytics in online marketing enables you to see granular performance data across a number of key metrics. And the more websites I analyze, the more I see a significant trend across industry categories. I see FaceYahoogle sending large amounts of traffic to a wide range of sites. The abnormally high percentage of traffic coming from Google, Yahoo, and Facebook is not only amazing to see, it’s actually scary. With thousands and thousands of potential referring sites on the web, to see FaceYahoogle send that high of a percentage of traffic is alarming. I think you begin understand how Google built up a $22 billion war chest! :)

I think many people would suspect Google being high on the referring sites list, based on having ~70% market share in search and also having Gmail, Google Maps, Google Docs, etc. However, I’m not sure many know how much actual traffic is coming from Googleland. Also, we hear that Facebook has over 300 million members, which is powerful, but are those members visiting your site via the social network? I’ll answer that question below via screenshots. And then you have Yahoo, with turmoil somewhat cloaking the power of its sites. How much traffic actually comes from Yahoo Search, Yahoo Mail, Yahoo News, Finance, Answers, etc?

So that’s my quick introduction to FaceYahoogle. Now let’s take a look at some numbers! I have provided Compete data (September 09) for a number of popular websites across a range of categories so you can view their referring sources. Note, I know Compete isn’t perfect, but it does provide a wealth of information to analyze for each website (especially for sites that receive large amounts of traffic).

Referring Sites for
31% from FaceYahoogle (and 17% from Google alone…)

Referring Sources for The New York Times

Referring Sites for LinkedIn
36% from FaceYahoogle, and over 8% from Facebook.

Referring Sources for LinkedIn

Referring Sites for
24% from FaceYahoogle

Referring Sources for

Referring Sites for JCrew
31% from FaceYahoogle

Referring Sources for JCrew

Referring Sites for The Huffington Post
33% from FaceYahoogle (and almost 8% from Facebook)

Referring Sources for The Huffington Post

Referring Sites for Yelp
A whopping 55% from FaceYahoogle (and 43% of that from Google!)

Referring Sources for Yelp

Referring Sites for ESPN
25% from FaceYahoogle (and nearly 10% from Facebook)

Referring Sources for ESPN

Referring Sites for
25% from FaceYahoogle (cha-ching…)

Referring Sources for

Referring Sites for
28% from FaceYahoogle

Referring Sources for

Let’s throw in a military site to see how the 3 headed monster works here:
Referring Sites for
Over 40% of referring traffic from FaceYahoogle

Referring Sources for The US Airforce

The screenshots above make it a little more tangible, right? FaceYahoogle is accounting for 40%+ of referring traffic for some websites. If you analyze website traffic often, then you know how insane those numbers are… But that’s not the whole story. The downstream data is important too. It ends up that a large percentage of traffic from these websites is going back to FaceYahoogle. Let’s take a look at just a few from above.

Downstream Data for
26% of visitors leave and go back to FaceYahoogle

Downstream Traffic from

Downstream Data for
31% of visitors leave and go back to FaceYahoogle

Downstream Traffic from

I saw the same trend across the other sites.

So, FaceYahoogle is driving enormous amount of traffic, but it’s also the top recipient of traffic from many sites. In particular, Facebook provides some unique opportunities with regard to downstream traffic. Give your visitors something to take back and you can possibly end up with even more traffic (WOM-based or possibly viral-based). And with some Google and Yahoo traffic going to back to Gmail, Yahoo Mail, Yahoo Answers, etc., you also have opportunities for spreading the word about your products, company, brand, etc. Let’s quickly take a closer look at each part of FaceYahoogle below.

As you can see, Google is an absolute powerhouse, even showing 43% of Yelp's overall referring traffic. That’s outrageous! And it’s not just any traffic, right? Many of the visitors from Google just searched for specific products or services that each site provides (AKA, high quality visitors). Imagine the revenue impact of Google traffic for those sites. In case you are wondering, Google traffic numbers include Search, Maps, Mail, Docs, Video, etc.

Seeing the high percentages from Google across sites, you can start to understand why SEO and SEM have been incredibly hot in online marketing… Some companies survive based on Google traffic alone (via paid and organic search traffic). A slip in rankings can be catastrophic for some websites, with the potential of impacting millions of dollars of revenue. Think about it. If you have 40% of your traffic coming from Google and slip to page two, three, or beyond, you will lose many targeted visitors, and the money they would have spent on your site. So is Google powerful? You bet it is. The numbers combined with my experience tell me so. :)

Facebook has grown by leaps and bounds over the past few years and is estimated to have 325 million members now. Clearly people are signing up in droves, using the platform at a staggering pace (104 billion pageviews based on Compete September 09), and oh yeah, they are visiting websites from Facebook. As you can see in the screenshots above, Facebook ranks in the top five referring sites for many of the properties I checked. Actually, it was typically in the top three. And in case you’re wondering, Twitter is moving up the charts too. Depending on the focus on the site in question, I see Twitter sending large amounts of traffic (and that doesn't count desktop clients which many Twitter members use). On that note, to read an example of how Twitter can impact exposure, traffic, and subsequent SEO power, check out my post about the Twitter effect on SEO. It’s a great example of how Search works with Social.

So, if your company is ignoring social media, then go back through the screenshots above and take note of the percentage of referring traffic from Facebook again. In meetings, I find myself saying more and more that if you ignore social media (and especially Facebook and Twitter), do so at your own risk. Again, the numbers are pretty convincing.

Although Yahoo has taken a back seat recently, the numbers are still strong from a referring source perspective. Between Yahoo Search, Yahoo Mail, Yahoo Answers, Yahoo News, Finance, etc. there are still millions of people visiting each property per month. And yes, those sites end up as top referring sources (impacting traffic, sales, sign-ups, twitter followers, Facebook fans, etc.) Yahoo consistently showed up in the top five referring sites, and often number one or two. Don’t count out Yahoo just yet. If you do, you’d be dismissing a huge traffic source (when you take all of their properties into account).

The Future of FaceYahoogle
I’m sure you are wondering which sites will be the major sources of traffic in 2010 and beyond? Will Twitter beat out Facebook, will Bing surpass Google, will Yahoo be non-existent? The beauty of the web (and technology) is that we never know. But the data does tell us something… don’t ignore Search and Social, and how they can work together.

People are searching and people are talking. And the people that are talking can impact how people that are searching find your website. And people searching can lead to sharing on social networks, based on what they find. Look at the numbers again. Don’t discount Facebook because you think people are tagging photos or playing games all day. You also shouldn’t disregard Google’s dominance. It is too powerful to ignore. And Yahoo isn’t dead yet. There are millions of people visiting Yahoo Sites on a regular basis.

Last, to emphasize that we never really know what will take off, I have provided Twitter’s trending below (Compete data over the past 24 months). I bet many people don’t even know that it was around in 2006 and 2007… and that it crept along until 2008 when it gained serious traction. So, is the next Twitter out there right now slowly growing and about to gain traction? Good question. :)

Click the image to view a larger version:
Twitter Trending 24 Months


Labels: , , , , , , ,

Tuesday, October 13, 2009

SEO and AJAX: Taking a Closer Look at Google’s Proposal to Crawl AJAX

Taking a closer look at Google's proposal for crawling AJAX.Last week at SMX, Google announced a proposal to crawl AJAX. Although it was great to hear the official announcement, you had to know it was coming. Too many web applications are using AJAX for Google to ignore it! After the news was released, I received a lot of questions about what the proposal actually means, how it works, and what the impact could be. There seemed to be a lot of confusion, and even with people in the Search industry. And I can understand why. If you don’t have a technical background, then Google’s blog post detailing the proposal to crawl AJAX can be a bit confusing. The mention of URL fragments, stateful pages, and headless browsers can end up being confusing for a lot of people, to say the least. And if you’ve never heard of a headless browser, fear not! Since it’s close to Halloween and I grew up near Sleepy Hollow, I’ll spend some time in this post talking about what a headless browser is.

So based on my observations over the past week or so, I decided to write this post to take a closer look at what Google is proposing. My hope is to clear up some of the confusion so you can be prepared to have your AJAX crawled. And to reference AJAX’s original slogan, let’s find out if this proposal is truly Stronger Than Dirt. :)

Some Background Information About SEO and AJAX:
So why all the fuss about AJAX and SEO? AJAX stands for asynchronous JavaScript and xml, and when used properly, it can create extremely engaging web applications. In a nutshell, a webpage using AJAX can load additional data from the server on-demand without the page needing to refresh. For example, if you were viewing product information for a line of new computers, you could dynamically load the information for each computer when someone wants to learn more. That might sound unimpressive, but instead of triggering a new page and having to wait as the page loads all of the necessary images, files, etc., the page uses AJAX to dynamically (and quickly) supply the information. As a user, you could quickly see everything you need and without an additional page refresh. Ten or more pages of content can now be viewed on one… This is great for functionality, but not so great for SEO. More on that below.

Needless to say, this type of functionality has become very popular with developers wanting to streamline the user experience for visitors. Unfortunately, the search engines haven’t been so nice to AJAX-based sites. Until this proposal, most AJAX-based content was not crawlable. The original content that loaded on the page was crawlable, but you had to use a technique like HIJAX to make sure the bots could find all of your dynamically loaded content. Or, you had to create alternative pages that didn’t use AJAX (which added a lot of rework.) Either way, it took careful planning and extra work by your team. On that note, I’ve yet to be part of project where AJAX developers jump up and down with joy about having to do this extra work. Based on what I explained above, Google’s proposal is an important step forward. But there just had to be a better solution.

What is Google’s Proposal to Crawl AJAX?
When hearing about the proposal, I think experienced SEO’s and developers knew there would be challenges ahead. It probably wasn’t going to be a simple solution. And for the most part, we were right. The proposal is definitely a step forward, but webmasters need to cooperate (and share the burden of making sure their AJAX can be crawled). In a nutshell, Google wants webmasters to process AJAX content on the server and provide the search engines with a snapshot of what the page would look like with the AJAX content loaded. Then Google can crawl and index that snapshot and provide it in the search results as a stateful URL (a URL that visitors can access directly to see the page with the AJAX-loaded content).

If the last line threw you off, don’t worry. We are going to take a closer look at the process that’s being proposed below.

Getting Your AJAX Crawled: Taking a closer look at the steps involved:

1. Adding a token to your URL:
Let’s say you are using AJAX on your site to provide additional information about a new line of products. A URL might look like:

Google is proposing that you use a token (in this case an exclamation point !) to make sure Google knows that it’s an AJAX page that should be crawled. So, your new URL would look like:!productname

When Google comes across this URL using the token, it would recognize that it’s an AJAX page and take further action.

2. The Headless Browser (Scary name, but important functionality.)
Now that Google recognizes you are using AJAX, we need to make sure it can access the AJAX page (and the dynamically loaded content). That’s where the headless browser comes in. Now if you just said, “What the heck is a headless browser?”, you’re not alone. That’s probably the top question I’ve received after Google announced their proposal. A headless browser is a GUI-less browser (a browser with no graphical user interface) that will run on your server. The headless browser will process the request for the dynamic version of the webpage in question. In the blog post announcing this proposal, Google referenced a headless browser called HTMLUnit and you can read more about it on the website.

Why would Google require this? Well, Google knows that it would take enormous amounts of power and resources to execute and crawl all of the JavaScript being used today on the web. So, if webmasters help out and process the AJAX for Google, then it will cut down on the amount of resources needed and provide a quick way to make sure the page gets properly crawled.

To continue our example from above, let’s say you already provided a token in your URL so Google will recognize that it’s an AJAX page. Google would then request the AJAX page from the headless browser on your server by escaping the state. Basically, URL fragments (an anchor with additional information at the end of a URL), are not sent with requests to the server. Therefore, Google needs to change that URL to request the AJAX page from the headless browser (see below).

Google would end up requesting the page like this:
Note: It would make this request only after it finds a URL using the token explained above (the exclamation point !)

This would tell the server to use the headless browser to process the page and return html code to Google (or any search engine that chooses to participate). That’s why the token is important. If you don’t use the token, the page will be processed normally (AJAX-style). If that’s the case, then the headless browser will not be triggered and Google will not request additional information from the server.

3. Stateful AJAX Pages Displayed in the Search Results
Now that you provided Google a way to crawl your AJAX content (using the process above), Google could now provide that URL in the search results. The page that Google displays in the SERPs will enable visitors to see the same content as if they were traversing your AJAX content on your site. i.e. They will access the AJAX version of the page versus the default content (which is what would normally be crawled). And since there is now a stateful URL that contains the AJAX content, Google can check to ensure that the indexable content matches what is returned to users.

Using our example from above, here is what the process would look like:
Your original URL:

You would change the URL to include a token:!productname

Google would recognize this as an AJAX page and request the following:

The headless browser (on your server) would process this request and return a snapshot of the AJAX page. The engines would then provide the content at the stateful URL in the search results:!productname

Barriers to Acceptance
This all sounds great, right? It is, but there are some potential obstacles. I’m glad Google has offered this proposal, but I’m worried about how widespread of an acceptance it’s going to gain. Putting some of the workload on webmasters presents some serious challenges. When you ask webmasters to install something like a headless browser to their setup, you never know how many will actually agree to participate.

As an example, I’ve helped a lot of clients with Flash SEO, which typically involves using SWFObject 2.x to provide alternative and crawlable content for your flash movies. This is a relatively straightforward process and doesn’t require any server-based changes. It’s all client side. However, it does require some additional work from developers and designers. Even though it’s relatively painless to implement, I still see a lot of unoptimized flash content out there… And again, it doesn’t require setting up a headless browser on the server! There are some web architects I’ve worked with over the years that would have my head for requesting to add anything to their setup, no pun intended. :) To be honest, the fact that I even had to write this post is a bad sign… So again, I’m sure there are challenges ahead.

But, there is an upside for those webmasters that take the necessary steps to make sure their AJAX is crawlable. It’s called a competitive advantage! Take the time to provide Google what it wants, and you just might reap the benefits. That leads to my final point about what you should do now.

Wrapping Up: So What Should You Do?
Prepare. I would spend some time getting ready to test this out. Speak with your technical team, bring this up during meetings, and start thinking about ways to test it out without spending enormous amounts of time and energy. As an example, one of my clients agreed to wear a name tag that says, “Is Your AJAX Crawlable?” to gain attention as he walks the halls of his company. It sounds funny, but he said it has sparked a few conversations about the topic. My recommendation is to not blindside people at your company when you need this done. Lay the groundwork now, and it will be easier to implement when you need to.

Regarding actual implementation, I’m not sure when this will start happening. However, if you use AJAX on your website (or plan to), then this is an important advancement for you to consider. If nothing else, you now have a great idea for a Halloween costume, The Headless Browser. {And don’t blame me if nobody understands what you are supposed to be… Just make sure there are plenty of SEO’s at the Halloween party.} :)


Related Posts:
The Critical Last Mile for SEO: Your Copywriters, Designers and Developers
Using SWFObject 2.0 to Embed Flash While Providing SEO Friendly Alternative Content
6 Questions You Should Ask During a Website Redesign That Can Save Your Search Engine Rankings
SEO, Forms, and Hidden Content - The Danger of Coding Yourself Into Search Obscurity

Labels: , , , ,

Monday, September 28, 2009

SEO Technical Audits - A Logical First Step for Improving SEO Results

SEO Website Audits, Why Extensive Technical Audits Are Critically Important.When I begin assisting new SEO clients, I typically start each engagement by completing a thorough SEO technical audit. Actually, I believe technical audits are so important that it's rare for me not to complete one. The reason is simple. An extensive audit identifies the strengths, weaknesses, and opportunities that a client has in natural search. It’s essentially a full analysis of a website and it takes into account several key factors that impact organic search. Needless to say, it's an important part of my seo services.

When speaking with new clients about natural search, I often refer to the four pillars of seo, including structure (a clean and crawlable structure), content (ensuring you have the right content and that it’s optimized), links (inbound links are the lifeblood of seo), and analytics (ensuring you track and analyze your natural search efforts). Then I typically jump back to pillar one and explain that without a clean and crawlable structure, you’re dead in the water. You can essentially forget about the other three pillars if your content can’t be crawled and indexed... For example, I was helping a site that already had over 1.3 million inbound links, yet the site ranked for almost no target keywords. The site had a massive structural problem, which was wreaking havoc on a number of important factors for SEO. The site could have built another 1.3 million links and nothing would have changed. The structure and architecture needed to be addressed before any impact would be seen. That’s a good example of when a technical audit was desperately needed (and you better believe I started one quickly to identify all of the barriers present on the site.)

The Core Benefits of an SEO Technical Audit
SEO technical audits yield several key benefits for clients looking to improve their results in natural search. The first benefit is that the audit yields an actionable remediation plan, which is a deliverable that documents each of the findings from the audit (along with how to address each issue.) To me, it’s one of the most important deliverables in SEO (especially in the beginning phases of an SEO engagement.) The remediation plan enables clients to fully understand where their website (or network of websites) stands SEO-wise. They get a lay of the land, understand the core problems impacting their website, and identify key opportunities in natural search (some of which can be tackled immediately). For example, I once helped a website jump from 250K pages indexed to 1.1 million in less than a month based on relatively painless changes to the site’s structure. That opened up a massive amount of content that was essentially hidden from the search engines. Without the audit, they probably would have stayed at 250K pages indexed and missed a huge opportunity…

Another benefit is that the audit helps build an SEO roadmap, which is a critical plan for how a client is going to achieve its goals in natural search. You know where the site stands, what needs to be addressed, what the key opportunities are, and how long each step will take. Working directly with a client’s team (executives, marketers, programmers, designers, copywriters, etc.) you can map out the necessary steps to remediate the site and expand your efforts. Everyone should have a solid feel for what needs to completed, and every person on the team is involved. In case you haven’t read my previous posts, I typically refer to a company’s team of developers, designers, and copywriters as The Critical Last Mile for SEO. Without their input and cooperation, you’re going to have a heck of time getting things done and seeing success.

What Can You Learn From an SEO Technical Audit?
Extensive audits produce a wealth of knowledge about the website in question. Although there are some people that might want to charge the (SEO) hill without conducting a thorough audit, I think that's a dangerous proposition. Thorough research and analysis are critically important when trying to determine obstacles in natural search. Without fully understanding what you are facing, you risk wasting time, a massive amount of effort (from everyone involved), burning through budget, and all while producing little results. Don’t charge the hill without a solid plan in place.

So, what can you find when performing a technical audit? To answer that question, let’s take a look at a hypothetical situation. Imagine you’re a VP or Director of Marketing that has a serious SEO problem. How important would finding the following things be for you?

Your SEO website audit revealed:

* Your company was using seven domains, and splitting your content across all of them. All seven have built up their own amount of SEO power (and none of them are very powerful).
* A website redesign was just completed, but without a proper migration strategy in place. This left thousands of pages, and possibly hundreds of thousands of inbound links, in limbo.
* Your website just added a killer web application, but that same application is hiding 90% of your content.
* Your website houses 750 videos across 30 categories, but none of them are indexed and ranking.
* Your navigation is half as robust as it needs to be, and uses several 302 redirects to link to each page.
* Every campaign landing page you launch disappears after the campaign ends (wasting thousands of powerful links.)
* Your new product pages are beautiful, but they contain a heavy amount of flash content and almost no text. And to add insult to injury, your flash content isn’t even optimized.
* 600 pages on your website are optimized the same exact way.
* Your site contains 200 pages, but over 2000 are indexed. Huh? What does that even mean?
* Your 404 page looks great, but it issues 200 codes (telling the engines the pages in question loaded successfully).
* At any given time, thousands of URL’s can change, wasting all of the SEO power they have built up over time.

I can keep going here... and you can probably start to see why I think SEO technical audits are so important. :) You never know what you’ll find, and many times these little gremlins are severely impacting your natural search efforts. Without conducting an extensive audit, you might only identify a small percentage of the problems impacting the website. That could leave the most important, and deepest structural problems hidden and unaddressed. And those deeper structural problems might be causing 90% of your SEO issues. By tackling only 10% of your problems, you might not make a dent in your efforts and performance in natural search.

SEO Audit Details: Deliverables, Cost, and Length of Time
In case you are wondering what a technical audit looks like, the deliverable is typically a PowerPoint presentation. Using PowerPoint enables you to provide visuals, screenshots, callouts, etc. It also works well when you need to present to larger groups of people. There are times a Word document will suffice, but unless you're audience is extremely familiar with the technical aspects you will be referring to in the remediation plan, I recommend going with PowerPoint. The length of time for completing an audit (and subsequent cost) completely depends on the size and complexity of the website. For example, larger, more complex sites might yield a 70 or 80 slide deck where smaller websites might yield 25-30 slides. I’ve seen audits completed in less than a week and others that take 6-8 weeks to complete. It makes sense if you think about it. You might have one website that has fewer than 50 pages and another site that has millions of webpages… The two presentations might look very different.

A Critical Component: The Analyst Completing Your Audit
It’s important that you find a consultant or agency that matches well with your business, industry, and the type of content you provide. You definitely don’t want to spend time and money on an audit that produces little results. So it's important that you choose a consultant or agency that can produce a remediation plan that's technically sound, thorough, and actionable. Find out how many audits the agency or analyst has completed. Find out which verticals they have focused on, and then ask for results based on their audits. For example, if you're a small business, find out if the SEO focuses on SMB's and local search. If you have expanded internationally, then ask if the SEO understands international SEO. If you focus on video, make sure the SEO has in depth experience with Video SEO. If you have 10 million webpages, then find out the largest website the consultant has worked on. You get the picture.

A quick example: All technical audits are not created equally:
I was asked to analyze a website last year and give the site a score for SEO (0-100, where 100 was be the best possible SEO situation). Before presenting my findings, I was told that the site was previously audited and was given a score of 75%. I was pretty shocked to hear that score. I had given the website a score of 35%. From my perspective, the site needed serious help… There's a big difference between the two scores, right? But, there’s also a reason the company had chosen to have a second audit performed. They weren’t seeing results after the first was completed. A score of 35% was accurate and we quickly were able to identify projects to tackle and develop a roadmap.

Unfortunately, technical audits that provide a shallow or incomplete view of your website can be dangerous. That type of audit could yield what I call “the snake oil effect”. That’s when internal employees become desensitized to SEO, don’t believe it can actually work, and focus their attention on less powerful initiatives. Think about it, if you’re an executive that allocated significant budget for several SEO efforts but never saw results, then your view of SEO will probably be skewed. Don’t let that happen! Natural search is too important.

The Most SEO Bang for Your Buck
If you are unhappy with your natural search results and you are determining where to begin, don’t overlook the power of an SEO technical audit. As I mentioned above, an audit can yield a detailed remediation plan in a relatively short amount of time. The remediation plan can yield a roadmap for your efforts, which can include projects that improve your overall SEO performance (including crawlability, indexation, content optimization, rankings, and targeted traffic.) That’s why I consider technical SEO audits a logical first step for most companies. It can provide serious SEO bang for your buck.


Related Posts:
6 Questions You Should Ask During a Website Redesign That Can Save Your Search Engine Rankings
The Critical Last Mile for SEO, Your Designers, Developers, and Copywriters
SEO, Forms, and Hidden Content - The Danger of Coding Yourself Into Search Obscurity

Labels: , ,

Tuesday, September 08, 2009

SEO, Forms, and Hidden Content - The Danger of Coding Yourself Into Search Obscurity

How forms and web applications can hide content from the search engines.When I perform a competitive analysis for a client, I often uncover important pieces of information about the range of websites they are competing with online. Sometimes that information is about traffic, campaigns, keywords, content, inbound links, etc. There are also times I uncover specific practices that are either beneficial or problematic for the competitor. For example, they might be doing something functionality-wise that could be inhibiting the overall performance of the site. If I do uncover something like that, I usually dig much deeper to learn more about that problem to ensure my clients don’t make the same mistakes. So, I was analyzing a website last week and I uncovered an interesting situation. On the surface, the functionality the site was providing was robust and was a definite advantage for the company, but that same functionality was a big problem SEO-wise. Needless to say, I decided to dig deeper to learn more.

Slick Web Application Yielding Hidden Content
As part of the competitive analysis I was completing, I came across a powerful web application for finding a variety of services based on a number of criteria. The application heavily used forms to receive information from users. The application included pretty elaborate pathing and prompted me to clarify answers in order to provide the best recommendations possible. After gathering enough information, I was provided with dozens of targeted service listings with links to more information (to more webpages on the site). So you might be thinking, “That sounds like a good thing Glenn, what’s the problem?” The problem is that the web application, including the robust form functionality, essentially hid all of the content from the search engines. In this case, we are talking about more than 2000 pages of high quality, high demand content. I say “high demand”, because I completed extensive keyword research for this category and know what people are searching for. Unfortunately for this company, the application yielded results that are simply not crawlable, which means the site has no chance to rank for competitive keywords related to the hidden pages. And by all means, the site should rank for those competitive keywords. For those of you asking, “but isn’t Google crawling forms?” I’ll explain more about that below. For this application, none of the resulting content was indexed.

Losing Visitors From Natural Search and Missing Opportunities For Gaining Inbound Links
Let’s take a closer look at the problem from an SEO standpoint. Forms often provide a robust way to receive user input and then provide tailored information based on the data collected. However, forms can also hide that content from the search engine bots. Although Google has made some strides in executing forms to find more links and content, it’s still not a perfect situation. Google isn’t guaranteeing that your forms will be crawled, it limits what it will crawl to GET forms (versus POST), and some the form input is generated by common keywords on the page (for text boxes). That’s not exactly a perfect formula.

Using forms, you might provide an incredible user experience, but you might also be limiting the exposure and subsequent traffic levels to your web application from natural search. I come across this often when conducting both SEO technical audits and competitive analyses for clients. In this case, over 2000 pages of content remain unindexed. And if the content is not indexed, then there is no way for the engines to rank it highly (or at all).

The Opportunity Cost
Based on the keyword research I performed, a traffic analysis of competing websites, and then comparing that data to the 2000 pages or so of hidden content, I estimate that the site in question is missing out on approximately 10-15K highly targeted visitors per day. That additional traffic could very easily yield 300-400 conversions per day, if not higher, based on the type of content the site provides.

In addition to losing targeted traffic, the site is missing a huge opportunity to gain powerful inbound links, which can boost its search power. The content provided (yet hidden) is so strong and in demand, that I can’t help but think the 2000 pages would gain many valuable inbound links. This would obviously strengthen both the domain’s SEO power, as well as the power of the specific pages (since the more powerful and relevant inbound links your site receives, the more powerful it is going to become SEO-wise.)

Some Usability Also Hindered
Let’s say you found this form and took the time to answer all the questions. After you completed the final steps of the form, you are provided with a list of quality results based on your input. You find the best result, click through to more information, and then you want to bookmark it so you can return later. But unfortunately you can’t… This is due to the web application, which doesn’t provide permanent URL’s for each result. Yes, the form is slick and its algorithm is great, but you don’t have a static page that you can bookmark, email to someone else, etc. How annoying is that? So if you want to return to the listing in question, you are forced to go back through the form again! It’s another example of how SEO and usability are sometimes closely related.

SEO and Forms, A Developer's Perspective
I started my career as a developer, so I fully understand why you would want to create a dynamic and powerful form-based application. This specific form was developed using, which utilizes postback (where the form actually posts back information to the same page). The URL doesn’t change, and the information submitted is posted back to the same page where the programmer can access all of the variables. Coding-wise, this is great. SEO-wise, this produces one URL that handles thousands of different pieces of content. Although you might have read that Google started crawling html forms in 2008, it’s a work in progress and you can’t guarantee that all of your forms will be crawled (to say the least…) On that note, you should really perform a thorough analysis of your own forms to see what Google is crawling and indexing. You might be surprised what you find (good or bad). So, the application I analyzed (including the forms) isn’t being crawled, the URL never changes, the page optimization never changes, and the content behind the form is never found. This is not good, to say the least.

If I were advising the company using this application, I would absolutely recommend providing another way to get the bots to all of this high quality content. They should definitely keep their robust web application, but they should also provide an alternative path for the bots. Then they should optimize all of those resulting webpages so they can rank for targeted queries. I would also disallow the application in robots.txt, blocking the bots from crawling any URL’s that would be generated via the form (just in case). With the right programmer, this wouldn’t take very long and could produce serious results from natural search…

The Most Basic SEO Requirement: Your Content Needs to be Found In Order to Rank
It sounds obvious, but I run into this problem often as I perform SEO technical audits. Your killer content will not rank just because it’s killer content. The content needs to be crawled and indexed in order to rank highly for target keywords. In this case, the site should definitely keep providing its outstanding functionality, but they should seriously think about the search implications (and provide an easy way for the bots to find optimized content.)

The bad news for my client's competitor is that I believe they aren’t aware of the severity of the problem and how badly it’s impacting their natural search traffic. However, the good news for my client is that they know about the problem now, and won’t make the same mistake as their competitor. That’s the power of a competitive analysis. :)


Related Posts:
6 Questions You Should Ask During a Website Redesign To Save Your Search Engine Rankings
The Critical Last Mile for SEO, Your Copywriters, Designers, and Developers

Labels: , , ,

Wednesday, August 12, 2009

Your Google Local Business Center Dashboard, Analyzing and Refining Your Google Maps Listing Based on Analytics

Google Local Business Center DashboardMore and more small businesses are realizing the importance of advertising online, including how to maximize their presence in Search. As local businesses get more involved in online marketing, they begin to understand how prospective customers research products and services. Needless to say, many are searching for information online. And, if you offer a product or service they are looking for, it’s obviously important for you to show up for targeted searches. If you don’t rank highly for target keywords, other businesses are...and they are the ones receiving calls (or visits in person).

In addition, there are searches that Google and the other engines deem as “local” in nature. For example, bakery in Princeton, NJ and florist in Miami, FL. Google may provide a 10 pack of local results for searches like this, and it’s important to make sure you show up. Even further, Google recently changed the way it processes requests that it deems local. For example, you often don’t need to put a location to trigger the 10 pack. Google knows your location and provides tailored local results for you. How nice. :)

To learn more about local listings in Google, you can read a previous post of mine about how to set up a Google maps listing in Google Local Business Center. In the post I walk you through what it is and how to set one up. By the way, once you take a hard look at Google’s 10 pack of local listings, it should be no surprise that it attracts a lot of attention. The 10 pack, which sometimes shows less than 10 listings, contains a map with markers showing the location of each business. It’s pretty hard to ignore this on the results page… The 10 pack also pushes down the organic results, which can potentially move your organic listing down the page.

Why Continual Analysis Can Provide Serious Benefits
I've found that many local businesses either don't have a listing or they set one up and check it off their list, never to return to analyze and refine the listing. But hold on a second… businesses should really be asking themselves, “How is that local listing working for me?” I recently had a client make some relatively minor changes based on reporting. These changes ended up having a significant impact on their local rankings and subsequent visits and calls from prospective customers. That’s pretty powerful considering the reporting they analyzed cost them nothing. Yes, $0. I helped my client use data provided to them in their Google Local Business Center Dashboard. You might have heard about this recently, as Google launched it in June of this year. That said, I’m sure some of you reading this post have no idea what it is. That’s ok, since this post is here to provide a thorough overview of your local dashboard, while also giving you some ideas for how to best use the data to attract prospective customers.

The Google Local Business Center Dashboard, Free Analytics for Local Businesses
OK, let’s assume you read my post about setting up your Google maps listing and you are showing up for some targeted searches. That’s great, but do you really know how well that listing is working for your business? Until recently (June 2009), you really didn’t have a lot of insight into the performance of your local listing. Sure, you probably had Google Analytics or another analytics package set up, but that doesn’t specifically give you data about your local listing. Thankfully, Google understood this and did something about it. They rolled out a Local Business Center Dashboard that is basically a scaled down Google Analytics report for your local listing. It provides some important data about how your listing is being triggered, viewed, and accessed. Let’s explore the features below.

The Features of Your Local Dashboard
First, log into Google Local Business Center. You will see your business information, status, and a label for “Statistics”. Under the heading for statistics, you will see a quick view of impressions and actions. Impressions include the number of times your local listing was triggered and viewed as a result of a search on Google or Google Maps. Actions include when someone viewing your listing actually interacted with it. More on this shortly. Click the “View Report” link to access your dashboard.

Accessing the dashboard from Google Local Business Center
Google Analytics-like Graphs for Impressions and Actions
The first thing you will see is a timeline at the top of the page showing activity for your listing. The chart breaks down impressions and actions visually by day, over the time period you selected. The default timeframe is the past 30 days, but you can easily change that by using date range selector in the upper right corner and then clicking apply. Right below the timeline, you will see the number of impressions, which again is the number of times your listing is viewed as a result of a search on Google or on Google Maps. Underneath impressions, you will see a breakdown of actions, which is the number of times a user took “action” with your listing. Possible actions include clicks for more information on Google Maps, clicks for driving directions, and clicks to your website. Actions are aggregated in the graph, but actually broken down underneath the graph. Providing this reporting enables you to get a quick snapshot of the performance of your local listing.

Viewing impressions and actions in Your Google Local Business Center Dashboard
What to look for:
You might notice spikes in impressions and actions based on advertising campaigns you have launched. You can identify the most active days of the week or periods of time based on activity. For example, are many people searching for your services on weekends or during the week, right before holidays, or heavily during a specific season? You can also test the effectiveness of the details of your listing. Google provides the ability to edit the details of your local listing, so my recommendation is to test various ways to communicate your business and then view the impact on impressions and actions. For example you can refine your description, specialties, and categories served to determine the optimal combination of elements. Don’t just throw up a local listing without revisiting its performance on a regular basis.

Top Search Queries
Below the breakdown of actions, you will find top search queries that triggered your local listing, along with the number of impressions. Although this isn't a robust list of keywords like you would see in Google Analytics or another analytics package, it still provides important data for you to review. You probably have an idea about the types of keywords that trigger your listing, however, I’ll bet some of the keywords in the list surprise you. It’s just like when I talk about performing keyword research, you should find what people are actually searching for versus what you think they are searching for. Trust data, and not necessarily opinion.

Click the image below to view a larger version:
Viewing top search queries in Your Google Local Business Center Dashboard

Are there keywords you never thought about targeting that people are actually searching for? Analyzing even this simple keyword report can help you target the right people locally, based on what they are really looking for. For example, let's say you are a florist focused on wedding arrangements and none of the keywords triggering your listing seem targeted for that niche. You find that most people are searching for gifts or flowers versus a specific type of arrangement. Or, you might find the opposite is true and that people are searching for very specific types of arrangements. Again, you never know until you look. Then you can determine the best path to take with regard to your local listing.

Based on what you find, you should start to think about why your listing is showing up for those searches. Is that because of the type of search being conducted or the information contained in your actual listing? It’s a good question and it is definitely worth analyzing... For example, did you let Google know that you provide organic food at your restaurant? Take the time to analyze the data and make changes to your listing. Don’t miss out on customers. In addition, the data can help you craft new marketing messages, and even possibly how you explain your business in person or via other forms of advertising. Using the example above, are you using the word organic in your advertising, whether that’s on TV, in mailers, at shows or festivals, and when you speak with people in your community. If they are searching for it, you might want to start including it. :)

Know Where Your Customers Are Coming From (Literally)
Underneath top search queries, you will find a list of zip codes, based on where driving direction requests are coming from. To clarify, this is when someone clicks “Directions” or “Get Directions” from your local listing. This data would mean more to a business with a physical location serving local customers and can provide some interesting data. For example, you can see the impact of offline marketing, you can see which areas provide high demand for your products or services, and can help you craft future advertising campaigns. For example, I know some local businesses like to attend town festivals, which enable you to set up a booth. Let’s say you planned to attend four festivals in the fall (at $750 per booth). Your knee jerk reaction might be to set up at festivals that are in close proximity to your business, maybe the four closest towns to your business. However, you might change that strategy based on data you view in your dashboard. Maybe more requests are coming from locations 10-15 minutes away versus 5 minutes away. You actually might pass on the festivals right around your town and target ones that are two or three towns over. Again, you don’t know until you review the data. If you don’t, you could miss opportunities to get in front of more targeted groups of people. This is why I always recommend continual analysis and refinement based on data. It has become a motto here at G-Squared Interactive.

Click the image below to view a larger version:
Viewing where direction requests are coming from in Your Google Local Business Center Dashboard

Go Check Your Local Dashboard Now
So there you have it, an overview of your Local Business Center Dashboard, or what I like to call a scaled down Google Analytics report for your local listing. I would love to see the ability to access more data, but this is still better than flying blind (which is what many businesses were doing beforehand).

Here are some key points to think about after reading this post:
* First, do you have a local listing and are you effectively managing that listing?
* Second, are you reviewing reporting for your listing and making changes based on the data?

Remember, you don’t want to miss an opportunity that’s right around the corner…literally. :)


Related Posts:
How to Set Up Your Google Maps Listing
How to Perform Keyword Research for SEO
The Difference Between Sales and Marketing

Labels: , , ,

Thursday, July 30, 2009

Creative Headlines Versus Descriptive Titles - Why Optimized Titles Tags are Still Important for SEO, My Latest Post on Search Engine Journal

Headlines can be powerful. Chances are you've come across a headline that was so enticing, you just had to learn more. It may have been funny, shocking, intriguing, etc. I think most marketers would agree that strong headlines can help drive a surge in short term traffic, while also being extremely memorable. However, I’m also sure that most SEO’s (including myself) would agree that those very headlines could risk poor search engine rankings, which means a potential loss of long term, quality traffic from organic search. And when Search can be a majority of a website’s traffic, it’s hard to ignore the power of high rankings.

The effect of creative and clever headlines on SEO.

SEO and Shoe-Throwing Incidents
There are times that I work with a client’s editorial staff to explain SEO, including keyword research, content optimization, the power of inbound links, etc. I find that many writers are interested in SEO, since they obviously want their articles and posts found via search engines. However, it's not uncommon to have a shoe fly by my head when I explain that clever and creative headlines are not optimal for SEO! As I explained in my guest post on Search Engine Journal, if I’m lucky, the shoe is thrown by someone with poor accuracy or small feet. :) Once the bombardment stops, I often start to conduct searches to show the impact of optimized headlines and titles (based on a client’s industry and focus). If you’ve read my previous blog posts, then you know I’m a big fan of backing your recommendations based on data and not opinion. I find that data is hard to ignore.

My SEO Headline Test
Based on my work with copywriters and editors, I decided to run even more tests and write a post detailing my findings. So, I conducted searches on a number of topics and checked Google, Yahoo, and Bing to determine how many of the top listings included titles that would be considered creative or clever. Then I reversed it, and checked posts and articles that I knew used clever or creative headlines to see where they ranked in natural search.

To view the results of my test, you’ll have to read my post on Search Engine Journal titled Great Headline, Poor Rankings – Why Clever Headlines Don’t Beat Optimized Title Tags for SEO

Feel free to post a comment on Search Engine Journal or here on my blog if you have any questions or thoughts about the topic.


Labels: ,

Tuesday, July 21, 2009

New Features in Keyword Discovery - Also Searched, Successful Search, Core Search Engine Information, and Competitors Search

New Features Features in Keyword Discovery 2009.If you’ve read some of my previous posts about SEO, then you know how important I think keyword research is. When you break it down, it’s risky to base decisions on what you think people are searching for versus analyzing the actual data. Once you perform keyword research, it can be used to optimize your current content, or more importantly, to help generate ideas for new content.

Needless to say, I’m neck deep in keyword research on a regular basis. Although I’ve used several tools to perform keyword research for my clients, I believe Keyword Discovery by Trellian is the industry leader. As new features are added to the product, I plan to cover them here on my blog in detail. In case you are interested, I’ve written several posts in the past about the importance of keyword research and some overlooked features in Keyword Discovery. After reading this post, you might want to also check them out.

New Features, Better Analysis
I’m going to cover four new features in this post that have been greatly helpful as I work on SEO projects. I’m a firm believer that you need to conduct a thorough analysis of your keywords versus just checking query volume. Trellian obviously understands this too, as they keep adding valuable features that make it a powerful analysis tool for search marketers. These new features help provide important pieces of information so you can make educated decisions about which keywords to target.

The four new features I will cover are:
* Also Searched Queries
* Successful Searches
* Analyze Information from Google, Yahoo, MSN/Bing, and Ask
* Competitors Search

Without further ado, let’s jump in.

1. Also Searched Queries
I love this feature. Have you ever wanted to know which other keywords people are searching for based on an initial keyword? This feature displays “also searched queries” as you search for keywords in the application (along with search volume.) So, if you enter “mens shoes” as the keyword, Keyword Discovery will show you other keywords that were searched for by the same users that searched for mens shoes. You actually know that the same users were searching for these additional keywords… In addition, the order of the results is based on user frequency (and not by pure number of searches in the database). This lets you see which keywords were most often searched by the same users versus just seeing volume numbers.

Click the image below to see a larger version:
The also searched feature in keyword discovery.

So, you can see that people searching for mens shoes are also searching for footwear, mens jeans, mens shirts, etc. You can also see specific retailers they are searching for. All of this data can help you make informed decisions about which keywords to target, as well as which additional keywords you might want to optimize for.

2. Successful Search Score
This is an important metric when analyzing keywords. Successful Search Score essentially tells you the percentage of people that clicked through a search result after searching for a keyword. It gives you a good feel for the keywords that actually generate a click through.

Below you will see a list of 13 keywords based on a search for mens shoes. You can clearly see how certain keywords generate a much higher click through. This metric should be part of your own decision making process for which keywords to target. It’s obviously not the only metric to consider, but when combined with other metrics that KD offers, it can help you determine which keywords to focus on.

Successful search score in keyword discovery.

3. Now You Can Analyze Data From Google, Yahoo, MSN/Bing, and Ask
After adding keywords to one of your projects, Keyword Discovery enables you to analyze those keywords to view a number of key metrics. For example, you can see the number of searches in the database, successful searches (mentioned above), the number of results in each engine for that keyword, and the KEI (or Keyword Effectiveness Index). Keyword Discovery recently broke down this information by core search engine, including Google, Yahoo, MSN/Bing, and Ask. Having all of this information at your fingertips enables you to analyze keywords across the core engines, in order to make smart decisions about which keywords to target. This data helps you understand how competitive each keyword is so you can target the right keywords for the task at hand.

Click the image below to see a larger version:
Analyze core search engine information in keyword discovery.

4. Competitors Feature
Checking this box when conducting a search in Keyword Discovery will display the top websites receiving search engine traffic for that keyword. There are some great competitive analysis tools on the market, and I use several of them on a regular basis, but it’s great to have some base level data at your fingertips while performing keyword research. For example, I entered mens shoes in KD and it displayed the top 100 sites receiving search engine traffic for that keyword (based on Trellian’s Competitive Analysis User Path Data). Your list might start with some obvious players, but as you scan down the results you might find some interesting competitors. And, you can use the results to start performing a deeper competitive analysis.

Click the image below to see a larger version:
Competitor search feature in keyword discovery.

This won’t be my last post about keyword research or Keyword Discovery…
So there you have it. Four new features in Keyword Discovery that can help you select the right keywords for the project at hand. I plan to write more about KD in the future as Trellian adds more features. Actually, there are some features that warrant an entire blog post, so look for more posts in the near future!

I’ll end this post with a Glenn Gabe public service announcement:

Please don’t base your SEO efforts on opinion. Perform extensive keyword research and have that research fuel your projects. A keyword is a terrible thing to waste. :)


Labels: , ,