SEO, Forms, and Hidden Content – The Danger of Coding Yourself Into Search Obscurity

Glenn Gabe

google, SEO

How forms and web applications can hide content from the search engines.When I perform a competitive analysis for a client, I often uncover important pieces of information about the range of websites they are competing with online. Sometimes that information is about traffic, campaigns, keywords, content, inbound links, etc. There are also times I uncover specific practices that are either beneficial or problematic for the competitor. For example, they might be doing something functionality-wise that could be inhibiting the overall performance of the site. If I do uncover something like that, I usually dig much deeper to learn more about that problem to ensure my clients don’t make the same mistakes. So, I was analyzing a website last week and I uncovered an interesting situation. On the surface, the functionality the site was providing was robust and was a definite advantage for the company, but that same functionality was a big problem SEO-wise. Needless to say, I decided to dig deeper to learn more.

Slick Web Application Yielding Hidden Content

As part of the competitive analysis I was completing, I came across a powerful web application for finding a variety of services based on a number of criteria. The application heavily used forms to receive information from users. The application included pretty elaborate pathing and prompted me to clarify answers in order to provide the best recommendations possible. After gathering enough information, I was provided with dozens of targeted service listings with links to more information (to more webpages on the site). So you might be thinking, “That sounds like a good thing Glenn, what’s the problem?” The problem is that the web application, including the robust form functionality, essentially hid all of the content from the search engines. In this case, we are talking about more than 2000 pages of high quality, high demand content. I say “high demand”, because I completed extensive keyword research for this category and know what people are searching for. Unfortunately for this company, the application yielded results that are simply not crawlable, which means the site has no chance to rank for competitive keywords related to the hidden pages. And by all means, the site should rank for those competitive keywords. For those of you asking, “but isn’t Google crawling forms?” I’ll explain more about that below. For this application, none of the resulting content was indexed.

Losing Visitors From Natural Search and Missing Opportunities For Gaining Inbound Links

Let’s take a closer look at the problem from an SEO standpoint. Forms often provide a robust way to receive user input and then provide tailored information based on the data collected. However, forms can also hide that content from the search engine bots. Although Google has made some strides in executing forms to find more links and content, it’s still not a perfect situation. Google isn’t guaranteeing that your forms will be crawled, it limits what it will crawl to GET forms (versus POST), and some the form input is generated by common keywords on the page (for text boxes). That’s not exactly a perfect formula.

Using forms, you might provide an incredible user experience, but you might also be limiting the exposure and subsequent traffic levels to your web application from natural search. I come across this often when conducting both SEO technical audits and competitive analyses for clients. In this case, over 2000 pages of content remain unindexed. And if the content is not indexed, then there is no way for the engines to rank it highly (or at all).

The Opportunity Cost

Based on the keyword research I performed, a traffic analysis of competing websites, and then comparing that data to the 2000 pages or so of hidden content, I estimate that the site in question is missing out on approximately 10-15K highly targeted visitors per day. That additional traffic could very easily yield 300-400 conversions per day, if not higher, based on the type of content the site provides.

In addition to losing targeted traffic, the site is missing a huge opportunity to gain powerful inbound links, which can boost its search power. The content provided (yet hidden) is so strong and in demand, that I can’t help but think the 2000 pages would gain many valuable inbound links. This would obviously strengthen both the domain’s SEO power, as well as the power of the specific pages (since the more powerful and relevant inbound links your site receives, the more powerful it is going to become SEO-wise.)

Some Usability Also Hindered

Let’s say you found this form and took the time to answer all the questions. After you completed the final steps of the form, you are provided with a list of quality results based on your input. You find the best result, click through to more information, and then you want to bookmark it so you can return later. But unfortunately you can’t… This is due to the web application, which doesn’t provide permanent URL’s for each result. Yes, the form is slick and its algorithm is great, but you don’t have a static page that you can bookmark, email to someone else, etc. How annoying is that? So if you want to return to the listing in question, you are forced to go back through the form again! It’s another example of how SEO and usability are sometimes closely related.

SEO and Forms, A Developer’s Perspective

I started my career as a developer, so I fully understand why you would want to create a dynamic and powerful form-based application. This specific form was developed using asp.net, which utilizes postback (where the form actually posts back information to the same page). The URL doesn’t change, and the information submitted is posted back to the same page where the programmer can access all of the variables. Coding-wise, this is great. SEO-wise, this produces one URL that handles thousands of different pieces of content. Although you might have read that Google started crawling html forms in 2008, it’s a work in progress and you can’t guarantee that all of your forms will be crawled (to say the least…) On that note, you should really perform a thorough analysis of your own forms to see what Google is crawling and indexing. You might be surprised what you find (good or bad). So, the application I analyzed (including the forms) isn’t being crawled, the URL never changes, the page optimization never changes, and the content behind the form is never found. This is not good, to say the least.

If I were advising the company using this application, I would absolutely recommend providing another way to get the bots to all of this high quality content. They should definitely keep their robust web application, but they should also provide an alternative path for the bots. Then they should optimize all of those resulting webpages so they can rank for targeted queries. I would also disallow the application in robots.txt, blocking the bots from crawling any URL’s that would be generated via the form (just in case). With the right programmer, this wouldn’t take very long and could produce serious results from natural search…

The Most Basic SEO Requirement: Your Content Needs to be Found In Order to Rank

It sounds obvious, but I run into this problem often as I perform SEO technical audits. Your killer content will not rank just because it’s killer content. The content needs to be crawled and indexed in order to rank highly for target keywords. In this case, the site should definitely keep providing its outstanding functionality, but they should seriously think about the search implications (and provide an easy way for the bots to find optimized content.)

The bad news for my client’s competitor is that I believe they aren’t aware of the severity of the problem and how badly it’s impacting their natural search traffic. However, the good news for my client is that they know about the problem now, and won’t make the same mistake as their competitor. That’s the power of a competitive analysis. :)

GG

Related Posts:

6 Questions You Should Ask During a Website Redesign To Save Your Search Engine Rankings

The Critical Last Mile for SEO, Your Copywriters, Designers, and Developers