The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
  • Contact GSQi

When The Hammer Falls – Analyzing Lyrics in the Google SERPs and Its Impact on Traffic [Case Study]

February 4, 2015 By Glenn Gabe 7 Comments

Summary: In the fall of 2014, both Bing and Google began surfacing song lyrics directly in the search engine results pages (SERPS). Since users could now find lyrics immediately in the SERPs, many wondered what would happen to lyrics websites that provided the same information, but required a click through to view the lyrics. This post provides findings from analyzing three large-scale lyrics web sites to determine the traffic impact of lyrics in the SERPs.

Song Lyrics Displayed In The Google Search Results

Article Contents and Quick Jumps:

  • Introduction
  • Lyrics Show Up in the SERPs
  • Analysis Process and Caveats
  • The Results – The Impact of Lyrics in the SERPs
  • What Could Lyrics Web Sites Do?


Introduction
In April of 2014, I picked up a major algorithm update that heavily impacted lyrics web sites. The drop in traffic to many key players in the niche was substantial, with some losing 60%+ of their Google organic traffic overnight. For those of you familiar with Panda or Penguin hits, you know what this looks like.

Lyrics Web Sites Hit By Google Algorithm Update in April of 2014

I ended up digging in heavily and analyzing the drop across the entire niche. I reviewed a number of lyrics sites across several countries that got hit and wrote a post covering my findings (linked to above). After writing that post, I had a number of lyrics sites reach out to me for more information. They wanted to know more about what I surfaced, what the problems could be, and if I could help rectify the situation. It was a fascinating algo hit to analyze and I absolutely wanted to take on the challenge of helping the sites recover. So I began helping several of the lyrics sites that were heavily impacted.

2014 – A Crazy Year for Lyrics Sites
I took on several of the lyrics sites as clients and began heavily analyzing and auditing the negative impact. That included performing a deep crawl analysis of each site, a heavy-duty technical SEO analysis, a thorough content analysis, while also using every tool in my arsenal to surface SEO-related problems.

I won’t sugarcoat my findings, there were many problems I surfaced, across content, technical SEO, and even links (in certain situations). It was hard to say if the specific update in April was Panda, a separate algo update that hammered lyrics sites, or something else. But I tackled the situation by covering as many bases as I could. Each remediation plan was extensive and covered many ways to tackle the problems I surfaced. As time went on, and many changes were implemented, the sites started to recover. Some recovered sooner than others, while other sites took many more months to surge back.

Lyrics Website Recovering During Panda Update

On that note, many of the large lyrics sites have ridden the Panda roller coaster for a long time. And that’s common for large-scale websites that haven’t focused on Panda-proofing their web sites. Over time, insidious thin content builds on the site like a giant layer of bamboo. And as the bamboo thickens, Panda smells dinner. And before you know it, boom, Panda hits the site (and for these sites, it hit them hard).

After recovering, each site would hold their collective breath while subsequent Panda updates rolled out. Based on the lyrics web sites I have assisted, only one has fallen again to Panda. The others have remained out of the gray area and are doing well traffic-wise. Unfortunately, one lyrics web site I was helping saw a temporary recovery after recovering relatively quickly (almost too quickly). Quick recoveries are rare when you’re dealing with Panda, so I did find that specific recovery odd. It typically takes months before you see a major surge after being pummeled by Panda. The site surged during the 9/5 update and then got hammered again during the cloaked 10/24 update. And Panda has not rolled out since 10/24/14, so we’re still waiting to see if the site comes back.

Lyrics Website Temporary Recovery from Panda

But enough about Panda for now. Actually, Google Panda could pale in comparison to what showed up in late fall 2014. We all knew it was possible, considering Google’s ambition to provide more and more data in the search engine results pages (SERPs). But it’s another story when you actually see it happen. I’m referring to the search engines adding lyrics directly in the SERPs. You know, when someone searches for song lyrics, and boom, the lyrics show up right in the desktop or mobile SERPs. No click through needed. I’ll cover how this unfolded next.


Lyrics Show Up in the SERPs
Bing was the first to add lyrics in the SERPs on October 7, 2014. That was the first bomb dropped on lyrics sites. It was a small bomb, considering it was only showing in Bing in the United States and Bing has approximately 19.7% market share (according to comScore Dec 2014 stats). Bing also drives Yahoo search (organic and paid), but lyrics are not showing in Yahoo yet.

Lyrics in Bing SERPs

But the writing was on the wall. Lyrics were coming to Google, and sooner than later. When lyrics hit Bing, I sent emails to all of my lyrics clients explaining the situation, providing screenshots, and sample searches. Not every song would yield lyrics in the SERPs, but this was still a major event for the lyrics industry.

Next up was the first move by Google. On October 24, 2014, if you searched for a specific song, Google began providing a YouTube video with some song and artist information at the top of the SERPs. And near the bottom of that unit was a line or two from the lyrics and then a link to Google Play for the full lyrics. Whoa, so Google was beginning their assault on lyrics by simply linking to Google Play to view the lyrics. Again, I immediately emailed my clients and explained the situation, knowing lyrics were coming to the main SERPs soon.

Lyrics in Google SERPs Linking To Google Play

 

December 19, 2014 – The Hammer Falls
And then this happened:

Lyrics in Google SERPs Finally Arrive on December 19, 2014

And here was my Google+ share, which ended up getting a lot of attention:

Google Plus Share of Lyrics in the Google SERPs

 

I shared this screenshot of Google including lyrics directly in the SERPs, and the G+ post got noticed, a lot. That share was mentioned on a number of prominent websites, including Search Engine Roundtable, TechCrunch, Billboard, and more.

To clarify what was happening search-wise, on December 19, 2014 Google began showing song lyrics for users in the United States, and only for certain songs. I’m assuming the limit on songs and geography was based on licensing, so this doesn’t impact every song available. I’ll cover more about the impact of those limitations soon when I dig into some stats, but it’s an important note.

For example, if you search for “bang bang lyrics” in the United States, you get this:

Bang Bang Lyrics in US Google SERPs

But if you search for “you shook me all night long lyrics”, you won’t see lyrics in the SERPs. Clearly Google doesn’t have the rights to present the lyrics to all AC/DC songs, but it does for “Bang Bang”.

You Shook Me All Night Long Without Lyrics in US Google SERPs

And by the way, that’s for the desktop search results. This is also happening in mobile search, in the United States, and for certain songs. Talk about dominating the mobile SERPs, check out the screenshot below. Where on desktop, you get the lyrics, but still see links to lyrics websites above the fold (typically), mobile is another story.

Check out the search for “bang bang lyrics” on my smartphone:

Bang Bang Lyrics in the Mobile U.S. Google SERPs

Can you see the massive difference? It’s just lyrics, and nothing else. And to add insult to injury, the percentage of users searching for lyrics is heavily skewed mobile. And that makes sense. Those users are on the go, hear a song, want to know the lyrics, and simply search on their phones. Or, they are in a situation where their phone –is their computer– so their searches will always be mobile.

Mobile Heavy Queries for Lyrics Globally

 

Death to Lyrics Websites?
Based on what I’ve explained so far, you know that Panda loves taking a bite out of lyrics web sites and you also know that both Google and Bing are providing lyrics directly in the SERPs (in the US and for certain songs). And you might guess that all of this means absolute death for lyrics websites. But wait, does it? I wouldn’t jump to conclusions just yet. There are definitely nuances to this situation that require further analysis and exploration.

For example, how much of a hit have the lyrics sites taken based on lyrics in the SERPs? How much traffic dropped for each song that yields lyrics in the SERPs? Was there an impact just in the United States or around the world too? And what about the difference between desktop and mobile? All of these were great questions, and I was eager to find answers.

So, I reached out to several of my lyrics clients and asked if I could analyze the changes and document the data in this post (anonymously of course). The post isn’t meant to focus on the sites in particular, but instead, focus on the impact that “lyrics in the SERPs” have made to their traffic. The lyrics websites I’ve been helping generate revenue via advertising, so a massive drop in traffic means a massive drop in revenue. It’s pretty much that simple at this point. That’s why Panda strikes fear in every lyrics web site owner and why lyrics in the SERPs can strip away visits, pageviews, and ad dollars. It’s a new one-two punch from Google.


Analyzing Three Large-Scale Lyrics Websites
Three of my clients were nice enough to let me move forward with the analysis. And I greatly appreciate having clients that are awesome, and are willing to let me analyze and share that data. The three sites I analyzed for this post are large-scale lyrics sites. Combined, they drive more than 30 million visits from Google organic per month and have approximately 6 million lyrics pages indexed. And as I explained earlier, a lot of that traffic is from users on mobile devices. Approximately 40-50% of all Google organic traffic is from mobile devices (across all three sites).

Process:
My goal with the analysis was to understand the impact of lyrics in the SERPs from a click-through and traffic standpoint. I dug into search queries driving traffic over time to all three sites while also checking impressions and clicks in the SERPs (via Google Webmaster Tools, both desktop and mobile). Then I also checked Google Analytics to determine the change in traffic levels to song pages since the lyrics hit the SERPs.

For example, if a query saw a similar number of impressions since the launch of lyrics in the SERPs, but clicks dropped off a cliff, then I could dig in to analyze the SERPs for that query (both desktop and mobile). I found some interesting examples for sure, which I’ll cover below.

An example of stable or increasing impressions, but clicks dropping off a cliff: 

Google Webmaster Tools Impressions and Clicks for Lyrics Queries

 

Caveats:
My analysis measured the impact right after lyrics hit the SERPs (from December 19, 2014 through the end of January 2015). The holidays were mixed in, which I tried to account for the best I could. Some of the lyrics sites saw steady traffic during the holidays, while one dipped and then returned as the New Year approached. The songs I analyzed and documented were not holiday-focused songs. I made sure to try and isolate songs that would not be impacted by the holidays. Also, Google Webmaster Tools data was sometimes wonky. I’m sure that’s no surprise to many of you working heavily in SEO, but it’s worth noting. I tried my best to exclude songs where the data looked strange.

Google Webmaster Tools & Advanced Segmentation in GA
When I began my analysis, I quickly found out that the straight reporting in both Google Webmaster Tools and Google Analytics wouldn’t suffice. Overall Google organic traffic wouldn’t help, since lyrics only rolled out in the SERPs in the United States. When checking traffic since the rollout, you really couldn’t see much overall change. But the devil is in the details as they say. So I used the functionality available to me in both GWT and GA to slice and dice the data. And that greatly helped me understand the impact of lyrics in the SERPs.

In Google Webmaster Tools, the search queries reporting enables you to filter the results. This was incredibly helpful, as I was able to isolate traffic from the United States and also view web versus mobile traffic. But there was another nifty filter I used that really helped. You see, many people visit lyrics websites for the meaning of the lyrics, and not just to see the lyrics. For example, “take me to church meaning” or “meaning of hallelujah lyrics”.

The reason I wanted to weed those queries out is because as of now, Google does not provide the lyrics in the SERPs for “meaning” focused queries. And that’s good for my clients by the way. So by adding the filters per site, I would able to isolate songs that could be impacted.

Filtering GWT Search Queries by Search Property, Location, and Negative Query:

Google Webmaster Tools Filters for Property, Location, and Query

After setting the filters, I was able to search for queries that yielded relatively stable impressions, but saw a drop in clicks and click through rate. And I always kept an eye on average position to make sure it didn’t drop heavily.

From a Google Analytics standpoint, I ran into a similar problem. Top-level statistics wouldn’t cut it. I needed Google organic traffic from the United States only. And then I wanted both Desktop and Mobile Google organic traffic from the United States only (separated). That’s where the power of advanced segments come in.

I built segments for Desktop Google organic traffic from the United States and Mobile Google organic traffic from the United States. By activating these segments, my reporting isolated that traffic and enabled me identify trends and changes based on those segments alone. By the way, I wrote a tutorial for how to use segments to analyze Panda hits. You should check that out if you aren’t familiar with segments in GA. You’ll love them, believe me.

Filtering Google Organic Traffic from the United States in GA Using Segments:

Google Analytics Segments for U.S. Desktop Google Organic Traffic

 

So, with the right tools and filters in place, I began to dig in. It was fascinating to analyze the queries leading to all three sites now that lyrics hit the SERPs. I cover what I found next. By the way, this posts focuses on Google and not Bing. I might write up another post focused on Bing’s lyrics in the SERPs, but I wanted to focus on Google to start.


The Impact of Lyrics in the SERPs – The Data
With multiple computers up and running, two phones, and two tablets, I began to dig in. I wanted to find queries and songs that typically drove traffic to the three sites that now yielded lyrics in the SERPs. And then I wanted to see what happened once those lyrics hit the SERPs, the impact on clicks, traffic, etc. I have documented a number of examples below. By the way, there are many more examples, but I wanted to just provide a sampling below. Here we go…

 

Spill The Wine Lyrics by War
Google Organic Desktop US Traffic Down 73%
Google Organic Mobile US Traffic Down 65%
GWT Clicks Down 56%

 

Sister Ray Lyrics by The Velvet Underground
Google Organic Desktop US Traffic Down 73%
Google Organic Mobile US Traffic Down 56%
GWT Clicks Down 84%

 

Rude Lyrics by Magic!
Google Organic Desktop US Traffic Down 41%
Google Organic Mobile US Traffic Down 32%
GWT Clicks Down 55%

 

Bang Bang Lyrics by Jesse J, Nicki Manaj and Ariana Grande
Google Organic Desktop US Traffic Down 32%
Google Organic Mobile US Traffic Down 47%
GWT Clicks Down 66%

 

Fireproof Lyrics by One Direction
Google Organic Desktop US Traffic Down 44%
Google Organic Mobile US Traffic Down 40%
GWT Clicks Down 29%

 

All of Me Lyrics by John Legend
Google Organic Desktop US Traffic Down 39%
Google Organic Mobile US Traffic Down 14%
GWT Clicks Down 61%

 

Country Road Lyrics by John Denver
Google Organic Desktop US Traffic Down 62%
Google Organic Mobile US Traffic Down 45%
GWT Clicks Down 36%

 

Come Sail Away Lyrics by Styx
Google Organic Desktop US Traffic Down 43%
Google Organic Mobile US Traffic Down 27%
GWT Clicks Down 55%

 

Midnight Special Lyrics by Huddie William Ledbetter
Google Organic Desktop US Traffic Down 53%
Google Organic Mobile US Traffic Down 85%
GWT Clicks Down 33%

 

Comfortably Numb Lyrics by Pink Floyd
Google Organic Desktop US Traffic Down 46%
Google Organic Mobile US Traffic Down 17%
GWT Clicks Down 43%

 

Yes, There’s A Serious Impact
As you can see from the statistics above, both desktop and mobile traffic to the song pages dropped significantly since lyrics hit the SERPs (for songs that yield lyrics in the SERPs). Again, these songs showed stable impressions during the timeframe, yet showed large drops in clicks from the SERPs, and subsequent traffic to the three lyrics sites I analyzed.

Some users were clearly getting what they wanted when searching for lyrics and finding that information in the SERPs. And in mobile search, the lyrics take up the entire results page. So it’s no surprise to see some mobile numbers absolutely plummet after lyrics hit the SERPs.


What Could Lyrics Sites Do?
Above, I provided a sampling of what I saw while analyzing the impact of lyrics in the U.S. Google SERPS. Clearly there’s a large impact. The good news for lyrics sites is that there are several core factors helping them right now.

  • This is only in the United States.
  • The lyrics only trigger when the query is structured in certain ways. For example, “magic rude lyrics” yields lyrics where “rude lyrics magic” does not. Also, if additional words are entered in the query, lyrics will not be shown (like “meaning” which I explained earlier.)
  • Not all songs are impacted (yet). I found many examples of songs that did not yield lyrics in the SERPs. Again, this is probably due to licensing issues.

If you look at the overall traffic numbers for the sites I analyzed (and the other sites I have access to), Google organic traffic overall has not been heavily impacted. Taking all global Google organic traffic into account, and across all songs, you clearly don’t see the huge drop like I showed you for the songs listed above. That said, this is still a grave situation for many lyrics sites. The content they have licensed and provided on their sites is now being surfaced directly in the SERPs. If this expands to more songs, more countries, and for additional queries, then it can have a massive impact on their businesses. Actually, it could very well end their businesses.

Moving forward, lyrics sites need to up their game from a functionality and value proposition standpoint. If Google can easily add lyrics to the SERPs, then lyrics sites need to keep driving forward with what Google can’t do (at least for now). They should develop new functionality, strengthen community engagement, provide member benefits, include more data and media for artists and songs, provide a killer mobile experience, etc.

Remember, there are many people searching for additional information related to songs. For example, people want to know the meaning of lyrics and seem to enjoy the community engagement about learning what each lyric means. And lyrics don’t trigger in the SERPs for those queries (yet).

And then you have the next generation of devices, social networks, messaging apps, gaming consoles, connected cars, etc. I would start thinking about how people are going to search for lyrics across new devices and in new environments. That’s a new frontier and it would be smart to begin building and testing lyrics applications that can work in those new environments. Mobile, wearables, voice search, cars, etc. provide a wealth of opportunity for business owners focused on music. It just takes the right ideas, time, resources, and of course, money.

But I’ll stop there. I think that topic can be an entire post and this one is getting too long already. :)

 

Summary – Moving Forward With (Expanding) Lyrics in the SERPs
In the short-term, it’s hard to say how this will expand. Google and Bing might drop the effort and keep things as-is, or they could keep expanding lyrics in the SERPs until every song and every country is covered.

Based on the current song and geography limits in Google and Bing, lyrics websites are still surviving, and especially for searches outside the United States. It will be interesting to watch this space over time, especially since I have several clients adapting to the new lyrics world as I write this post.

From an SEO standpoint, between Google Panda and content surfacing in the SERPs, lyrics web sites are fighting a battle on two fronts. If it’s not Panda attacking the site one night, it’s the Knowledge Graph pushing song lyrics front and center in the SERPs. And in this day and age, wars are won by technology, not brute strength. So lyrics sites need to up their engineering prowess, think two to three steps ahead of the industry, and then execute quickly and at a very high level.

That’s how they can survive and prosper in the coming years. Of course, that’s until we have a Google chip implanted in our brains that instantly provides the lyrics to every song ever written, from the around the world, since the beginning of time. Think about that for a second.

GG

 

Filed Under: bing, google, seo, web-analytics

Panda, Penguin, and Manual Actions – Questions, Tips, and Recommendations From My SES Atlanta Session

July 14, 2014 By Glenn Gabe 12 Comments

SES Atlanta Panda

{Important Update About Penguin: Read John Mueller’s latest comments about the Penguin algorithm.}

I just returned from SES Atlanta, where I presented “How To Avoid and Recover From Panda, Penguin, and Manual Actions”. The conference was outstanding, included a killer keynote by Duane Forrester and sessions packed with valuable information about SEO and SEM. By the way, I entered my hotel room in Atlanta and immediately saw a magazine on the desk. The photo above is the cover of that magazine! Yes, a Panda was on the cover. You can’t make this stuff up. :)

During (and after) my presentation about algorithm updates and penalties, I received a number of outstanding questions from audience members. And later in the day, I led a roundtable that focused on Panda and Penguin. There were also some great conversations during the roundtable from business owners and marketers across industries. It’s always interesting to hear top marketer concerns about major algorithm updates like Panda and Penguin (and especially Panda 4.0 which just rolled out in late May). We had a lively conversation for sure.

On the plane flight home, I started thinking about the various questions I was asked, which areas were the most confusing for marketers, and the tips and recommendations I was sharing.  And based on that list, I couldn’t help but think a Q&A style blog post could be very helpful for others dealing with Panda, Penguin, and manual actions. So, I decided to write this post covering a number of those questions. I can’t cover everything that I spoke about at SES Atlanta (or this post would be huge), but I can definitely provide some important tips and recommendations based on questions I received during the conference.  Let’s jump in.

Algorithm Updates and Manual Actions – Q&A From SES Atlanta

Question: I’ve been hit by Panda 4.0. What should I do with “thin content” or “low-quality” content I find on my website?  Is it better to nuke the content (404 or 410), noindex it, or should I redirect that content to other pages on my site?

Glenn: I hear this question often from Panda victims, and I know it’s a confusing topic. My recommendation is to remove thin and low-quality content you find on your site. That means 404 or 410 the content or noindex the content via the meta robots tag. When you have a content quality problem on your site, you need to remove that content from Google’s index. In my experience with helping companies recover from Panda, this has been the best path to take.

That said, if you find content that’s thin, but you feel you can enhance that content, go for it. If you believe the content could ultimately hold information that people are searching for, then beef it up. Just make sure you do a thorough job of developing the additional content. Don’t replace thin content with slightly thin content. Create killer content instead. If you can’t, then reference my first point about nuking the content.

Also, it’s important to ensure you are removing the right content… I’ve seen companies nuke content that was actually fine thinking it was low-quality for some reason. That’s why it’s often helpful to have an objective third party analyze the situation. Business owners and marketers are often too close to their own websites and content to objectively rate it.

Panda Decision Matrix

 

Question: How come I haven’t seen a Panda recovery yet even though I quickly made changes? I was expecting to recover during the next Panda update once the changes were implemented.

Glenn: This is another common question from Panda victims. It’s important to understand that completing the changes alone isn’t enough. Google first needs to recrawl the site and the changes you implemented.  Then it needs to better understand user engagement based on the changes. I’ve explained many times in my blog posts about Panda that the algorithm is heavily focused on user engagement. So just making changes on your site doesn’t provide Google enough information.

Panda recovery can take time. Just read my case study about 6 months with Panda. That was an extreme situation in my opinion, but it’s a great example of how long it can take to recover.

Second, Panda roughly rolls out once per month. You need an update to occur before you can see changes. But that’s not a hard rule. John Mueller from Google clarified the “Panda Tremors” I have been seeing since Panda 4.0, and explained that there isn’t a fixed frequency for algorithm updates like Panda. Instead, Google can continue to tweak the algo to ensure it yields the desired results. Translation: you might see turbulence after a Panda hit (and you may see increases or decreases as the tremors continue).

Panda Tremors John Mueller

And third, you might see smaller recoveries over time during subsequent updates (versus a full recovery in one shot). I’ve had several clients increase with subsequent Panda updates, but it took 4-5 updates for them to fully recover. So keep in mind that you might not see full recovery in one shot.

 

Question:  We know we have an unnatural links problem, and that we were hit by Penguin, but should we tackle the links problem or just build new links to balance out our link profile?

Glenn: I’ve seen many companies that were hit by Penguin avoid tackling the root problem, and instead, just try and build new links to balance out their link profile. In my opinion, that’s the wrong way to go. I always recommend aggressively handling the unnatural links situation, since that’s the most direct path to Penguin recovery.

And to clarify, you should still be pumping out killer content, using Social to get the word out, etc. I always tell clients impacted by Penguin or Panda to act like they aren’t impacted at all. Keep driving forward with new content, sharing via social media, connecting with users, etc. Fresh links and shares will be a natural side effect, and can help the situation for sure. And then the content they are building while under the Penguin filter could end up ranking well down the line. It’s hard to act like you’re not hit, but that’s exactly what you need to do. You need to be mentally tough.

Address Unnatural Links for Penguin

 

Question: Is it ok to remove content from Google’s index? Will that send strange signals to the engines?

Glenn: Nuke it. It’s totally fine to do so, and I’ll go even further and say it could be a great thing to do. I mentioned this several times in my Panda 4.0 findings, but the right indexation is more important than high indexation. In other words, make sure Google has your best content indexed, and not thin, duplicate, or other low-quality content.

I had one client drop their indexation by 83% after being impacted by Phantom and Panda, and they are doing extremely well now Google organic-wise. I love the screenshot below. It goes against what many marketers would think. Lower indexation = more Google traffic. That’s awesome.

Indexation and Panda

 

Question: We consume a lot of syndicated content. What’s the best way to handle attribution?

Glenn: I saw a number of sites get smoked during Panda 4.0 that were consuming a lot of syndicated content and not handling that properly SEO-wise. The best way to handle attribution for syndicated content is to use the cross domain canonical url tag pointing to the original article. If you can’t do that (or don’t want to do that), then you can keep the content out of Google’s index by noindexing it via the meta robots tag.

It’s not your content, so you shouldn’t be taking credit for it.  That said, if set up correctly, it’s fine to have syndicated content on your site for users to read. But the proper attribution is important or it can look like you are copying or scraping content. I know that won’t go over well for ad teams looking to rank in organic search (to gain more pageviews), but again, it’s not your content to begin with.

Syndication and Panda

 

Question: Why hasn’t there been a Penguin update since October of 2013? What’s going on? And will there ever be another update?

Glenn: It’s been a long time since the last Penguin update (October 4, 2013). Like many others heavily involved with Penguin work, I’m surprised it has taken so long for another update.

Penguin 2.1 on October 4, 2013

Matt Cutts recently explained at SMX Advanced that they have been heavily working on Panda 4.0, so Penguin has taken a back seat. But he also said that an engineer came up to him recently and said, “it’s probably time for a Penguin update”. That situation is both positive and scary at the same time.

On the one hand, at least someone is thinking about Penguin on the webspam team! But on the flip side, they clearly haven’t been focusing on Penguin for some time (while many Penguin victims sit waiting for an update). On that note, there are many webmasters who have rectified their unnatural link problems, disavowed domains, urls, etc., and are eagerly awaiting a Penguin update. It’s not exactly fair that Google has been making those business owners wait so long for Penguin to roll out again.

Now, there’s always a possibility that there is a problem with the Penguin algorithm. Let’s face it, there’s no reason it should take so long in between updates. I’m wondering if they are testing Penguin and simply not happy with the results. If that’s the case, then I could see why they would hold off on unleashing a new update (since it could wreak havoc on the web). But that’s just speculation.

In my opinion, it’s not cool to let Penguin victims that have worked hard to fix their link problems sit in Penguin limbo. So either Google is seriously punishing them for the long-term, they have put the algo on the back burner while focusing on other algos like Panda, or Penguin is not up to par right now. Remember, if Google isn’t happy with the results, then they don’t have to push it out. And if that’s the case, Penguin victims could sit in limbo for a long time (even longer than the 9 months they have waited so far.)  Not good, to say the least.


Important Penguin Update: Google’s John Mueller provided more information about the Penguin algorithm on today’s Webmaster Central Office Hours Hangout.

John was asked if Penguin would be released again or if it was being retired. And if it was being “retired”, then would Google at least run it one more time to free those webmasters that had cleaned up their link profiles. John explained that Penguin was not being retired. Let me say that again. he said Penguin is not being retired. John explained that it can sometimes take longer than expected to prepare the algorithm and update the necessary data. He also explained that if Google were to retire an algorithm, then they would “remove it completely” (essentially removing any effect from the algorithm that was in place).

So we have good news on several fronts. Penguin is still alive and well. And if Google did retire the algo, then the effect from Penguin would be removed. Let’s hope another Penguin update rolls out soon.

You can view the video below (starting at 5:16) or you can watch on YouTube -> https://www.youtube.com/watch?v=8r3IIPCHt0E&t=5m16s

 

Question: We’ve been hit by both Panda and Penguin. We don’t have a lot of resources to help with recovery, so which one do we tackle first?

Glenn: I’ve helped a number of companies with Pandeguin problems over the years, and it’s definitely a frustrating situation for business owners. When companies don’t have resources to tackle both situations at the same time, then I’ve always been a big fan of tackling the most acute situation first, which is Penguin.

Pandeguin Hit

Panda is a beast, and has many tentacles. And Penguin is all about unnatural links (based on my analysis of over 400 sites hit by Penguin since April 24, 2012). That’s why I recommend focusing on Penguin first (if you can’t knock out both situations at once). I recommend aggressively tackling unnatural links, remove as many spammy links as you can, and then disavow the remaining ones you can’t get to manually. Then set up a process for monitoring your link profile over time (to ensure new unnatural links don’t pop up).

After which, you can tackle the Panda problem. I would begin with a comprehensive Panda audit, identify the potential problems causing the Panda hit, and aggressively attack the situation (the bamboo). Move quickly and aggressively. Get out of the grey area of Panda (it’s a maddening place to live).

 

Question: Is linkbuilding dead? Should I even focus on building links anymore and how do I go about doing that naturally?

Glenn: Links are not dead! The right links are even more important now. I know there’s a lot of fear and confusion about linkbuilding since Google has waged war on unnatural links, but to me, that makes high quality links even more powerful.

Duane Forrester wrote a post recently on the Bing Webmaster Blog where he explained if you know where a link is coming from prior to gaining that link, then you are already going down the wrong path. That was a bold statement, but I tend to agree with him.

Duane Forrester Quote About Linkbuilding

I had several conversations about this topic at SES Atlanta. To me, if you build killer content that helps your target audience, that addresses pain points, and teaches users how to accomplish something, then there’s a good chance you’ll build links. It’s not the quantity of links either… it’s the quality. I’d rather see a client build one solid link from a site in their niche versus 1000 junky links. The junky links are Penguin food, while the solid link is gold.

 

Question: I was hit by Panda, but my core competitors have the same problems we do. We followed what they were implementing, and we got hit. Why didn’t they get hit? And moving forward, should we follow others that are doing well SEO-wise?

Glenn: I can’t tell you how many times companies contact me and start showing me competitors that are doing risky things SEO-wise, yet those sites are doing well in Google. They explain that they tried to reproduce what those competitors were doing, and then they ended up getting hit by Panda. That situation reinforces what I’ve told clients for a long time. Competitive analyses can be extremely beneficial for gathering the right intelligence about your competitors, but don’t blindly follow what they are doing. That’s a dangerous road to travel.

Instead, companies should map out a strong SEO strategy based on their own research, expertise, target audience, etc. Ensure you are doing the right things SEO-wise for long-term success. Following other companies blindly is a dangerous thing to do. They could very easily be headed towards SEO disaster and you’ll be following right along.

For example, I had a client always bring up one specific company to me that was pushing the limits SEO-wise (using dark grey hat tactics). Well, they finally got hit during a Panda update in early 2014 and lost a substantial amount of traffic. I sent screenshots to my client which reinforced my philosophy. My client was lucky they didn’t follow that company’s tactics… They would have jumped right off the SEO cliff with them. The screenshot below shows an example of a typical surge in Google before a crash.

Surge in Traffic Before Algo Hit

 

Question: We’ve been working hard on a manual action for unnatural links, but right before filing reconsideration, it expired. What should we do?

Glenn: I’ve seen this happen with several clients I was helping with manual actions. It’s a weird situation for sure. You are working on fixing problems based on receiving a manual action, and right before you file a reconsideration request, the manual action disappears from Google Webmaster Tools. When that happens, is the site ok, do you still need to file a reconsideration request with Google, should you wait, or should you continue working on the manual action?

It’s important to know that manual actions do expire. You can read that article by Marie Haynes for more information about expiring manual actions. Google has confirmed this to be the case (although the length of each manual action is variable). But those manual actions can return if you haven’t tackled the problem thoroughly… So don’t’ think you’re in the clear so fast.

Expiring Manual Actions

 

That said, if you have tackled the problem thoroughly, then you are probably ok. For example, I was helping a company with a manual action for unnatural links and we had completed the process of removing and disavowing almost all of their unnatural links. We had already written the reconsideration request and were simply waiting on a few webmasters that were supposed to take down more links before filing with Google.

As we were waiting (just a few extra days), the manual action disappeared from Google Webmaster Tools. Since we did a full link cleanup, we simply drove forward with other initiatives. That was months ago and the site is doing great SEO-wise (surging over the past few months).

Just make sure you thoroughly tackle the problem at hand. You don’t want a special surprise in your manual action viewer one day… which would be the return of the penalty. Avoid that situation by thoroughly fixing the problems causing the penalty.

 

Summary – Clarifying Panda and Penguin Confusion
As you can see, there were some outstanding and complex questions asked at SES Atlanta. It confirms what I see every day… that business owners and webmasters are extremely confused with algorithm updates like Panda and Penguin and how to tackle penalties. And when you combine algo updates with manual actions, you have the perfect storm of SEO confusion.

I hope the Q&A above helped answer some questions you might have about Panda, Penguin, and manual actions. And again, there were several more questions asked that I can’t fit into this post! Maybe I’ll tackle those questions in another post. So stay tuned, subscribe to my feed, and keep an eye on my Search Engine Watch column.

And be prepared, I felt a slight chill in the air this past weekend. The next Penguin update could (and should) be arriving soon. Only Google knows, but I hope they unleash the algo update soon. Like I said in my post, there are many webmasters eagerly awaiting another Penguin rollout. Let’s hope it’s sooner than later.

GG

 

Filed Under: algorithm-updates, bing, google, seo

How Bing Pre-Renders Webpages in IE11 and How Marketers Can Use The Pre-Render Tag for CRO Today

November 2, 2013 By Glenn Gabe Leave a Comment

Bing, IE11, and Pre-rendering Webpages

Bing recently announced it is using IE11’s pre-render tag to enhance the user experience on Bing.com.   Pre-rendering enables Bing to automatically download the webpage for the first search result before you visit that page.  Note, this only happens for “popular searches”, and I’ll cover more about that below.  Pre-rendering via Bing means the destination page will load almost instantaneously when you click through the first search result.  Bing explained that over half of users click the first result, and using IE11’s pre-render tag can enhance the user experience by loading the destination page in the background, after the search is conducted.

A Quick Pre-Render Example:
If I search Bing for “Samsung” in IE11, the first result is the U.S. Samsung website.  When clicking through to the website, the first page loads immediately without any delay (including all webpage assets, like images, scripts, etc.)  Checking the Bing search results page reveals that Bing was using pre-render for the Samsung website homepage.  You can see this via the source code.  See the screenshots below.

Search Results and Sitelinks for Samsung

Checking the source code reveals Bing is pre-rendering the U.S. Samsung homepage:

Bing Source Code Pre-Render Tag

 

Yes, Google Has Been Doing This With “Instant Pages”
In case you were wondering, Google has been accomplishing this with “Instant Pages” in Chrome since 2011, but it’s good to see Bing roll out pre-rendering as well.  My guess is you’ve experienced the power of pre-rendering without even realizing it.  When Bing and Google have high confidence that a user will click the first search result, they will use the pre-render tag to load the first result page in the background.  Then upon clicking through, the page instantaneously displays.  That means no waiting for large photos or graphics to load, scripts, etc.  The page is just there.

Testing Bing’s Pre-Render in IE11
Once Bing rolled out pre-render via IE11, I began to test it across my systems.  When it kicked in, the results were impressive.  The first result page loaded as soon as I clicked through.  I was off and running on the page immediately.

But when did Bing actually pre-render the page and why did some search results not spark Bing to pre-render content?   Good questions, and I dug into the search results to find some answers.

Identifying Pre-rendering with Bing and IE11
During my testing, I began to notice a trend.  Pre-rendering was only happening when sitelinks were provided for a given search result.  So, if I searched for “apple ipad”, which Bing does not provide sitelinks for, then pre-rendering was not enabled.  But if I searched for just “Apple”, and Bing did provide sitelinks, then pre-render was enabled.  If I searched for “Acura”, sitelinks were provided for the branded search, and the first result was pre-rendered.

A Bing search for “Acura” yields sitelinks:
Search Results and Sitelinks for Acura

 

Checking the source code reveals Bing is pre-rendering the first search result for “Acura”:
Bing Source Code Pre-Render Tag for Acura

 

A Bing search for “Derek Jeter” does not yield sitelinks:
Bing Search Results for Derek Jeter
Checking the source code reveals Bing is not pre-rendering the first search result for “Derek Jeter”:
Bing Source Code for Derek Jeter Without Pre-render

 

So, Bing clearly needed high confidence that I would click through the first listing in order to use pre-render.  In addition, there was a high correlation between sitelinks and the use of the pre-render tag.  For example, “how to change oil” did not yield pre-rendering, “Derek Jeter” did not trigger pre-rendering, and “weather” did not trigger pre-rendering.  But “Firefox” did trigger sitelinks and the use of pre-render.

How Can You Tell If Pre-Rendering is Taking Place
You need an eagle eye like me to know.  Just kidding.  :)  I simply viewed the source code of the search result page to see if the pre-render tag was present.  When it was, you could clearly see the “url0=” parameter and the value (which was the webpage that was being pre-rendered).  You can see this in the screenshots listed above.

And for Chrome, you could check task manager and see if a page is being pre-rendered.  It’s easy to do and will show you if the page is being pre-rendered and the file size.

Using Chrome’s Task Manager to view Pre-rendered Pages
Using Chrome Task Manager to Check Pre-render

 

How Marketers Can Use Pre-Render On Their Own Websites for CRO Today
Yes, you read that correctly.  You can use pre-render on your own website to pre-load pages when you have high confidence that a user will navigate to that page.  I’m wondering how many Conversion Rate Optimization (CRO) professionals have tried that out!  Talk about speeding up the user experience for prospective customers.

Imagine pre-loading the top product page for a category, the first page of your checkout process, the lead generation form, etc.  Pre-rendering content is supported by Chrome, IE11, and Firefox, so you can actually test this out today.

I’ve run some tests on my own and the pre-rendered pages load in a flash.  But note, Chrome and IE11 support prerender, while Firefox supports prefetch.  That’s important to know if you’re a developer or designer.  Also, I believe you can combine prerender and prefetch in one link tag to support all three browsers, but I need to test it out in order to confirm the combination works.  Regardless, I recommend testing out pre-rendering on your own site and pages to see how it works.

You can analyze visitor paths and determine pages that overwhelmingly lead to other pages.  And when you have high confidence that a first page will lead to a second page, then implement the pre-render tag.  Heck, split test this approach!  Then determine if there was any lift in conversion based on using pre-render to speed up the conversion process.

Analyzing Behavior Flow in Google Analytics to Identify “Connected Pages”:
Analyzing Behavior Flow to Identify Connected Pages

 

An Example of Using Pre-Render
Let’s say you had a killer landing page that leads to several other pages containing supporting content.  One of those pages includes a number of testimonials from customers, and you notice that a high percentage of users click through to that page from the initial landing page.  Based on what I explained earlier, you want to quicken the load time for that second page by using pre-render.  Your hope is that getting users to that page as quickly as possible can help break down a barrier to conversion, and hopefully lead to more sales.

All that you would need to do is to include the following line of code in the head of the first document:

<link rel=”prerender” href=”http://www.yourdomain.com/some-page-here.htm” >

Note, that will work in Chrome and IE11.  If you combine prerender with prefetch, then I believe that will work across Chrome, IE11, and Firefox.

When users visit the landing page, the second page will load in the background.  When they click the link to visit the page, that page will display instantaneously.  Awesome.

 

Summary – Pre-Render is Not Just For Search Engines
With the release of IE11, Bing is starting to pre-render pages in the background when it has high confidence you will click the first search result.  And Google has been doing the same with “Instant Pages” since 2011.  Pre-rendering aims to enhance the user experience by displaying pages extremely quickly upon click-through.

But pre-render is not just for search engines.  As I demonstrated above, you can use the technique on your own pages to reduce a barrier to conversion (the speed at which key pages display for users on your website).  You just need to determine which pages users visit most often from other key landing pages, and then implement the pre-render tag.  And you can start today.  Happy pre-rendering.  :)

GG

 

Filed Under: bing, cro, google

Robots.txt and Invisible Characters – How One Hidden Character Could Cause SEO Problems

May 13, 2013 By Glenn Gabe 1 Comment

How syntax errors in robots.txt can cause SEO problems.

If you’ve read some of my blog posts in the past, then you know I perform a lot of SEO technical audits.  As one of the checks during SEO audits, I always analyze a client’s robots.txt file to ensure it’s not blocking important directories or files.  If you’re not familiar with robots.txt, it’s a text file that sits in the root directory of your website and should be used to inform the search engine bots which directories or files they should not crawl.  You can also add autodiscovery for your xml sitemaps (which is a smart directive to add to a robots.txt file).

Anyway, I came across an interesting situation recently that I wanted to share.  My hope is that this post can help some companies avoid a potentially serious SEO issue that was not readily apparent.  Actually, the problem could not be detected by the naked eye.  And when a problem impacts your robots.txt file, the bots won’t follow your instructions.  And when the bots don’t follow instructions, they can potentially be unleashed into content that should never get crawled.  Let’s explore this situation in greater detail.

A sample robots.txt file:

Sample Robots.txt File

Technical SEO – Cloaked Danger in a Robots.txt File
During my first check of the robots.txt file, everything looked fine.  There were a number of directories being blocked for all search engines.  Autodiscovery was added, which was great.  All looked good.  Then I checked Google Webmaster Tools to perform some manual checks on various files and directories (based on Google’s “Blocked URLs” functionality).  Unfortunately, there were a number of errors showing within the analysis section.

The first error message started with the User-agent line (the first line in the file).  Googlebot was choking on that line for some reason, but it looked completely fine.  And as you can guess, none of the directives listed in the file were being adhered to.  This meant that potentially thousands of files would be crawled that shouldn’t be crawled, and all because of a problem that was hiding below the surface…  literally.

Blocked URLs reporting in Google Webmaster Tools:

Blocked URLs in Google Webmaster Tools

 

Word Processors and Hidden Characters
So I started checking several robots.txt tools to see what they would return.  Again, the file looked completely fine to me.  The first few checks returned errors, but wouldn’t explain exactly what was wrong.  And then I came across one that revealed more information.  The tool revealed an extra character (hidden character) at the beginning of the robots.txt file.  This hidden character was throwing off the format of the file, and the bots were choking on it.  And based on the robots syntax being thrown off, the bots wouldn’t follow the instructions.  Not good.

Invisible Character in Robots.txt

I immediately sent this off to my client and their dev team tracked down the hidden character, and created a new robots.txt file.  The new file was uploaded pretty quickly (within a few hours).  And all checks are fine now.  The bots are also adhering to the directives included in robots.txt.

 

The SEO Problems This Scenario Raises
I think this simple example underscores the fact that there’s not a lot of room for error with technical SEO… it must be precise.  In this case, one hidden character in a robots.txt file unleashed the bots on a lot of content that should never be crawled.  Sure, there are other mechanisms to make sure content doesn’t get indexed, like the proper use of the meta robots tag, but that’s for another post.  For my client, a robots.txt file was created, it looked completely fine, but one character was off (and it was hidden).  And that one character forced the bots to choke on the file.

 

How To Avoid Robots.txt Formatting Issues
I think one person at my client’s company summed up this situation perfectly when she said, “it seems you have little room for error, SEO seems so delicate”.  Yes, she’s right (with technical SEO).  Below, I’m going to list some simple things you can do to avoid this scenario.   If you follow these steps, you could avoid faulty robots.txt files that seem accurate to the naked eye.

1. Text Editors
Always use a text editor when creating your robots.txt file.  Don’t use a word processing application like Microsoft Word.  A text editor is meant to create raw text files, and it won’t throw extra characters into your file by accident.

2. Double and Triple Check Your robots.txt Directives
Make sure each directive does exactly what you think it will do.  If you aren’t 100% sure you know, then ask for help.  Don’t upload a robots.txt file that could potentially block a bunch of important content (or vice versa).

3. Test Your robots.txt File in Google Webmaster Tools and Via Third Party Tools
Make sure the syntax of your robots.txt file is correct and that it’s blocking the directories and files you want it to.  Note, Google Webmaster Tools enables you to copy and paste a new robots file into a form and test it out.  I highly recommend you do this BEFORE uploading a new file to your site.

4. Monitor Google Webmaster Tools “Blocked URLs” Reporting
The blocked urls functionality will reveal problems associated with your robots.txt file under the “analysis” section.  Remember, this is where I picked up the problem covered in this post.

 

Extra Characters in Robots.txt – Cloaked in Danger
There you have it.  One hidden character bombed a robots.txt file.  The problem was hidden to the naked eye, but the bots were choking on it.  And depending on your specific site, that one character could have led to thousands of pages getting crawled that shouldn’t be.  I hope this post helped you understand that your robots.txt format and syntax are extremely important, that you should double and triple check your file, and that you can test and monitor that file over time.  If the wrong file is uploaded to your website, bad things can happen.  Avoid this scenario.

GG

 

Filed Under: bing, google, seo, tools

Connect with Glenn Gabe today!

Latest Blog Posts

  • Google Search Console (GSC) reporting for Soft 404s is now more accurate. But where did those Soft 404s go?
  • Google’s December 2020 Broad Core Algorithm Update Part 2: Three Case Studies That Underscore The Complexity and Nuance of Broad Core Updates
  • Google’s December 2020 Broad Core Algorithm Update: Analysis, Observations, Tremors and Reversals, and More Key Points for Site Owners [Part 1 of 2]
  • Exit The Black Hole Of Web Story Tracking – How To Track User Progress In Web Stories Via Event Tracking In Google Analytics
  • Image Packs in Google Web Search – A reason you might be seeing high impressions and rankings in GSC but insanely low click-through rate (CTR)
  • Google’s “Found on the Web” Mobile SERP Feature – A Knowledge Graph and Carousel Frankenstein That’s Hard To Ignore
  • Image Migrations and Lost Signals – How long before images lose signals after a flawed url migration?
  • Web Stories Powered by AMP – 12 Tips and Recommendations For Creating Your First Story
  • Visualizing The SEO Engagement Trap – How To Use Behavior Flow In Google Analytics To View User Frustration [Case Study]
  • The May 2020 Google Core Update – 4 Case Studies That Emphasize The Complexity Of Broad Core Algorithm Updates

Web Stories

  • Google’s Disqus Indexing Bug
  • Google’s New Page Experience Signal

Archives

  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2021 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy

We are using cookies to give you the best experience on our website.

You can find out more about which cookies we are using or switch them off in settings.

The Internet Marketing Driver
Powered by  GDPR Cookie Compliance
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookies

Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.

3rd Party Cookies

This website uses Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages.

Keeping this cookie enabled helps us to improve our website.

This site also uses pixels from Facebook, Twitter, and LinkedIn so we publish content that reaches you on those social networks.

Please enable Strictly Necessary Cookies first so that we can save your preferences!