The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

Archives for June 2015

How To Identify and Avoid Technical SEO Optical Illusions

June 17, 2015 By Glenn Gabe Leave a Comment

Technical SEO Optical Illusions

Without a clean and crawlable website structure, you’re dead in the water SEO-wise. For example, if you don’t have a solid SEO foundation, you can end up providing serious obstacles for both users and search engines. And that’s never a good idea. And even if you have clean and crawlable structure, problems with various SEO directives can throw a wrench into the situation. And those problems can lie beneath the surface just waiting to kill your SEO efforts. That’s one of the reasons I’ve always believed that a thorough technical audit is the most powerful deliverables in all of SEO.

The Power of Technical SEO Audits: Crawls + Manual Audits = Win
“What lies beneath” can be scary. Really scary… The reality for SEO is that what looks fine on the surface may have serious flaws. And finding those hidden problems and rectifying them as quickly as possible can help turn a site around SEO-wise.

When performing an SEO audit, it’s incredibly important to manually dig through a site to see what’s going on. That’s a given. But it’s also important to crawl the site to pick up potential land mines. In my opinion, the combination of both a manual audit and extensive crawl analysis can help you uncover problems that may be inhibiting the performance of the site SEO-wise. And both might help you surface dangerous optical illusions, which is the core topic of my post today.

Uncovering Optical SEO Illusions
Optical illusions can be fun to check out, but they aren’t so fun when they can negatively impact your business. When your eyes play tricks on you, and your website takes a Google hit due to that illusion, it’s not so fun.

The word “technical” in technical SEO is important to highlight. If your code is even one character off, it could have a big impact on your site SEO-wise. For example, if you implement the meta robots tag on a site with 500,000 pages, then the wrong directives could wreak havoc on your site. Or maybe you are providing urls in multiple languages using hreflang, and those additional urls are adding 30,000 urls to your site. You would definitely want to make sure those hreflang tags are set up correctly.

But what if you thought those directives and tags were set up perfectly when in fact, they aren’t set up correctly. The look right at first glance, but there’s something just not right…

That’s the focus of this post today, and it can happen easier than you think. I’ll walk through several examples of SEO optical illusions, and then explain how to avoid or pick up those illusions.

Abracadabra, let’s begin. :)

Three Examples of Technical SEO Optical Illusions
First, take a quick look at this code:

Technical SEO Problem with hreflang

Did you catch the problem? The code uses “alternative” versus “alternate”. And that was on a site with 2.3M pages indexed, many of which had hreflang implemented pointing to various language pages.

Hreflang Using Alternative vs. Alternate

Now take a look at this code:

SEO Technical Problem with rel canonical

All looks ok, right? At first glance you might miss it, but the code uses “content” versus “href”. If rolled out to a website, it means rel canonical won’t be set up correctly for any pages using the flawed directive. And on sites where rel canonical is extremely important, like sites with urls resolving multiple ways, this can be a huge problem.

Technical SEO problem with rel canonical

Now how about this one?
Technical SEO problem with meta robots

OK, so you are probably getting better at this already. The correct value should be “noindex” and not “no index”. So if you thought you were keeping those 75,000 pages out of Google’s index, you were wrong. Not a good thing to happen while Pandas and Phantoms roam the web.

Meta Robots problem using no index vs. noindex

I think you get the point.

How To Avoid Falling Victim To Optical Illusions?
As mentioned earlier, using an approach that leverages manual audits, site-wide crawls, and then surgical crawls (when needed) can help you nip problems in the bud. And leveraging reporting in Google Search Console (formerly Google Webmaster Tools) is obviously a smart way to proceed as well. Below, I’ll cover several things you can do to identify SEO optical illusions while auditing a site.

SEO Plugins
From a manual audit standpoint, using plugins like Mozbar, SEO Site Tools, and others, can help you quickly identify key elements on the page. For example, you can easily check rel canonical and the meta robots tag via both plugins.

Using Mozbar to identify technical seo problems.

Crawlers
From a crawl perspective, you can use DeepCrawl for larger crawls and Screaming Frog for small to medium size crawls. I often use both DeepCrawl and Screaming Frog on the same site (using “The Frog” for surgical crawls once I identify issues through manual audits or the site-wide crawl).

Each tool provides data about key technical SEO components like rel canonical, meta robots, rel next/prev, and hreflang. Note, DeepCrawl has built-in support for checking hreflang, while Screaming Frog requires a custom search.

Using DeepCrawl to identify SEO technical problems.

Once the crawl is completed, you can double-check the technical implementation of each directive by comparing what you are seeing during the manual audit to the crawl data you have collected. It’s a great way to ensure each element is ok and won’t cause serious problems SEO-wise. And that’s especially the case on larger-scale websites that may have thousands, hundreds of thousands, or millions of pages on the site.

Google Search Console Reports
I mentioned earlier that Google Search Console reports can help identify and avoid optical illusions. Below, I’ll touch on several reports that are important from a technical SEO standpoint.

Index Status
Using index status, you can identify how many pages Google has indexed for the site at hand. And by the way, this can directory-level (which is a smart way to go). Index Status reporting will not identify specific directives or technical problems, but can help you understand if Google is over or under-indexing your site content.

For example, if you have 100,000 pages on your site, but Google has indexed just 35,000, then you probably have an issue…

Using Index Status in Google Search Console to identify indexation problems.

International Targeting
Using the international targeting reporting, you can troubleshoot hreflang implementations. The reporting will identify hreflang errors on specific pages of your site. Hreflang is a confusing topic for many webmasters and the reporting in GSC can get you moving in the right direction troubleshooting-wise.

Using International Targeting reporting in GSC to troubleshoot hreflang.

Fetch as Google

Using Fetch as Google, you can see exactly what Googlebot is crawling and the response code it is receiving. This includes viewing the meta robots tag, rel canonical tags, rel next/prev, and hreflang tags. You can also use fetch and render to see how Googlebot is rendering the page (and compare that to what users are seeing).

Uisng fetch as google to troubleshoot techncial SEO problems.

Robots.txt and Blocked Resources
Using the new robots.txt Tester in Google Search Console enables you to test the current set of robots.txt directives against your actual urls (to see what’s blocked and what’s allowed). You can also use the Tester as a sandbox to change directives and test urls. It’s a great way to identify current problems with your robots.txt file and see if future changes will cause issues.

Using robots.txt Tester to troubleshoot technical SEO problems.

Summary – Don’t Let Optical Illusions Trick You, and Google…
If there’s one thing you take away from this post, it’s that technical SEO problems can be easy to miss. Your eyes can absolutely play tricks on you when directives are even just a few characters off in your code. And those flawed directives can cause serious problems SEO-wise if not caught and refined.

The good news is that you can start checking your own site today. Using the techniques and reports I listed above, you can dig through your own site to ensure all is coded properly. So keep your eyes peeled, and catch those illusions before they cause any damage. Good luck.

GG

 

Filed Under: google, seo, tools

More Findings From Google’s Phantom 2 Update (“The Quality Update”) – Panda Overlap, Long-Term Approach, URL Tinkering, and More

June 5, 2015 By Glenn Gabe 15 Comments

 

Google Phantom 2 Update

{Update November 2015: Google rolled out a significant algorithm update on November 19, 2015 that had a strong connection to Phantom 2 from May 2015. Many sites that were impacted during Phantom 2 in May were also impacted on November 19 during Phantom 3. And a number of companies working to rectify problems saw recovery and partial recovery. You can learn more about Google’s Phantom 3 Update in my post covering a range of findings.}

My Google Richter Scale was insanely active the week of April 27, 2015. That’s when Google rolled out a change to its core ranking algorithm to better assess quality signals. I called the update Phantom 2, based on the mysterious nature of the update (and since Google would not confirm it at the time). It reminded me a lot of the original Phantom update, which rolled out on May 8, 2013. Both were focused on content quality and both had a big impact on websites across the web.

I heavily analyzed Phantom 2 soon after the update rolled out, and during my analysis, I saw many sites swing 10-20% either up or down, while also seeing some huge hits and major surges. One site I analyzed lost ~80% of its Google organic traffic overnight, while another surged by 375%. And no, I haven’t seen any crazy Phantom 2 recoveries yet (which I never thought would happen so quickly). More about that soon.

Phantom 2 Drop Sustained

Also, based on my first post, I was interviewed by CNBC about the update. Between my original post and the CNBC article, the response across the web was amazing. The CNBC article has now been shared over 4,500 times, which confirms the level of impact Phantom had across the web. Just like Phantom 1 in 2013, Phantom 2 was significant.

Phantom and Panda Overlap
While analyzing the impact of Phantom 2, it wasn’t long before I could clearly see that the update heavily targeted content quality. I was coming across serious low quality content, user engagement problems, advertising issues, etc. Many (including myself) initially thought it was a Panda update based on seeing the heavy targeting of content quality problems.

And that made sense timing-wise, since the last Panda update was over seven months ago (10/24/14). Many have been eagerly awaiting an update or refresh since they have been working heavily to fix any Panda-related problems. It’s not cool that we’ve gone over seven months when Panda used to roll out frequently (usually about once per month).

But we have good news out of SMX Advanced. Gary Illyes explained during his AMA with Danny Sullivan that the next Panda refresh would be within the next 2-4 weeks. That’s excellent news for many webmasters that have been working hard on Panda remediation. So definitely follow me on Twitter since I’ll be covering the next Panda refresh/update in great detail. Now back to Phantom.

After Phantom rolled out, Google denied Panda (and Penguin) and simply said this was a “normal update”. And then a few weeks later, they explained a little more about our ghostly friend. Gary Illyes said it was a change to Google’s core ranking algorithm with how it assesses “quality signals”. Ah, that made a lot of sense based on what I was seeing…

So, it wasn’t Panda, but focuses on quality. And as I’ve said many times before, “content quality” can mean several things. You have thin content, low quality affiliate content, poor user experience, advertising obstacles, scraped content, and more. Almost every problem I came across during my Phantom analysis would have been something I would have targeted from a Panda standpoint. So both Phantom and Panda seem to chew on similar problems. More about this soon.

The Ultimate Panda and Phantom Cocktail

Phantom Recovery and A Long-Term Approach
So, Panda targets low quality content and can drag an entire site down in the search results (it’s a domain-level demotion). And now we have Phantom, which changed how Google’s core ranking algorithm assesses “quality signals”. The change can boost urls with higher quality content (which of course means urls with lower quality content can drop).

It’s not necessarily a filter like Panda, but can act the same way for low quality content on your site. Google says it’s a page-level algorithm, so it presumably won’t drag an entire domain down. The jury is still out on that… In other words, if you have a lot of low quality content, it can sure feel like a domain-level demotion. For example, I’ve seen some websites get obliterated by the Phantom update, losing 70%+ of their traffic starting the week of April 27th. Go tell the owner of that website that Phantom isn’t domain-level. :)

Huge Phantom 2 Hit

URL Tinkering – Bad Idea
So, since Phantom is page-level, many started thinking they could tinker with a page and recover the next time Google crawled the url. I never thought that could happen, and I still don’t. I believe it’s way more complex than that… In addition, that approach could lead to a lot of spinning wheels SEO-wise. Imagine webmasters tinkering endlessly with a url trying to bounce back in the search results. That approach can drive most webmasters to the brink of insanity.

I believe it’s a complex algorithm that also takes other factors into account (like user engagement and some domain-level aspects). For example, there’s a reason that some sites can post new content, have that content crawled and indexed in minutes, and even start ranking for competitive keywords quickly. That’s because they have built up trust in Google’s eyes. And that’s not page-level… it’s domain-level. And from an engagement standpoint, Google cannot remeasure engagement quickly. It needs time, users hitting the page, understanding dwell time, etc. before it can make a decision about recovery.

It’s not binary like the mobile-friendly algorithm. Phantom is more complex than that (in my opinion).

John Mueller’s Phantom Advice – Take a Long-Term Approach To Increasing Quality
Back to “url tinkering” for a minute. Back when Phantom 1 hit in May of 2013, I helped a number of companies deal with the aftermath. Some had lost 60%+ of their Google organic traffic overnight. My approach was very similar to Panda remediation. I heavily analyzed the site from a content quality standpoint and then produced a serious remediation plan for tackling those problems.

I didn’t take a short-term approach, I didn’t believe it was page-level, and I made sure any changes clients would implement would be the best changes for the long-term success of the site. And that worked. A number of those companies recovered from Phantom within a six month period, with some recovering within four months. I am taking the same approach with Phantom 2.

And Google’s John Mueller feels the same way. He was asked about Phantom in a recent webmaster hangout video and explained a few key points. First, he said there is no magic bullet for recovery from a core ranking change like Phantom. He also said to focus on increasing the quality on your site over the long-term. And that one sentence has two key points. John used “site” and not “page”. And he also said “long-term”.

So, if you are tinkering with urls, fixing minor tech issues on a page, etc., then you might drive yourself insane trying to recover from Phantom. Instead, I would heavily tackle major content quality and engagement problems on the site. Make big improvements content-wise, improve the user experience on the site, decrease aggressive advertising tactics, etc. That’s how you can exorcise Phantom.

More Phantom Findings
Over the past month, I’ve had the opportunity to dig into a number of Phantom hits. Those hits range from moderate drops to severe drops. And I’ve analyzed some sites that absolutely surged after the 4/29 update. I already shared several findings from a content quality standpoint in my first post about Phantom 2, but I wanted to share more now that I have additional data.

Just like with Panda, “low quality content” can mean several things. There is never just one type of quality problem on a website. It’s usually a combination of problems that yields a drop. Here are just a few more problems I have come across while analyzing sites negatively impacted by Phantom 2. Note, this is not a full list, but just additional examples of what “low quality content” can look like.

Thin Content and Technical Problems = Ghostly Impact
An ecommerce site that was heavily impacted by Phantom reached out to me for help. Once I dug into the site, the content quality problems were clear. First, the site had a boatload of thin content. Pages were extremely visual with no supporting content. The only additional content was triggered via a tab (and there wasn’t much added to the page once triggered). The site has about 25-30K pages indexed.

The site also used infinite scroll to display the category content. If set up properly SEO-wise, this isn’t a huge deal (although I typically recommend not to use infinite scroll). But this setup had problems. When I dug into Google’s cache to see how the page looked, the page would continually refresh every second or so. Clearly there was some type of issue with how the pages were coded. In addition, when checking the text-only version, the little content that was provided on the page wasn’t even showing up. Thin content became even thinner…

So, you had extremely thin content that was being cut down to no content due to how the site was coded. And this problem was present across many urls on the site.

Ecommerce Site with Phantom Problems

Directory With Search Results Indexed + Followed Links
Another Phantom hit involved a large directory and forum. There are hundreds of thousands of pages indexed and many traditional directory and forum problems are present. For example, the directory listings were thin, there was a serious advertising issue across those pages (including the heavy blending of ads with content), and search results indexed.

In addition, and this was a big problem, all of the directory links were followed. Since those links are added by business owners, and aren’t natural, they should absolutely be nofollowed. I estimate that there are ~80-90K listings in the directory, and all have followed links to external websites that have set up the listings. Not good.

An example of a low quality directory page with followed links to businesses:

Directory with Phantom 2 Problems

Rogue Low Quality Content Not Removed During Initial Panda Work
One business owner reached out to me that had been heavily impacted by Panda and Penguin in the past. But they worked hard to revamp the site, fix content quality problems, etc. They ended up recovering (and surging ) during Panda 4.1.  They were negatively impacted by Phantom (not a huge drop, but about 10%).

Quickly checking the site after Phantom 2 rolled out revealed some strange legacy problems (from the pre-recovery days). For example, I found some horrible low quality content that contained followed links to a number of third party websites. The page was part of a strategy employed by a previous SEO company (years ago). It’s a great example of rogue low quality content that can sit on a site and cause problems down the line.

Note, I’m not saying that one piece of content is going to cause massive problems, but it surely doesn’t help. Beyond that, there were still usability problems on the site, mobile problems, pockets of thin content, and unnatural-looking exact match anchor text links weaved into certain pages. All of this together could be causing Phantom to haunt the site.

Low Quality Content With Followed Links

No, Phantoms Don’t Eat Tracking Codes
I’ll be quick with this last one, but I think it’s important to highlight. I received an email from a business owner that claimed their site was hit by 90%+ after Phantom rolled out. That’s a huge Phantom hit, so I was eager to review the situation.

Quickly checking third party tools revealed no drop at all. All keywords that were leading to the site pre-Phantom 2 were still ranking well. Trending was strong as well. And checking the site didn’t reveal any crazy content quality problems either. Very interesting…

Phantom 2 No Impact

So I reached out to the business owner and asked if it was just Google organic traffic that dropped, or if they were seeing drops across all traffic sources. I explained I wasn’t seeing any drops via third party tools, I wasn’t seeing any crazy content quality problems, etc. The business owner quickly got back to me and said it was a tracking code issue! Google Analytics wasn’t firing, so it looked like there was a serious drop in traffic. Bullet avoided.

Important note: When you believe you’ve been hit by an algo update, don’t simply rely on Google Analytics. Check Google Search Console (formerly Google Webmaster Tools), third party tools, etc. Make sure it’s not a tracking code issue before you freak out.

Quality Boost – URLs That Jumped
Since Google explained that Phantom 2 is actually a change to its core ranking algorithm with how it assesses quality, it’s important to understand where pages fall flat, as well as where other pages excel. For example, some pages will score higher, while others will score lower. I’ve spent a lot of time analyzing the problems Phantom victims have content quality-wise, but I also wanted to dig into the urls that jumped ahead rankings-wise. My goal was to see if the Quality Update truly surfaced higher quality pages based on the query at hand.

Note, I plan to write more about this soon, and will not cover it extensively in this post. But I did think it was important to start looking at the urls that surged in greater detail. I began by analyzing a number of queries where Phantom victim urls dropped in rankings. Then I dug into which urls used to rank on page one and two, and which ones jumped up the rankings.

A Mixed Bag of Quality
Depending on the query, there were times urls did jump up that were higher quality, covered the subject in greater detail, and provided an overall better user experience. For example, a search for an in-depth resource for a specific subject yielded some new urls in the top ten that provided a rich amount of information, organized well, without any barriers from a user engagement standpoint. It was a good example of higher quality content overtaking lower quality, thinner content.

Questions and Forums
For some queries I analyzed that were questions, Google seemed to be providing forum urls that contained strong responses from people that understood the subject matter well. I’m not saying all forums shot up the rankings, but I analyzed a number of queries that yielded high quality forum urls (with a lot of good answers from knowledgeable people). And that’s interesting since many forums have had a hard time with Panda over the years.

Navigational Queries and Relevant Information
I saw some navigational queries yield urls that provided more thorough information than just providing profile data. For example, I saw some queries where local directories all dropped to page two and beyond, while urls containing richer content surface on page one.

Local Example
From a pure local standpoint (someone searching for a local business), I saw some ultra-thin local listings drop, while other listings with richer information increase in rank. For example, pages with just a thumbnail and business name dropped, while other local listings with store locations, images, company background information, hours, reviews, etc. hit page one. Note, these examples do not represent the entire category… They are simply examples based on Phantom victims I am helping now.

In Some Cases, The “Lower Quality Update”?
The examples listed above show higher quality urls rising in the ranks, but that wasn’t always the case. I came across several queries where some of the top listings yielded lower quality pages. They did not cover the subject matter in detail, had popups immediately on load, the pages weren’t organized particularly well, etc.

Now, every algorithm will contain problems, yield some inconsistent results, etc., but I just found it ironic that the “quality update” sometimes surfaced lower quality urls on page one. Again, I plan to dig deeper into the “quality boost” from Phantom 2 in future posts, so stay tuned.

Next Steps with Phantom:
As mentioned earlier, I recommend taking a long-term approach to Phantom remediation. You need to identify and then fix problems riddling your sites from a content quality and engagement standpoint. Don’t tinker with urls. Fix big problems. And if Phantom 2 is similar to Phantom 1 from 2013, then that’s exactly what you need to focus on.

Here is what I recommend:

  • Have a thorough audit conducted through a quality lens. This is not necessarily a full-blown SEO audit. Instead, it’s an audit focused on identifying content quality problems, engagement issues, and other pieces of bamboo that both Panda and Phantom like to chew on.
  • Take the remediation plan and run with it. Don’t put band-aids on your website, and don’t implement 20% of the changes. Fix as many quality problems as you can, and as quickly as you can. Not only will Google need to recrawl all of the changes, but I’m confident that it will need to remeasure user engagement (similar to Panda). This is one reason I don’t think you can bounce back immediately from a Phantom hit.
  • Have humans provide feedback. I’ve brought this up before in previous Panda posts, and it can absolutely help with Phantom too. Ask unbiased users to complete an action (or set of actions) on your site and let them loose. Have them answer questions about their visit, obstacles they came across, things they hated, things they loved, etc. You might be surprised by what they say. And don’t forget about mobile… have them go through the site on their mobile phones too.
  • Continue producing high quality content on your site. Do not stop the process of publishing killer content that can help build links, social shares, brand mentions, etc. Remember that “quality” can be represented in several ways. Similar to Panda victims, you must keep driving forward like you aren’t being impacted.
  • What’s good for Phantom should be good for Panda. Since there seems to be heavy overlap between Phantom and Panda, many of the changes you implement should help you on both levels.

 

Summary – Exorcise Phantom and Exile Panda
If you take one point away from this post, I hope you maintain a long-term view of increasing quality on your website. If you’ve been hit by Phantom, then don’t tinker with urls and expect immediately recovery. Instead, thoroughly audit your site from a quality standpoint, make serious changes to your content, enhance the user experience, and set realistic expectations with regard to recovery. That’s how you can exorcise Phantom and exile Panda.

Now go ahead and get started. Phantoms won’t show themselves the door. You need to open it for them. :)

GG

 

Filed Under: algorithm-updates, google, seo

Connect with Glenn Gabe today!

Latest Blog Posts

  • How to compare hourly sessions in Google Analytics 4 to track the impact from major Google algorithm updates (like broad core updates)
  • It’s all in the (site) name: 9 tips for troubleshooting why your site name isn’t showing up properly in the Google search results
  • Google Explore – The sneaky mobile content feed that’s displacing rankings in mobile search and could be eating clicks and impressions
  • Bing Chat in the Edge Sidebar – An AI companion that can summarize articles, provide additional information, and even generate new content as you browse the web
  • The Google “Code Red” That Triggered Thousands of “Code Reds” at Publishers: Bard, Bing Chat, And The Potential Impact of AI in the Search Results
  • Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]
  • How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API
  • Percent Human: A list of tools for detecting lower-quality AI content
  • True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting
  • Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent