The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
  • Contact GSQi

UTF-8 BOM and SEO: How to find, clean, and fix an invisible character in your robots.txt file

October 2, 2016 By Glenn Gabe 11 Comments

Share
Tweet
Share
Email
130 Shares

UTF-8 BOM and SEO

I’ve written in the past about how a robots.txt file could look fine, but actually not be fine. For example, maybe you add your directives, a sitemap file or sitemap index file, and then upload it to your site. You think all is good, but you find that directives are not being adhered to, maybe a boatload of urls that are being crawled that shouldn’t be, etc. When that happens, it can have a big impact on SEO (especially on large-scale sites with many urls that should never be crawled).

So why does this sinister robots.txt problem happen? It often comes to down a single character. Literally. Sure, it’s an invisible character, but a character nonetheless. It’s called the UTF-8 BOM and I’m going to explain more about that in this post. Unfortunately, I’ve come across this issue many times during audits and while helping companies with technical SEO. It’s sinister, since it’s invisible. But the results are extremely visible (and can be alarming).

Below, I’ll cover what UTF-8 BOM is, how it can impact your robots.txt file, how to check for it, and then how to fix the problem. So, if your robots.txt file is bombing, and you are still scratching your head wondering what’s going on, then this post is for you.

What is UTF-8 BOM?
BOM stands for byte order mark and it’s used to indicate the byte order for a text stream. It’s an invisible character that’s located at the start of a file (and it’s essentially meaningless from an SEO perspective). Some programs will add the BOM to a text file, which again, can remain invisible to the person creating the text file. And the BOM can cause serious problems when Google tries to read the file. Actually, the UTF-8 BOM can make your robots.txt file, well, bomb… Sorry for the play on words here, but I couldn’t resist. :)

What can happen to a robots.txt file when UTF-8 BOM is present?
As mentioned above, when your robots.txt file contains the UTF-8 BOM, Google can choke on the file. And that means the first line (often user-agent), will be ignored. And when there’s no user-agent, all the other lines will return as errors (all of your directives). And when they are seen as errors, Google will ignore them. And if you’re trying to disallow key areas of your site, then that could end up as a huge SEO problem.

For example, here’s what a robots.txt file looks like in GSC when it contains UTF-8 BOM:
UTF8-BOM in robots.txt Tester in GSC

Needless to say, all of the directories that should be disallowed are not being disallowed. And that means many urls that shouldn’t be crawled are being crawled (and many are being indexed). This can lead to all sorts of nasty SEO problems. And that could include quality problems, as well, depending on what is being crawled and indexed.

How to identify UTF-8 BOM:
Right now, you might be sweating a little. Maybe you’ve seen problems with your robots.txt file and subsequent indexation, and you’re now wondering if UTF-8 BOM is the problem. Don’t worry, I’ll quickly walk you through how to check your robots.txt file now.

1. First, fire up GSC and use the robots.txt Tester. When you view the report, does it look like the screenshot above? Is the first line showing a red X next to it? If so, hover over the x and you might see a hint that says, “Syntax not understood”. If so, there’s a good chance you’ve got the UTF-8 BOM situation I’ve been explaining.

Syntax error when testing robots.txt file in GSC

2. Next, visit the W3C Internalization Checker. This tool will enable you to upload your robots.txt file and check for the presence of UTF-8 BOM.

3. Next, click the tab labeled “By File Upload”:
How to check robots.txt using W3C checker

4. Next, click “Choose File” and select your robots.txt file. Then click the “Check” button:
Choosing a robots.txt file to check for UTF-8 BOM

The tool will return the results, which will include a line about UTF-8 BOM. If you see that in the results, you know what the problem is (which is great). That’s the smoking gun.
W3C Checker Results for UTF-8 BOM

How to fix UTF-8 BOM in your robots.txt file:
Fixing the issue is pretty easy. I recommend using a text editor like Textpad to create your new robots.txt file. When saving the file, ensure that BOM is not selected (some text editor programs have an option for adding the BOM).

Checking for UTF-8 BOM in text editor

Also, make sure you’re not using a word processing application like Microsoft Word for creating your robots.txt file. I’ve seen that cause problems too. You should be using a pure text editor for creating your robots.txt file, .htaccess file, xml sitemaps, etc. I’m a big Textpad fan, but there are many others you can use as well.

Once you make the changes, then use the W3C internationalization tool to check the revised file. If the BOM doesn’t show up, you’re good to go. If it is, you are doing something wrong while creating the robots.txt file. Go back and start over using a pure text editor.

After you’ve fixed the problem, head back to GSC and to the robots.txt Tester. The tool enables you to submit a request to Google to retrieve your latest robots.txt file (after you upload the new one).

Submitting a new robots.txt file to GSC

If you’ve done everything correctly, the errors should be removed from the robots.txt Tester and your directives will now work (blocking directories and files that should not be crawled).

Side note: Blocking does not mean “remove from index” (usually):
If you’ve been experiencing robots.txt issues due to the UTF-8 BOM problem I’ve covered here, your work might not be done. If many pages have been indexed, then just blocking via robots.txt will not remove those pages from the index. Over time, Google can remove them, but I would try and get those undesirable urls out of Google’s index quickly.

For example, you could add the meta robots tag using “noindex” and submit an xml sitemap to Google that contains all of the urls that you want deindexed. Then once you’re sure those urls are deindexed, you could block those directories again via robots.txt. But remember, if you add noindex to files that are being blocked via robots.txt, then Google will never be able to see the meta robots tag… You will need to let Google recrawl those pages first, see the meta robots tag using noindex, and then you can start blocking the files again. That’s a confusing subject for many in SEO, but it’s a really important one.

Summary – Don’t Let UTF-8 BOM Turn Into An SEO Bomb
There are several hidden and sinister problems that can rear their ugly heads in SEO. The UTF-8 BOM is one of them. If your robots.txt file is not working as expected, throwing errors, and causing serious headaches, then follow the instructions in this post to test for UTF-8 BOM. You might find that a hidden character is the gremlin causing major SEO problems. Then it’s up to you to remove that problem by “disarming the BOM”. :)

GG

 

Share
Tweet
Share
Email
130 Shares

Filed Under: google, seo

Connect with Glenn Gabe today!

Latest Blog Posts

  • Google’s December 2020 Broad Core Algorithm Update Part 2: Three Case Studies That Underscore The Complexity and Nuance of Broad Core Updates
  • Google’s December 2020 Broad Core Algorithm Update: Analysis, Observations, Tremors and Reversals, and More Key Points for Site Owners [Part 1 of 2]
  • Exit The Black Hole Of Web Story Tracking – How To Track User Progress In Web Stories Via Event Tracking In Google Analytics
  • Image Packs in Google Web Search – A reason you might be seeing high impressions and rankings in GSC but insanely low click-through rate (CTR)
  • Google’s “Found on the Web” Mobile SERP Feature – A Knowledge Graph and Carousel Frankenstein That’s Hard To Ignore
  • Image Migrations and Lost Signals – How long before images lose signals after a flawed url migration?
  • Web Stories Powered by AMP – 12 Tips and Recommendations For Creating Your First Story
  • Visualizing The SEO Engagement Trap – How To Use Behavior Flow In Google Analytics To View User Frustration [Case Study]
  • The May 2020 Google Core Update – 4 Case Studies That Emphasize The Complexity Of Broad Core Algorithm Updates
  • How To Remove An Image From Google Search Using The Outdated Content Tool (When The Image Was Published On Another Site)

Web Stories

  • Google’s Disqus Indexing Bug
  • Google’s New Page Experience Signal

Archives

  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2021 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy

We are using cookies to give you the best experience on our website.

You can find out more about which cookies we are using or switch them off in settings.

The Internet Marketing Driver
Powered by  GDPR Cookie Compliance
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookies

Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.

3rd Party Cookies

This website uses Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages.

Keeping this cookie enabled helps us to improve our website.

This site also uses pixels from Facebook, Twitter, and LinkedIn so we publish content that reaches you on those social networks.

Please enable Strictly Necessary Cookies first so that we can save your preferences!