The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

Archives for April 2016

How Does The Facebook News Feed Work? – Notes From Adam Mosseri’s 2016 F8 Presentation (Head of Product for News Feed)

April 25, 2016 By Glenn Gabe Leave a Comment

Adam Mosseri - Facebook's VP of Product for News Feed

Although many people associate algorithms with Google, Facebook also has many algorithms running on a regular basis. And I can’t think of one more important than its News Feed algorithm. That’s what determines the posts and updates you see when you log into your account. With most people having hundreds of “friends”, while also liking many Pages and publishers, the News Feed has gotten very crowded over the past few years. So, Facebook uses an algorithm to determine what shows up, and in which order. And that order can have a serious impact on businesses trying to reach a targeted audience.

For example, if your posts end up ranking highly in the News Feed for many people you are connected with, you can get in front of a lot of eyeballs and end up with a boatload of targeted traffic. That can impact ad revenue, sales, sign-ups, or any number of conversions that apply to your business. If your posts end up ranking low in the News Feed, then you have little chance at gaining those eyeballs.

As you can guess, many wonder how the News Feed algorithm actually works. In other words, what does Facebook take into account when ordering the News Feed? Well, at Facebook’s F8 conference this year, Adam Mosseri, the VP of Product Management for News Feed, presented that very topic. And even better, Facebook recorded a video of his presentation. The presentation was short, but contained some key pieces of information about how the News Feed works. So I decided to take some notes based on Adam’s slides, including the time of each statement.

I highly recommend you watch the video for yourself. There are some great bits of information about how Facebook’s News Feed ticks. Below, you’ll find my notes along with the timecode of each section (so you can jump directly to those parts).

Notes from Adam Mosseri’s presentation about how the News Feed works:

0:50 When The News Feed Algorithm Kicks In
Adam explained that when a publisher first posts a story on Facebook, nothing happens from a News Feed algo perspective. The algo only kicks when someone that is your friend or follows your page logs in and has an opportunity to see your post. That’s when Facebook looks at that story, and all of the other stories that can be shown to those people. Then they are all evaluated by the News Feed algorithm.

1:20 The Number One Input (and it makes sense)
Yes, basic and obvious, but worth noting. The number one input in the News Feed algo is who you friend and which publishers you follow. This is pretty obvious, since that’s the basis for seeing any update on Facebook. For example, if you don’t follow a publisher or have someone as a friend on Facebook, you can’t see their post (or a post they have engaged with) in your News Feed.

Friends and Publishers on Facebook

2:10 How The News Feed Algorithm Analyzes Updates
Each story (update) is looked at one at a time. The algo tries to determine if you will be interested in that post based on a number of factors/signals. For example, will you like the story, read the story, share it, etc? Those actions are proxies to knowing if a user will be interested in the story.

Facebook News Feed Stories

2:44 Strongest Facebook News Feed Signals/Factors
Facebook then predicts if a user will take these actions based on a number of factors. The top signals include who posted the update, the amount of interactions on a post (i.e. likes and comments), the type of content (photos, video, etc.), and your previous actions with that entity on Facebook (your friend or publisher you follow). For example, how much has a user engaged with that person or publisher?

And the News Feed algorithm definitely takes recency into account (although importance of a post will trump recency). “Recency is a really important signal for relevance, but it’s not the only important signal.” Adam gave a simple example of his cousin that got engaged on a Friday, but he hadn’t been on Facebook since then (a few days). Then this morning his brother-in-law posted his breakfast sandwich. Facebook’s New Feed algorithm would determine that he was more interested in his cousin’s engagement versus the breakfast sandwich. So relevance beats recency in a number of situations.

Strongest News Feed Signals For Ranking

4:00 Relevancy Score Calculation
A relevancy score is calculated for each story and then the News Feed for a given user is ordered by those relevancy scores. Relevancy scores are specific to each Facebook user. This happens each time someone logs into Facebook (or presumably whenever someone refreshes their News Feed). So, you log in, all of the possible posts that can show in your News Feed are scored. Then they are ordered by relevancy score.

News Feed Relevancy Score

5:00 Facebook Receiving Feedback About News Feed Quality
Facebook has two primary mechanisms for receiving important feedback about the quality of the News Feed. It looks at what people do and then what people say. First, Facebook can see what users are doing. Are they liking and engaging with the content? Are they spending time on the site, and with specific posts? How much value are users gaining from the content in their News Feed?

But, Facebook knows there are certain types of posts where engagement might not happen. For example, a post about someone’s dog that passed away. A person might not engage with that post, but it’s still important to them. That’s where Adam brought up the Feed Quality Program, which has two components.

The first is the Feed Quality Panel, which includes many people that organize their own stories based on interest. Then Facebook can compare their order to how the News Feed would have picked the order. For those heavily involved in SEO, this is very similar to Google’s Quality Raters.

The second component includes online surveys that happen on Facebook every day (for ordinary users). There are tens of thousands of surveys per day asking users how interested they are in a specific story in the News Feed. This happens in over thirty languages, and all around the world geographically. Then Facebook can compare how interested they were to how interested the News Feed algo thought they would be (as a quality measure).

Facebook Survey for Users

6:30 News Feed Controls For Users
Facebook knows they always don’t get it right… which is why Facebook introduced controls for users. Adam explained the various controls Facebook has created for users (so they can tailor they own News Feeds). For example, he explained the importance of following and friending (and that it’s very clear how they work). Second, Adam explained that unfollowing is important too. By doing so, your News Feed experience “will get more interesting”. You can also hide any post in the News Feed, and Facebook will try and show you less of that type of post in the future. Again, making your News Feed experience more interesting. Last, he mentioned “see first” (which I love by the way). Using “see first”, you can select people or publishers that you find the most interesting (and who you want to show up at the top of your News Feed each time).

Adam explained these controls can help “correct” Facebook and make the News Feed more relevant over time. I thought that was a great statement. Hey, anyone in SEO can tell you that algos aren’t perfect. :)

News Feed Controls For Users

7:30 What This Means For Publishers
Adam ended his presentation with some tips for publishers. First, he said to create compelling headlines. He made sure to say that he did not mean click bait headlines… He said to give people a real sense of the content behind the headline. For example, be clear and honest (but still compelling). He explained Facebook knows that “people really enjoy this type of content and that Facebook does what it can to make sure it does well in the system.”

Second, he said to avoid overly promotional content. Facebook knows people don’t like getting bombarded with overly promotional content, so try not to do that. Your audience might get less interested in your content over time. Facebook has brought this up before, as heavy promotion can turn off users.

Third, and most important according to Adam, he explained to experiment and try new things. “What’s best for your audience is probably not what’s best for a different publisher’s audience.” He said to experiment with long-form content, short-form content, video, images, try different tones, etc. The point is to see how those changes work for your specific audience, which leads to Adam’s final tip about publisher tools — analysis.

In order to analyze how well your efforts are working, Facebook provides a set of publisher tools that can help you understand what’s resonating with your audience. For example, analyze posts to view engagement, reach, and more.

Facebook's News Feed Tips for Publishers

8:30 Adam Highlights Audience Optimization Tool (Part of publisher tools.)
After running through his tips, Adam called out Facebook’s Audience Optimization Tool as a great way to help tailor your posts to specific segments of your audience. You can tell Facebook what your specific update is about and then Facebook will use that information when considering how interesting the post will be for specific followers. In other words, Facebook will try and match the post up with people that have already shown an interest in that specific topic.

And you can use Audience Restrictions to do the opposite. You can tell Facebook who you don’t think will be interested in the content. By using both pieces of functionality that are part of audience optimization, you can help Facebook focus your update on who will actually be interested in the content.

And last, you can analyze insights based on using the Audience Optimization tool to see what’s working and what’s not. And then based on the performance, you can experiment with refining your targeting to make sure you are driving the most engagement from your posts.

Facebook's Audience Optimization Tool

Summary – News Feed Information Directly From The Source
I love when people directly from Google or Facebook explain how their algorithms work. Adam’s video, although short, provided some interesting information directly from the person leading product for Facebook’s News Feed. Again, I recommend watching the video, reading through my notes, and then mapping out a plan of attack for your own efforts on Facebook. And if you have any questions or comments, feel free to provide them below!

GG

 

Share
Tweet
Share
Email
96 Shares

Filed Under: facebook, social-media

Is Your Killer Content Still Killer? How To Update Older and Outdated Content To Maintain High Search Rankings (Case Study)

April 17, 2016 By Glenn Gabe 3 Comments

Improving Outdated Content

You’ve heard it before a thousand times. “Produce killer content to win at SEO.” And I definitely agree with that statement. Because if you do, then you can get in front of more eyeballs, gain a following, naturally build powerful inbound links, and there’s a good chance that Google will reward you with strong rankings over time. And with strong rankings and targeted traffic, conversions and revenue can jump (whether that’s sales, sign-ups, memberships, ad revenue, etc.)

Now, although killer content can help drive strong SEO, and you might even be executing at a high level right now, there’s a hidden danger that publishers need to be aware of. And it can slowly creep its way into becoming a big problem. So if you are executing a content marketing strategy, and you think you’re killing it, then definitely read the rest of my post. It could end up saving you some grief.

The Natural Evolution For Some Content – It Becomes Outdated
Let’s say you’ve been producing high quality content that helps people thoroughly learn how to do something (i.e. how-to content). You meticulously planned each post, you are extremely knowledgeable in your niche, and you continually produce killer content that helps users. That’s all good, but I’ve found too many people hit publish and never revisit that killer content again. They move on, thinking that Google will forever love them for producing such amazing content.

But what if those articles start to get dusty over time? What if some fundamental changes occur to the subject matter you covered? What if product names change, visuals change, resources you are linking to change, and steps change in the tutorial you mapped out for users?

If that’s the case, then how “killer” is that content over time? For example, is it up-to-date? Do the step-by-step instructions still apply? Are the screenshots current? So on and so forth.

After reading that last paragraph, many of you might look like this now thinking about your own content:
Woman shocked.

And that’s exactly my point. It’s very easy to produce a high-quality article or tutorial, build amazing Google traffic, but then never revisit the post (thinking Google will ALWAYS love that piece of content). And if you don’t revisit the post over time, and it becomes outdated, bad things can happen Google-wise. I have a quick case study that I’ll document in this post that explains that very situation. It’s a happy, sad, happy story that can apply to many publishers living in Google Land. Let’s dig in.

The Problem – Clear As Day
I was digging through data for a client when I noticed a dip in Google organic traffic. It wasn’t horrible, but there was definitely a dip.

I isolated the drop from a query and landing page perspective and the problem was as clear as day. It was one of my client’s best-performing posts, and it dropped in traffic for a number of popular queries. Actually, it seemed to drop for broader terms related to the subject matter (head terms), while still ranking for some very specific keywords (longer-tail terms). I found that interesting and I dug in further.

When reviewing many of the queries leading to this killer how-to post, it was clear the article wasn’t as killer as it was before… Remember what I explained earlier about posts becoming outdated? Well, everything from the title to the screenshots to the references were now out-of-date (or partially out-of-date).

Here are some of the core elements that needed to be updated:

Updating killer content that's out of date.

When a post doesn’t meet user expectations anymore, user unhappiness increases. And when users are unhappy, it’s not long before Google is unhappy. In addition, if the older post doesn’t contain the updated subject matter, and related copy, then the post can’t possibly rank for queries related to that new content. So between user unhappiness and outdated content, it’s not long before Google can pick up on the problem. And when that happens, rankings drop and traffic takes a dive. That’s exactly what happened here.

Head Terms Drop, Longer-Tail Queries Remain
As I explained earlier, the post still ranked for longer-tail queries directly related to the older subject matter, but not the broader head terms. To clarify, the post still ranked for queries that referenced the out-of-date content (which some people were still searching for), but dropped for users searching for the subject matter in general. And the post definitely didn’t rank for queries that referenced the most recent information about the subject (since the older post didn’t even contain that content at all!)

Queries that dropped in rankings.

Taking Data To The Client

Unfortunately, it’s easier than you think to let this happen… The site has many in-depth articles and how-to posts. All of them are high quality. This just happened to be a case of a post that desperately needed an update. So I presented the findings to my client, the dip in traffic, the signs of user unhappiness, and the gap between queries and content. Everyone agreed that this should be fixed, and quickly.

After reviewing the situation, my client realized it wouldn’t take long to update the post. Then once it was updated, we just needed to see how Google would respond (and how quickly). We got to work.

The Fix – Less Than One Day Of Work
Yep, less than one day of work was needed to fix the post. The original author updated the title, the post details, the steps involved, links to supporting articles, and the screenshots. In addition, the article now included a quick blurb about the update (making sure both users and Google knew the date of the update and why the update was needed).

Then I used fetch as google and submitted the post to the index. It took two days for the update to be indexed, which is common when you fetch and submit an article that’s already published. I wish Google would quicken up the process there, but hey, at least you can resubmit.

Using fetch and submit in Google Search Console (GSC).

Then it was time to analyze changes in rankings and traffic based on the update. And let me tell you, we were pleasantly surprised with the changes (and how quickly that happened).

The Results – Wow
Quickly after the post had been indexed, rankings began to jump across a number of relevant queries (and queries that had dropped). Now that the post was accurate, the title was accurate, the steps and screenshots were accurate, almost all of the original rankings returned (for head terms related to the subject matter).

It was almost like Google was trying to speak to us through the drop in rankings. Sure, I’d rather have a signal in Google Search Console (GSC) that says, “hey, this post looks outdated”, but I’m not complaining if rankings can return quickly! :)

 A sample of queries that rebounded:

Improved rankings based on updating older content.

Traffic to the post jumped with the increase in rankings and my client was humming along again. Now, it’s important to understand that this is a trusted site in Google’s eyes, so I’m not saying every site can see this type of behavior. But, it’s an important case study since many sites have content that ages naturally over time, and requires an update every now and then.

Remember, users are reading your content with the hope of learning how to do something, how to fix something, etc. If you provide the wrong information, or information that’s out-of-date, then they will not be happy. And Google can pick up on that. It clearly did here and rankings dropped… And then you have users searching for the most recent content related to the subject matter, which outdated posts don’t even contain! It’s a double whammy from an SEO perspective.

But, it was awesome to see how quickly rankings returned for the post once the content was updated. And it took less than one day to update.

What You Can (And Should) Do Now
You need to fully understand your top content. Understand what users are searching for and make sure you are meeting and/or exceeding user expectations. If that sounds familiar, then you’ve been paying attention algorithm-wise. That’s exactly how Gary Illyes explained Panda recently. I mentioned this in my post about the March algorithm updates, and I’ll mention it again here.

Gary explained that Panda is not about penalizing a site. It’s simply trying to match users with the right content based on their query. When sites become overly prominent for queries they can’t answer, then Panda needs to “adjust those sites” that aren’t meeting expectations. It’s one of the reasons I believe that the March algo updates were Panda. I saw a lot of that during my analysis of the initial update on 3/3 and the subsequent tremors on 3/14, 3/21, 3/26, etc.

Back to outdated content. If you want to track down queries by url in GSC, then you can follow my Search Engine Land column that explains how to do that. By doing so, you can uncover the queries leading to each of your top posts. Then you should objectively read the content and make sure you are meeting and/or exceeding expectations. There are times you might still be doing a great job with older content, but there might be times when you cringe as you read through an older post and see how outdated it is now. But you won’t know until you dig in.

Checking queries by url in Google Search Console (GSC):

Finding queries per url in Google Search Console (GSC).

Summary – Brush The Dust Off Your Killer (But Older) Posts
Based on the case study I documented above, it’s clear that updating aging killer content is critically important. Unfortunately, it’s easy to let older posts collect dust over time, which can lead to user unhappiness and content that doesn’t cover the latest subject matter. When that happens, Google can easily pick up on it. And when it does, rankings can drop, and traffic will follow suit.

You can avoid this situation by following the recommendations I provided above. Analyze your posts now before rankings take a dip. That’s one way you can stay afloat in an ever-changing Google world. Now get ready to dust off those older posts. They might just need it.

GG

 

 

Share
Tweet
Share
Email
91 Shares

Filed Under: google, seo

Connect with Glenn Gabe today!

Latest Blog Posts

  • Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]
  • How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API
  • Percent Human: A list of tools for detecting lower-quality AI content
  • True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting
  • Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap
  • Google Multisearch – Exploring how “Searching outside the box” is being tracked in Google Search Console (GSC) and Google Analytics (GA)
  • Sitebulb Server – Technical Tips And Tricks For Setting Up A Powerful DIY Enterprise Crawler (On A Budget)
  • Google’s Helpful Content Update Introduces A New Site-wide Ranking Signal Targeting “Search engine-first Content”, and It’s Always Running
  • The Google May 2022 Broad Core Update – 5 micro-case studies that once again underscore the complexity of broad core algorithm updates
  • Amazing Search Experiments and New SERP Features In Google Land (2022 Edition)

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent