Defense against the dark arts of the internet – the Facebook News Feed

In the world of Harry Potter, there is a special mirror called “The Mirror of Erised”. When someone looks into it, they see their deepest, most desperate desire fulfilled. Dumbledore finds Harry in front of this mirror and warns him that the mirror gives him neither knowledge, nor truth and that men have wasted their lives away staring into this mirror.

It does not do to dwell on dreams Harry, and forget to live.

It’s fun to imagine what you would see in this mirror, wonder what your deepest desire is and see how you would be with it fulfilled. What if, instead of your deepest desires, it played to your more basic ones? Imagine a mirror that reinforced whatever you believed to be true about the world. What if it took the shape of your “enemies” or people that disagree with you or think differently; but instead of portraying them as complicated beings like yourself, they were portrayed as idiotic, shallow and obviously wrong.

Well my friends, this creature exists, and it’s been hard at work sinking its tentacles into you. Its name is Facebook, and it uses the power of an algorithm to learn about what you like, what you believe and what you care about. It will tell you whatever you want to hear if you just keep scrolling through your News Feed.

What does Facebook want?

If you ask a friend or family member, “Why do you use Facebook?” they would likely say something along the lines of:

“I want to keep in touch with my friends and family that I don’t get to see often.”

Me too. It helps me keep connections, and I use it all the time. It’s amazing that it’s free. Well it’s not exactly free, the cost is hidden. Here’s what Facebook wants from you:

  • Scrolling more – Facebook’s main goal is to keep your eyeballs glued to the screen. It’s awfully good at it too. Have you ever opened a new tab to do a quick facebook check, then, in what seems like the blink of an eye, realize you’ve spent an hour scrolling through the News Feed?
  • Engaging more – Clicks, likes, comments, even the amount of time spent viewing a video or reading a post counts as engagement. Each action you take means more data for Facebook and more time you have spent on the site.
  • Clicking ads – The more you’re scrolling the more ads you see, and the more likely you are to click on one. Even if you don’t click on an ad, it’s learning more and more about you so it can send you ads you’re more likely to click on.

Let’s be clear and outline a few things that Facebook does not care about:

  • Truth – A crap post that’s filled with lies but triggers an emotional response will outperform a true yet bland post every time.
  • Objectivity – When you ask most people where they get their news these days, they say “my friends” which really means “social media” or “Facebook,” but Facebook does nothing to create a balanced or accurate view of the world or what’s happening in it. Trusting your friends and knowing that they’re good and honest people should not mean that what you’re seeing on your News Feed means that all the good and honest people of the world think exactly the same way.
  • Long-term importance – Facebook does not do anything on it’s own to prioritize things that are meaningful, useful or that will actually have a long term impact on your life.

Note that it’s not impossible to have a News Feed that embodies these things, but it’s up to you to create and curate that experience for yourself.

The bubble

Though the content that rises to the top of your Facebook feed is tailored for you, it’s not necessarily in your best interests. It will boost articles that it thinks you will like in your News Feed and avoid placing ones that you won’t.

This is how the bubble is created, how Facebook gives you a lens of the world where everyone agrees with you. You see articles that reinforce and validate your beliefs and demonize your enemies. This is great for Facebook, it keeps you clicking and sharing. It’s great for advertisers and journalists, because they know just what message to send you.

But is it good for you?

How Facebook decides what to put in your News Feed

Most people have hundreds of friends on Facebook. Though they are no doubt lovely people, you probably don’t want to see every post that every person makes. So Facebook does it’s best to guess what you want to see.

Facebook carefully tracks and measures everything you do while on the News Feed and other elements of their site. This helps them determine what ads they should place in your News Feed as well as what posts you see from your friends. Here are a few things they track:

  • Posts you like and comment on
  • Posts you view
  • Personalities and brands you follow
  • What technology or device you are using to access Facebook
  • What apps you have installed on your device
  • Third party apps you have installed on your Facebook
  • Links that you click on that take you off their site
  • Things you search for in Facebook’s search tool
  • Things that many of the friends you are engaging with are liking or commenting on

It gathers this information and constructs a profile based on this information. If there’s anything that’s missing (maybe you’re not posting a lot of political ideas on your feed), it can take some guesses on what your affinity is based on your other likes.

“Even if you do not like any candidates’ pages, if most of the people who like the same pages that you do — such as Ben and Jerry’s ice cream — identify as liberal, then Facebook might classify you as one, too.” Liberal, Moderate or Conservative? See How Facebook Labels You

From your desktop computer, you can see how Facebook has categorized you at:

https://www.facebook.com/ads/preferences

Facebook not only allows people to target you based on your political affinity but also your ethnic affinity. These 2 affinities combined can single you out with messaging.

Easy targets for voter suppression

In the recent election, one of the Trump campaign’s most powerful tools was Project Alamo, a digital database of over 220 million Americans. It was used at first to cultivate supporters of the Trump campaign, but was used in the final weeks as a last-ditch effort to target key groups of Clinton supporters.

They created satirical ads and cartoons targeting blacks, young women and “idealistic liberals” then used “dark posts” to make these posts appear in the feeds of their target audience but were invisible to everyone else.

Campaigns typically spend millions on data science to understand their own potential supporters — to whom they’re likely already credible messengers — but Trump was willing to take a risk and speak to his opponent’s supporters. In the end, Trump’s risky bet on micro-targeted Facebook ads to discourage African Americans and young women from voting was handsomely rewarded with a presidential campaign victory.” – How the Trump Campaign Built an Identity Database and Used Facebook Ads to Win the Election

Though articles about Trump’s tactics are more visible now since his victory, there’s no doubt that Clinton used similar tactics on her campaign. The point is, no matter who you are or who you supported, you can and will be targeted. Not just for elections or for ads, these tools can be used to persuade you of anything.

Like-farming, fake news and sensationalism

When a marketer is choosing a headline for an article or creating spin on a story, they think of ways to get more likes and shares on the article. This is nothing new; it’s been happening since the early days of journalism. The article below may have started the Spanish American War.

defense against the dark arts of the internet Facebook fake news feed

But because of the way Facebook works, sensational information can spread faster than ever before. The more likes and shares a post has, the more likely it will show up in other people’s News Feeds.

Articles with headlines like “Bernie Sanders Could Replace President Trump With Little-Known Loophole” are conjured up by marketers and journalists who know their audience and want to speak directly to whatever emotion they are feeling (or feeding). Often articles like this get massive amounts of likes and shares without even being read or double checked.

Here’s another fake news article with a title and statement that requires a great deal of stretching to be considered “true” and a flat-out wrong picture of the electoral map. According to an analysis by Buzzfeed (The current masters of driving and analyzing social media traffic), fake news spreads faster and was shared more than “true” articles written by major media outlets.

brietbart-fake-news-facebook-news-feed

Articles like this one are easy to share because they provide this strong emotional response and validation. They get shared faster than the reader can question them. Breitbart fixed the map after receiving criticism, but claimed their Facebook is managed by someone else and had no idea this article was circulating across the internet.

“We see that Facebook drives anywhere between 30 to 40 percent of traffic,” said Dan Valente, chief data scientist at web analytics firm Chartbeat. “My guess is that [Breitbart is] on the higher end of Facebook publishers, because they get more traffic from Facebook than other sources because of the content that they post.” – Breitbart’s phony election map shows how hard it is to stamp out fake news

Here’s a handy image that breaks down some common news sources and shows both the quality of the information and thier political leanings.

a breakdown of different news sites and thier quality

This tactic can be used by scammers as well. A common strategy is to create some content that gets likes and shares fast. In the beginning, the content is authentic and relatively safe. As the post gets more momentum on the News Feed, scammers start to change the content and add in something malicious.

Here’s a few examples that scammers use:

  • Giveaways – These are often “too good to be true”-looking giveaways, like 2 free tickets on an airline or the newest iPhone or iPad. They’ll often ask you to share the post in order to complete the giveaway process. By the time you realize it’s a scam, you’ve shared it with friends and family and roped a few more people into it.

Facebook news feed scam

  • Brain teasers – Posts with headlines like, “98% of people can’t read this scrambled sentence,” with an image of a jumbled sentence or a backwards word are common scams. People are able to easily read it and think, “Oh my god! I’m a genius!” and share it without even checking the link.
  • Competition – Headlines like, “I bet BYU can get more likes than the UofU,” or, “If I can get X likes, then I’ll donate a puppy to a homeless person.”
  • Who viewed your profile – Anyone promising to tell you who’s been visiting your Facebook profile should not be trusted.

Defense against the dark arts of the Facebook News Feed – How to clean your history and protect your mind.

We’ve all made a mistake and fallen for one of the scams or liked something impulsively.

Luckily, you can review what you have liked, commented on and most of your other activity in the activity log on Facebook.

Facebook activity log

Your likes, comments and statuses can all be modified or deleted individually. This may take a lot of time and effort to manage, but there are a few easy ways to clean up your history.

Searches – Every search you’ve made is recorded in your activity log. This is one of the few things you can clear completely.

facebook-activity-log-search-history

Videos you’ve watched – Even if you didn’t like or comment on the video, it’s still recorded here. Luckily, this is another one that you can completely clear with a click.

Be aware of your emotions / Emotions that people like to target

There may not be a perfect way to protect your information from being collected, but you can still protect your mind.

The best way to defend yourself is to be aware of the emotions that scammers, advertisers and journalists like to target. You should pause anytime you see a story with a clear “good guy” and an obvious “bad guy,” or something happening that fills you with a righteous sense of anger or injustice. Take a moment, and realize someone is selling something to you.

The stronger and more immediate your reaction to something is, the more likely it is that someone wants you to feel that way. Maybe it will get more shares or views, maybe it will persuade you to do or not do something.

Get uncomfortable, and talk with some people who disagree with you

The year of 2016 revealed plenty of reasons why it is dangerous to live in a bubble. If we continue to be impulsive with the information we choose to like, share and believe, we hand over control of our lives and our worldviews to people who don’t have our best interests in mind. This is only going to lead to more division, more tension and more anger.

Do we really need to hear any more accusations of “Misogynists, spoiled baby hipsters, racists, losers, cop killers, communists, fascists”?

I challenge you to find someone on the other end of the argument and listen to them, both online and offline. Don’t react; you don’t have to agree, but don’t immediately write them off or unfriend them. There’s often more to the story, something that’s important to them that you can understand.

Take some time to populate your Facebook news feed and your social life with a diversity of ideas and perspectives. Clearly the “getting news from our friends” approach is not working for anyone except people who profit from oversimplified stories and sensational headlines.

Kyle Gray

Kyle Gray is the founder of Conversion Cake, where he helps small businesses and startups with content marketing strategy and sales funnels. He is also the author of “The College Entrepreneur” a guide that teaches students how to build an entrepreneurial skillset while in school and use their university’s resources to help them build something amazing.