Are News Algorithms Friend or Foe?

Browse By

Our generation is notoriously known for information overload. Our lives are intertwined with social media platforms like Snapchat, Facebook, and Instagram, and at this point it is impossible to open your phone without seeing either a local news notification on your home screen, or Snapchat and Facebook news bites (which I have heard more than one person say is increasingly how they get their news).

Facebook’s “Trending Topics” feature on the side of your newsfeed, is described by the Facebook team as there “to help people discover content that is both popular in the world and meaningful to them…Meant to ensure a high-quality product, consistent with Facebook’s deep commitment to being a platform for people of all viewpoints.” The algorithm, written by the employees at Facebook, highlights what is being talked about the most on Facebook, at which point then the employees who work on trending topics write a topic description, and based on the personal preferences of the user or the global importance, decide at what level it is seen, or at all.

Gizmodo published an article this past May which exposed the distancing of Facebook’s trending topics section from the article’s source. Several journalists admitted that the newsroom in Facebook is not as cut and dried as the company likes to make it seem. In reality, the items that appear on trending topics alter with each employee’s viewpoint, and most of the time this has resulted in conservative news being ignored, despite being trending statistically.

Facebook is now doing everything they can on their website to deny this possibility. Whether true or not, the scandal brings to light some very real concerns, and as the New York Times pointed out, these concerns go much further than who Facebook hires to work in their newsroom. Concerns revolve around whether algorithms themselves are biased, due to the extreme complexity of their nature. Since we have learned how to make programs that can learn, what you “like” (and what you don’t) is all incorporated into what gets spit back out at you. While algorithms have the unique ability to learn, this does not make them able to detect the validity of many sources, which is another vast problem. An article was released recently by the Washington Post which notes that Russian propaganda had created fake news throughout the election, some of which was picked up by big name outlets such as Facebook, and read by “more than 15 million American viewers”. A program can tell what you’re interested in, but it can’t read the sources, the views, or the validity of the information. And the more viewership is impacted by algorithms, the more political opinions will sway.

More and more, our activity online is being tracked and taken note of by various algorithms, as we all have noticed when we see the same advertisement for Aritzia or Nike on every site we visit. This has its perks- when our activity and habits are being tracked, we get well-versed recommendations to other brands or sources which our browser very well knows we’ll like. Look at amazon prime- after ordering a handful of books or face products, it can tell that you are someone who likes to read and likes to take care of their skin- and will recommend products accordingly. This also might open up more opportunities for smaller businesses. While many predicted that amazon would be the end of the small bookstore, independent sellers’ revenues were actually boosted due to the recommendations on amazon- the algorithm did not have a bias for big or little names, it simply saw that you would be interested. This created more equality in business, and eroded, if only a little bit, the tradition of brand loyalty among shoppers.

While the recommendation algorithm has been so far a great success in online commerce, we have also seen this past year that there exist two sides of how algorithms function in our everyday social platforms: and neither side is neutral.

Written By:

Carlotta Esposito


Sources:

Information About Trending Topics

http://gizmodo.com/former-facebook-workers-we-routinely-suppressed-conser-1775461006

https://www.washingtonpost.com/business/economy/russian-propaganda-effort-helped-spread-fake-news-during-election-experts-say/2016/11/24/793903b6-8a40-4ca9-b712-716af66098fe_story.html?wpisrc=nl_headlines&wpmm=1

Love what you read? Share it!
Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Share on Reddit

Leave a Reply

Your email address will not be published. Required fields are marked *