Fake News in Social Media: Bad Algorithms or Biased Users?
Although fake news has been present in human history at any time, nowadays, with social media, deceptive information has a stronger effect on society than before. This article answers two research questions, namely (1) Is the dissemination of fake news supported by machines through the automatic con...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Korea Institute of Science and Technology Information
2019-06-01
|
Series: | Journal of Information Science Theory and Practice |
Subjects: | |
Online Access: | https://doi.org/10.1633/JISTaP.2019.7.2.4 |
id |
doaj-aadfb6cd753047679c206e665f85cdec |
---|---|
record_format |
Article |
spelling |
doaj-aadfb6cd753047679c206e665f85cdec2020-11-25T01:49:08ZengKorea Institute of Science and Technology InformationJournal of Information Science Theory and Practice2287-90992287-45772019-06-0172405310.1633/JISTaP.2019.7.2.4Fake News in Social Media: Bad Algorithms or Biased Users?Franziska Zimmer0Katrin Scheibe1Mechtild Stock2Wolfgang G. Stockck3Heinrich Heine UniversityHeinrich Heine UniversityStock-KerpenHeinrich Heine UniversityAlthough fake news has been present in human history at any time, nowadays, with social media, deceptive information has a stronger effect on society than before. This article answers two research questions, namely (1) Is the dissemination of fake news supported by machines through the automatic construction of filter bubbles, and (2) Are echo chambers of fake news man-made, and if yes, what are the information behavior patterns of those individuals reacting to fake news? We discuss the role of filter bubbles by analyzing social media’s ranking and results’ presentation algorithms. To understand the roles of individuals in the process of making and cultivating echo chambers, we empirically study the effects of fake news on the information behavior of the audience, while working with a case study, applying quantitative and qualitative content analysis of online comments and replies (on a blog and on Reddit). Indeed, we found hints on filter bubbles; however, they are fed by the users’ information behavior and only amplify users’ behavioral patterns. Reading fake news and eventually drafting a comment or a reply may be the result of users’ selective exposure to information leading to a confirmation bias; i.e. users prefer news (including fake news) fitting their pre-existing opinions. However, it is not possible to explain all information behavior patterns following fake news with the theory of selective exposure, but with a variety of further individual cognitive structures, such as non-argumentative or off-topic behavior, denial, moral outrage, meta-comments, insults, satire, and creation of a new rumor.https://doi.org/10.1633/JISTaP.2019.7.2.4fake newstruthinformation behaviorsocial mediafilter bubbleecho chamber |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Franziska Zimmer Katrin Scheibe Mechtild Stock Wolfgang G. Stockck |
spellingShingle |
Franziska Zimmer Katrin Scheibe Mechtild Stock Wolfgang G. Stockck Fake News in Social Media: Bad Algorithms or Biased Users? Journal of Information Science Theory and Practice fake news truth information behavior social media filter bubble echo chamber |
author_facet |
Franziska Zimmer Katrin Scheibe Mechtild Stock Wolfgang G. Stockck |
author_sort |
Franziska Zimmer |
title |
Fake News in Social Media: Bad Algorithms or Biased Users? |
title_short |
Fake News in Social Media: Bad Algorithms or Biased Users? |
title_full |
Fake News in Social Media: Bad Algorithms or Biased Users? |
title_fullStr |
Fake News in Social Media: Bad Algorithms or Biased Users? |
title_full_unstemmed |
Fake News in Social Media: Bad Algorithms or Biased Users? |
title_sort |
fake news in social media: bad algorithms or biased users? |
publisher |
Korea Institute of Science and Technology Information |
series |
Journal of Information Science Theory and Practice |
issn |
2287-9099 2287-4577 |
publishDate |
2019-06-01 |
description |
Although fake news has been present in human history at any time, nowadays, with social media, deceptive information has a stronger effect on society than before. This article answers two research questions, namely (1) Is the dissemination of fake news supported by machines through the automatic construction of filter bubbles, and (2) Are echo chambers of fake news man-made, and if yes, what are the information behavior patterns of those individuals reacting to fake news? We discuss the role of filter bubbles by analyzing social media’s ranking and results’ presentation algorithms. To understand the roles of individuals in the process of making and cultivating echo chambers, we empirically study the effects of fake news on the information behavior of the audience, while working with a case study, applying quantitative and qualitative content analysis of online comments and replies (on a blog and on Reddit). Indeed, we found hints on filter bubbles; however, they are fed by the users’ information behavior and only amplify users’ behavioral patterns. Reading fake news and eventually drafting a comment or a reply may be the result of users’ selective exposure to information leading to a confirmation bias; i.e. users prefer news (including fake news) fitting their pre-existing opinions. However, it is not possible to explain all information behavior patterns following fake news with the theory of selective exposure, but with a variety of further individual cognitive structures, such as non-argumentative or off-topic behavior, denial, moral outrage, meta-comments, insults, satire, and creation of a new rumor. |
topic |
fake news truth information behavior social media filter bubble echo chamber |
url |
https://doi.org/10.1633/JISTaP.2019.7.2.4 |
work_keys_str_mv |
AT franziskazimmer fakenewsinsocialmediabadalgorithmsorbiasedusers AT katrinscheibe fakenewsinsocialmediabadalgorithmsorbiasedusers AT mechtildstock fakenewsinsocialmediabadalgorithmsorbiasedusers AT wolfganggstockck fakenewsinsocialmediabadalgorithmsorbiasedusers |
_version_ |
1725008675409494016 |