Research Paper Draft Final.docx

  • Uploaded by: Michael Bowie
  • 0
  • 0
  • June 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Research Paper Draft Final.docx as PDF for free.

More details

  • Words: 2,513
  • Pages: 10
Bowie 1 Michael Bowie Dr. Sarah Klein MCWP 50 16 March 2018 Blowing a Filer Bubble: How do we Approach Filter Bubbles? In the age of information and technology, information has never been easier to access and learn from. However, despite the abundance of information and relative ease which many people can access, people still seem to end up in what are known as “filter bubbles”. A filter bubble is a state in which a person intellectually isolated due to personalized search results, often due to internet algorithms predicting what a person would like to see most. Because of the nature of the human mind and the way social media algorithms have been developed, it is fairly easy for these filter bubbles to be created. Due to mind’s nature to naturally drift towards cognitive ease, and social media algorithm heavily enabling the consumption of content that is desirable to their users, it becomes far too easy for these filter bubbles to form. Firstly, the process in which filter bubbles form seemingly feeds into itself and can easily grab many people as it grows larger and faster. Filter bubbles begin forming when a person has a certain lived experience, and when they begin to engage with resources to prove whether they are right or wrong, people have a natural tendency to seek out information that proves their correctness, rather than to first seek out information that proves them to be incorrect. Algorithms in social media, along with friends with similar views determine what they commonly search and the type of material they engage with, so it becomes so that they are often only displayed with content that affirms their views, and they never see news that offers a counterpoint. Once in this

Bowie 2 part of the process, it is difficult to challenge their views as they have been affirmed to a great extent as changing perception requires effortful cognitive thinking, something that the human mind is fairly adept at avoiding due to workload stress. With the process explained, we now must discuss the elements of critical-thinking and what allows people to discern information that can be considered reliable or unreliable within the context of media consumption. This is the crux of the conversation and being able to understand why people struggle to reliable discern information that can be counted upon versus something created for an agenda or to prey upon the fact that people cannot properly scrutinize some information. According to King and Kitchener, reflective judgement involves being able to take in relevant information and realizing not all problems can be described with a level of complete certainty and being able to realize this and process the most relevant information from reasonable sources is an important skill (King 9-11). It is possible for people to get stuck thinking that every problem has a certain solution, and there are enough sources of news proclaiming something to be true, some people are unable to engage with the source in a way that makes them think in a way that is skeptical enough to question the information. They will likely think that it is from a news source and they would not be lying due to that fact, and especially if they might have some sort of lived experience or an observation, or previously taught belief that is in line with what is being said, they will likely not question it, especially if the people around them agree. Reflective judgement is an important part of critical-thinking that is often neglected as people tend to believe that everything can be proven right or wrong. Another important part of critical-thinking is being aware of one’s biases and how they can lead one astray and influence our ways of thinking when engaging with content. Our mind

Bowie 3 tends to lean towards easy justifications and steer away from “effortful cognitive work” as Moore discusses within his journal entry. He goes on to mention that humans are better at critically assessing others than themselves as it’s an unavoidable feature of how we think, and critical-thinking at its fullest requires “effortful cognitive work”, such it actually requires one to expend energy (Moore 391). Because of this, it is much easier for people to take information at face value, their minds make quick justifications and interpretations without really giving it a second thought. Social media has made it easy to consume a mass amount of information quickly, and if it is all catered towards an individual’s ideology, it becomes very quick to reaffirm all what they already hold to be true. Moore goes on to describe figurative systems of thinking which he names System 1—the intuitive and automatic thought process, and System 2—effortful cognitive work. Problems arise when the systems start to interact, System 2 often rationalizing intuitive responses that our brain generates from its associative memory so it does not have to work as much (Moore 391-392). This is noteworthy as it is another reason why people feel like when they are told to really think something over and challenge their ideas on a topic they believe they are doing so, but rather their mind is pulling from all the validation they have been receiving rather than considering an alternative option. Moore goes on to discuss how surprise can often take a person out of their natural thought process and make then reconsider their views. He mentions that surprising statistics are sometimes enough, but for more deeply rooted and intuitive beliefs, it takes personal experience to have a chance at changing them (Moore 392-393). Due to the nature of media we consume online, a person is much more likely to consume a news story about a specific event rather than hearing “surprising statistics” anyway, and it is much easier to consume that trending videos that show anecdotal evidence. There is the possibility with the friend groups that they interact with on

Bowie 4 social media share videos and events that push their own agenda, and possibly later seek out articles and statistics to further reassure themselves that they are correct. Humans have intuitive and instinctual beliefs that are also heavily influenced by what they can see and personally experience more so than statistics can provide at some time. For a perhaps a less educator individual Another aspect of this cognitive work ties into cognitive dissonance. Cognitive dissonance occurs when a person must “consider the negative implications of their selected choice”. Liao conducted a study observing who were more typically “challenge-adverse” and “diverse-seeking”, as they had found that some people were not as averse to challenging ideas at others. Their findings had found that typically more educated person—a university student, graduate, etc.—tended to be more “diverse-seeking” , particularly for topics that are of high importance high importance being something that strongly linked with their core values and had a relevant outcome, they used political topics such as gay marriage) (Liao 2360-2361). It is likely that the more educated an individual is, the more time they have had to hone their criticalthinking skills, the more time they have had to sit and analyze different points of view and discuss them. With the knowledge that nothing in the world should be taken for face value, it is reasonable to say that they would be the ones that would seek out information that is diverse in nature. However, with topics with low topic-involvement, selective exposure was still pronounced (Liao 2366). This could mean for topics that do not seeming directly affect certain people, it can be hard for them to bother to change their views because it does not directly affect them. This could play into why some people in America do not see sexism and racism as issues, as they might not be directly affected by it. Topic-involvement complicates selective exposure,

Bowie 5 but there is a general trend that the more educated and more a person cares about a topic, the more likely they will engage in a diverse set of material. It is important to factor in education when it comes to the amount selective exposure as in Garrett’s findings, people in general seek information to validate their views because of the cognitive dissonance theory (Garret). Rather than having to go through the struggle of changing perceptions, it easier for them to find validating views to do away with cognitive dissonance. His findings a somewhat complicate Liao’s, as they group they were testing often sought information that was opinion reinforcing and spent time with sources that they would knew affirm their views, but they noted that their sample was disproportionally male and white, and that population being more conservative leaning. The female population that was there also was typically more liberal. This is important to note because politics often has ideology and practices that might have certain groups driven away from one side or the others because their perceived beliefs. It makes sense for the females being disproportionally female as it might seem like conservative views do not have women’s best interests in mind, for example. Considering that, it is likely that we need to make an effort to education people more on media literacy. Garrett advocates for this as it helps decrease selective exposure and allows people to be more critical of the information the engage with on the internet. Now that the cognitive and psychological aspects address in a more general manner, we can take a deeper look into behavior interacts with social media, and the way that social media leads to the creation of filter bubbles. What is seen on social media is mostly content that is either shared by your friends, which likely hold similar views to you, or something the algorithm has predicted that you might like, meaning if it is news to consume, news that tailors to your

Bowie 6 political leanings. The reason why this brings up concern is because it enables the consumption of content that one agrees with such ease. In a study by Eytan Bakshy et al, they found that amount of “cross-cutting content”, content that is differing from political beliefs held by a person, is not greatly influenced by what actually is shared by their friends, as 70% of the cross-cutting content gets ignored due to user choice, in other words, selective exposure. This is a lot more than the 15% cross-cutting content that gets unnoticed due to it never actually appearing in their feed (Bakshy). This is important to note and to consider, that a lot of content might get ignored due to selective exposure and sometimes is not completely the fault of the algorithm, it is still important to note its impact. This is within only of the context of Facebook, and it does not completely consider user might not follow through on an article link. Garrett’s findings mention that a user is less likely to read or spend less time reading an article a user knows opposes their views. This is important to consider because platforms like Facebook have comment sections, a place where people can still get information about an article and discuss their opinions as friends do. Because of this, there is more incentive to seek out the comments to save time and the frustration of in engaging in an article that has views different from your own, but it is likely that the comments can be biased. Algorithms are not the main culprit but play a more subtle and covert role in facilitating the creation of filter bubbles. It is up to a user to decide if what information they consume is true or not and when a fair amount of people are not capable of performing the effortful cognitive work and reflective thinking work required to sift and pass proper judgement on the information they engage with, it leaves room open for exploitation. People with malintent can use this fact to push out stories people will believe for a political agenda, especially through the use of bots. While often benign or even helpful, social bots can still be used to trick and deceive people on

Bowie 7 social media. They have been used to manipulate and upset political discourse and spread misinformation. They can even generate artificial support for a particular political candidate, by having multitudes of fake accounts that look real to the average person. Ferrara et al developed a method in which they created bots themselves that sent out tweets that were completely nonsensical and random text but noticed that some accounts followed anyway. They were able to determine that many of these accounts were being use to spread misinformation and made the connections. (Ferrera 96-98). This is dangerous because most people do not have ways to tell a real person from a fake one on social media, especially if they do not look into it too deeply. Most people will not have the time or care enough and will take most things at face value, as per established. This can be exploited as these bots, particularly on Twitter can follow people in an attempt to expand their social circle and spread more misinformation. This can be seen with the most recent election even, where Twitter was able to take notice of a large amount of ‘bots’, were purged from Twitter’s servers, causing a lot of conservatives to complain about losing followers (Rosenberg). This suggests that there could be a greater force at work here and being able to use social media to intentionally create these bubbles is harmful to the democratic process. Many people are not aware and are seemingly worried about losing the followers and support they have to affirm the political opinions, which is dangerous to the state of the country. Filter bubbles are alarming concern because they prevent people from being as open and engaging with diverse knowledge and people that differ from them. They form due to psychological factors which includes the minds tendency to avoid cognitive effort and dissonance. Social media algorithms prey on a person’s desire to be validated and affirmed, and this can be dangerous as it can be exploited. Looking into ways to educate people in media

Bowie 8 literacy and combat the abuse of the algorithms is important if we are to keep people accountable and to ensure democracy is not greatly affected.

Bowie 9

Bowie 10 Works Cited Bakshy, Eytan, Solomon Messing, and Lada A. Adamic. "Exposure to ideologically diverse news and opinion on Facebook." Science 348.6239 (2015): 1130-1132. Ferrara, Emilio, et al. "The rise of social bots." Communications of the ACM 59.7 (2016): 96104. Garrett, R. Kelly. "Echo chambers online?: Politically motivated selective exposure among Internet news users." Journal of Computer-Mediated Communication 14.2 (2009): 265285. King, Patricia M., and Karen Strohm Kitchener. Developing reflective judgment: understanding and promoting intellectual growth and critical thinking in adolescents and adults. JosseyBass Publishers, 1994. Moore, David Cooper. "Thinking, Fast and Slow (2011)." Journal of Media Literacy Education 5.2 (2013): 6. Rosenberg, Eli. “Twitter Suspends Thousands of Suspected Bot Accounts, and the pro-Trump Crowd Is Furious.” The Washington Post, WP Company, 21 Feb. 2018, www.washingtonpost.com/news/the-switch/wp/2018/02/21/twitter-suspends-thousandsof-suspected-bots-and-the-pro-trump-crowd-is-furious/?utm_term=.ba8fb71ecf1f.

Related Documents

Research Paper
October 2019 49
Research Paper
May 2020 22
Research Paper
August 2019 49
Research Paper
June 2020 15

More Documents from "89639"