![]() Finally, by means of a multinomial logistic model, we are able to predict with a good precision the probability of whether a user will become polarized towards a given narrative or she will continue to switch between information supporting competing narratives. ![]() The vast majority of the latter-after the initial switching phase-starts consuming mainly one type of information, becoming polarized towards one of the two conflicting narratives. On both platforms, we observe that some users interact only with a specific kind of content since the beginning, whereas others start their commenting activity by switching between contents supporting different narratives. how users become polarized comment after comment. We extend our analysis by investigating the polarization dynamics-i.e. In this paper, using a quantitative analysis on a massive dataset (12 M of users), we compare consumption patterns of videos supporting scientific and conspiracy-like news on Facebook and Youtube. We believe that additional insights about cognitive factors and behavioral patterns driving the emergence of polarized environments are crucial to understand and develop strategies to mitigate the spreading of online misinformation. A recent study pointed out the inefficacy of debunking and the concrete risk of a backfire effect from the usual and most committed consumers of conspiracy-like narratives. those supporting anti-vaccines claims, climate change denials, and alternative medicine myths-keep proliferating in polarized communities emerging on online environments, leading to a climate of disengagement from mainstream society and recommended practices. ![]() In spite of different debunking strategies, unsubstantiated rumors-e.g. Moreover, the spreading of misinformation on online social media has become a widespread phenomenon to an extent that the World Economic Forum listed massive digital misinformation as one of the main threats for the modern society. Indeed, conspiracy-like pages disseminate myth narratives and controversial information, usually lacking supporting evidence and most often contradictory of the official news. We limit our analysis to Science and Conspiracy for two main reasons: a) scientific news and conspiracy-like news are two very distinct and conflicting narratives b) scientific pages share the main mission to diffuse scientific knowledge and rational thinking, while the alternative ones resort to unsubstantiated rumors. We then compare the users interaction with these videos on both platforms. We focus on Facebook posts linking Youtube videos reported on Science and Conspiracy pages. To shade light on the role of algorithms for content promotion in the emergence of echo chambers, we analyze the users behavior exposed to the same contents on different platforms-i.e. Recent studies suggest confirmation bias as one of the driving forces of content selection, which eventually leads to the emergence of polarized communities where users acquire confirmatory information and ignore dissenting content. Not much is known about the role of cognitive factors in driving users to aggregate in echo chambers supporting their preferred narrative. Conversely, YouTube promotes videos through Watch Time, which prioritizes videos that lead to a longer overall viewing session over those that receive more clicks. photo, video, status update-can also make a post more likely to appear. Facebook promotes posts according to the News Feed algorithm, that helps users to see more stories from friends they interact with the most, and the number of comments and likes a post receives and what kind of story it is-e.g. Indeed, little is known about the factors affecting the algorithms’ outcomes. The role of these algorithms in influencing the emergence of echo chambers is still a matter of debate. they show users viewpoints that they already agree with. However, on online social media, different algorithms foster personalized contents according to user tastes-i.e. Ĭonfirmation bias has been shown to play a pivotal role in the diffusion of rumors online. The emergence of polarization in online environments might reduce viewpoint heterogeneity, which has long been viewed as an important component of democratic societies. Polarized communities emerge around diverse and heteorgeneous narratives often reflecting extreme disagreement with respect to the main stream news and recommended practices. Such a disintermediation elicits the tendencies of the users to a) select information adhering to their system of beliefs-i.e., confirmation bias-and b) to form groups of like minded people where they polarize their view-i.e. ![]() We passed from a mediated (e.g., by journalists) to a more disintermediated selection process. The diffusion of social media caused a shift of paradigm in the creation and consumption of information. ![]()
0 Comments
Leave a Reply. |