To Participate on Thurstonblog

email yyyyyyyyyy58@gmail.com, provide profile information and we'll email your electronic membership


Tuesday, January 12, 2016

"... Facebook users tended to choose and share stories containing messages they accept, and to neglect those they reject."

...................................................................................................................................................................
COMMENTS: 
*  Facebook is just a tool to hoover up information about people. Info about selling something is a welcome side effect. The fact that nearly all networks offer a log-in through facebook and some exclusively through facebook tells you something. You also have to use your real name and birthday. Banks warn about publishing your birthday and I know IT people who use a fictonal one for safety. I'm not on it but imagine they have a record about me. .
*  Confirmation-bias applies to scientist as much as any demographic group.  The challenge of looking at information objectively and outside the box is a skill to develop not a trait to inherit. However, this article is about how social platforms, like Facebook or Google, expanding the volume of misinformation by allowing like-minded people to form clusters. Essentially it is a view that ideas looking to find affirmation get it through the virtual world. Some of that information is disseminated through hackers, payed people, bots, but also people doing research and building knowledge from ground up. And for now, most of that information gets an equal treatment.  As the result there is no universal quality metric visible to mark the value. Most of everything is shared without classification. Less is validated. More threats are started to attract attention and less because of accuracy. 
*  In this regard, nothing has changed at all. 100% believe that their version of the truth is the 100% truth, and that the other guy is lying. I remember hearing two versions of a local schoolyard fight. One would swear that neither of the guys telling me the story were even there.
*  "ABC, CBS, NBC, CNN and MSNBC - all FAILED to make the Top 10 in ratings."  Ratings determine what's accurate? You've just proven the point of the commentary.
*  I'm sure there is a herd mentality within any group of people going along to get along. But when there are no standards, or accountablity for media or political candidates for providing something other than bogus information why would you expect the public to know the difference? Corporations also lie to us about safety of cars and meeting health standards for products they produce. If you live in a society full of people that get media attention all the time or lie repeatedly and get rewarded for it why wouldn't you create a society that doesn't know how to tell the difference?
*   The only part of this article that I disagree with is the title which indicates this is a light matter. I think this is a very serious matter and explains a great deal of the hate expressed in terrible ways that we see today. The speed of communication we now enjoy has many great benefits but there are also problems as described here. The solution is for people to act with responsibility but that is not going to happen without some serious work from I do not know who.
...................................................................................................................................................................
How Facebook Makes Us Dumber
By Cass R. Sunstein, January 8, 2016

Why does misinformation spread so quickly on the social media? Why doesn’t it get corrected? When the truth is so easy to find, why do people accept falsehoods?

A new study focusing on Facebook users provides strong evidence that the explanation is confirmation bias: people’s tendency to seek out information that confirms their beliefs, and to ignore contrary information.

Confirmation bias turns out to play a pivotal role in the creation of online echo chambers. This finding bears on a wide range of issues, including the current presidential campaign, the acceptance of conspiracy theories and competing positions in international disputes.

The new study, led by Michela Del Vicario of Italy’s Laboratory of Computational Social Science, explores the behavior of Facebook users from 2010 to 2014. One of the study’s goals was to test a question that continues to be sharply disputed: When people are online, do they encounter opposing views, or do they create the virtual equivalent of gated communities?

Del Vicario and her coauthors explored how Facebook users spread conspiracy theories (using 32 public web pages); science news (using 35 such pages); and “trolls,” which intentionally spread false information (using two web pages). Their data set is massive: It covers all Facebook posts during the five-year period. They explored which Facebook users linked to one or more of the 69 web pages, and whether they learned about those links from their Facebook friends.

In sum, the researchers find a lot of communities of like-minded people. Even if they are baseless, conspiracy theories spread rapidly within such communities.

More generally, Facebook users tended to choose and share stories containing messages they accept, and to neglect those they reject. If a story fits with what people already believe, they are far more likely to be interested in it and thus to spread it.

As Del Vicario and her coauthors put it, “users mostly tend to select and share content according to a specific narrative and to ignore the rest.” On Facebook, the result is the formation of a lot of “homogeneous, polarized clusters.” Within those clusters, new information moves quickly among friends (often in just a few hours).

The consequence is the “proliferation of biased narratives fomented by unsubstantiated rumors, mistrust, and paranoia.” And while the study focuses on Facebook users, there is little doubt that something similar happens on other social media, such as Twitter -- and in the real world as well.

Striking though their findings are, Del Vicario and her coauthors do not mention the important phenomenon of “group polarization,” which means that when like-minded people speak with one another, they tend to end up thinking a more extreme version of what they originally believed. Whenever people spread misinformation within homogenous clusters, they also intensify one another’s commitment to that misinformation.

Of the various explanations for group polarization, the most relevant involves a potentially insidious effect of confirmation itself. Once people discover that others agree with them, they become more confident -- and then more extreme.

In that sense, confirmation bias is self-reinforcing, producing a vicious spiral. If people begin with a certain belief, and find information that confirms it, they will intensify their commitment to that very belief, thus strengthening their bias.

Suppose, for example, that you think an increase in the minimum wage is a sensational idea, that the nuclear deal with Iran is a mistake, that Obamacare is working well, that Donald Trump would be a fine president, or that the problem of climate change is greatly overstated. Arriving at these judgments on your own, you might well hold them tentatively and with a fair degree of humility. But after you learn that a lot of people agree with you, you are likely to end up with much greater certainty -- and perhaps real disdain for people who do not see things as you do.

On the basis of all the clustering, that almost certainly happened on Facebook. Strong support for this conclusion comes from research from the same academic team, which finds that on Facebook, efforts to debunk false beliefs are typically ignored -- and when people pay attention to them, they often strengthen their commitment to the debunked beliefs.

Can anything be done? The best solution is to promote a culture of humility and openness. Some people, and some communities, hold their own views tentatively; they are interested in refutation, not just confirmation. Moroever, those who manage social media (such as Google) can take steps to allow people to assess the trustworthiness of what they are seeing, though these efforts might be controversial and remain in a preliminary state.

In the midst of World War II, a great federal judge, Learned Hand, said that the spirit of liberty is “that spirit which is not too sure that it is right.” Users of the social media are certainly exercising their liberty. But there is a real risk that when they fall prey to confirmation bias, they end up compromising liberty’s spirit -- and dead wrong to boot.
...................................................................................................................................................................

No comments: