The authors of this book argue that social factors contribute more to the spread and persistence of false beliefs than individual psychology. They explore some of these social factors in the text.
Databite Discussion: Digital Technology and Democratic Theory
In this episode of a video series by the Data & Society Research Institute, the authors of Digital Society and Democratic Theory discuss their new book. Of special interest is Archon Fung's discussion of the "Public Sphere," a closely related concept to the information ecosystem, and how its structure has changed in the internet age.
These organizations are doing research into the way our information ecosystem works.
Data and Society researches the ways that data-driven technologies and automation - including the kind found in social media and online recommendation algorithms - influence society. This section of their site features their research on media.
The Dangerous Speech Project studies forms of human expression that promote violence between groups of people, and investigates ways to mitigate the risk of violence while protecting freedom of expression.
This brief video explains how recommendation algorithms like YouTube's, are designed to encourage users to stay on the platform as long as possible. It explains how this can inadvertently encourage misinformation, and the ways that bad actors can use this system to purposefully spread disinformation.
In this lecture from the Just Infrastructure speakers series, Dr. J. Nathan Matias discusses the impact of content moderation algorithms on freedom of expression and the spread of misinformation. Like recommendation algorithms, existing content moderation algorithms have been designed by tech companies with profit in mind. Matias discusses what content moderation algorithms designed with an eye toward promoting a healthy democracy
The Misinformation Age
Here, Dr. Cailin O'Connor argues that social factors contribute more to the spread and persistence of false beliefs than individual psychology. In other words, people are likely to believe what those in their personal social networks believe, and these social affiliations are better predictors of conspiracy theories and anti-science beliefs than individual intelligence or access to education. She discusses her book in this talk from September of 2019.
More Resources for Understanding our Information Ecosystem
This ongoing series hosted by the University of Illinois explores the complex interactions between people, systems, and algorithms that affect our information ecosystem. Check out the most recent lecture by Joan Donovan on media manipulation.
In this excerpt from his book "Why We're Polarized", author Ezra Klein discusses the ways in which the attention economy -- a financial structure in which engagement equals profit for online media companies-- leads to increased polarization. Consider the following quote: Since the early 2000s "the internet has become much better at learning what we want and giving us more of it...[a]nd all of this has changed both how political news is produced and how it’s consumed."