This paper was written for the course "New Media & Politics."
The Information Flood & its Discontents: Postmodern Politics in Highly-Mediated Systems
“While probably no former time tolerated so many diverse opinions on religious or philosophical matters, factual truth, if it happens to oppose a given group’s profit or pleasure, is greeted today with greater hostility than ever before.”
-Hannah Arendt, “Truth and Politics”
“The simulacrum is never that which conceals the truth—it is the truth which conceals that there is none. The simulacrum is true.”
-Jean Baudrillard, “Simulacra and Simulation”
In the millions of years between the evolution of the genus Homo sapiens at the beginning of the Stone Age and the year 1999 AD, humans produced an estimated 12 exabytes, or 10^18 bytes, of information. In the three years following this, from 2000-2002, this number doubled to 24 exabytes. In 2006, the Internet by itself contained 200 exabytes; by 2009 had 500 exabytes; and by 2013 had grown to 1 whole zettabyte, which is 10^21 bytes (Batalla and Ledzinska). This means that throughout the whole arc of human history, stretching all the way back to our evolutionary emergence, significantly more data has been produced in the past couple decades than in the millions of years preceding them. Not just significantly more, exponentially more—and it’s only growing. The advent of computing, and the spread of the Internet and information and communications technologies (ICTs), has facilitated an explosion of information production. We live our lives in a sea of data, and increasingly must navigate overwhelming amounts of information in order to make decisions about our lives. Information fuels our economies, informs our politics, monitors our health, and controls our social lives.
Such information-rich environments are prone to information overload, a phenomenon in which an overabundance of information renders an individual, machine, or group unable to process it effectively. This can lead to a number of stressors, and has consequences at both individual and group levels. For example, it can cause psychological stress, “analysis paralysis,” and reduced decision-making quality (Lincoln). But the implications of information overload are not just incidental—they can also be weaponized by actors who wish to use the affordances of an increasingly information-rich world for their own ends. Dictators, rather than risk being accused of top-down censorship, can flood the information market with an overwhelming amount of content in order to “bury” any information they find undesirable or unwanted. Hackers use denial-of-service (DoS) attacks to incapacitate network resources by flooding the site with traffic, overwhelming the host and preventing real clients from accessing it. Social media harassment campaigns coordinate a large collection of both bots and real people in targeting a specific victim. In this way, actors—political and otherwise—can use the affordances of information overload, and the information economy in general, to control and manipulate political ideology and the experience of politics more generally. They can also exploit the sense of confusion and chaos that often accompanies information overload, as they seek to manipulate a public that has become disillusioned with traditional notions of truth and the conduits through which they operate.
In this paper, I will first lay out the contours of the information economy and information overload as a phenomenon, providing examples of how it has been weaponized by various actors in different contexts, and situating it within a discussion of information control more generally. Next, I will explore how modern media systems express symptoms of information overload, including polarization and fragmentation. I will then provide some background information on postmodern theory, linking it to the study of media and politics. Focusing on Russia specifically, I will examine its unique situation in regards to information control, applying concepts from postmodernism in order to understand the Russian media environment. In this section I explore how the Russian state has used techniques of information manipulation to construct spectacles of political theater, creating the impression of democracy while maintaining tight control over political narratives behind the scenes. Finally, I theorize about a potential consequence of information overload as a form of manipulation: the development of a “post-truth” society in which strong notions of “truth” lose favor among the general population in the face of destabilizing amounts of information. Amid the chaos and confusion of a world in which the very nature of objective reality is up for debate, I theorize that the vacuum of political organization is filled by authoritarian leaders. Rather than coalescing around party, ideology, or shared history, I believe people may rally around a strongman figure—a “postmodern authoritarian.”
I. The Information Flood
Though the term was codified relatively recently, information overload is not a new phenomenon, nor is popular anxiety about its potential effects. In Ecclesiastes 12:12 of the Hebrew Bible, the author claims “of making books there is no end” (Ecclesiastes 12:12). The first century Roman writer Seneca the Elder once remarked “the abundance of books is distraction.” This sentiment become particularly common in response to the explosion of writing that followed the invention of the printing press in the 15th century AD, as well as the increased book production of 18th century Europe. Each time this kind of “information anxiety” set in, society developed a way of managing these increases in amounts of information—in the case of post-printing press Europe, this manifested in the prevalence of encyclopedias and reference books (Blair).
Before turning to information overload’s current manifestation, it’s important to first define and clarify terms. Often times, “data” and “information” are used interchangeably, though they have an important distinction: the former are single isolated facts detached from context, whereas the latter refers to collections of organized, interpreted data with applied meaning (“Data vs. Information”). This paper will use the terms “information” and “information overload,” recognizing that these terms are more all-encompassing than “data.” The literature on information also sometimes uses the term “knowledge,” particularly when referring to the “knowledge economy” as a feature of the “information age” (Smith). Though “information overload” has many definitions and conceptualizations, this paper will adopt Speier et al.’s (1999) view that “information overload occurs when the amount of input to a system exceeds its processing capacity. Decision makers have fairly limited cognitive processing capacity. Consequently, when information overload occurs, it is likely that a reduction in decision quality will occur.”
Information overload, while certainly not novel, has expanded in the last few decades as a result of the growth and spread of ICTs. The amount of information being produced is growing exponentially, as we develop increasingly more new ways of capturing, measuring, and expressing data. This can have a number of negative consequences on the body, including increased cardiovascular stress, psychological stress, frustration, and confusion (Shenk). But it can also bring about a whole host of other mental pathologies—the inability to separate relevant information from irrelevant, worse decision-making, and “analysis paralysis,” in which an overemphasis on analyzing information leads to the inability to make a decision (Melinat et. al). From a torrent of incoming information, there seems to be two general reactions that can prove problematic: one, a filtering mechanism that excludes critical information, leading to worse outcomes; and two, a generalized anxiety and confusion about what’s important and/or even true.
II. Information Overload as a Mechanism for Information Control
The ICT revolution had a significant effect on the news media system, which had traditionally been dominated by a smaller number of large newspapers and broadcast networks. In the U.S. and around the world, as social media provided a platform for ordinary citizens to voice their opinions and document events as they unfolded, journalists had to re-negotiate their role in society as gatekeepers of information in the news environment (Chadwick). In this new networked system, essentially anyone—provided they have the access—can be an information provider, and the line between producer and consumer becomes blurred (Batalla and Ledzinska). This challenged the monopoly on information previously enjoyed by traditional media organizations, opening up a channel for independent and citizen journalists, alternative news sources, and other actors (such as brands, politicians, and NGOs) who wished to contribute to and direct the conversation. So what we had was a weakening of trust in “old” media institutions and an influx of many, diverse “new” media actors. On top of this, the affordances of new media tools allowed for ubiquitous media consumption—24-hour news cycles, constant social feed monitoring, and always being “plugged in.”
This changed our media system in such a way that attention became an incredibly scarce resource, and competition grew around attracting it. The ability to capture and maintain attention determined, to some extent, who was successful in this new paradigm (Goldhaber). It may have also led to the polarization and fragmentation of the media environment, as well as opened up new forms of information control. As previously discussed, information overload necessitates a mechanism through which people can manage all the information—in the digital age, this can be search engines, curators, thought leaders and influencers, and “listicles.” The processes of filtering and curating information excludes information that may be important or relevant, or that provides balanced, contextual perspectives. Though empirical research so far is mixed, it has been theorized that this process may create a “bubble” effect in which media consumers stay siloed within communities of like-minded media participants, reinforcing and potentially intensifying their beliefs. When this occurs along the political spectrum, it leads to a polarization between political ideologies (Kellogg Insight, “Surprising Speed”). Similarly, fragmentation refers to the wide spread of beliefs people develop from the media system, as a result of a diverse set of content providers. This allows people to personalize their information diets, and thus makes mass political behavior difficult to predict (Tewksbury and Rittenberg).
But most importantly, these changes to the media system created new methods and opportunities for information control and manipulation—from both ordinary people as well as those in power. The affordances of new and digital media narrow the amount of space between media platforms and their consumers, and produce a large amount of data that is easily exploitable. As just one example, Internet service providers can sell Internet browsing history to companies, who hyper-target their marketing down to the individual level (Gillula and Tummarello). In contrast to traditional media, new media gives more actors the opportunity to manipulate information, and it creates more opportunities for information to be manipulated in the first place. In the wake of the Arab Spring and other popular movements for democracy around the world that were aided by new media, it was a common refrain in the popular press that social media had a democratizing or levelling effect. As Howard and Hussain point out in Democracy’s Fourth Wave?, attempts to censor or completely shut down the Internet during the Arab Spring protests usually failed, as citizens created work-arounds (Howard and Hussain). But this view neglects the many ways in which government leaders, authoritarian or otherwise, use the affordances of new media and digital technologies to serve their own goals. Increased social media use, particularly when used to coordinate political activities like protests, means more opportunities to monitor and track dissidents and opponents. Even if governments aren’t able to successfully limit or shut down Internet access, they can use online data to monitor and, in some cases, prosecute political actors they see as threatening (Morozov). This is what Sarah Oates refers to as “networked authoritarianism” in Revolution Stalled (Oates). When the Turkish intelligence agency hacked into the database of the messaging app ByLock months before the 2016 coup attempt, they were able to identify a network of dissidents it then prosecuted (Reuters). During the coup attempt itself, President Erdogan used social media networks—which he had previously derided—to call for a resistance against the coup, a real-time call to action that would have been difficult before digital media (Stein).
More specifically, information overload has similarly created new opportunities for manipulation. For example, denial-of-service (DoS) attacks seek to bring a site or network offline by sending an overwhelming number of service requests to the host so that the server is jammed and legitimate traffic is prevented from accessing the resource. These kinds of attacks use the easy, cheap nature of digital communication—and the data it can generate—as a weapon against online tools that embrace the open nature of the web. They are used in hacking and theft, but are techniques that are available to any with the technical know-how. Another technique that exploits information overload is astroturfing, in which an actor floods an information market with messages that support their views to give the impression of their popularity. The messages can be genuine or manufactured (as in with the use of bots), but the intention is the same—to appear more popular than may be the case (MacKinnon). Similarly, in what’s known as third-generation Internet control, an actor can use an overwhelming amount of messages or information in order to actually bury or invalidate unwanted information (Oates). This technique focuses “less on denying access than successfully competing with potential threats through effective counter-information campaigns that overwhelm, discredit, or demoralize opponents” (Deibert et. al). Moreover, as Margaret Earling Roberts (2014) states, states can participate in information flooding, in which they use the viral nature of new media to augment the spread of pro-regime content; or information friction, in which they use other gatekeeping affordances of new media to make anti-regime sentiment inaccessible or disaggregated.
It’s also important not to ignore the wider consequences of a media system characterized by overwhelming amounts of information from an increasing number of sources and platforms. As previously discussed, mechanisms for narrowing down all this content do develop, and they have issues of their own. But the condition of being overwhelmed by information—and particularly political and current events information—may in and of itself be a flashpoint for manipulation. Sun Kyong Lee et al. (2016) show that those who pay attention to news through new and digital media platforms were more likely to feel the effects of information overload. Is it possible this fosters a general malaise with regards to news media, and a disillusionment with its ability to deliver accurate, relevant information? Even more importantly, does it bring into question the importance, and even existence, of notions of “truth” in relation to politics?
III. Postmodern Authoritarianism: Power in a Post-Truth World
To explore this potential phenomenon, I turn to postmodern theory to examine conceptions of reality that diverge from scientific, rational certainty, particularly in heavily-mediated environments. Broadly speaking, postmodernism is an intellectual movement that challenged the modernist movement’s reliance on objective truth, rationality, reason, and Enlightenment values. Instead, postmodernism emphasized pluralism, relativism, intertextuality, and skepticism—generally speaking, a disillusionment with hegemonic notions of reality. Within this paradigm, media is central to the construction of reality, as it increasingly becomes the lens through which we view and understand what’s happening around us—just as Frederic Jameson conceived of postmodernism as “the transformation of reality into images” (Jameson). The postmodern condition, which is skeptical of universal objectivity, sees the construction of reality through media—which provide an interface between people and “what really happens”—as the primary way in which people develop consensus.
The paradox here is that, in such a mediated world, the distinction between the medium and “reality” (if such even exists) becomes blurred. In hyperreality, people are unable to distinguish between reality and simulation of reality. According to Baudrillard, a simulation is the mixing of “reality” and representation, so that one cannot tell where one ends and the other begins; the simulacrum, on the other hand, is a copy without an original (Baudrillard). In a heavily mediated world, as media builds on top of and in reference to each other, the supposed object of representation loses importance, and it is mediations of this object that become reality. In such a way, notions of reality are developed through representations of it, in a way that makes the original object irrelevant. Within such a system, a significant element of politics is the spectacle or performance of politics. Debord, in The Society of the Spectacle, highlights how many aspects of life are no longer experienced as they are, but are viewed through mediated forms or spectacles (Debord). Politics is also highly mediated, presented as entertainment on 24-hour news channels and social media feeds, blended into commercials on TV for companies who wish to stake out a social stance, and played out in politicians’ photo ops with factory workers, small business owners, constituents. Such highly-mediated pseudo-events exist to represent the workings of politics, but in practice become the politics themselves (Boorstin). According to postmodern theorists, the blurring of lines between “the image” of politics and its actuality can prevent the general public from engaging meaningfully in politics, as well as foster disillusionment with dominant conceptions of reality. Jameson warns of a condition of “schizophrenia” in which the overwhelming, jumbled nature of modern media in a system of “late capitalism” results in the inability to form genuine self-identities, leading to cultural confusion and general malaise (Jameson).
To illustrate these postmodern concepts applied to media systems, and to link them back to the concept of information overload, I turn to Russia’s unique political and media environments. Without providing a full diagnostic on Russia’s media system, which must take into account complicated historical circumstances and current policy, I use Russia merely as a case study to highlight manifestations of the weaponization of information and information overload. The underlying assumption is that Russia, while technically a democracy, functions more similarly to an authoritarian state (Sakwa). Thus, I argue, the state is able to co-opt ICTs and exploit their effects in order to maintain a tight monopoly on political power. Most importantly, I link information manipulation and information overload to the concept of “postmodern authoritarianism”—the consolidation of power as an effect of a postmodern media and political environment.
Peter Pomerantsev has called modern-day Russia a “postmodern dictatorship,” in which the state uses “simulated institutions and simulated narratives” to create the illusion of democracy and freedom (Pomerantsev, “Postmodern Dictatorship”). Inherited from the late and post-Soviet period, this cynicism belies a disillusionment with notions of truth, as many Soviet citizens had to juggle the public profession of belief in the Soviet model with their private insecurities and resentment. The swift change from Soviet-style communism to Western free-market capitalism—two polarized extremes, each requiring their own very different narratives to maintain—fostered a general sense of confusion about how society should operate. On top of this, holdover institutions from the Soviet Union, like the Kremlin, continued to influence politics in the newly “democratic” Russia. The result was managed democracy, in which the centralized state held a monopoly on political power, but used the markings of democracy—supposedly free and fair elections, an active media environment, and contentious political parties—to give the impression of freedom (Pomerantsev, “Postmodern Dictatorship”).
Russia’s media system may be active, but this does not mean it is not used as a mechanism through which the state strengthens its control over the country. In fact, it often actively presents politics as spectacle, allowing the impression of democratic participation while carefully maintaining a tight script over political events. Such was the case with Snob/Zhivi media house, founded by oligarch Mikhail Prokhorov as a politically oppositional media company that appealed to Westernized Russians. Snob was allowed to exist because it presented an easily co-opted political alternative to President Putin and the ruling party, allowing dissidents to “let off steam” without posing a real threat to existing power structures (Pomerantsev, “Postmodern Dictatorship”). The phenomenon of spectacle also manifests in the state’s creation of political parties. Vladislav Surkov, one of Putin’s top aides and known as the “political technologist” of managed democracy, has been involved in the creation and funding of various political parties across the spectrum. This includes both United Russia, the center-right ruling regime’s party, as well as the center-left social democratic Just Russia (Sakwa). There is also evidence he has supported both far-left and far-right political movements, as well as both human-rights NGOs and nationalist organizations, raising the question of what the state would benefit from supporting oppositional parties, particularly those on extremes of the political spectrum (Pomerantsev, “Putinism”). Keeping in mind the concepts of spectacle and simulacrum, it is clear that the state supports all these different combating narratives in an attempt to own all of them. Rather than suppress these kinds of oppositional sentiments within society, it is much more effective to allow them to exist but in highly managed ways. Politics is thus not the competition between factions with different visions, but merely a reenactment, a performance of ideology existing within the framework of an all-encompassing Russian state.
We also see the simulation of politics in what Gregory Asmolov terms “virtual Potemkin villages,” or the illusion of statehood through the use of ICTs (Asmolov). In the wake of the Soviet Union’s collapse, Russia was highly corrupt, and the central government had little power to enforce decisions it made in the vast outlying districts beyond Moscow. To centralize and consolidate power, Putin created a “power vertical” that gave him almost full control of the country. While intended to improve the Kremlin’s ability to govern and provide services to the whole country, it actually highlighted the government’s inability to manage the large nation. Instead, the state uses ICTs to create the appearance they are providing necessary services when they are not, falling into a long tradition of the Russian state using spectacle as political strategy (Asmolov).
The philosophy behind this form of managed democracy is distinctly postmodern. According to Richard Sakwa, “Surkov’s philosophy is that there is no real freedom in the world, and that all democracies are managed democracies, so the key to success is to influence people, to give them the illusion that they are free, whereas in fact they are managed. In his view, the only freedom is ‘artistic freedom’” (Sakwa). An information-rich media system lends itself to such a condition, because a highly-mediated environment provides more opportunities to influence consumers. In discussing third-generation Internet control, Sarah Oates highlights how, by increasing access to the Internet for its citizens, the Russian state is actually helping to inculcate pro-regime sentiment (contrary to the popular view that the Internet leads merely to political opposition and rebellion). This is because those who do access the Internet are exposed to overwhelmingly pro-regime content, increasing their sympathies for the government. Of course, the Internet is regulated in ways other than top-down censorship in order to create a generally regime-friendly atmosphere—another manifestation of managed democracy and the simulation of politics (Oates).
Instead of using top-down censorship to remove unwanted political information in the media environment, as most regimes do, the Russian state uses the affordances of new media to carefully control the space. Rather than attack the supply, the state manipulates the flow of information and influences the demand for certain content. As mentioned before, this can include astroturfing, in which the state floods the information market with overwhelming amounts of pro-regime sentiment to create the illusion of support and drown out any unwanted content. It can also use extensive targeted campaigns to discredit, embarrass, or overwhelm people posting anti-regime content, discouraging such posts in the first place (self-censorship) and creating the impression of a unified pro-regime public. Instead of trying to eliminate “bad” information from the space, the Russian regime understands that dominating the internet with extensive counter-narrative campaigns is the more effective way to win the information war. It also provides the appearance of freedom, lending credence to its claim as a democracy, while in reality nurturing an environment that is extremely hostile to anti-regime sentiment (Oates).
So far, I have discussed direct, first-order effects of the use of information and information overload in the media space. This includes how the various affordances of ICTs and the new media system facilitate the use of information as a tool in political conflict. But I would like to go a step further and theorize about the potential indirect, second-order effects of information overload in media ecosystems, particularly with regards to conceptions of “truth” in society. The crux of this theory is that, over time, overwhelming amounts of information in the media system fosters an anxiety about our inability to digest and comprehend the vast amounts of information in our world, similar to Jameson’s conception of “schizophrenia” in regards to cultural identities. This will lead to either the splintering and fragmentation of society, as people personalize their understanding of the world around them, or a sort of postmodern ennui, a disillusionment with the concept of objective truth more generally. In both of these cases, no over-arching societal narrative acts as a cohesive for a wildly diverse population. Social strife resulting from a lack of civic unity, coupled with shaky notions of objective “truth,” leaves the population particularly vulnerable to social control and manipulation. In such a “post-truth” world, what is the organizational unit of politics, if not ideology or party? Is it the “postmodern authoritarian,” who believes in nothing but power?
Modern media environments rely on vast amounts of information, which can often be manipulated. In our attention economy, those who are most able to receive, maintain, and direct attention wield more power than ever before. According to danah boyd, “A new form of information manipulation is unfolding in front of our eyes. It is political. It is global. And it is populist in nature. The new media is being played like a fiddle, while decentralized networks of people are leveraging the ever-evolving networked tools around them to hack the attention economy” (boyd). There is a world of difference between the decentralized hackers of which boyd speaks and a centralized authoritarian state, such as in Russia. But I believe these techniques can be co-opted by authoritarians, just as other new media methods of information manipulation are easily co-opted and re-appropriated. And as these techniques evolve, what will be the public’s reaction to the awareness of manipulation, to the realization that their politics is a highly-mediated spectacle? I am limited by a lack of empirical research on this potential trend, making it impossible to develop a true conclusion. And it may be the case that media polarization actually creates stronger adherence to notions of “truth” (just notions of “truth” that are very opposed to one another). But my hypothesis is that one response to a “post-truth” media environment is, much like Jameson’s conception of cultural “schizophrenia,” a postmodern skepticism about grand “truth” narratives. The result would therefore be a decrease in the importance of political ideology and a general re-orientation of politics around something that can provide the certainty, stability, and visionary narrative people seek in its place—charismatic authoritarian leaders.
IV. Conclusion
The proliferation of ICTs has been paramount in the creation of media systems characterized by vast amounts of information. One side effect of this has been to exacerbate the effects of information overload, in which an individual or group is unable to process all of the information fed to it, leading to analysis paralysis and poor decision-making. Information overload, and the proliferation of information more generally, can be weaponized by various actors within society, including the state. This often includes techniques like astroturfing and third-generation Internet control, which rely on the affordances of new media to manipulate information. Specifically, the Russian state has utilized these methods of information control, and provides a real-life illustration of the kinds of political manipulation theorized in postmodernism, such as spectacle, simulacrum, and simulation. Using highly-mediated and information-rich environments, the state has created the impression of democracy while maintaining tight political control behind the scenes. I theorize that, according to principles of postmodernism, one potential reaction over time to the confusion and anxiety wrought by information overload and continued information manipulation could be a disillusionment with notions of “truth” and objectivity in regards to politics, and the re-alignment of politics, in the absence of ideology or party, around authoritarian leaders—the “postmodern authoritarian.”
Works Cited
Arendt, Hannah. “Truth and Politics.” The New Yorker, New York, NY, 25 Feb. 1967.
Asmolov, Gregory. “The Kremlin’s Cameras and Virtual Potemkin Villages: ICT and the Construction of Statehood.” Bits and Atoms: Information and Communication Technology in Areas of Limited Statehood, edited by Steven Livingston and Gregor Walter-Drop, Oxford University Press, 2014.
Batalla, Jordi Mongay, and Maria Ledzinska. “On Reducing the Detrimental Information Flood in the Use of the Internet.” Problems of Education in the 21st Century, Vol. 28, 2011, pp. 84-95, http://www.scientiasocialis.lt/pec/files/pdf/vol28/84-95.Batalla_Vol.28.pdf.
Baudrillard, Jean. Simulacra and Simulation. University of Michigan Press, Ann Arbor, MI, 1994.
Blair, Ann. “Information Overload, Then and Now.” The Chronicle of Higher Education, 28 Nov. 2010.
Boorstin, Daniel J. The Image: A Guide to Pseudo-Events in America. Vintage Books, New York, NY, 1992.
boyd, danah. “Hacking the Attention Economy.” Points.datasociety.net, 5 Jan. 2017, https://points.datasociety.net/hacking-the-attention-economy-9fa1daca7a37#.63deal7xi.
Chadwick, Andrew. The Hybrid Media System: Politics and Power. Oxford University Press, Oxford, 2013.
“Data vs. Information.” Diffen, http://www.diffen.com/difference/Data_vs_Information.
Debord, Guy. The Society of the Spectacle. Black & Red, Detroit, MI, 1970.
Deibert, Ronald, John Palfrey, Rafal Rohozinski, and Jonathan L. Zittrain. Access Controlled: The Shaping of Power, Rights, and Rule in Cyberspace. The MIT Press, Cambridge, MA, 2010.
Gillula, Jeremy and Kate Tummarello. “Hollow Privacy Promises from Major Internet Service Providers.” Electronic Frontier Foundation, 18 Apr. 2017.
Goldhaber, Michael H. “The Attention Economy and the Net.” First Monday, vol. 2, num. 4, 7 Apr. 1997, http://firstmonday.org/ojs/index.php/fm/article/view/519/440.
Howard, Philip N. and Muzammil M. Hussain. Democracy’s Fourth Wave?: Digital Media and the Arab Spring. Oxford University Press, New York, NY, 2013.
Jameson, Frederic. “Postmodernism and Consumer Society.” Re-printed from lecture at Whitney Museum, 1982.
Lee, Sun Kyong, Kyun Soo Kim, and Joon Koh. “Antecedents of News Consumers’ Perceived Information Overload and News Consumption Patterns in the USA.” International Journal of Contents, Vol. 12, Issue 3, Sep. 2016, pp. 1-11.
Lincoln, Anthony. “FYI: TMI: Toward a holistic social theory of information overload.” First Monday, vol. 16, num. 3, 7 Mar. 2011, http://firstmonday.org/ojs/index.php/fm/article/view/3051/2835.
MacKinnon, Rebecca. “China’s ‘Networked Authoritarianism.’” Journal of Democracy, Vol. 22, Issue 2, 2011, pp. 32-46.
Melinat, Peter, Tolja Kreuzkam, and Dirk Stamer. “Information Overload: A Systemic Literature Review.” Perspectives in Business Informatics Research, BIR 2014, vol. 194, pp. 72-86, doi: 10.1007/978-3-319-11370-8_6.
Morozov, Evgeny. The Net Delusion: The Dark Side of Internet Freedom. PublicAffairs, New York, NY, 2012.
Oates, Sarah. Revolution Stalled: The Political Limits of the Internet in the Post-Soviet Sphere. Oxford University Press, New York, NY, 2013.
Pomerantsev, Peter. “Russia: A Postmodern Dictatorship?” Legatum Institute Transitions Lecture Series, Oct. 2013.
Pomerantsev, Peter. “The Hidden Author of Putinism.” The Atlantic, The Atlantic Monthly Group, 7 Nov. 2014.
Reuters. “Turkey coup plotters' use of 'amateur' app helped unveil their network.” The Guardian, Guardian News and Media Limited, 3 Aug. 2016.
Roberts, Margaret Earling. “Fear, Friction, and Flooding: Methods of Online Information Control.” Doctoral dissertation, Harvard University. 2014.
Sakwa, Richard. “Surkov: dark prince of the Kremlin.” openDemocracy, 7 Apr. 2011, https://www.opendemocracy.net/od-russia/richard-sakwa/surkov-dark-prince-of-kremlin.
Shenk, David. Data Smog: Surviving the Information Glut. HarperCollins Publishers, New York, 1997.
Smith, Keith. “What is the ‘Knowledge Economy’? Knowledge Intensity and Distributed Knowledge Bases.” The United Nations University Institute for New Technologies (INTECH) Discussion Paper Series, 2002, http://www.intech.unu.edu/publications/discussion-papers/2002-6.pdf.
Speier, Cheri, Joseph S. Valacich, and Iris Vessey. “The Influence of Task Interruption on Individual Decision Making: An Information Overload Perspective.” Decision Sciences: A Journal of the Decision Sciences Institute, Vol. 30, Issue 2, Mar. 1999, pp. 337-360, doi: 10.1111/j.1540-5915.1999.tb01613.x.
Stein, Aaron. “Inside a Failed Coup and Turkey’s Fragmented Military.” War on the Rocks, 20 Jul. 2016.
Tewksbury, David and Jason Rittenberg. News On the Internet: Information and Citizenship in the 21st Century. Oxford University Press, New York, NY 2012.
The Bible. “Ecclesiastes 12:12.” The New Oxford Annotated Version, 3rd ed., Oxford UP, 2001.
“The Surprising Speed with Which We Become Polarized Online.” Kellogg Insight, Kellogg School of Management, 6 Apr. 2017.