Digital pollution | The Indian Express
Ongoing police investigations to identify the culprits behind the condemnable apps “Bulli Bai” and “Sulli Deals”, which “auctioned” several prominent and vocal Muslim women, involve people born around the turn of the century. On the face of it, this indicates that digital natives are not resilient to issues such as misinformation, hate speech and the potential for radicalization that plague our information spaces. But placed in the larger context of decreasing levels of social cohesion in Indian society, the fact that such apps have even been created forces us to frame our understanding in a way that can steer us towards the right set of interventions to long term.
To understand how we got here, we must begin by examining the effect of new media technologies developed over the past 20 years on our collective behaviors and identities. Technologies have changed the scale and structure of human networks; and leads to the abundance and virality of information. Social scientists hypothesize that these rapid transitions alter the way individuals and groups influence each other within our social systems. The pace of technological change coupled with the speed of diffusion of these influences also means that we do not fully understand the changes, nor can we predict their outcomes. Others have emphasized their effects on the evolution of individual, political, social and cultural identities. These identities can be consciously or unconsciously shaped by our interactions, and therefore affect how we process information and react to events in digital and physical spaces.
Our identities are ultimately about our cognitive processes – arguments against our defining values can activate the same neural pathways as the threat of physical violence. The rise of social media has been linked to the strengthening of personal social identities at the cost of increased intergroup divisions. Some have suggested that personalized streams in new media technologies trap us in “echo chambers,” reducing exposure to other viewpoints. While other empirical work shows that people on social media gravitate towards like-minded people despite frequent interaction with ideas and people they disagree with. People can also self-select into groups that reinforce their beliefs and validate their actions. We still need to better understand the broader psychosocial effects, especially in the Indian context. Experience, however, suggests that when these beliefs are prejudice and resentment against a specific group of people, the feedback loops of social confirmation and validation can lead to violence. Even pockets of disconnected action, when repeated and widespread, can destabilize delicate socio-political relationships built over decades.
The harms resulting from increasing levels of polarization and radicalization are primarily analyzed through the prism of disinformation and hate speech that give primacy to motives. This framing leaves the possibility to certain actors to escape their responsibilities since the motivations can be judged subjective. And for others to be unaware of the downstream consequences of their actions – often times even those taken with good intentions can have unpredictable and harmful results. The information ecosystem metaphor, proposed by Whitney Phillips and Ryan M. Milner, likens today’s information dysfunction to environmental pollution. It encourages us to prioritize results over motives, in that we should be concerned with how it spreads and not whether someone intended to pollute or not. It also makes us realize that the effects of pollution get worse over time and that attempts to ignore, or worse, exploit this pollution only exacerbate the problem – not just for those who suffer from it, but for everyone.
We usually focus on those who command the most ratings, have the loudest voice, or say the most egregious things. While important, ignoring or minimizing the role of everyone else, or viewing them as passive and malleable audiences, risks overlooking the participatory nature of our current situation. Big and small polluters feed off the actions and content of others on social media, traditional media as well as physical spaces. Distinctions between ‘online’ and ‘offline’ effects or harms are often neither clearly categorizable nor easily distinguishable, ‘online’ bullying is bullying. Players as varied as bored college students, local political wannabes, content creators/influencers, national level politicians or someone trying to gain influence, etc. engage in the information ecosystem. Their underlying motivations can range from the mundane (FOMO, seeking entertainment, fame) to the sinister (organized, systematic, and collaborative dissemination of propaganda, hate) to the performative (virtue signaling, power projection, ability, expertise), etc. . . The interactions of these disparate sets of actors and motivations result in a complex and unpredictable system, composed of multiple reinforcing and self-decreasing cycles, where untested interventions can have unintended and unintended consequences.
Several called for action by platforms to combat hate speech. Content moderation should be considered a late intervention. Individuals must be stopped early in the path of radicalization and extremist behavior to prevent the development of apps such as Bulli Bai. This is where steps such as counter-speech – tactics to counter hate speech by presenting an alternative narrative – can play a role and should be further explored in the Indian context. The counter-speech could take the form of messages aimed at creating empathy by humanizing the people targeted; enforce social norms around respect or openness; or defuse a dialogue. Notably, this excludes fact-checking. When people have strong ideological dispositions, supporting their stories based on accuracy alone can have limited effectiveness. Since behaviors in online and physical spaces are linked, community action and in-person outreach can also help. Social norms can be transmitted through families, friends and educational institutions. “Influencers” and people in leadership positions can have a significant impact on the development of these standards. At such times, the signals that political leaders and state institutions send are particularly important.
Prabhakar is Head of Research at Tattle Civic Tech. Waghre is a researcher at the Takshashila Institution, where he studies India’s information ecosystem and the governance of digital communication networks.