The Good Web

in #good2 years ago

Over the past two decades, social media has moved from a space of aspiration for people who study social change, to a space of profound anxiety. If we believe news headlines, social media is responsible for many of the major scourges of modern life. It is coarsening our political discourse, leading to polarization and division. It exposes vulnerable people to extremist ideology, driving them toward violence. Social media is addictive, can damage our body image and sense of self-worth.

If all this is indeed true, it’s odd that our society has not chosen to ban this dangerous new technology. In reality, social media is complicated. Professor Casey Fiesler of the University of Colorado, Boulder, has observed, “Social media is really good for people, and social media is really bad for people. Those two things can be true at the same time.” My research at the University of Massachusetts, Amherst on a project called the Initiative for Digital Public Infrastructure, suggests that we can work to shape social media into a pro-social force, not simply push back against the excesses of the form.

Putting the Public Interest in Front of Technology
For people isolated and lonely, social media presents a crucial lifeline and connections to other people. It can be life-affirming and transformative for people whose gender identity, sexuality, or interests are not well supported by their local community. It has helped amplify the voices of people historically excluded from media dialogue, including people of color, queer people, and people with disabilities. In notable cases like the Arab Spring, social media has provided support for revolutionary political movements, whose participants have used online tools to expose and oppose authoritarian dictatorships.

The potentials of social media, and the distance between the real harms and potential benefits, mean a focus on improving the space is a high priority for advancing social justice broadly and the emerging field of public interest technology specifically. As this series has pointed out, many people working within public interest tech find themselves engaged in the complex work of trust and safety, trying to minimize known harms of these online spaces. Others in the field research the algorithms that increasingly shape our experience of social media platforms. Some work with regulators and legislators to design guidelines that could blunt the most damaging extremes of social media.

Pending legislation like the Digital Services Act (DSA) and Digital Markets Act (DMA) in the European Union, seek to make social media platforms more transparent, giving users and legislators insights into the algorithms that promote some and demote other content. Proposed regulations also seek to outlaw so-called dark patterns, psychologically manipulative techniques designed to make it difficult to leave a service or designed to keep users engaged with social media platforms, in the same way that gambling machines seek to keep gamblers in the casino.

This focus on mitigating the harms of social media reflects a recognition that platforms like Twitter, Facebook, and YouTube now play a central role in our public sphere. Social networks are now the spaces in which people process, unpack, interpret, discuss, and debate the events of the day and what we as a society should do in reaction to them. As such, they have taken on a key role in our democracy.

Yet what well-meaning regulatory proposals lack is a vision of social media that could be good for society. At best, these regulatory approaches seek to make existing social media less awful. But an emerging movement that we might call “the Good Web” envisions the possibility of social media that has a salutary role in a public sphere. What’s less clear is which of several dueling visions of the Good Web might lead us to a healthy social media environment.

My interest in the future of social networks connects to my personal history. In the late 1990s, I helped build one of the internet’s first communities centered on user-generated content, Tripod, which served as a precursor to early social networks like MySpace. Some of the decisions my team and I made in the 1990s—from how we moderated content to creation of the hated pop-up ad—contributed to the problems of contemporary social media. My work now as a scholar is focused on finding ways social media might meet its original goals of opening the internet to widespread and diverse participation. In this article, I explore four visions for the Good Web and draw lessons from each that can help social change leaders improve society.

Facebook Knows Best: The Centralized Web
One group with a great deal at stake in this reimagining of the social media landscape are existing platforms. In their preferred future, they would continue to be in charge of the most popular social media spaces online. The Good Web would be realized through renovations of existing spaces and a new wave of completely novel ones. You can see this in Facebook's rebranding as Meta. One way to understand Mark Zuckerberg’s fascination with the Metaverse is as a desire to be rid of the existing problems with Facebook: spam, extreme speech, mis- and disinformation, and conflicts between users. Zuckerberg imagines a 3D future in which users wearing headsets made by his company interact with software made by his company, to buy games and other digital goods sold by his company in a universe entirely controlled by his company.

It's unclear whether Zuckerberg understands how to solve any of the new problems likely to arise in 3D interaction. Indeed, a few weeks after releasing a beta version of Facebook’s Metaverse, users started reporting novel forms of online harassment, including groping, in which male avatars would sexually assault female avatars. It seems obvious that platforms that have challenges moderating their existing online spaces would face new and more severe challenges as they move into new frontiers. And given a long history of misogyny in online gaming spaces, as manifest in harassment campaigns like Gamergate, it did not require much expertise to predict that violence against women would be a major problem in a game-centered Metaverse platform.

Even with these challenges, existing platforms have an enormous advantage over new entrants to the field. Facebook has actually done a remarkable job of policing certain types of content, building costly and hard-to-replicate infrastructure in the process. Child sexual abuse material (CSAM, sometimes incorrectly termed “child porn”) is frequently posted to social media. Facebook along with other major platform providers have created a database of fingerprints of known CSAM imagery. These databases allow Facebook and partner companies to quickly identify, block, and report CSM to law enforcement.

There is the possibility of abuse of these databases. Civil libertarians note that a similar database of violent extremism needs to be carefully monitored to ensure that it is not trampling freedom of expression. But these large platforms have demonstrated the ability to combat some of the worst content in social media at scale. Recently Amazon’s video streaming service Twitch identified and removed a live stream of the Buffalo, New York, shooter within two minutes, which is evidence that these platforms are beginning to handle extreme content at scale. What is less clear is whether as Facebook moves into the Metaverse and Twitter moves into audio conversations spaces, these platforms understand how to tackle the everyday hostility and harassment that characterize their platforms.

Put Us in Charge: The Deplatformed Web
A second group of Good Web innovators are the deplatformed: groups that have been kicked off mainstream platforms because their content is not acceptable to the guardians of those spaces. Australian activist group, Assembly Four, is a group of sex workers and programmers who came together to create an online space for sex workers in the wake of US legislation. SESTA-FOSTA, the legislation that sought to combat sex trafficking in the United States, had the side effect of making US-hosted social media platforms hostile to sex workers in countries where sex work is legal. In the part of Australia where Assembly Four operates, sex work is legal and the people behind Assembly Four built an alternative platform called Switter.At, short for “Sex Worker Twitter,” which at its peak was used by 420,000 people.Critically, this platform was designed with and for sex workers, with consideration of their health and safety in its design. It provided an essential space for sex workers to share information about dangerous clients and protect each other’s welfare. Unfortunately, recent Australian legislation forced the closure of this platform in early 2022, but it remains an example of why it can be so valuable for groups chased off of existing platforms to be able to create their own social media spaces.

Unfortunately, the same techniques that worked for Assembly Four work for more problematic groups. Gab.Ai, a site designed to host far-right extremists who have been banned from Twitter and other social networks, uses the same software and architecture used by the creators of Switter. Gab.Ai has quickly become a problematic and extreme online space.

Some of the language Gab.Ai’s founders have used to describe the project is deeply familiar to critics of the economic models of social media platforms, including the focus on finding alternatives to “surveillance capitalism” and hopes to support the platform through a combination of non-targeted advertising and subscription revenue. Gab’s intent to create a surveillance-free social media space supported by subscription and broadly targeted advertising is laudable, while many of the views presented on the site are utterly reprehensible.

Parts of Gab.Ai's model have been adopted by others, including former President Trump who founded Truth.Social after being banned from numerous social networks in the wake of comments inciting the violence that unfolded on January 6, 2021. Such networks promise resistance to censorship but have often limited the speech of ideological rivals. What matters most in these communities is who is in charge and what rules they choose to propagate.

Despite the variability in governance and intent, the Deplatformed Web deserves attention because its users are deeply passionate and engaged in their communities. Often new communities are able to scale rapidly because they are meeting the needs of users who can’t congregate elsewhere. At best, these communities learn from the needs of their communities and create novel forms of interaction to support them. But another branch of the Good Web worries about the dangers of putting opinionated leaders in charge of a community’s rules and their enforcement.

Put Nobody in Charge: Web3
A third group attempting to build the Good Web is a set of cryptocurrency enthusiasts who have recently branded themselves, Web3. Tim O’Reilly popularized the term Web 2.0 to promote the participatory internet enabled by platforms like Facebook and Twitter. In Web3, users of these platforms will be owners of these platforms and therefore, they believe that the problems of Web 2.0—notably the concentration of corporate power—will be undone by a novel system of token-based democracy.

In Web3 systems, platforms are managed democratically, and votes are attached to tokens tracked on a blockchain. These tokens are awarded to individuals who support these communities, donating resources like computing power, producing high-quality content, or writing code. Tokens are both votes in a system called a Distributed Autonomous Organization (DAO), as well as coins that can be traded and sold on exchanges.

The vision underlying Web3 is not a particular model of social media. There is no dominant vision of what rules should apply. Instead, it is a vision about the value of markets in helping individuals create community rules. The assumption behind Web3 is that the market create rival platforms, and users will gravitate toward the platforms that best meet their needs, where they feel they can steer the conversation through their tokens.

There’s very little evidence that adding market mechanisms to social media makes things better. Indeed, in general, adding markets into social media seems to be responsible for some of the most seriousproblems, including spam. Also, the democratic vision behind Web3 is a particular form of democracy, far from one person, one vote. It is a corporatized democracy, in which one share equals one vote. It is not hard to imagine situations in which platforms have their rules hijacked by someone who purchased enough votes to be able to steer the direction of the community. Indeed, this has already happened with a prominent platform, Steemit.

Still, Web3 proponents have two good ideas going for them. Their communities are self-financing. There is sufficient interest in these token systems that projects are springing up based on people being willing to invest in the currencies that support these schemes. Second, there is a large, passionate group of young software developers and entrepreneurs who feel alienated from existing social media systems and are ideologically invested in creating this next generation of tools. It’s unclear whether these advantages can overcome the substantial downsides of Blockchain-based systems, which include their egregious environmental footprints and a culture of fraud and grift. But it seems unwise to dismiss the Web 3.0 movement, given the energy and passion its adherents are bringing to the task of reforming social media.

Think Small: Decentralized Social Networks
Finally, there are advocates for small networks. Unlike the Web 3.0 camp, which believes in distributed networks in which no one is in charge, this group envisions a world of many small networks, each of which is controlled by an individual, an institution, or a small group steering the community’s rules and culture.

PubHub, a new network being created in the Netherlands by Dutch academics, envisions a system of networks based on important institutions in people’s lives. Your town’s local government might maintain a PubHub, or your local soccer club, or your children’s school. Each group makes its own rules of the road for social interaction, and a different group of people could be in charge, from someone appointed by your local government to a group of volunteers who take on management of the online community, in the same way that parents volunteer for the parent-teacher association.

These small networks are not replacing Facebook or Twitter, but complementing them, creating spaces in which focused civic discussions can occur with a high degree of privacy and outside of the existing surveillance economy. What is unclear about these small networks, is whether they will be able to attract substantial participation. Many small networks have arisen before, to challenge the dominance of global networks. They tend to thrive briefly and die. Almost by definition, small networks produce less content than large ones—a thriving town discussion group might produce a dozen messages a day, while Twitter might serve me thousands of tweets in the same period. Users often forget to check these networks, and they are abandoned when they are less entertaining than large social networks, as their goal is not entertainment, but strengthening community ties.

One possibility is that these small social networks could invest in building tools that allow a single client program on a user’s phone or laptop to interact both with these new social networks and existing social networks, giving the user more control of her presence across the social media sphere. These tools could improve user experience on both new and existing social networks, giving her control over the algorithms that deliver social media content and integrating these new small networks with the large established networks. For this vision to succeed, existing networks would need to make their software interoperable with these new networks, something that likely will not occur without legislation or regulation. My lab at UMass Amherst works on software called Gobo, designed to give users more control over the algorithmic filtering present on centralized social networks and, potentially, to integrate small social networks.

Everyone Believes They Are Building theGood Web, and They Are.
Although my lab and I are heavily invested in small social networks and clients to allow interoperability, the point in outlining these visions for the Good Web is not to advocate for one over another. Instead, the goal is to help everyone focused on social change—from funders supporting new tech ventures to the public interest technologists who will help design and create future technology—understand that building better social media is both possible and essential.

Rather than argue about which of these visions for the future of social media is most promising and deserves support, perhaps we can find consensus around the idea that we need not just to fix existing social platforms but envision and build new ones. Refocused on this mission, we can take lessons from all four of these rival camps.

From the large, centralized platforms, we have the ability to learn from billions of points of data and to create infrastructures that could help us combat existing problems of social media, like child sexual abuse material and violent extremism. From the deplatformed, we have examples of passionate communities, designing tools to meet their own needs, that would otherwise be unanticipated and unmet by mainstream platforms. From the Web3 camp, we have the important reminder that economic models matter and that giving a community ownership of the place where it engages in speech is a way to ensure ongoing investment and participation in those conversations.

Finally, from the small social networks, we have a reminder of how human beings actually socialize. There are no communities of three billion people, as Facebook likes to describe its user base. Instead, we are members of dozens of small communities, each with its own rules, practices, and governance structures. Creating social media that works more like human society may give us clues on how to create these healthy spaces.

The common ground behind the Good Web is the idea that social media must be taken seriously, not just as a problematic space in need of regulation, but a space that could ultimately help us be better neighbors, voters, and advocates for social change. What we must take from this conversation is the notion that it is not enough to fix existing social media. Instead, we must imagine, experiment with, and build social media that can be good for society

Coin Marketplace

STEEM 0.27
TRX 0.21
JST 0.039
BTC 97352.82
ETH 3723.57
USDT 1.00
SBD 3.95