Build your own community on Discord
Initially meant as a platform for online gaming, text and voice chat app Discord has become much more than that. It allows for communities to sprout and conversations to spark. But where do we draw the line on the self-sovereignty that comes with it?
Discord’s promotional video lays down the basic principles of their platform. It is a space for users to create “your server, with your rules, with the people you want”. As an antidote to social media, Discord proposes a new way to converse online. Albeit primarily focused around the gaming community, Discord has become much more than that.
The platform originates from the need for a voice-chat for online gaming. During multi-player games, teams or individual players can benefit from communication during gameplay. Players can discuss strategies, ask for directions within the game, or simply chat with likeminded gamers. While an application like Skype offers the basics for this, Discord decided to step it up a notch. Their application offers specifications that allow gameplay and conversation-making all at once.
Besides voice chat, Discord offers much more. It is both a text and voice application. Similarly to the workplace application Slack, conversations are divided into threads. Upon joining a server, users are presented with as many channels as the moderator has put up on the server, divided into text and voice channels. Users can be assigned roles, which allow or deny access to certain threads and performing different actions. Then, there are a multitude of bots that can be used for channels. For example, Rhythm, a bot that can be used for playing music.
Online community building
“People use Discord to build communities not just for games”, as Discord states in their promotional video. Although the platform was initially built around gaming, it seems the user community has branched out to different subjects. Described as a ‘real-time Reddit’1, Discord offers a space for all people. Every user has the ability to start their own server, with a self-chosen subject and guidelines for it. Users can decide to make the server open to anybody or invite-only. In short, communities can be build easily.
Thanks to this openness and ease of community-building, Discord comes to be associated with all sorts of things, whether intended by the platform or not. It has particularly been associated with the rise of alt-right movements online23. While Discord on its own might not be the cause of extremist behavior or thoughts, it could provide a bridge in a network of reinforcing existing radical beliefs (Munn).
Servers can be put up by anyone, with any intention and about any subject. Much like 4chan and Reddit, Discord allows discussions to take place. Unfortunately, all of these platforms have faced criticism over their moderation, or more accurate, lack thereof. While these platforms made changes to be able to have more control over what content is posted (Munn), it can be argued that there is more to be done.
Discord has little to no rules about how users should behave within the communities that are build. The makers behind the application continuously stress the fact that the platform is open to anyone. It relies on users to report faulty behavior and to moderate content on servers (Reyman & Sparby).
With great freedom comes great responsibility
(Self-)praised for openness and accessibility, Discord is open about updates and changes made to the platform4. They create a lot of freedom for users and developers to build upon the platform and create their own communities. This type of freedom on social media might lead to ‘a dark social web’ (Munn). For example, bullies or trolls gain access to a server and are able to disrupt conversation and infiltrate a community. There have been some solutions to this unwanted behavior, such as in the form of containment rooms (Reyman & Sparby). This type of solution comes from users themselves.
Discord steps in in some ways, like setting verification levels, eliminating servers and issuing warnings to users (Reyman & Sparby). The issue does raise the question of who is responsible for ensuring the platform is a safe space and does what it intends to do. The platform can be seen as a mere enabler of communication. It can also be argued that, although they may seem neutral, they are always ideological in the way communities are organized (Bratton).
Users who engage in behavior that distorts communities seem to have full responsibility on Discord. This can lead to a problem when these users are not aware of the impact their negative behavior has until after they have negatively influenced others (Sparby). Should Discord behave similarly to a kindergarten teacher, who teaches children about respect and tries to break up fights between them? Or can the platform’s users be granted full responsibility of their actions?
In essence, there is nothing wrong with providing spaces for anyone to join and discuss issues. People have always had the ability to share controversial thought, long before the rise of social media. Platforms such as Discord simply make it easier for people from all over the world to come together. Does this then mean they are responsible for content that is posted by users? Moderation is possible on Discord by users themselves, which allows communities to filter out unwanted users and content.
It would be easy to blame Discord for negative behavior on their platform and demand them to make changes. But this most likely would not put an end to negative online behavior altogether. Movements can always organize themselves elsewhere, as has been shown previously5.
Discord’s philosophy is built upon freedom and accessibility. It would seem paradoxical to strictly moderate content and users. Giving users all responsibility seems risky and possibly puts the platform’s future at stake. The middle road of taking action when needed, but mostly trusting communities to build and re-build possibly provides the safest option for Discord.
Bratton, Benjamin H. The stack: On software and sovereignty. MIT press, 2016.
Munn, Luke. “Algorithmic Hate: Brenton Tarrant and the Dark Social Web.” Institute of Network Cultures 19.3 (2019).
Munn, Luke. “Alt-right pipeline: Individual journeys to extremism online.” First Monday 24.6 (2019).
Reyman, Jessica, and Erika M. Sparby, eds. Digital Ethics: Rhetoric and Responsibility in Online Aggression. Routledge, 2019.
Sparby, Erika M. “Digital social media and aggression: Memetic rhetoric in 4chan’s collective identity.” Computers and Composition 45 (2017): 85-97.