A Sense of Community
The other day, an experienced community moderator who follows developments in her field told me she was seeing significant growth of interest from all kinds of businesses not just in protecting their brand online but also in protecting their customers online. They see safety as part of brand protection, of course, but more and more, community safety is becoming a concern in its own right.
She made me think of the city metaphor. Sure, if a neighborhood isn’t safe, well-lit and pleasant to be in, people won’t hang around, but there’s more to it than that. The appeal of the neighborhood isn’t just the city’s responsibility. City services play a major role, of course, but so do the residents, businesses and visitors. We really are talking about “community” in the strictest sense – lots of participants and roles, each playing an essential part in the overall feeling of the place. No matter how pretty a physical or digital space is, it doesn’t appeal if it’s not safe, and it isn’t any one thing, such as the police, that makes it safe or anything else.
(If you’re wondering what she and I mean by “safety,” it’s a whole lot of things, but – online – mostly safety from hate speech, social cruelty and other behaviors that threaten people’s emotional or psychological wellbeing.)
The Shared Responsibility of Safety
I’ve been writing about online safety for the better part of two decades and, like my colleague the community manager, I’m seeing something happening too: As more and more people of all ages are using social media on phones and every other kind of device, it’s beginning to sink in that safety is different in a user-driven, social media environment. It’s a shared proposition.
By definition, the provider of an online environment alone can’t make safety happen all by itself. It’s co-created by the provider and the participants. But unlike in a city, the provider can do a lot to create an environment that encourages and supports safety. But not once and for all. It needs to be baked in, modeled and fostered day by day, because participants change, communities evolve and the host and underlying technology do too. It’s not only “safety by design” – which does create the necessary infrastructure but is static and only one part of the equation – but safety that’s continuously in process and crowd-sourced as well as fostered by the community provider.
Properties of Online Community Safety
So what’s the whole equation? Coming up with the right formula for co-creating safety is a work in progress, partly because social media is so new on the planet and we’re still figuring out how it impacts us and we impact it. But as I’ve watched various kinds of solution developers over the years – e.g., parental control tech developers, community managers, safety advocates, law enforcement, policymakers – game designers and professors of digital game design have taught me the most. Based on what I’ve learned from them and the global discussion in general, here are the properties of social-media safety:
- Agency: A degree of free will, choice or empowerment for participants increases their safety by allowing them to be stakeholders in both the success of outcomes and the wellbeing of themselves, their fellow participants, and their community (the dictionary definition of “agency” is “the capacity to act or exert power”). This is the collaborative safety of a community and a social media environment. Providers can empower users to be part of the solution, co-keepers of community safety. In her talk at the 2012 South by Southwest conference, game and school designer and educator Katie Salen cited Media Molecule founder/game designer Mark Healey about the need to give players powers to do something – the powers that create a kind of “force that flows through your veins and makes you feel like you can change the world around you.”
- Literacy: Sometimes referred to as “competency” or “mastery,” literacy not only enables trust and confidence, it enhances safety the same way that being informed protects from misinformation, propaganda, fraud, manipulation, etc. The three literacies of social digital media are digital literacy, media literacy and social literacy. Community providers can consider following Facebook’s lead in working to foster social literacy even in the abuse-reporting flow for teen users (see this blog post from a participant in Facebook’s 4th “Compassion Research Day.”
- Community fosters social norms and a sense of belonging, which are both protective. Just as in offline spaces, and online community’s participants organically (in the process of participating) develop the social norms that support communication and collaboration and marginalize behaviors that hinder them. When there’s a lack of community, such as on YouTube or huge news sites that are too vast and diverse to be communities, it’s much more common to see the cruel or moronic comments that are too often associated with social media (even though, in these sites, too, there are people who see themselves as stakeholders). And this is important: A sense of belonging mitigates hurtful behavior. My friend and adviser Patricia Agatston, PhD, a risk prevention specialist for Atlanta-area schools, wrote me, “When we feel like we belong and are accepted, behavior is usually productive. When we doubt our belonging and acceptance we may act in less helpful ways,” she added.
- Purpose: Sometimes called “learning objectives” – whether for a core-curriculum subject in school, an avidly shared interest in any space, the end result of a quest in a game, etc. – purposefulness fuels the work of a community by offering motivation or even inspiration. It fuels personal and collective growth and collaboration. It tends to increase safety by eclipsing random moronic or cruel behavior with collaborative actions toward a shared goal. Also, “the fastest way to improve someone’s everyday quality of life is to ‘bestow on a person a specific goal,’” University of California, Irvine, psychology professor Sonja Lyubomirsky says.
- Guidance: This is the moderator part – the all-important human touch. It needs to be respectful not coercive or over-reactive. It’s the kind of guidance that models appropriate behavior as much as enforcing it. The best kind of guide is invested in the community, is energized by it, learns from it (from fellow participants) and evolves his/her role based on its evolution. A good community manager focuses more on communication than control but knows when s/he needs to exercise control in unacceptable situations.
- Infrastructure: The infrastructure of online community is cultural (or philosophical) as much as technological. The provider’s technology creates the latter and sets the rules and tone of the infrastructure, but everybody co-creates cultural part, which changes faster than the technology. Game designer Salen said that a healthy gaming community has a lot to do with the game company that hosts it. She pointed to UK-based Media Molecule as an example: It’s a “very flat and respectful” company whose “gaming community is an extension of the company’s sense of community,” certainly a community that fosters communication and collaboration. This is the kind of digital space where users are most likely to feel safe and help others stay safe.