What’s the worst online community you could imagine? That’s what we asked participants at SRCCON this year, in our session Designing Digital Communities for News.
The responses were enlightening — we learned a great deal about the variety and frequency of certain concerns. In addition to those about trolling and safety, many of the responses were focused on what content was visible and which community members were able to contribute that content. Here are some of our findings:
Moderation came up in almost every group, and many groups had both “no moderation” or too much or unpredictable moderation in their worst communities. Moderation without rationale — e.g. getting banned without being told why — also came up.
Expertise was a concern — the idea that users could inflate their own expertise on an issue and that it’s easy for false information to propagate.
Several groups mentioned dominant personalities, i.e. communities in which the loudest voices have the most traction and quiet, meaningful voices are overpowered.
Contribution quality ranking came up several times, in terms of arbitrary ranking systems (e.g. comments with the most words go to the top) or lack of any quality heuristics at all.
Lack of threading came up several times; it was the most common UI concern across the groups. One group mentioned lack of accessibility as a problem.
Ambiguity came up as a problem in many different ways: lack of explicit and visible community values/guidelines, an unclear sense of purpose or focus, and even concerns about intellectual property — who owns the contributions?
Some groups brought up the collection of arbitrary and excessive personal information as problematic. Similarly, unnecessarily restrictive or permissive terms of services/licenses were also a concern (“perpetual license to do anything [with your data]”), as was the unnecessary display of personal information to others.
Complete anonymity (e.g. no user accounts or history) was mentioned a few times as something to be avoided. Only one group mentioned forced IRL identities (using your SSN as a username).
No control over your own contributions came up a couple times — not being able to edit, delete, or export your own submissions.
The worst communities would be powered by slow software or supported by disruptive advertisements, the groups said — all the better to ratchet up feelings of frustration.
Even though participants weren’t restricted to existing sites, some communities came up several times as representative of many of these problems. Examples that cropped up a few times were Reddit, YouTube, and LinkedIn.
After the dust settled, attendees then designed their ideal online community. As you might expect, some of the points here were just the opposite of what came up in the worst community exercise. But interestingly, several new concerns arose. Here are some of the results:
Clarity of purpose and of values — some explicitly mentioned a “code of conduct”.
Accessibility across devices, internet connections, and abilities.
There was focus on ease of onboarding. One group pointed to this GitHub guide as an example, which provides a non-technical and accessible introduction to a technical product.
A couple of groups mentioned providing good tools that users could utilize for self-moderation and conflict resolution.
Transparency, in terms of public votes and moderation logs.
Increasing responsibility/functionality as a user demonstrates quality contributions.
Still a sense of small-community intimacy at scale.
A space where respectful disagreement and discussion is possible.
Options to block/mute people, and share these block lists.
Graduated penalties — time blocks, then bans, and so on.
Some kind of thematic unity.
Some findings from our other research were echoed here, which is validating for how we might focus our efforts.
It was interesting to see that there was a lot of consensus around what constituted a bad online community, but a wider range of opinions around what a good community could look like. We definitely have no shortage of starting places and many possible directions. We’re looking forward to building some of these ideas out in the next few months!
What’s the worst online community you could imagine? That’s what we asked participants at SRCCON this year, in our session Designing Digital Communities for News.
The responses were enlightening — we learned a great deal about the variety and frequency of certain concerns. In addition to those about trolling and safety, many of the responses were focused on what content was visible and which community members were able to contribute that content. Here are some of our findings:
Even though participants weren’t restricted to existing sites, some communities came up several times as representative of many of these problems. Examples that cropped up a few times were Reddit, YouTube, and LinkedIn.
After the dust settled, attendees then designed their ideal online community. As you might expect, some of the points here were just the opposite of what came up in the worst community exercise. But interestingly, several new concerns arose. Here are some of the results:
Some findings from our other research were echoed here, which is validating for how we might focus our efforts.
It was interesting to see that there was a lot of consensus around what constituted a bad online community, but a wider range of opinions around what a good community could look like. We definitely have no shortage of starting places and many possible directions. We’re looking forward to building some of these ideas out in the next few months!
By Francis Tseng & Tara Adiseshan for the Coral Project.