Anonymous on the internet
elizabeth tobey
23114

nderground is a social network that supports anonymity. We have no “real name” requirement, as Facebook does. We also have very liberal “community standards” (compared to Facebook and Instagram).

nderground users are not allowed to post content owned by someone else. Users are not allowed to threaten violence, plot violence or other acts that are illegal under United States law. Currently that’s it.

My hope is that nderground will develop fewer of the toxic results of anonymity than we have seen on other sites. nderground is a closed social network. Nothing posted on nderground can be seen by anyone who is not logged into nderground. Further, access to posted material on nderground is limited to people within your Karass (your friends and family). No one outside this group can see the material you post. Even within this group, access is one to one.

For example, if Bob is a member of my Karass, I can see material posted by Bob on his page. Bob can see and comment on my material on my page. If Bob’s Karass includes Alice and Alice is not in my Karass, my association with Bob will not give me access to material that is posted by Alice. This avoids a problem seen on other social networks where a post can be seen by second level connections that were not intended to see the material.

My hope is that the architecture of nderground will limit abuse. If someone becomes abusive on nderground, the person feeling abused can break the link to that person, removing them from their Karass. At that point the two people become invisible to each other (at least on nderground). The link can only be reformed by an invitation, which must be accepted.

We have constructed nderground to scale. Human oversight is not something that will scale. My hope is that, over time, we can implement software to identify people who are abusing the anonymity of nderground. We do not want to provide a venue for organizations like ISIS or other groups involved in violence.

How effective software methods can be remains to be seen. For example, if nderground software runs searches on a dictionary of words associated with violence, will this be effective? Will there be too many false positives?

For example, someone who writes “Sometimes I feel like I could just kill my boyfriend” is usually very different from writing that “I am planning to kill boyfriend”. Developing software than can tell the difference is a challenge and may not be practical.