This blog post summarizes a paper that investigates how Airbnb hosts learn the intricacies of their work through becoming integrated members of Facebook Groups dedicated to Airbnb hosting. This paper will be presented at the 22nd ACM Conference on Computer-Supported Cooperative Work and Social Computing in Austin, Texas.

The sharing economy has been growing at a rapid pace, with numerous emerging platforms such as Uber, Lyft, and Airbnb that have revolutionized how individuals interact with services long provided by established industries. One crucial aspect that distinguishes Airbnb from traditional hotels is that while traditional hotels are typically operated by lodging professionals, providers of accommodations on Airbnb tend to represent an unconventional workforce that in large part consists of “amateurs” occasionally renting out their apartments. Hosting on Airbnb can mean a substantial source of income, but these hosts’ abilities to consistently earn this income depends on how well they learn to serve their guests in competitive marketplaces, while remaining cautious and efficient with their expenses. Sharing economy platforms like Airbnb represent a general paradigm shift towards nonprofessional service providers. …

This blog post summarizes a paper that investigates how removal explanations affect future user activity on the social media site Reddit. This paper will be presented at the 22nd ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW) in Austin, Texas. This paper received a Best Paper Award at CSCW.

Social media platforms usually make content moderation decisions without explaining to the end-users why those decisions were made. Prior research suggests that this secretiveness often frustrates users who suspect that the platforms are biased in some ways. Would it help platforms to instead be transparent about their processes? …

This blog post summarizes a paper on understanding fairness in content moderation from the perspectives of end-users that will be presented at the 22nd ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW) in Austin, Texas. This paper received a Best Paper Honorable Mention Award at CSCW.

“I feel sad that my effort in making that post was for nothing, and that no one will see it and no one will reply with any help or advice.” — P254

How do users feel when their content is removed from online communities? Does it deter them from posting again? Does it change their attitude about the community? Individuals have a range of motivations for posting, and this shapes their reactions to content removal. In some cases (like P254 above), a user might really need advice. In others, a user might annoy the moderators on purpose, intending to provoke a removal. How does the level of effort made in creating content affect the way users perceive its removal, and does receiving an explanation of why content was removed matter? …

This blog post summarizes a TOCHI paper about the use of automated tools for content moderation on the social media website Reddit that will be presented at the 22nd ACM Conference on Computer-Supported Cooperative Work and Social Computing in Austin, Texas.

Moderating content on social media websites involves making trade-offs between the goals of achieving high efficiency and sustaining low costs. It is possible that moderation systems can considerably prevent spam, harassment and other forms of abuse on a large community if enough expert human moderators are available to carefully review each post. But that may drive up the costs of moderation to unacceptable levels. …

This blog post summarizes a paper about online harassment and content moderation on the social media website Twitter that will be presented at the 21st ACM Conference on Computer-Supported Cooperative Work and Social Computing.

With harassment growing on many online platforms, what tools do users have to protect themselves? Are those tools effective? Do tools designed to protect users ever block too much content, or block unfairly? In this study, we interviewed people who use Twitter blocklists, a mechanism developed by third-party volunteer users on Twitter aimed at addressing the problem of online harassment. We also interviewed people blocked by Twitter blocklists. Are they really harassers? …

About

Shagun Jhaver

CS PhD Candidate @GeorgiaTech | https://shagunjhaver.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store