Airbnb utilizing an AI to screen its guests
The tool developed by Trooly — a startup acquired by Airbnb in 2017 scans the Internet to judge the suitability of customers
As excited as I am with the use of Artificial Intelligence in the field like Healthcare, where it has the potential to revolutionize disease detection & prevention — it does scare me a little when similar tools are used to replace human analytical decision making. An example of the latter is companies using AI to interview and hire you.
With the meltdown that we saw in the IPO market of the different unicorns, Airbnb might come as a savior of the market in 2020, as it plans for a direct listing on the New York Stock exchange. For one, it has a much more capable management bench to handle the pressures of a public company. More recently, however, the company has been in the news for an entirely different reason.
According to patent documents, the online renting service is utilizing an AI tool to analyze customers’ online personalities to calculate the risk their risk of trashing a host’s house. It would determine the suitability of the guest to be rented out the property. According to the Evening Standard, this comes after complaints from London hosts — where a woman who had rented her £2.5 million apartment for a “baby shower,” ended up getting it wrecked by a rowdy party crowd.
“Every Airbnb reservation is scored for risk before it’s confirmed. We use predictive analytics and machine learning to instantly evaluate hundreds of signals that help us flag and investigate suspicious activity before it happens.” ~ Airbnb website
The tool was developed by Trooly — a background check startup that Airbnb acquired in 2017. Airbnb said the patent (figure above) was a continuation of its work after acquiring Trooly. However, the company declined to comment on the extent of usage of this AI tool.
The machine learning algorithm employed marks down those found to be linked with fake social media accounts or who give out false information. Any image or video files showing individuals associated with drugs, alcohol or hate-speech, etc. also result in poor scores.
The algorithm also takes into account articles related to crime, blog postings & news websites to form a “person graph” to determine the suitability of the guest. How accurately can the AI predict whether the person’s behavior from their online presence is an open question.
Airbnb has been dealing with the guests trashing the hosts’s homes for quite some time now and the company seems eager to implement a system, which weeds out some of the troublemakers if not all. My concern is the same — I hope we are not rushing on our reliance on machines to make judgment calls which have normally been a prerogative of humans.