Automatic for the peaceful

by Helena Puig Larrauri, Build Up’s Managing Director

all cute robot pictures by Jacob Lefton

I think social media bots can contribute to building peace, and here is why (and a bit of how).

There is a growing body of research showing how the tools and structure of online communications are often damaging to peace. This is in part driven by people and groups pushing to spread hate and drown out inconvenient truths, many of whom have refined aggressive tools and approaches leveraging social media bots to strategically take advantage of the relatively open and algorithmic qualities of online spaces to push for division and violence. These tools are reinforced by the structure of the internet: algorithm-driven news and social media platforms intensify echo chambers and reinforce bias effects — so called filter bubbles — that result in segmentation of people with different views, reducing opportunities for cross-cutting engagement.

In short, all this automation is making us angrier, more disconnected, less likely to imagine peace.

What’s worse, there are no identified similar tools or approaches to push as aggressively for the creation of third poles, shared values that connect people and allow them safe spaces to discuss and debate different positions respectfully. Yet contrary to their current divisive nature, online platforms have the potential to offer unique opportunities to flatten hierarchies, remove barriers of communication between diverse groups, and create civic conversations that are essential to peaceful communities. We just have to figure out how.

At Build Up, we’ve been asking ourselves how automation can contribute to peacebuilding. And since we like thinking by doing, we’re piloting two approaches to start a conversation on this topic.

Peacebots for the win

For International Peace Day 2017, Build Up and International Alert are partnering to engage the general public to build a flock of robots that will share messages of peace on Twitter. The aim of this Robots for Peace campaign is to reach as many people as possible with peace messages, and get #peaceday trending on Twitter in as many places as possible on or around the UN International Day of Peace on September 21.

We’ve put together a simple website where people can get advice on how to build a bot. The website also has a code of conduct, a form to register bots, and a short guide on how to promote the campaign on social media. We’ll be monitoring the hashtags and bot handles to understand reach and overall campaign effectiveness.

Road to hell, good intentions

The idea for the Robots for Peace campaign came from an unsual (and possibly risky) source of inspiration: an article examining uses of social media bots by the Trump campaign to “manufacture consensus” — essentially, to make it look on social media like a lot of people care about a topic, in order for it to be covered by traditional media outlets.

Adopting a strategy that has been used against peace and re-purposing to amplify messages of peace is not problematic per se. Especially since we have a distinctly different aim — we are not trying to manufacture consensus but rather to guide the public towards a greater awareness and appreciation for International Peace Day, which has depth and purpose. We see automation not as a cancer you inject and watch spread, but rather as a part of a broader process of engagement.

Still, we do open ourselves to two important risks. First, Sanjana Hattotuwa kindly commented on this idea and shared:

“Peace by this sort of simplistic gamification makes me very uneasy because it is extremely easy to usurp and undermine by those opposed to our ideals and ideas.”

To a certain extent, we are replicating a strategy that could also be used to mindlessly share hate or misinformation. The robots for peace campaign could be easily targeted and derailed, a tactic used on Twitter by online activists of all sorts of persuasions.

Second (and this one worries me more, personally), there is a fine line between amplifying a message so it receives the attention we believe it deserves (as we are trying to do) and manufacturing consensus to a point where it loses credibility (as recently happened during the FCC online consultation on net neutrality).

The Robots for Peace campaign is vulnerable to these two risks because it is a simplistic tool, a megaphone with limited strategy. I think there is a place for this kind of automation tool that engages the public on a flash campaign, but it’s important to think beyond this — there is more we can do to use automation for deeper engagement.

A community of robots and people crossing divides on social media

This urge to explore deeper forms of engagement that leverage social media bots is the driving force behind The Commons, a pilot project that Build Up is running this Fall (with funding from the Cross-Over Fund Innovations for Peace and Justice) that tests an approach to address filter bubbles that have become destabilising in civic conversations in the USA. Drawing from successful frameworks of prior research and peacebuilding practice, the pilot first identifies polarising filter bubbles on Twitter and Facebook, then uses social media bots to engage with people who display certain behaviours in these bubbles, and finally organises a network of trained volunteers to move identified users towards constructive engagement with each other and with the phenomenon of polarisation.

The project is explicitly non-partisan and operates through a peacebuilding lens with a broad agenda of transforming destructive interactions into mutually desired systems and relationships. Building on applications of the Do No Harm framework to conflict contexts, we know that for peaceful coexistence to be possible in a society, connectors must be strengthened. Connectors are dynamics and structures that help people develop a shared language for understanding and processing emotions, shift a focus from power-keeping to understanding, and provide a third pole of shared values.

We believe that a majority of people in the USA are not actively driving polarization. Rather, polarization is happening to them. Moving people from passively accepting a context that escalates conflict to constructively engaging in mediating dialogue in their society is an enormous challenge. There are already a plethora of initiatives that leverage ICTs to encourage constructive, cross-cutting engagement and the creation / promotion of shared values. However, many of these initiatives reach very few people, and mostly people who are already predisposed to depolarized behaviors.

By analysing filter bubbles and leveraging automation, we can reach more people who are at risk of polarisation. We do not expect all people targeted by our pilot intervention to become active “connectors”. Some will just passively experience cross-cutting engagement and receive shared values content, thus increasing their awareness of other poles. Others may find that our volunteer network offers a safe space to explore depolarisation, and a pathway to engaging with other existing de-polarization initiatives in their communities.

Join us in a conversation on peacebuilding and automation

Both Robots for Peace and The Commons are experimental projects, and we are documenting the process and results in order to share them publicly for scrutiny and feedback. We recognize and emphasize the importance of transparency and open engagement, combating the negative use of anonymity, misinformation and deceit that has become most associated with social media automation. And we think that if an ethical use of bots is loud and broad enough, in a way this is ‘agenda setting’ for the use of bots.

We also know we’re not the only ones experimenting in this area: International Alert is building a Facebook chatbot to encourage people to take every day peace actions; Moonshot CVE are using Facebook ads to counter recruitiment into violent groups. And we know we’re only touching at the edges of what is possible with automation — none of this work uses Natural Language Processing, for example.

We’re hoping our projects and other similar ones will be the start of a conversation about the opportunities and challenges of using online automation for peacebuilding. On September 23, we’ll be at International Alert’s peacehack to talk about the ethics of peacebots. On December 4–6, we will be running a workshop with Creative Associates at the Build Peace conference on social media and automation for peacebuilding. And in early March, we’re organising a workshop at MIT to present the results of The Commons project and discuss implications for other similar work using social media bots.

We hope you join us at these events or comment on this post to share your ideas.

Follow us on facebook and twitter to see these and other projects in action.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.