Social Media Bots for Peace
Zeit Wissen Nr. 2 2017 published a story on „Digitale Selbstverteidigung“ (digital selfdefence). An internet-security-enterprise, Incapsula made a study trafic in the internet. Half of trafic in the web was produced by bots and half of this bot trafic was build to produce damage to somebody – anti-social-media-bots. So I searched for social-media-bots for peace and found a story released on Sep 7, 2017 by Helena Puig Larrauri, Build Up’s Managing Director titeled:
I think social media bots can contribute to building peace, and here is why (and a bit of how), so Larrauri.
Build Up is a media to recognise and catalyze better local peacebuilding through innovation. It’s goal is to organize peace – @howtobuildpeace
Research how the tools and structure of online communications are damaging to peace
There is a growing body of research showing how the tools and structure of online communications are often damaging to peace.
This is in part driven by people and groups pushing to spread hate and drown out inconvenient truths, so Larrauri. Many of them have refined aggressive tools and approaches leveraging social media bots to
strategically take advantage of the relatively open and algorithmic qualities of online spaces to push for division and violence.
These aggressive tools are reinforced by the structure of the internet:
- algorithm-driven news and
- social media platforms intensify echo chambers and
- reinforce bias effects — so called filter bubbles
This results in segmentation of people with different views, reducing opportunities for cross-cutting engagement.
All this automation is making us
- more disconnected,
- less likely to imagine peace.
What’s worse, in 2017 there were nearly no identified similar tools or approaches to push as aggressively for the creation of third poles, shared values that connect people and allow them safe spaces to discuss and debate different positions respectfully.
Larrauri suggests, „contrary to their current divisive nature“ we should use the potential of online platforms unique opportunities to flatten hierarchies, remove barriers of communication between diverse groups, and create civic conversations that are essential to peaceful communities. Then she figures out how:
How automation can contribute to peacebuilding
Built Up startet piloting two approaches to start a conversation on this topic.
Peacebots for the win
For International Peace Day 2017, Build Up and International Alert were partnering to engage the general public to build a flock of robots that will share messages of peace on Twitter. The aim of this Robots for Peace campaign was
- to reach as many people as possible with peace messages, and
- get #peaceday trending on Twitter in as many places as possible on or around the UN International Day of Peace on September 21.
They put together a simple website where people can get advice on how to build a bot.
This website also should have
- a code of conduct,
- a form to register bots, and
- a short guide on how to promote the campaign on social media.
The promotors intended to
- monitor the hashtags and
- bot handles to understand reach and overall campaign effectivenes
Road to hell, good intentions
The idea for the Robots for Peace campaign came from an unsual (and possibly risky) source of inspiration: an article examining uses of social media bots by the Trump campaign to “manufacture consensus” — essentially, to make it look on social media like a lot of people care about a topic, in order for it to be covered by traditional media outlets.
Adopting a strategy that has been used against peace and re-purposing to amplify messages of peace is not problematic per se. Especially since we have a distinctly different aim — we are not trying to manufacture consensus but rather to guide the public towards a greater awareness and appreciation for International Peace Day, which has depth and purpose. We see automation not as a cancer you inject and watch spread, but rather as a part of a broader process of engagement.
Build Up was ready to take to two important risks.
- First, Sanjana Hattotuwa kindly commented on this idea and shared:
“Peace by this sort of simplistic gamification makes me very uneasy because it is extremely easy to usurp and undermine by those opposed to our ideals and ideas.”
To a certain extent, we are replicating a strategy that could also be used to mindlessly share hate or misinformation. The robots for peace campaign could be easily targeted and derailed, a tactic used on Twitter by online activists of all sorts of persuasions.
- Second (and this one worried Larrauri more, personally), there is a fine line between amplifying a message so it receives the attention we believe it deserves (as we are trying to do) and manufacturing consensus to a point where it loses credibility (as recently happened during the FCC online consultation on net neutrality).
The Robots for Peace campaign is vulnerable to these two risks because it is a simplistic tool, a megaphone with limited strategy. I think there is a place for this kind of automation tool that engages the public on a flash campaign, but it’s important to think beyond this — there is more we can do to use automation for deeper engagement.
A community of robots and people crossing divides on social media
This urge to explore deeper forms of engagement that leverage social media bots is the driving force behind The Commons, a pilot project that Build Up was running in Fall 2017 (with funding from the Cross-Over Fund Innovations for Peace and Justice) that tested an approach to address filter bubbles that have become destabilising in civic conversations in the USA. Drawing from successful frameworks of prior research and peacebuilding practice, the pilot
- identified polarising filter bubbles on Twitter and Facebook, then
- used social media bots to engage with people who display certain behaviours in these bubbles, and finally organises a network of trained volunteers to move identified users towards constructive engagement with each other and with the phenomenon of polarisation.
The project is explicitly non-partisan
Building on applications of the Do No Harm framework to conflict contexts, we know that for peaceful coexistence to be possible in a society, connectors must be strengthened. Connectors are dynamics and structures that help people develop a shared language for understanding and processing emotions, shift a focus from power-keeping to understanding, and provide a third pole of shared values.
Build Up believes „that a majority of people in the USA are not actively driving polarization. Rather, polarization is happening to them. Moving people from passively accepting a context that escalates conflict to constructively engaging in mediating dialogue in their society is an enormous challenge.“ Build Up observed already a plethora of initiatives that leverage ICTs to
- encourage constructive, cross-cutting engagement and
- the creation / promotion of shared values.
But many of these initiatives
- reach very few people, and
- mostly people who are already predisposed to depolarized behaviors.
Analysing filter bubbles and leveraging automation
Peace initiative can reach more people who are at risk of polarisation, when they analyse filter bubbles and leverage automation. Build Up did not expect all people to be targeted by the pilot intervention to become active “connectors”. Build Up assumes that some would
- just passively experience cross-cutting engagement and
- receive shared values content,
- thus increasing their awareness of other poles.
Others would find that our volunteer network offers
- a safe space to explore depolarisation, and
- a pathway to engaging with other existing de-polarization initiatives in their communities.
Larrauri suggests: „Join us in a conversation on peacebuilding and automation“
Both Robots for Peace and The Commons are experimental projects, and we are documenting the process and results in order to share them publicly for scrutiny and feedback. We recognize and emphasize the importance of transparency and open engagement, combating the negative use of anonymity, misinformation and deceit that has become most associated with social media automation. And we think that if an ethical use of bots is loud and broad enough, in a way this is ‘agenda setting’ for the use of bots.
Build Up also knows that they are not the only ones experimenting in this area:
- International Alert is building a Facebook chatbot to encourage people to take every day peace actions;
- Moonshot CVE are using Facebook ads to counter recruitiment into violent groups.
And we know we’re only touching at the edges of what is possible with automation — none of this work uses Natural Language Processing, for example
Built Up is hoping their projects and other similar ones will be the start of a conversation about the opportunities and challenges of using online automation for peacebuilding.
On September 23, 2007 Built Up took part at International Alert’s peacehack to talk about the ethics of peacebots.
On December 4–6, 2007 Build Up organized a workshop with Creative Associates at the Build Peace conference on social media and automation for peacebuilding.
March, 2008 Build Up organised a workshop at MIT to present the results of The Commons project and discuss implications for other similar work using social media bots.
Built Up hopes that you join them at events or you share your ideas on this topics.
Hacks to build peace | International Alert
International Alert | Peace is within our power https://www.international-alert.org/
International Alert works with people directly affected by conflict to build lasting … to sign up to our #peacehack and build a bot to send messages of peace.
Posted in Abrüstung, english, Ethik, Friedensarbeit, Friedensbewegung, Friedensexport, Friedensforschung, Friedenskultur, Friedensorganisation, Friedenspädagogik, Friedenspolitik, Friedenspsychologie, Friedensstifter, Friedensstifterin, Friedensstruktur, Gewaltprävention, Global, Kultur, Peacebuilding, Tipp, USA