Image courtesy of flickr user Arthur Caranta

Beware of the bots – turbocharged messages, mindshare and influence

By Tessa Curtis, Principal & Director, Tessa Curtis Associates

The fallout continues from not just the US elections, but also in France where fake news and allegations of Russian interference loomed large.

Facebook is waking up. Aside from using external fact checkers (albeit with disputed levels of success) it’s now advertising to educate us on fake news and how to spot it. Ironically, for a company accused of undermining the editorial integrity of “old media”, Facebook has placed these ads in print.

New doors are opening of great interest for corporate communicators. Fake news begs questions about reliability of content, whilst social media profiling – on sale by Facebook, Twitter and others – promises better targeting and influence. New tools include “bots” – programmable en masse and used to build psychological profiles of individuals based on social media activity.

By harnessing meta-data and the power of algorithms, bots have greater potential to reach and sway people in vast numbers than has previously ever been possible. Every major election delivers a learning for communications professionals, new tools and techniques. This looks like the latest.

In the previous US election, Obama’s campaign demonstrated the power of engaging with voters and grass roots via social media. Trump’s victory has brought into the mainstream how automation can turbo charge the process of gathering insight and building influence via social media.

Where politicians lead in use of the media to sway mass opinion, corporate communicators often follow. Advertisers and product marketers have long since made enthusiastic use of meta-data to maximise impact. And problems with bots have already been surfacing across businesses ranging from online ticketing to financial trading and gaming/ betting.

A more opaque and challenging area, however, is how bots and similar “turbo tools” might be used by commercial organisations to build mindshare and sway public opinion on major issues and projects.

The Russians have shown how, in politics, there can be a dark side to bots. As in the TV show “Homeland”, underground armies of social media propogandists might identify, and groom, users to accept news which is false – and then rocket propel its reach and impact by using bots. Unchecked, this scenario could transform corporate communications as we now know it.

It’s uncertain how far organisations might go in their use of bots and related tools to “educate” the public or build support on controversial issues. I have no inside knowledge, but consider overcoming resistance for a new runway at Heathrow, for example, fracking or even investing in nuclear weapons.

In the battle for mindshare truth is already a relative concept. Each side advances experts, arguments and material to support their case and creates a wealth of content. What’s “true”, or “fake” can be hard to discern. Much depends on where you sit.

But winning hearts and minds is also a numbers game. Bots have the capacity to turbo-charge an argument. Fighting a PR battle when one party is backed by bots could soon become a hopeless task, just like it might once have been to ignore online or broadcast media.

In the 90s, Monsanto sought PR to make a case for the introduction of genetically modified crops in Europe. There was an outcry. At that time, the company only had conventional advertising, marketing and PR tools at its disposal, but suppose that, having failed with these, it could unleash the bots?

If a company with a controversial message today approached Facebook or another social media giant to give the best rocket-propulsion money can buy, would that company be cautioned, or turned away? If the giants of new media are failing to filter fake news effectively, how good will they be at filtering this?

Of course, innovations and new tools are to be welcomed and usher in new and exciting eras in communications. All professionals welcome and benefit from this, but we mustn’t be blind to dangers. Old media –heavily regulated – can help by talking about this. Meantime regulators need to wise up, fast.

Tessa Curtis is founder and principal of Tessa Curtis Associates, an independent corporate communications consultancy. Previously at global agencies including Trimedia and Weber Shandwick Worldwide, as head of corporate PR, she was originally a journalist, latterly Business Correspondent at the BBC and previously Daily Telegraph.

Image courtesy of flickr user Arthur Caranta

Posted in Digital and Tech, Editor's Picks, Public Relations, Social Media

Leave a Reply

Your email address will not be published. Required fields are marked *

*