Facebook is building a 'shadow' social network just for bots

'Bots must be suitably isolated from real users to ensure that the simulation does not lead to unexpected interactions with real users,' researchers warn

Anthony Cuthbertson
Thursday 16 April 2020 12:06 BST
Comments
A new breed of opportunistic criminals are using major technology platforms for drug dealing, money laundering, human trafficking and even terrorism.
A new breed of opportunistic criminals are using major technology platforms for drug dealing, money laundering, human trafficking and even terrorism.

Facebook has developed a shadow social network inhabited entirely by bots in an effort to better understand how trolls and scammers operate on its platform.

The Web-Enabled Simulation (WES) was revealed in a research paper that explains how artificial intelligence sims that mimic human behaviour are being deployed on a hidden version of Facebook.

The researchers hope to learn through the bots' interactions how people abuse the social network to scam other users or exploit their personal information.

The simulation allows the bots to perform the same kinds of actions that a regular Facebook user can, such as liking posts and sending friend requests.

Each bot is modelled on different personality types that might use Facebook, meaning some may be built to seek out targets, while others will include traits that make them susceptible to scams.

"It uses a software platform to simulate real-user interactions and social behaviour on the real platform infrastructure... Unlike traditional simulation, in which a model of reality is created, a WES system is built on a real-world software platform," the research paper states.

"The promise of WES is realistic, actionable, on-platform simulation of complex community interactions that can be used to better understand and automatically improve deployments of multi-user systems."

Facebook researchers developing the simulation said it would help detect bugs within the world's largest social network, which counts around 2.5 billion users around the world.

Thousands of different scenarios can run simultaneously on the simulation, which will be used to automatically recommend updates and changes that could improve a human user's experience.

It is only lines of computer code that separate the AI bots from real Facebook users, though the researchers noted the risk of the experiment spilling over into the public version of the social network.

"Bots must be suitably isolated from real users to ensure that the simulation, although executed on real platform code, does not lead to unexpected interactions between bots and real users," the paper states.

"Despite this isolation, in some applications bots will need to exhibit high end user realism, which poses challenges for the machine learning approaches used to train them.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in