Facebook live shooting video has social media companies scrambling

One of the shooters appears to have livestreamed the attack on Facebook (FB). The disturbing video, which has been verified by CNN, purportedly shows a gunman walking into a mosque and opening fire.

"New Zealand Police alerted us to a video on Facebook shortly after the live broadcast and we quickly removed both the Facebook and Instagram accounts and the video," Mia Garlick, Facebook's director of policy for Australia and New Zealand, said in a statement.

Hours after the attack, however, copies of the video continued to appear on Facebook, YouTube and Twitter, raising new questions about the companies' ability to manage harmful content on their platforms.

Facebook is "removing any praise or support for crime and the shooter or shooter as soon as we're aware," Garlick said.

Multiple deaths as gunmen open fire in two mosques in New Zealand & # 39; s Christchurch
Twitter (tWTR) said it suspended an account related to the shooting.
YouTube, which is owned by Google (GOOGL), removes "shocking, violent and graphic content" as soon as it is made aware of it, according to Google spokesperson.

New Zealand police asked social media users to stop sharing the shot shooting and said they were seeking to have taken down.

CNN is not available to publish until more details are available.

Tech firms 'don't see this as a priority'

This is the latest case of social media companies being caught off guard by videos of their crimes, and then users sharing the disturbing footage. It has happened in the United States, Thailand, Denmark and other countries.

Friday's video reignites questions about how social media platforms handle offensive content: Are the companies doing enough to try to catch this type of content? How quickly should they be expected to remove it?

"While Google, YouTube, and Facebook all say that they're cooperating and acting in the best interests of citizens to remove this content, they are actually not because they're allowing these videos to reappear all the time," said Lucinda Creighton, a senior adviser at the Counter Extremism Project, an international policy organization.

YouTube says it will crack down on recommending conspiracy videos

Facebook's artificial intelligence tools and human moderators were apparently unable to detect the livestream of the shooting. The company says it was alerted to it by New Zealand police.

"The tech companies basically don't see this as a priority, they wrap their hands, they say this is terrible," Creighton said. "But what they're doing is preventing this from reappearing."

John Battersby, a counter-terrorism expert at Massey University of New Zealand, said the country had spared mass terrorist attacks, partly because of its isolation. Social media had changed that.

"This fellow is a streamed shooter and supporter of some of them," he said. "Unfortunately once it's out there, it's downloaded, it can still be (online)," he added.

CNN legal enforcement analyst Steve Moore, a retired supervisory special agent for the FBI, said the spread of the video could inspire copycats.

"What I tell the public is: Do you want to help terrorists? Because if you do, sharing this video is exactly how you do it," Moore said.

"Do not share the video or you are part of this," he added.

Hadas Gold, Donie O'Sullivan, Samuel Burke and Paul Murphy contributed to this report.

.

Leave a comment

Send a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.