Facebook said it took down a livestream of the shootings and removed the shooter's Facebook and Instagram accounts after being alerted by police. At least 49 people were killed at two mosques in Christchurch, New Zealand's third-largest city.
Facebook is "removing any praise or support for the crime and the shooter or shooters as soon as we're aware," she said. "We will continue working directly with New Zealand Police as their response and investigation continues."The furor highlights once again the speed at which graphic and disturbing content from a tragedy can spread around the world and how Silicon Valley tech giants are still grappling with how to prevent that from happening.
The video's spread underscores the challenge for Facebook even after stepping up efforts to keep inappropriate and violent content off its platform. In 2017 it said it would hire 3,000 people to review videos and other posts, on top of the 4,500 people Facebook already tasks with identifying criminal and other questionable material for removal.
Facebook and YouTube were designed to share pictures of babies, puppies and other wholesome things, he said, "but they were expanded at such a scale and built with no safeguards such that they were easy to hijack by the worst elements of humanity." At one point, the shooter even paused to give a shout-out to one of YouTube's top personalities, known as PewDiePie, with tens of millions of followers, who has made jokes criticized as anti-Semitic and posted Nazi imagery in his videos.The seemingly incongruous reference to the Swedish vlogger known for his video game commentaries as well as his racist references was instantly recognizable to many of his 86 million followers.
and yet these kind of videos still exist after how long on youtube. The hypocrisy just stinks.