Facebook Inc. said it’s reviewing its livestreaming policy after the terrorist attack at a mosque in Christchurch, New Zealand, that left 50 people dead was streamed on Facebook itself. The social media giant has said the attacker’s livestream was not taken down fast enough, adding that its artificial intelligence systems failed to flag the video at first. The company said its system should flag videos containing suicidal or harmful acts, but more needs to be done. “AI has made massive progress over the years and in many areas, which has enabled us to proactively detect the vast majority of the content we remove,” Guy Rosen, Facebook’s vice president of product marketing, said Wednesday in a blog post. He added, unnecessarily: “But it’s not perfect.” The video was seen live for about two minutes by fewer than 200 people before it was taken down following the New Zealand police contacting Facebook, according to the company. No one reported it while viewing it. Facebook said in total the video was seen about 4,000 times, with the first report coming in 29 minutes after the video was posted. This was 12 minutes after the end of the livestream. Facebook said in the next 24…