Remove terror content in 1 hour or else, EU warns tech giants


"Terrorist content online poses a particularly grave risk to the security of Europeans, and its proliferation must be treated as a matter of the utmost urgency", the commission said in a statement, and urged the removal of all terror threats and propaganda within one hour of detection.

All platforms have been advised that they must remove terrorist content within one hour of its initial posting and internet companies should implement automatic detections signals to "disable terrorist content and stop it from reappearing once it has been removed".

The Commission said that illegal content online not only undermines the trust of European Union citizens, but also poses security threats.

Internet companies have faced increased pressure from authorities to speed up the takedown of terrorist content and hate speech after a number of deadly terrorist attacks in Europe in recent years.

Significantly, they have issued internet firms with an hour to remove illegal content, in an effort to stop unsafe and harmful propaganda and hate speeches from spreading like wildfire. One hour to take down terrorist content is too short, the Computer & Communications Industry Association, which speaks for companies like Google and Facebook, said in a statement that criticized the EU's plans as harming the bloc's technology economy. "We share the goal of the European Commission to fight all forms of illegal content". If sites fail to follow the rule, the European Union will work towards a formal regulation, the WSJ noted.

"The rule of law applies just as much online as it does offline", said Vice-President for the Digital Single Market Andrus Ansip, adding that online platforms are self-regulating the removal of more illegal content than ever before.

When it comes to risky content online, the European Union is not messing around.

But internet activists expressed concern the voluntary measures announced Thursday bypass those very rules that protect companies from having to actively monitor their websites and could lead to excessive blocking of free expression.

For terrorist and child sexual content in particular, the Commission said, it should be possible to automatically detect and remove material "which does not need contextualization to be deemed illegal".

"The European Commission is pushing "voluntary" censorship to internet giants to avoid legislation that would be subject to democratic scrutiny and judicial challenge", said Joe McNamee, executive director of the nonprofit European Digital Rights, or EDRI.

Social media sites or other places where the content is provided by users should review their flagging processes to ensure they can meet the guidelines.