The European Parliament’s civil liberties committee (Libe) voted yesterday to back proposed legislation for a one-hour takedown rule for terrorist content online that critics argue will force websites to filter uploads.
The committee’s MEPs also backed large penalties for service providers who consistently and persistently fail to comply with the law, agreeing that they could be penalized with up to 4% of their global turnover, according to the Commission’s original proposal.
However, the committee rejected a push from the EU executive for the law to include the so-called “duty of care obligation” according to which internet companies would have to take proactive measures, including the use of automated detection tools. Critics have suggested that this would create a general obligation on platforms to monitor content and filter uploads.
Libe voted against the general obligation of hosts to monitor the information they transmit or store, and against having to actively look for facts that indicate illegal activity.
“If a company has been subject to a substantial number of removal orders, the authorities may request that it implement additional specific measures (for example, regularly inform the authorities or increase human resources). the Civil Freedoms The committee voted to exclude from these measures the obligation to monitor uploaded content and the use of automated tools, “he noted in a press release after the vote, which was approved by 35 votes in favor to 1 (with 8 abstentions).
“Furthermore, any decision in this regard must take into account the size and economic capacity of the company and” the freedom to receive and impart information and ideas in an open and democratic society, “the committee added.
Nonetheless, critics argue that a one-hour rule for terrorist attacks will bring filters out the back door and / or result in smaller websites being forced to operate on larger platforms to avoid having to adhere to a strict , unique for all deadline.
The Commission presented its proposals for new rules on removing terrorist content online last fall. Although social media platforms have had an informal one-hour rule to remove illegal content in the region for over a year.
The bill seeks to convert the previous one-hour rule into formal legislation. But it would also apply to any Internet business that receives a terrorist content removal notice from a competent national authority, regardless of size. Hence it draws criticism for the burden it could place on smaller website operators.
Libe’s committee made some changes to the proposals intended to help smaller websites.
Specifically, it decided that the competent authority should contact companies that have never received a takedown order to provide them with information on procedures and timelines, and do so at least 12 hours before issuing the first order to remove the content they are hosting.
Commenting in a statement, Daniel Dalton (ECR, UK), rapporteur on the EP proposal, said: “Any new legislation must be practical and proportionate to safeguard freedom of expression. Without a fair process, we run the risk of content being over-removed, as companies would understandably take a safe approach to defend themselves. “Absolutely, it can’t lead to general back-door content monitoring either.”
However, after Libe’s vote, a vocal critic of the bill: Pirate Party member and MEP Julia Reda – argued that Libe’s 12-hour rule will do little to help website owners.
“That’s not enough to turn off your phone for the weekend,” he wrote, calling the proposal “a catastrophe for work-life balance for small business owners and hobbyist websites.”
Only website owners who have never received a takedown order before receiving an additional 12 hours to react once. That is not long enough to be able to turn off your phone for the weekend. A catastrophe for the work-life balance of small business owners and hobbyist websites. #TERREG
– Julia Reda (@Senficon) April 4, 2019
There is also the question of how terrorist content is defined online.
The Commission’s proposal says it refers to material and information posted online that “incites, encourages, or promotes terrorist crimes, provides instructions on how to commit such crimes, or promotes participation in activities of a terrorist group.”
“When assessing whether online content constitutes terrorist content, responsible authorities and hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made, including whether it was released for educational purposes, for journalistic or investigative purposes, and the potential to lead to harmful consequences, “publishes a Commission question-and-answer session on the September bill.
The committee endorsed protections for terrorist content disseminated for educational, journalistic or investigative purposes, and agreed with the Commission’s previous warning that the expression of controversial or controversial views on sensitive political issues should not be considered terrorist content. .
Although, again, critics are not convinced that the legislation does not result in cold talk across the block as platforms and websites seek to reduce their risk of compliance.
The European Parliament as a whole will vote on the bill next week. After which a new parliament, determined through the upcoming elections next month, will be in charge of negotiating with the representatives of the member states in the Council of Ministers, a process that will determine the final form of the legislation.