Tech giants, under more pressure than ever to police content on their platforms, are facing sweeping guidelines announced by European regulators Thursday.

Most notably, the European Commission is asking Facebook, Google and Twitter to remove illegal or terrorist content within an hour of being notified about its presence.

"While several platforms have been removing more illegal content than ever before -- showing that self-regulation can work -- we still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens' security, safety and fundamental rights," said Andrus Ansip, vice president for the digital single market for the Brussels-based commission, in a statement.

Is it possible to remove such content within an hour?

Facebook said it removes 83 percent of terrorism-related content within an hour of upload.

"As the latest figures show, we have already made good progress removing various forms of illegal content," a Facebook spokeswoman said Thursday. "We continue to work hard to remove hate speech and terrorist content while making sure that Facebook remains a platform for all ideas."

Facebook also said it removes 99 percent of ISIS- and Al-Qaeda-related content before it is flagged by users.

The commission also is asking the tech companies to establish systems to proactively detect and remove such content, and wants the companies to report on their progress every three months.

If the commission isn't satisfied with the companies' response to removing terrorist content within three months, and other illegal content within six months, it said legislation could follow. The EU guidelines are non-binding but can be referenced in court.

European regulators have been on tech companies' case regarding extremist content for a while, but the stepped-up efforts come in the wake of recent terrorist attacks on the continent -- and amid a Russian-linked political misinformation campaign in Europe and in the United States.

Tech giants use a combination of automated systems and human moderators to vet content. Facebook and Google have each said they have about 10,000 people reviewing content, and Facebook said it is planning to hire 10,000 more by the end of the year.

Google and Twitter have not returned requests for comment.