The private messages Facebook users send to each other through Messenger aren't so private. Facebook, already under fire for how it handles profile data, confirmed to Bloomberg on Wednesday that it scans the text and images people send to one another on Messenger to make sure it follows the company's content rules.

And it blocks messages that don't comply.

The company said it uses the same automated tools to scan Messenger for abuse as it does on Facebook in general.

"For example, on Messenger, when you send a photo, our automated systems scan it using photo matching technology to detect known child exploitation imagery or when you send a link, we scan it for malware or viruses," a Facebook Messenger spokeswoman said in a statement to Bloomberg. "Facebook designed these automated tools so we can rapidly stop abusive behavior on our platform."

The company told Bloomberg it doesn't use data from scanned messages for advertising.

Concerns about whether Facebook was snooping on Messenger rose this week after the site's founder, Mark Zuckerberg, alluded to it in a interview with Vox's Erza Klein . Zuckerberg told Klein about stopping sensational messages about ethnic cleansing in Myanmar being sent through Messenger.

Facebook launched Messenger as a stand-alone app in 2014. That same year, Facebook paid $19 billion for WhatsApp, a chat app similar to Messenger. Messenger topped 1 billion monthly users in 2017. WhatsApp had 1 billion daily users in 2017.

WhatsApp encrypts messages on both ends of the conversation, so the company cannot see them, according to Bloomberg.

Facebook has been under intense scrutiny since news broke that private information from about 50 million users was accessed by Cambridge Analytica, a political consulting firm with connections to President Donald Trump's 2016 campaign. Zuckerberg agreed to testify before Congress next week.

The company also announced Wednesday a slew of changes to its privacy policies.

The new privacy policy aims to explain the data it gathers about users more clearly -- but doesn't actually change what it collects and shares. Facebook says the changes aren't prompted by recent events or tighter privacy rules coming from the EU.

As Facebook evolved from a closed Harvard-only network with no ads to a giant corporation with $40 billion in advertising revenue and huge subsidiaries like Instagram and WhatsApp, its privacy policy has also shifted -- over and over.

Almost always, critics say, the changes meant a move away from protecting user privacy and toward pushing openness and more sharing. On the other hand, regulatory and user pressure has sometimes led Facebook to pull back on its data collection and use and to explain things in plainer language -- in contrast to dense legalese circulated by many other internet companies.

Among the changes:

--Facebook has added a section explaining that it collects people's contact information if they choose to "upload, sync or import" this to the service. This may include users' address books on their phones, as well as their call logs and text histories. The new policy says Facebook may use this data to help "you and others find people you may know."

--The previous policy did not mention call logs or text histories. Several users were surprised to learn recently that Facebook had been collecting information about whom they texted or called and for how long, though not the actual contents of text messages. It seemed to have been done without explicit consent, though Facebook says it only collected such data from Android users who specifically allowed it to do so -- for instance, by agreeing to permissions when installing Facebook.

--The new policy also makes it clear that WhatsApp and Instagram are part of Facebook and abide by the same privacy policy as their parent. The two were not mentioned in the previous policy. While WhatsApp still doesn't show advertisements, Instagram long has, and the policy consolidation could be a sign of things to come for WhatsApp as well.