Meta is doubling down on its efforts to prevent the spread of intimate photos and videos of young people on its apps. The company Monday announced that Facebook and Instagram are founding members of Take It Down, a new initiative from the National Center for Missing and Exploited Children (NCMEC) that will make it easier for young users and their parents to remove their intimate photos posted online. French social network Yubo, adult site PornHub, and content subscription service OnlyFans are other members of this initiative.
“Take It Down works by assigning a unique digital fingerprint, called a hash value, to nude, partially nude, or sexually explicit images or videos of people under the age of 18,” NCMEC explains. Anyone who wants to take down an explicit photo or video can visit this link and select the file from their phone or computer. Note that this will not upload the file anywhere. Instead, Take It Down will analyze the file on-device to create a “hash” or digital fingerprint of the file. This hash can identify an exact copy of that photo or video across supported online platforms.
Once the hash is created, Take It Down will add it to a secure list maintained by NCMEC. Participating companies can use that list to scan their platforms. If the system returns a match, i.e. finds an exact copy of the photo or video you want to take down, it will initiate action to limit its spread. It may eventually remove the file and also block the account that posted it. All this while, your identity remains anonymous, thus protecting your privacy. Note that Take It Down will keep the hash, so it can block future uploads of the same explicit photo or video.
Meta financially supported the development of this initiative
In a press release, Meta’s Global Head of Safety Antigone Davis said that the company financially supported the development of Take It Down. It is now working with NCMEC to promote across its platforms, including Facebook and Instagram. The social network biggie also plans to integrate the service with these platforms, making it easier for users to report potentially violating content. Take It Down will likely be part of the existing content reporting system on Facebook and Instagram.
This is the latest in a string of efforts Meta has made over the years to make its platforms safer for young people. The company says has developed more than 30 tools to support the safety of teens and families across its apps. In November last year, it blocked “suspicious adults” from messaging teens on Instagram and Facebook. It has also joined forces with various other companies in the past to combat sextortion and prevent sharing of any form of child sexual abuse material (CSAM) online.
2023-02-28 15:05:52