제목   |  Technology Tool Aims to Let Young People Remove Explicit Images from Internet 작성일   |  2023-03-03 조회수   |  13363
첨부파일: 03-18-2023.mp3

Technology Tool Aims to Let Young People Remove Explicit Images from Internet

 

 

 

A new tool called Take It Down promises to remove explicit pictures that young people have put on the internet before they were 18 years old.

The National Center for Missing and Exploited Children (NCMEC) is a non-profit organization that operates Take It Down. The tool is partly funded by Meta Platforms, which owns social media services Facebook and Instagram.

"Once you send that photo, you can't take it back," is a warning that teenagers who upload photos to the internet often hear from adults. But some young people feel pressure to take and send explicit photos of themselves. They might not fully understand the bad things that result from sending such pictures.

Without uploading any actual images and staying private, users of the site can create a digital “fingerprint” of the image. The fingerprint is a unique set of numbers called a “hash.” The numbers go into a database. Technology companies that take part in the program then remove the images from their websites.

But there are already a few problems with the tool.

There are only a few social media platforms joining in the program. So far, only Meta’s Facebook and Instagram and another site called Yubo have joined. Two adult sites are also taking part.

Another problem happens if the image or video is changed in any way. For example, if a person changes the size or turns the image into a meme. The new image gets a new hash number. A photo that has a filter on it will have a similar hash that is usually different by only one letter or number.

Gavin Portnoy is a spokesperson for the NCMEC. He said Take It Down is made for teens who think their images could be or are already out on the internet. The release could have happened through sharing, or the young person might have felt forced to put it on the internet.

Portnoy described it this way: "You're a teen and you're dating someone, and you share the image. Or somebody extorted you…’”

Portnoy said the privacy of the site helps teens feel more at ease than if they go to law enforcement, which would not be private.

"To a teen who doesn't want that level of involvement, they just want to know that it's taken down, this is a big deal for them,” Portnoy said.

NCMEC said it has seen a 35 percent increase in the number of reports of internet abuse of children from 2020 to 2021. The group operates a tipline, called CyberTipline, where people can report internet child abuse. NCMEC said it received 29.3 million reports in 2021.

 

Similar tools have been created and used, but with less success.

In 2017 Facebook released a similar tool for adults. The site asked people to send their encrypted explicit photos to the company. The service was tested in Australia for a short period of time, but people did not trust the site or Facebook.

In 2021, Facebook helped launch another tool for adults call StopNCII, or Stop Nonconsensual Intimate Images. A British non-profit group operates the website, but anyone in the world can use it.

Since then, online sexual abuse has gotten much worse for children, teens and adults.

Many tech companies already use the hash system to remove and report images of child sexual abuse. The goal is to have more companies sign up for the program.

Portnoy said, "We never had anyone say no.”

Two companies have not yet joined the project, Twitter and TikTok. Neither company had commented to the Associated Press as of Sunday.

Meta’s global head of safety, Antigone Davis said that Take It Down is just one of many tools the company uses to prevent and report child sexual abuse and exploitation.

Davis said Meta has supported the development of the tool as well as taken several steps. Davis said the company does not permit “unconnected adults” to send messages to children under the age of 18.

She also said that the site works with real images and ones created using artificial intelligence called deepfakes. Deepfakes are pictures or videos that show real people saying or doing things that they did not actually say or do.

 

 

Words in This Story

explicit – adj. showing or talking about sex or violence in a very detailed way

fund  v. to provide with money for a particular purpose

fingerprint – n. a distinctive or identifying mark or characteristic

unique – adj. one of a kind; like no other

meme –n. a picture, video or other kind of image spread widely through the internet that people find funny, interesting or to have some meaning

filter – v. a tool for selecting or removing a particular kind of information

extortion – n. to get money or something from someone by threatening harm

dating – v. to make a usually romantic social arrangement to meet with someone; to have a date with

encrypt – v. to change (information) from one form to another to hide its meaning

nonconsensual – adj. without the willing agreement of all the people involved

intimate  adj. ​marked by a warm friendship and close physical or sexual connection

 

 

 

 

 

 

 

https://learningenglish.voanews.com/a/technology-tool-aims-to-let-young-people-remove-explicit-images-from-internet/6985739.html

 

 

 

 

 

인쇄하기