Home » News » X confirms, it restricted Taylor Swift searches to 'prioritize safety'

X confirms, it restricted Taylor Swift searches to 'prioritize safety'

(Image Credit Google)
Following the discovery of pornographic deepfakes of the singer on the site this week, X has revealed that it is blocking users from searching for Taylor Swift. On Saturday, users of the website began to notice that certain searches involving Swift's name would only get an error message. Joe Benarroch, chief of business operations at X, stated in a statement to the Wall Street Journal, "This is a temporary action and done with an abundance of caution as we prioritize safety on this issue." Days have passed since the situation was initially made public. X's handling of the issue from the outset has been criticized for being hesitant to address the proliferation of nonconsensual, sexually graphic content. Following the photographs' viral success, Swift's followers took matters into their own hands by mass-reporting the accounts that published them and overflowing the Swift-related hashtags with supportive messages to reduce the images' visibility and have them removed, as NBC News reported earlier this week. Before being suspended, a few of the offending accounts were seen millions of times in certain situations. One post had over 45 million views, according to The Verge. Also Read: A Guide to Tap-to-Search and On-Screen Circling Later that day, X released a statement on its platform that read, "We have a zero-tolerance policy towards such content and strictly prohibit the posting of Non-Consensual Nudity (NCN) images on X." All discovered photographs are being actively removed by our teams, and the accounts that posted them are being suitably penalized. We're keeping a careful eye on things to make sure that any other infractions are dealt with right away and that the offending content is taken down. Our goal is to keep the environment civil and safe for all users. However, days later, it was still feasible to locate the pictures. 404Media determined that the pictures most likely came from a Telegram group that was well-known for using free software, such as Microsoft Designer, to create nonconsensual AI-generated photos of women. Microsoft CEO Satya Nadella stated that the problem underscores the company's responsibilities and "all of the guardrails that we need to place around the technology so that there's more safe content that's being produced" in an interview with NBC News's Lester Holt.

By Omal J

I worked for both print and electronic media as a feature journalist. Writing, traveling, and DIY sum up her life.

RELATED NEWS

Amazon Freevee, a progressive no-subscription stre...

news-extra-space

In an amazing move, Amazon as of late declared tha...

news-extra-space

Apple is acquiring AI firms at a quicker rate than...

news-extra-space

Imagine that you have a picture-perfect vacation p...

news-extra-space

Movie night with friends has just become a little ...

news-extra-space

Bluesky, the eagerly awaited decentralized Twitter...

news-extra-space
2
3
4
5
6
7
8
9
10