Tech

Layoffs Have Gutted Twitter’s Child Safety Team


Eliminate child exploitation To be “number 1 priority,” Twitter’s new owner and CEO, Elon Musk, announced last week. However, at the same time, after widespread layoffs and resignations, only one employee remains in a key group dedicated to removing child sexual abuse content from the site, according to two people with knowledge of the matter and Both requested anonymity.

It’s unclear how many people were on the team before Musk took over. On LinkedIn, WIRED identified four Singapore-based employees specializing in child safety who publicly announced that they had left Twitter in November.

The importance of child safety professionals in the home cannot be underestimated, researchers say. Headquartered at Twitter’s Asia headquarters in Singapore, the team enforces the company’s ban on child sexual abuse material (CSAM) in the Asia Pacific region. Currently, that group has only one full-time employee. The Asia Pacific region is home to about 4.3 billion people, about 60% of the world’s population.

The team in Singapore is responsible for some of the platform’s busiest markets, including Japan. Twitter has 59 million users in Japan, second only to the number of users in the US, according to data aggregator Statistical staff. However, the Singapore office has also been affected by widespread layoffs and resignations after Musk took over the business. In the last month, Twitter laid off half of its workforce and then emailed the remaining employees asking them to choose between committing to “high intensity hours” or accepting the package. severance allowance for three months’ salary.

Carolina Christofoletti, a CSAM researcher at the University of São Paulo in Brazil, said the impact of layoffs and resignations on Twitter’s ability to deal with CSAM was “very disturbing.” She said: “It is an illusion to think that there would be no impact on the platform if the people working on child safety inside Twitter could be fired or allowed to resign. Twitter did not immediately respond to a request for comment.

Twitter’s child safety experts are not alone against CSAM on the platform. They get help from organizations like the UK’s Internet Watch and the US-based National Center for Missing & Exploited Children, who also search the Internet. to identify CSAM content shared on platforms like Twitter. The IWF says that the data it submits to tech companies can be automatically deleted by the company’s systems—without human censorship. “This ensures that the blocking process is as efficient as possible,” said Emma Hardy, IWF communications director.

But these external organizations are focused on the end product and lack access to internal Twitter data, Christofoletti said. She describes internal dashboards as playing an important role in metadata analysis to help discovery coders identify CSAM networks before content is shared. “The only ones who can see it [metadata] is anyone on the platform, she said.

Twitter’s efforts to crack down on CSAM are complicated by the fact that it allows people to share consensual pornography. According to Arda Gerkens, who runs the Dutch organization EOKM, an online CSAM reporting organization, tools used by platforms to scan for child abuse have difficulty distinguishing between a person and a child. One adult agrees and one child disagrees. “Technology is still not good enough,” she said, adding that is why human employees are so important.

Twitter’s battle to stop the spread of child sexual abuse on its site dates back to before Musk took over. in it latest transparency report, which runs from July to December 2021, the company said it suspended more than half a million accounts because of CSAM, up 31% from six months earlier. In September, brands include Dyson and Forbes prohibit ad campaign after their ad appeared with child abuse content.

Twitter was also forced to delay its plan to monetize the consenting adult community and become a competitor of OnlyFans due to concerns that this risked exacerbating the platform’s CSAM problem. “Twitter failed to accurately detect child sexual exploitation and gratuitous nudity on a large scale,” an internal report in April 2022 said. obtained by The Verge.

Researchers are worried about how Twitter will deal with CSAM under its new ownership. Those concerns were only exacerbated when Musk request his followers to “reply in the comments” if they see any issues on Twitter that need addressing. Christofoletti said: “This question should not be a topic on Twitter. “That’s exactly the question he should be asking the child safety team he fired. That is the contradiction here.”

newsofmax

News of max: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button
Immediate Matrix Immediate Maximum
rumi hentai besthentai.org la blue girl 2 bf ganda koreanporntrends.com telugusareesex hakudaku mesuhomo white day flamehentai.com hentai monster musume سكس محارم الماني pornotane.net ينيك ابنته tamil movie downloads tubeblackporn.com bhojpuri bulu film
sex girel pornoko.net redtube mms odia sex mobi tubedesiporn.com nude desi men صور سكسي متحركه porno-izlemek.net تردد قنوات سكس نايل سات sushmita sex video anybunny.pro bengali xxx vido desigay tumblr indianpornsluts.com pakistani escorts
desi aunty x videos kamporn.mobi hot smooch andaaz film video pornstarsporn.info tamil sexy boobs internet cafe hot tubetria.mobi anushka sex video desi sexy xnxx vegasmovs.info haryana bf video 黒ギャル 巨乳 無修正 javvideos.net 如月有紀