Wall Street Journal 해석

Instagram Serves Up Toxic Video Mix : 인스타그램은 유해한 비디오 믹스 제공

춰리 2023. 11. 28. 21:31

Instagram Serves Up Toxic Video Mix
인스타그램은 유해한 비디오 믹스 제공

WSJ test accounts set to follow young gymnasts got risqué footage of children
어린 체조 선수를 추적하도록 설정된 WSJ 테스트 계정에서 어린이의 음란한 영상이 발견되었습니다.

 

Social-media platforms like videos because they hold user attention longer. Meta CEO Mark Zuckerberg in September.  GODOFREDO A. VÁSQUEZ/ ASSOCIATED PRESS _ 소셜 미디어 플랫폼은 사용자의 관심을 더 오래 끌 수 있기 때문에 비디오를 좋아합니다. 메타 CEO 마크 저커버그(Mark Zuckerberg) 9월. 고도프레도 A. 바스케스(Godofredo A. Vásquez)/ AP통신

 

Instagram owner Meta created Reels to compete with the video-sharing platform TikTok. KORI SUZUKI FOR THE WALL STREET JOURNAL _ 인스타그램 소유자 메타는 동영상 공유 플랫폼 틱톡과 경쟁하기 위해 릴스를 만들었습니다. 월스트리트 저널의 코리 스즈키

 

BY JEFF HORWITZ AND KATHERINE BLUNT

Instagram’s Reels video service is designed to show users streams of short videos on topics the system decides will interest them, such as sports, fashion or humor.
인스타그램의 릴스 비디오 서비스는 사용자들에게 스포츠, 패션, 유머 등 시스템이 관심을 가질 것이라고 결정한 주제에 대한 짧은 비디오 스트림을 보여주기 위해 고안되었습니다.

The Meta Platforms - owned social app does the same thing for users its algorithm decides might have a prurient interest in children, testing by The Wall Street Journal showed.
Meta Platforms 소유의 소셜 앱은 알고리즘이 아이들에게 호기심을 가질 수 있다고 판단하는 것과 같은 일을 사용자들에게 해주는 것으로 월스트리트 저널의 테스트에 의해 나타났습니다.

The Journal sought to determine what Instagram’s Reels algorithm would recommend to test accounts set up to follow only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform.
저널은 플랫폼에서 활동하는 어린 체조 선수, 치어리더, 기타 10대 및 10대 초반 영향력 있는 사람들만 팔로우하도록 설정된 계정을 테스트하기 위해 Instagram의 Reels 알고리즘이 무엇을 권장할지 결정하려고 했습니다.

Instagram’s system served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos—and ads for some of the biggest U.S. brands.
인스타그램 시스템은 어린이의 외설적인 영상은 물론 노골적으로 성적인 성인용 동영상과 일부 미국 최대 브랜드의 광고를 포함해 충격적인 외설적인 콘텐츠를 해당 테스트 계정에 제공했습니다.

The Journal set up the test accounts after observing that the thousands of followers of such young people’s accounts often include large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults. The Journal also tested what the algorithm would recommend after its accounts followed some of those users as well, which produced more-disturbing content interspersed with ads.
저널은 이러한 청소년 계정의 수천 명의 팔로어 중 다수의 성인 남성이 포함되는 경우가 많으며, 이러한 어린이를 팔로우하는 계정 중 상당수가 어린이와 성인 모두와 관련된 성적인 콘텐츠에 관심을 보인 것을 관찰한 후 테스트 계정을 설정했습니다. 저널은 또한 계정이 일부 사용자를 추적한 후 알고리즘이 무엇을 추천할지 테스트했으며, 이로 인해 광고가 산재된 더욱 혼란스러운 콘텐츠가 생성되었습니다.

In a stream of videos recommended by Instagram, an ad for the dating app Bumble appeared between a video of someone stroking the face of a life - size latex doll and a video of a young girl with a digitally obscured face lifting up her shirt to expose her midriff. In another, a Pizza Hut commercial followed a video of a man lying on a bed with his arm around what the caption said was a 10-year-old girl.
인스타그램(Instagram)이 추천한 일련의 비디오에서 누군가가 실물 크기의 라텍스 인형의 얼굴을 쓰다듬는 비디오와 디지털로 가려진 얼굴을 가진 어린 소녀가 그녀의 셔츠를 들어올려 그녀의 엉덩이를 노출시키는 비디오 사이에 데이트 앱 범블(Bumble)의 광고가 나타났습니다. 또 다른 광고에서는 피자헛(Pizza Hut)이 자막에 10세 소녀라고 표시된 대로 팔을 둘러 침대에 누워 있는 남자의 비디오를 따라했습니다.

The Canadian Centre for Child Protection, a child- protection group, separately ran similar tests on its own, with similar results.
아동보호단체인 캐나다 아동보호센터도 비슷한 검사를 자체적으로 실시했는데, 비슷한 결과가 나왔습니다.

Meta said the Journal’s tests produced a manufactured experience that doesn’t represent what billions of users see. The company declined to comment on why the algorithms compiled streams of separate videos showing children, sex and advertisements, but a spokesman said that in October it introduced new brand safety tools that give advertisers greater control over where their ads appear, and that Instagram either removes or reduces the prominence of four million videos suspected of violating its standards each month.
메타는 저널의 테스트가 수십억 명의 사용자가 보는 것을 나타내지 않는 제조된 경험을 만들어냈다고 말했습니다. 회사는 알고리즘이 어린이, 성별 및 광고를 보여주는 별도의 비디오 스트림을 편집한 이유에 대한 언급을 거부했지만 대변인은 10월에 광고주에게 광고가 표시되는 위치에 대한 더 큰 통제력을 부여하는 새로운 브랜드 안전 도구를 도입했으며 인스타그램이 매달 기준을 위반한 것으로 의심되는 4백만 개의 비디오의 저명성을 제거하거나 축소한다고 말했습니다.

The Journal reported in June that algorithms run by Meta, which owns both Facebook and Instagram, connect large communities of users interested in pedophilic content. The Meta spokesman said a task force set up after the Journal’s article has expanded its automated systems for detecting users who behave suspiciously, taking down tens of thousands of such accounts each month. The company also is participating in a new industry coalition to share signs of potential child exploitation.
저널은 지난 6월 페이스북과 인스타그램을 모두 소유한 메타(Meta)가 운영하는 알고리즘이 소아성애자 콘텐츠에 관심이 있는 대규모 사용자 커뮤니티를 연결한다고 보도했습니다. 메타 대변인은 저널의 기사 이후 설치된 태스크포스(TF)가 수상한 행동을 하는 사용자를 감지하기 위한 자동화된 시스템을 확장하여 매달 수만 개의 계정을 삭제했다고 말했습니다. 이 회사는 또한 잠재적인 아동 착취의 징후를 공유하기 위해 새로운 산업 연합에 참여하고 있습니다.

Companies whose ads appeared beside inappropriate content in the Journal’s tests include Disney, Walmart, online dating company Match Group, Hims, which sells erectile- dysfunction drugs, and The Wall Street Journal itself. Most brand-name retailers require that their advertising not run next to sexual or explicit content.
저널의 테스트에서 부적절한 내용 옆에 광고가 등장한 회사들에는 디즈니, 월마트, 온라인 데이트 회사인 매치 그룹, 발기부전 치료제를 판매하는 힘스, 그리고 월스트리트 저널 자체가 포함됩니다. 대부분의 유명 소매업자들은 그들의 광고가 성적이거나 명백한 내용 옆에 있어서는 안 된다고 요구합니다.

“Our systems are effective at reducing harmful content, and we’ve invested billions in safety, security and brand suitability solutions,” said Samantha Stetson, a Meta vice president who handles relations with the advertising industry. She said the prevalence of inappropriate content on Instagram is low, and that the company invests heavily in reducing it.
광고 산업과의 관계를 다루는 메타 부사장 사만다 스테슨(Samantha Stetson)은 "우리의 시스템은 유해한 콘텐츠를 줄이는 데 효과적이며 안전, 보안 및 브랜드 적합성 솔루션에 수십억 달러를 투자했습니다."라고 말했습니다. 그녀는 인스타그램에서 부적절한 콘텐츠의 보급률이 낮고, 회사가 이를 줄이는 데 많은 투자를 한다고 말했습니다.

After the Journal contacted companies whose ads appeared in the testing next to inappropriate videos, several said that Meta told them it was investigating and would pay for brand-safety audits from an outside firm.
저널이 테스트에서 부적절한 동영상 옆에 광고가 게재된 회사에 연락한 후 몇몇은 Meta가 조사 중이며 외부 회사의 브랜드 안전 감사 비용을 지불할 것이라고 말했습니다.

Ad problem
광고 문제


Following what it described as Meta’s unsatisfactory response to its complaints, Match began canceling Meta advertising for some of its apps, such as Tinder, in October. It has since halted all Reels advertising and stopped promoting its major brands on any of Meta’s platforms. “We have no desire to pay Meta to market our brands to predators or place our ads anywhere near this content,” said Match spokeswoman Justine Sacco.
메타의 불만에 대한 불만족스러운 반응으로 묘사된 이후, 매치는 10월 틴더와 같은 자사의 앱 중 일부에 대한 메타 광고를 취소하기 시작했습니다. 이후 모든 릴스 광고를 중단하고 메타의 모든 플랫폼에서 자사의 주요 브랜드를 홍보하는 것을 중단했습니다. 매치의 대변인 저스틴 사코(Justine Sacco)는 "우리는 우리의 브랜드를 포식자들에게 마케팅하거나 이 콘텐츠 근처 어디에도 광고를 게재하기 위해 메타에 돈을 지불할 생각이 없습니다."라고 말했습니다.

Robbie McKay, a spokesman for Bumble, said it “would never intentionally advertise adjacent to inappropriate content,” and that the company is suspending its ads across Meta’s platforms.
범블의 대변인 로비 맥케이는 "절대 의도적으로 부적절한 콘텐츠에 인접한 광고를 하지 않을 것"이라며 메타의 플랫폼 전반에 걸쳐 광고를 중단하고 있다고 말했습니다.

Charlie Cain, Disney’s vice president of brand management, said the company has set strict limits on what social media content is acceptable for advertising and has pressed Meta and other platforms to improve brand safety features. A company spokeswoman said that since the Journal presented its findings to Disney, the company had been working on addressing the issue at the “highest levels at Meta.”
디즈니의 브랜드 관리 부사장인 Charlie Cain은 회사가 어떤 소셜 미디어 콘텐츠가 광고에 허용될 수 있는지에 대한 엄격한 제한을 두었고 메타와 다른 플랫폼들이 브랜드 안전 기능을 개선하도록 압력을 넣었다고 말했습니다. 회사 대변인은 저널이 발견을 디즈니에 발표한 이후로 회사는 "메타의 가장 높은 수준"에서 문제를 해결하기 위해 노력해 왔다고 말했습니다

Walmart declined to comment, and Pizza Hut didn’t respond to requests for comment.
월마트는 논평을 거부했고, 피자헛은 논평 요청에 응답하지 않았습니다.

Hims said it would press Meta to prevent such ad placement, and that it considered Meta’s pledge to work on the problem encouraging.
Hims는 그러한 광고 배치를 막기 위해 메타를 압박할 것이며, 메타가 그 문제를 해결하기로 약속한 것을 고무적으로 고려했다고 말했습니다.

The Journal said that it was alarmed that its ad appeared next to a video of an apparent adult sex act and that it would demand action from Meta.
저널은 자사의 광고가 명백한 성인 성행위 영상 옆에 게재된 점과 메타에 조치를 요구할 것이라는 사실에 놀랐다고 밝혔습니다.

Meta created Reels to compete with TikTok, the video sharing platform owned by Beijing- based ByteDance. Both products feed users a nonstop succession of videos posted by others, and make money by inserting ads among them. Both companies’ algorithms show to a user videos the platforms calculate are most likely to keep that user engaged, based on his or her past viewing behavior.
Meta는 베이징에 본사를 둔 ByteDance가 소유한 비디오 공유 플랫폼인 TikTok과 경쟁하기 위해 Reels를 만들었습니다. 두 제품 모두 다른 사람이 게시한 동영상을 끊임없이 사용자에게 제공하고, 그 사이에 광고를 삽입하여 수익을 창출합니다. 두 회사의 알고리즘은 플랫폼이 계산한 비디오가 사용자의 과거 시청 행동을 기반으로 해당 사용자의 참여를 유지할 가능성이 가장 높다는 것을 사용자에게 보여줍니다.

The Journal reporters set up the Instagram test accounts as adults on newly purchased devices and followed the gymnasts, cheerleaders and other young influencers. The tests showed that following only the young girls triggered Instagram to begin serving videos from accounts promoting adult sex content alongside ads for major consumer brands, such as one for Walmart that ran after a video of a woman exposing her crotch.
저널 기자들은 새로 구입한 장치에 성인용 Instagram 테스트 계정을 설정하고 체조 선수, 치어리더 및 기타 젊은 영향력 있는 사람들을 팔로우했습니다. 테스트 결과, 어린 소녀들만 따라가면 인스타그램이 가랑이를 노출한 여성의 동영상을 추적한 월마트 광고와 같은 주요 소비자 브랜드 광고와 함께 성인 섹스 콘텐츠를 홍보하는 계정의 동영상을 제공하기 시작한 것으로 나타났습니다.

When the test accounts then followed some users who followed those same young people’s accounts, they yielded even more disturbing recommendations. The platform served a mix of adult pornography and child sexualizing material, such as a video of a clothed girl caressing her torso and another of a child pantomiming a sex act.
그런 다음 테스트 계정이 동일한 젊은이의 계정을 팔로우하는 일부 사용자를 팔로우했을 때 훨씬 더 충격적인 추천을 내놓았습니다. 플랫폼에는 옷을 입은 소녀가 자신의 몸통을 애무하는 영상과 어린이가 무언극으로 성행위를 하는 영상 등 성인 포르노와 아동 성적 대상화 자료가 혼합되어 제공되었습니다.

Experts on algorithmic recommendation systems said the Journal’s tests showed that while gymnastics might appear to be an innocuous topic, Meta’s behavioral tracking has discerned that some Instagram users following preteen girls will want to engage with videos sexualizing children, and then directs such content toward them.
알고리즘 추천 시스템 전문가들은 저널의 테스트 결과 체조가 무해한 주제로 보일 수 있지만 Meta의 행동 추적을 통해 10대 초반 소녀를 팔로우하는 일부 인스타그램 사용자가 어린이를 성적으로 묘사하는 동영상에 참여하고 싶어하며 이러한 콘텐츠를 어린이에게 전달한다는 사실이 밝혀졌다고 말했습니다.

“Niche content provides a much stronger signal than general interest content,” said Jonathan Stray, senior scientist for the Center for Human-Compatible Artificial Intelligence at the University of California, Berkeley.
버클리 캘리포니아 대학 인간 호환 인공 지능 센터의 선임 과학자인 조나단 스트레이(Jonathan Stray)는 “틈새 콘텐츠는 일반적인 관심 콘텐츠보다 훨씬 더 강력한 신호를 제공합니다.”라고 말했습니다.

Current and former Meta employees said in interviews that the tendency of Instagram algorithms to aggregate child sexualization content from across its platform was known internally to be a problem. Once Instagram pigeonholes a user as interested in any particular subject matter, they said, its recommendation systems are trained to push more related content to them.
현직 및 전직 Meta 직원은 인터뷰에서 Instagram 알고리즘이 플랫폼 전체에서 아동 성적 대상화 콘텐츠를 집계하는 경향이 내부적으로 문제인 것으로 알려져 있다고 말했습니다. 인스타그램이 특정 주제에 관심이 있는 사용자를 분류하면 추천 시스템이 더 많은 관련 콘텐츠를 푸시하도록 훈련된다고 그들은 말했습니다.

Preventing the system from pushing noxious content to users interested in it, they said, requires significant changes to the recommendation algorithms that also drive engagement for normal users. Company documents reviewed by the Journal show that the company’s safety staffers are broadly barred from making changes to the platform that might reduce daily active users by any measurable amount.
시스템이 관심 있는 사용자에게 유해한 콘텐츠를 푸시하는 것을 방지하려면 일반 사용자의 참여를 유도하는 추천 알고리즘에 상당한 변경이 필요하다고 그들은 말했습니다. 저널에서 검토한 회사 문서에 따르면 회사의 안전 담당자는 측정 가능한 양만큼 일일 활성 사용자를 줄일 수 있는 플랫폼 변경을 전반적으로 금지하고 있습니다.

The test accounts showed that advertisements were regularly added to the problematic Reels streams. Ads encouraging users to visit Disneyland for the holidays ran next to a video of an adult acting out having sex with her father, and another of a young woman in lingerie with fake blood dripping from her mouth. An ad for Hims ran shortly after a video depicting an apparently anguished woman in a sexual situation along with a link to what was described as “the full video.”
테스트 계정에서는 문제가 있는 Reels 스트림에 정기적으로 광고가 추가되는 것으로 나타났습니다. 휴가철에 디즈니랜드를 방문하도록 유도하는 광고는 성인이 아버지와 성관계를 갖는 장면을 담은 영상과 란제리를 입고 입에서 가짜 피를 흘리는 젊은 여성의 영상 옆에 게재되었습니다. Hims에 대한 광고는 성적인 상황에 처해 있는 것으로 보이는 고뇌에 찬 여성을 묘사한 동영상 직후에 "전체 동영상"이라고 설명된 링크와 함께 게재되었습니다.

Even before the 2020 launch of Reels, Meta employees understood that the product posed safety concerns, according to former employees.
전(前) 직원에 따르면 2020년 Reels 출시 이전에도 Meta 직원은 제품이 안전 문제를 야기한다는 것을 이해했습니다.

Part of the problem is that automated enforcement systems have a harder time parsing video content than text or still images. Another difficulty arises from how Reels works: Rather than showing content shared by users’ friends, the way other parts of Instagram and Facebook often do, Reels promotes videos from sources they don’t follow.
문제 중 하나는 자동화된 집행 시스템이 텍스트나 정지 이미지보다 비디오 콘텐츠를 구문 분석하는 데 더 어렵다는 것입니다. Reels의 작동 방식에서 또 다른 어려움이 발생합니다. Instagram 및 Facebook의 다른 부분처럼 사용자의 친구가 공유한 콘텐츠를 표시하는 대신 Reels는 팔로우하지 않는 소스의 비디오를 홍보합니다.

Internal concerns
내부 우려


In an analysis conducted shortly before the introduction of Reels, Meta’s safety staff flagged the risk that the product would chain together videos of children and inappropriate content, according to two former staffers. Vaishnavi J, Meta’s former head of youth policy, described the safety review’s recommendation as: “Either we ramp up our content detection capabilities, or we don’t recommend any minor content,” meaning any videos of children.
두 명의 전직 직원에 따르면 Reels 도입 직전에 수행된 분석에서 Meta의 안전 직원은 이 제품이 어린이의 동영상과 부적절한 콘텐츠를 하나로 묶을 위험을 지적했다고 합니다. Meta의 전 청소년 정책 책임자인 Vaishnavi J는 안전 검토의 권장 사항을 다음과 같이 설명했습니다. "콘텐츠 감지 기능을 강화하거나 사소한 콘텐츠, 즉 어린이 동영상을 권장하지 않습니다."

At the time, TikTok was growing rapidly, drawing the attention of Instagram’s young users and the advertisers targeting them. Meta didn’t adopt either of the safety analysis’s recommendations at that time, according to J.
당시 틱톡은 급속도로 성장하고 있었고, 인스타그램의 젊은 사용자들과 그들을 타겟으로 한 광고주들의 관심을 끌었습니다. J.에 따르면 Meta는 당시 안전 분석 권장 사항 중 어느 것도 채택하지 않았습니다.

Stetson, Meta’s liaison with digital ad buyers, disputed that Meta had neglected child safety concerns ahead of the product’s launch. “We tested Reels for nearly a year before releasing it widely, with a robust set of safety controls and measures,” she said.
Meta의 디지털 광고 구매자 연락 담당자인 Stetson은 Meta가 제품 출시에 앞서 어린이 안전 문제를 무시했다고 이의를 제기했습니다. “우리는 Reels를 널리 출시하기 전에 강력한 안전 제어 및 조치를 통해 거의 1년 동안 테스트했습니다.”라고 그녀는 말했습니다.

Video-sharing platforms appeal to social media companies because videos tend to hold user attention longer than text or still photos do, making them attractive for advertisers.
비디오 공유 플랫폼은 소셜 미디어 회사에게 매력적입니다. 비디오는 텍스트나 정지 사진보다 사용자의 관심을 더 오래 유지하는 경향이 있어 광고주에게 매력적이기 때문입니다.

 

After initially struggling to maximize the revenue potential of its Reels product, Meta has improved how its algorithms recommend content and personalize video streams for users.
처음에는 Reels 제품의 수익 잠재력을 극대화하기 위해 고군분투한 후 Meta는 알고리즘이 콘텐츠를 추천하고 사용자를 위해 비디오 스트림을 개인화하는 방식을 개선했습니다.

Social media platforms and digital advertising agencies often describe inappropriate ad placements as unfortunate mistakes. But the test accounts run by the Journal and the Canadian Centre for Child Protection suggest Meta’s platforms appeared to target some digital marketing at users interested in sex.
소셜 미디어 플랫폼과 디지털 광고 대행사는 종종 부적절한 광고 게재를 불행한 실수로 묘사합니다. 그러나 Journal과 Canadian Center for Child Protection이 운영하는 테스트 계정에 따르면 Meta의 플랫폼은 섹스에 관심이 있는 사용자를 대상으로 일부 디지털 마케팅을 목표로 하는 것으로 나타났습니다.

Among the ads that appeared regularly in the Journal’s test accounts were those for “dating” apps and livestreaming platforms featuring adult nudity, massage parlors offering “happy endings” and artificial intelligence chatbots built for cybersex. Meta’s rules are supposed to prohibit such ads.
저널의 테스트 계정에 정기적으로 게재되는 광고 중에는 성인 누드를 특징으로 하는 "데이트" 앱과 라이브 스트리밍 플랫폼, "해피 엔딩"을 제공하는 마사지 팔러 및 사이버 섹스를 위해 제작된 인공 지능 챗봇에 대한 광고가 있었습니다. Meta의 규칙은 그러한 광고를 금지하도록 되어 있습니다.

The Journal informed Meta in August about the results of its testing. In the months since then, tests by both the Journal and the Canadian Centre for Child Protection show that the platform continued to serve up a series of videos featuring young children, adult content and apparent promotions for child sex material hosted elsewhere.
저널은 8월에 Meta에 테스트 결과를 알렸습니다. 그 이후 몇 달 동안 저널과 캐나다 아동 보호 센터의 테스트에 따르면 플랫폼은 다른 곳에서 호스팅되는 어린이, 성인용 콘텐츠 및 명백한 아동 성 관련 홍보물이 포함된 일련의 비디오를 계속해서 제공하는 것으로 나타났습니다.

As of mid November, the center said Instagram is continuing to steadily recommend what the nonprofit described as “adults and children doing sexual posing.”
센터는 11월 중순 현재 인스타그램이 비영리단체에서 설명하는 '성적인 포즈를 취하는 성인과 어린이'를 꾸준히 추천하고 있다고 밝혔습니다.

After the Journal began contacting advertisers about the placements, and those companies raised questions, Meta told them it was investigating the matter and would pay for brand safety auditing services to determine how often a company’s ads appear beside content it considers unacceptable.
저널이 배치에 대해 광고주들에게 연락하기 시작하고 해당 회사들이 질문을 제기한 후 Meta는 문제를 조사하고 있으며 회사의 광고가 허용되지 않는 것으로 간주되는 콘텐츠 옆에 얼마나 자주 표시되는지 확인하기 위해 브랜드 안전 감사 서비스 비용을 지불할 것이라고 말했습니다.

Meta hasn’t offered a timetable for resolving the problem or explained how in the future it would restrict the promotion of inappropriate content featuring children.
Meta는 문제 해결을 위한 시간표를 제공하지 않았으며 향후 어린이가 등장하는 부적절한 콘텐츠 홍보를 어떻게 제한할지 설명하지 않았습니다.

The Journal’s test accounts found that the problem even affected Meta-related brands. Ads for the company’s Whats-App encrypted chat service and Meta’s Ray Ban Stories glasses appeared next to adult pornography. An ad for Lean In Girls, the young women’s empowerment nonprofit run by former Meta Chief Operating Officer Sheryl Sandberg, ran directly before a promotion for an adult sex-content creator who often appears in schoolgirl attire. Sandberg declined to comment.
저널의 테스트 계정에서는 이 문제가 메타 관련 브랜드에도 영향을 미치는 것으로 나타났습니다. 회사의 Whats-App 암호화 채팅 서비스와 Meta의 Ray Ban Stories 안경에 대한 광고가 성인 음란물 옆에 나타났습니다. 전 Meta 최고 운영 책임자(COO)인 Sheryl Sandberg가 운영하는 젊은 여성 역량 강화 비영리 단체인 Lean In Girls의 광고는 종종 여학생 복장을 입고 등장하는 성인 섹스 콘텐츠 제작자의 프로모션 바로 앞에 게재되었습니다. 샌드버그는 논평을 거부했다.

Through its own tests, the Canadian Centre for Child Protection concluded that Instagram was regularly serving videos and pictures of clothed children who also appear in the National Center for Missing and Exploited Children’s digital database of images and videos confirmed to be child abuse sexual material. The group said child abusers often use the images of the girls to advertise illegal content for sale in dark-web forums.
캐나다 아동 보호 센터는 자체 테스트를 통해 인스타그램이 옷을 입은 아이들의 동영상과 사진을 정기적으로 제공하고 있다는 결론을 내렸습니다. 이 동영상은 국립실종착취아동센터(National Center for Missing and Exploited Children)의 아동 학대 성적 자료로 확인된 이미지와 동영상의 디지털 데이터베이스에도 나타납니다. 이 단체는 아동 학대자들이 다크웹 포럼에서 판매할 불법 콘텐츠를 광고하기 위해 종종 소녀들의 이미지를 사용한다고 말했습니다.

The nature of the content— sexualizing children without generally showing nudity reflects the way that social media has changed online child sexual abuse, said Lianna McDonald, executive director for the Canadian center. The group has raised concerns about the ability of Meta’s algorithms to essentially recruit new members of online communities devoted to child sexual abuse, where links to illicit content in more private forums proliferate.
캐나다 센터의 전무이사인 Lianna McDonald는 일반적으로 과도한 노출을 표시하지 않고 아동을 성적으로 묘사하는 콘텐츠의 성격은 소셜 미디어가 온라인 아동 성적 학대를 변화시키는 방식을 반영한다고 말했습니다. 이 그룹은 비공개 포럼의 불법 콘텐츠에 대한 링크가 급증하는 아동 성적 학대에 전념하는 온라인 커뮤니티의 새로운 회원을 본질적으로 모집하는 Meta의 알고리즘 능력에 대한 우려를 제기했습니다.

“Time and time again, we’ve seen recommendation algorithms drive users to discover and then spiral inside of these online child exploitation communities,” McDonald said, calling it disturbing that ads from major companies were subsidizing that process. —Robbie Whelan contributed to this article.
McDonald는 “우리는 추천 알고리즘이 사용자를 이러한 온라인 아동 착취 커뮤니티를 발견하고 내부로 끌어들이는 것을 몇 번이고 목격했습니다.”라고 McDonald는 말했습니다. 주요 회사의 광고가 이러한 프로세스에 보조금을 지급하고 있다는 사실이 충격적이라고 말했습니다. —Robbie Whelan이 이 기사에 기고했습니다.