Hadley says his team of 20 typically knows about new terrorist content before any of the big platforms. While tracking verified content from Hamas' military wing or the PIJ, Hadey says the volume of content on the major social platforms is "Very low." "Examples that we can find on large platforms of that official content are in the thousands, so not a lot of examples, compared with significantly more on Telegram," Hadley says. "We are sharing alerts to 60 platforms who have been implicated with [Hamas] and PIJ content to date, but very low volumes are present." Hamas is banned on Facebook, Instagram, YouTube, Reddit, and TikTok, so it has typically not used these mainstream platforms to any great extent, though X said earlier this month that it had removed hundreds of Hamas-affiliated accounts. "Because the use of Telegram is so widespread and simple, we will assume they will continue to do this because terrorists prioritize platforms they can use most easily," Hadley says, "And why would they bother with anything other than Telegram?". Meta, X, and YouTube collaborate to share information about what they are seeing on the platforms. While it has blocked some channels in certain jurisdictions, Telegram has typically taken a hands-off approach to content moderation, meaning Hamas and related groups have been mostly free to post whatever content they like on its platform. Telegram did not respond to WIRED's questions about how it would deal with execution videos on its platform, nor did X or Twitch. Discord declined to provide an on-the-record comment. While Hadley sees Telegram as the primary concern, mainstream platforms are still at risk of facilitating the spread of this content. Videos can easily be downloaded from Telegram and reposted on any other platform, which is where the mainstream social media networks face the challenge of stopping them. When asked to provide more details about their efforts, Meta and TikTok directed WIRED to their policy pages outlining the work they are doing to prevent the spread of these videos. YouTube spokesperson Jack Malon tells WIRED that "Our teams are working around the clock to monitor for harmful footage across languages and locales." A Reddit spokesperson says the company has "Dedicated teams with linguistic and subject-matter expertise actively monitoring the conflict and continuing to enforce Reddit's Content Policy across the site."
This Cyber News was published on www.wired.com. Publication date: Thu, 30 Nov 2023 23:19:27 +0000