Tumgik
#tumblr removed from apple app store over abuse images
rahleeyah · 3 years
Text
So let's take a step back. Far right extremists latch on to "democrats are child sex traffickers/murderers" as a cause du jour (pizzagate was 2016 can you believe?) A very real problem - sex trafficking - becomes a dog whistle for neo nazis and q-anoners. It bleeds into the national conversation. Websites like Backpage get targeted in the name of protecting kids, which maybe it does, while also hammering sex workers. Sex workers start looking for other places to build businesses. Apple, with a chokehold on the mobile app market, goes on a puritanical spree to eliminate sexy content from its apps. To protect the kids, of course. Tumblr gets removed from the app store bc there was a thriving sex-interested community here, and scrambles to remove anything even vaguely sexy, including female presenting nipples and several pictures of my pet rabbit. Sex workers - and just! People who like sex in general! - go in search of somewhere else to make a living and create content. Again. They go to onlyfans, turn it from nothing into a billion dollar company, but onlyfans "can't attract investors" with all that sexy content, so they decide to kick the sex workers out. Again. You know what happened to Tumblr when they banned sexual images (or tried to, I promise you there are naked people on my dash at this very moment)? Usage of the site plummeted, along with its valuation, and left them desperately scrambling to find some way to make enough money to keep the lights on. What is gonna happen to onlyfans without sexual content? How many people use it for non sexual reasons?
Now. I do happen to believe it almost impossible to consume traditional porn ethically; it is almost impossible to know if the actors were plied with drugs or forced into situations they didn't initially consent to or were otherwise coerced or abused and the business is full of those stories. But onlyfans? By and large most of those creators are in business for themselves, and in charge of their creative directions. It provides them with agency. It provides them with a means to control their own income, on their terms.
But we can't have that, can we? We can strip a woman down and force her into uncomfortable, revealing clothes and positions for big budget movies, but we can't have her deciding to do that for herself, can we? We can make art out of her trauma but we hem and haw when she demands respect. Companies can make millions of dollars on their backs and then throw these women away like they're nothing. Corporations, CEOs in big cushy offices raking cash in hand over fist can shit all over the women who made them rich; it is inherently classist. And misogynist; no, not all sex workers are women, not by a mile, but that is the way our society understands this business to operate. The women do the work, the men pay - or get paid - and the women are punished.
The rise of this neo-puritanism (which has also infected the left, make no mistake about it) hurts women, hurts sex workers, and gives young people no option to turn to for exploring the concept of sex but the traditional porn industry or mainstream media, and god the damage that does is quantifiable. There is data on this, on the rising trend of sex related injuries that boil down to "well I saw it in a porno and they looked like they were having fun", bc the porno didn't show the prep work, or didn't care that the acts it was showing were violent and abusive.
We need a public discourse about sex. We need access to different presentations of what sex is, or can be. We need protections for sex workers, instead of continually forcing them into smaller and smaller corners, where the risk gets greater and greater. Sex is not evil! Sex is not bad! Sex can coexist on a platform with other kinds of content without eroding the moral purity or whatever the fuck of that platform. No one was forcing Tumblr users to look at porn (well except for whoever runs the porn bots, which are very much still here) and if a tumblr user doesn't wanna see it then they can take some fucking responsibility and use the tools that are available to shield themselves from it.
Same fucking thing with onlyfans. It's a subscription based website; no one is encountering "sexual nudity" by mistake. But paternal, puritanical, pearl clutching corporations wanna dictate the content we can and can't consume.
This is censorship, plain and simple, and it is and always has been a slippery slope. Every inch of ground given is a loss of freedom. Corporations are taking charge of what we see, using algorithms (make sure you have best stuff first turned off on your Tumblr) to curate our experiences, steering us in whatever direction will make them the most money. I am thinking now about that expose about how the reporter could track a right wingers descent into q anon madness through his Facebook likes. Each click dragged him deeper and deeper into that hole until there was no way out.
Do I use onlyfans? No! But this continuing tend of throttling sex based content is alarming to me, and I don't know how we turn back the clock but holy shit. Bodies are not shameful, and sex is not repugnant, and the more we act like it is the more damage we do to our national psyche.
18 notes · View notes
justforsmiles · 5 years
Text
I’m quitting Tumblr
I’m making the decision to leave Tumblr. @staff @support
Having been on Tumblr since 2010, I never really thought about when this day would come, but now more than ever, I don’t see why not. 
It has been a great journey, I won’t forget that. From writing little poems to writing more about my personal experiences to reblogging Cyanide and Happiness comics, to memes, dogs,  motivational quotes, inspirational images and building a positive community of my own, it felt so empowering and gave me something to look forward to during my high school and my college years. 
It was addictive. It felt rewarding when I was featured on Tumblr’s “radar”- hand-picked showcase of creative, interesting, and awesome posts. It was my go-to mood booster. I read up on news, learned new terms, saved a ton of hilarious, fun gifs (back when Tumblr did not have a button to easily look up gifs and what not), became heavily invested in social justice issues, posted my ideas, learnings, what I stand for, and have people thank me and my blog for existing.   
I connected with other people from all over the world. I met up with two of my Tumblr friends in person. I could share my thoughts, genuine interests, and have it liked and reblogged by thousands? It felt like a platform I was never ever going to see myself exit from. 
Until the start of this year. I don’t feel safe on Tumblr.
On December 17, 2018, Tumblr’s ban on adult content went into effect (after the fact that Tumblr’s official app was removed from Apple’s App Store over the discovery of child pornography on the site). That meant that all pictures, GIFs, and videos that feature erotic content will be removed from the site. 
A reblogged photo from back in 2011 of a newborn’s feet was flagged but when a porn blog (blacksinasian) decides to share my (fully clothed) self portraits and write crude things on it, impersonate me in writing those things to lead people to my Tumblr and social media- that is not okay. 
I had to find out through Instagram DMs when some people let me know that they saw me on Tumblr, asked if I had Tumblr and if I knew that my pictures were being circulated on Tumblr. 
This is screwed up on so many levels. I never agreed to allow my content to be used in this way. The blog decided to repeatedly take my content and twist it in their own nasty words. 
This is unwanted harassment and violation of community guidelines, but Tumblr decides to do ABSOLUTELY NOTHING. 
I sent in a support request January 21, 2019. I received NO reply whatsoever and followed up on February 5, 2019 to which I receive a reply to on February 11, 2019 with the following:
Hello,
Thank you for writing in. It sounds like you may be reporting a violation of our Community Guidelines instead of a technical issue on the site. In order to more quickly and efficiently handle this, we ask that you use our online Abuse forms located here:
https://www.tumblr.com/abuse
To report a post in the mobile apps, just click the share icon (that paper airplane) and choose "Report." That'll open the form and you can tell us what you're reporting from there. To report an entire blog, tap the blog's username to view their blog, then tap the little human icon, and then tap "Report."
Please select the form that most closely corresponds to the violation you’re reporting, which will help us correctly route your complaint.
We appreciate you taking the time to write in.
Jay,
Tumblr Community Support
_
That was a slap in the face. It was a response that was like, oh, seems like you sent in your request to the wrong department, here, try again, I won’t bother reading your request and helping you to route it to the right place and resolve it ASAP. That was terrible customer service for someone who has been a loyal user since 2010. Thanks a lot, Justin. Or as you called yourself in the email, Jay. 
After reporting abuse and writing the description of the abuse AGAIN, I received an automatic message from Tumblr Trust & Safety with the below message:
Hello,
Thanks for bringing this to our attention. We're checking out the content and will determine an action appropriate to our policies and procedures.
In the meantime, we do strongly suggest you block this user. If you need help doing so, have a look at the docs (
https://www.tumblr.com/docs/en/social#blockactions
). Blocked users can't follow you, can’t see your posts on their dashboard, reblog your posts, like your posts, or do anything else with your posts. You also won't get Asks or Fan Mail from users you've blocked, and you won’t appear in their search results.
Keep in mind that we don’t notify bloggers that you’ve blocked them—although they may realize it if they try to reblog one of your posts, say, and are prevented from doing so.
Tumblr Trust & Safety
Tumblr.com/abuse
_
Gee, thank you so darn much. Can you tell me something actually valuable in less words? The fact that Tumblr could not give less of a crap about this shows just how hopeless Tumblr is. Take a hard look at your own Community Guidelines  and tell me that you’ve done everything you can to uphold all of that. Theblacksinasian blog clearly violated some of these guidelines and they’re still out there, posting away. No suspended account, no consequences. And I’m left here spending countless amount of time and effort in writing something that Tumblr will most likely ignore for the umpteenth time. 
Yeah sure, thank you for being the platform from which I got to experience laughs, meaningful conversations, humor, and encouragement. That can be found elsewhere. This just isn’t worth it. 
Goodbye. 
To all my wonderful mutuals and friends old or new on here, you can find me on Instagram.
467 notes · View notes
bethanygamemaster · 6 years
Text
Found this online
The app was delisted from the store last week without a clear explanation, with Tumblr at the time saying they were "working to resolve an issue with the iOS app and hope to be fully functional again soon".
Similar updates were posted over the next two days, but this week a statement on their website seemed to acknowledge the App Store delisting was not a glitch, rather it was a move to try to stop the spread of exploitative content.
"We're committed to helping build a safe online environment for all users, and we have a zero-tolerance policy when it comes to media featuring child sexual exploitation and abuse," Tumblr's statement read.
A split image of a search screen for Tumblr showing no results, and a search screen for Facebook showing 20 results.
PHOTO Any trace of Tumblr seems to have been scrubbed from Apple's website.
ABC NEWS
"As this is an industry-wide problem, we work collaboratively with our industry peers and partners like NCMEC [National Centre for Missing and Exploited Children] to actively monitor content uploaded to the platform."
The statement said Tumblr cross-checked every image uploaded with a database of known child sexual abuse material, which stops those images from ever reaching the platform.
But it also admitted there was some content live on Tumblr that had not been blocked by that process.
Tumblr is being downloaded, as evidenced by a download progress bar.
PHOTO Tumblr is still available for Android users on Google Play.
ABC NEWS
Tumblr said that content was "immediately removed", but gave no timeline for when the app would be relisted on the App Store.
"Content safeguards are a challenging aspect of operating scaled platforms," the statement read.
"We're continuously assessing further steps we can take to improve and there is no higher priority for our team."
Tumblr terminating questionable accounts
The service also appears to have started shutting down whole accounts for this sort of content.
One user who routinely posts pornographic cartoons was furious to discover his content account had been taken down because it violated the policies around "inappropriate content involving or depicting minors".
"The termination is final and replies to this message are not reviewed," the Tumblr Trust and Safety message read.
"Guess nothing I have to say about this matters," tweeted the user, whose bio says he posts "very NSFW" (not safe for work) art.
Any references to the service appears to have been completely scrubbed from the App Store, with the store returning zero results when Tumblr is searched for.
As of now, it appears Android users are still able to download the microblogging app through Google Play.
5 notes · View notes
timboallthetime · 5 years
Text
From Wired.com author Paris Martineau
TUMBLR WAS NEVER explicitly a space for porn, but, like most things on the internet, it is chock full of it anyway. Or at least it was. On Monday, to the shock of the millions of users who had used the microblogging site to consume and share porn GIFs, images, and videos, Tumblr banned the “adult content” that its CEO, David Karp, had defended five years prior. In the hours after the announcement, sex workers panicked, users threatened to leave, and—in classic Tumblr fashion—online petitions calling for change gained hundreds of thousands of signatures. But Tumblr’s porn ban isn’t about porn or Tumblr at all, really. It’s about the companies and institutions who wield influence over what does and doesn’t appear online.
When Melissa Drew, an adult content creator and model, logged in to Tumblr Monday afternoon, she was greeted by a deluge of unfamiliar posts and notifications. Her usual feed, perfectly curated after nearly a decade of tinkering, was awash with panicked posts from fellow adult models, memes about the policy change, and goodbye posts. Drew’s personal blog, which she had relied on as a public-facing way to tease the content available to her Patreon subscribers, was lit up with notifications from Tumblr informing her that most of her posts violated the new rules.
When Yahoo bought Tumblr for $1.1 billion in 2013, critics warned that premium advertisers wouldn't exactly be clamoring to run ads in a sea of porn. Yahoo CEO Marissa Mayer disagreed, arguing that targeting tools would keep content that isn't "brand-safe" (read: porn) away from ads. In an interview shortly after the acquisition, Karp doubled down on the platform’s openness to porn. “When you have … any number of very talented photographers posting tasteful photography,” said Karp, “I don't want to have to go in there to draw the line between this photo and the behind the scenes photo of Lady Gaga and like, her nip."
But the nip line has indeed been drawn, and it’s a doozy. Though Tumblr’s Monday announcement had technically only prohibited depictions of sex acts, “human genitalia,” and “female-presenting nipples,” a much wider swath of Drew’s feed was quickly caught up in the ban. “I had everything from nude, censored nudity, and lingerie photos flagged,” she told WIRED. “I still haven’t dealt with removing all of them yet—I just sort of heavy sighed and closed the tab.”
Safe Space
In interviews and messages with WIRED, more than 30 sex workers, porn consumers, and creators on Tumblr lamented the loss of what they described as a unique, safe space for curated sexually themed GIFs, photos, and videos. Many users who had used the microblogging site as their primary source for porn were at a loss when asked where they would go after Tumblr’s ban on “adult content” goes into effect on December 17. For the thousands of sex workers who used the site to share their own explicit content in a controlled, relatively contained manner—not to mention the countless others who used that content to fill the hyper-curated feeds of some of the site’s most popular porn blogs—the crackdown’s consequences are even more difficult to unpack. And researchers say the ban could shrink Tumblr’s user base, which already appears in turmoil over the decision.
The move comes less than two weeks after Apple pulled Tumblr from the iOS App Store after child pornography was found on the site. Though the offending illegal content was removed quickly, according to Tumblr, the app has yet to return to the App Store (it was never removed from the Google Play Store). In its most recent blog post, Tumblr stated that its longstanding no-tolerance policy against child pornography should not be conflated with the move to ban adult content. The latter, Tumblr argued, was inspired by a drive to create “a better Tumblr.” But these sorts of decisions aren’t made in a vacuum.
In March, Congress passed the Fight Online Sex Trafficking Act and the Stop Online Sex Trafficking Act (or FOSTA-SESTA for short). Lawmakers hailed the law as a means to give prosecutors more tools to combat combat sex trafficking. But the statute also tinkered with a bedrock provision of internet law, opening the door for platforms to be held criminally and civilly liable for the actions of their users. The law’s passage immediately led to the closure of several sex-work-related online venues, such as Craigslist’s personals section, numerous subreddits, and Patreon’s support for adult creators.
In interviews, more than a dozen sex workers told WIRED that the support and openness of adult-content creators on Tumblr had attracted them to the site. They described the site as notably more empowering and friendly than more traditional venues for explicit content, like PornHub, and relished Tumblr’s freedom and opportunity for virality. Many used the site to promote their paid content on other sites, interact with fellow sex workers, and screen clients.
Liara Roux, a sex worker and online political organizer, told WIRED via email that the options for finding adult content online are diminishing, and consolidating with big companies “and away from community generated content and independent creator friendly platforms." This, Roux says, is dangerous: "As mainstream sites slowly remove sexual content, which often is how queer and other marginalized communities are able to connect, it will become difficult for both sex workers and these communities to have an online space to exist."
App Store Sway
The ubiquity of the App Store gives Apple considerable influence over online content. Historically, Apple has taken a sex-negative approach to policing, with former CEO Steve Jobs once famously stating that “folks who want porn can buy an Android phone,” but the company’s influence extends beyond the iOS sphere. In the days after it was delisted from the App Store, Tumblr ramped up its moderation efforts, erroneously deleting numerous SFW and NSFW Tumblr accounts unrelated to child exploitation and abuse, and pushed out an update for its Android app that forced all users to have “safe mode” toggled on, restricting access to explicit content. Then came the porn ban.
Apple’s App Store review guidelines prohibit apps "with user-generated content or services that end up being used primarily for pornographic content.” The key word here is “primarily,” as all popular social media platforms are rife with porn, but Apple mostly seems to care about how easily accessible it is. Case in point: Apple pulled a number of Reddit apps from the App Store in 2016, citing issues with an NSFW content toggle, which, when activated, could be used to view pornographic content. The iOS Reddit apps currently available on the App Store have built-in features that prohibit porn subreddits from appearing in search, and make it extremely difficult for users to find NSFW content in-app without labor-intensive workarounds.
Tumblr likely could have instituted a similar content-censorship-based workaround, but it didn’t. Instead, it has gone with what appears to be an all-out-ban of “photos, videos, or GIFs that show real-life human genitals or female-presenting nipples, and any content—including photos, videos, GIFs and illustrations—that depicts sex acts.” Photos of nipples that appear to belong to a female-identifying person—which may be a difficult category to define, judging by the attempts of other platforms, like Instagram, to do the same—may be permitted so long as they are shared as part of a non-sexual context like a post showcasing breastfeeding, pre- or post-birth, post-mastectomy, or gender confirmation surgery. Written erotic content, along with nudity in art—“such as sculptures and illustrations,” says Tumblr—are alright, too. To draw the distinctions, the company says it will use a mix of machine learning and human moderation, and that all appeals for posts erroneously flagged as adult will be reviewed by “real, live human eye(s).”
Tumblr users are already finding problems with the flagging system. Classical paintings of Jesus Christ were flagged, as were photos and GIFs of fully clothed people, patents for footwear, line drawings of landscape scenes, discussions about LGBTQ+ issues and more. “In its early stages, Tumblr is using a poor system for flagging content,” says Abigail Oakley, a researcher at Arizona State University with a focus on Tumblr communities. “Having a certain number of flags on your blog (regardless of their validity) also removes the blog from Google searches, which is another form of censorship.”
Casey Fiesler, an assistant professor at the University of Colorado who specializes in fandom culture on platforms like Tumblr, thinks Tumblr’s crackdown will likely lead to a mass exodus of users. Fielser also worries that the LGBTQ people and sexual assault survivors who use Tumblr as a space for support could inadvertently be affected by the ban. “Even for fandom participants who don't create adult content themselves, this kind of policy feels like an attack on the community,” she said.
Consumers and creators of porn on Tumblr aren't entirely sure where they'll go next. While many sex workers mentioned Reddit and Twitter as two popular alternatives, they lamented the platforms' lack of community and sex-positive culture. Similarly, more than a dozen people who said they had used Tumblr as their primary source of porn for years praised the site's social, judgment-free culture, which many cited as helping them understand their sexual orientation. "The technology is out there to allow users to continue browsing a variety of porn from independent producers, but it will require a shift back in how people think of the internet," said Roux, the sex worker and political organizer. "I'm hoping these kinds of controversies encourage content creators to take as much control over their distribution channels as possible and inspires tech companies to solve the issue of bringing content to users without hosting the content themselves."
1 note · View note
saizoswifey · 7 years
Note
Not the asshole anon, but what can be share within the rules of Tenka? Or what can't you do?
Hey nonnie! I will link the direct post from Voltage HERE. But basically this; 
“So, as long as they are not for profit, we do allow the following in regard to Samurai Love Ballad: PARTY:
★Edited/modified portrait images. For example, Tumblr or blog layouts for personal/fandom use that are distributed in the fandom on a not-for-profit basis which use the portraits as a base, but without using entire portrait images, image-based memes and captions (written directly on the image and unable to be modified), gifsets, icons…we are trusting you to use your personal judgement in this arena. However, please do understand that we reserve the right to do the same, and if it is discovered that this allowance is being abused, we may revoke the right to share even edited portraits on Tumblr.
If you do decide to make an edit of our images in this way, we would greatly appreciate credit for Samurai Love Ballad: PARTY in the post that it is made. Ideally, we would like links to the app download from the Apple App Store and/or Google Play on posts which contain modified portrait images, or at the very least a link to us here at @voltageparty  (though this is not mandatory, just a courtesy to us).
★In-Game Screencaptures (including Story Sections)
We know that one of the most fun parts about having a Tumblr about our games is to share parts of the story you found interesting, exciting or moving–and we do not wish to stop you! You are free to share screencaptures from the story as you wish, as long as spoilers are tagged appropriately and full story sections are not shared (for example, the entirety of an ending, Main Story, Event Story, or Tea Garden story).
Of course, we would appreciate game credits using the links above on these posts as well, but it is not mandatory!
You are absolutely free (and encouraged)! to share in this way, whether it be snippets of story or showing off your in-game character/Fashion Items/Castle/Home Screen!
We love to see how our users are enjoying Samurai Love Ballad: PARTY, which characters and features are favorites, and see that our work is being enjoyed and discussed by our fans!
We hope that this answers all of your questions, and please enjoy Samurai Love Ballad: PARTY!”
-------------
I 100% respect Voltage as a company and I never EVER want to take away from the amazing and hard work they do to bring us these stories. Seriously. Let me get that straight. 
We all play otome games for our own reasons, but these stories seriously help me a lot in my own life as well. I make sure I know the rules, I respect their wishes, and I would hate it if they killed this game. It’s my favorite and I have spent....god, I don’t honestly want to know how many hundreds of dollars playing. 
I could do better, like adding links to my posts, but honestly I make sure that I don’t ever post full stories or chapters, I think I’ve only posted one CG...and I made sure it was hecka cropped. I’m not a jerk, I am not careless about this. I honestly just want to fangirl with fellow players...plain and simple. I think some people have the mindset that just because it’s not available in the US yet that no one should post anything about Tenka...when in fact a good portion play it and just as with any route or ES or battle, there are freak out posts following. Hell, Nobunagas second act was all over the place when it released and I hadn’t even gotten half way through LOL.  
It’s just upsetting when I make sure I adhere to rules and someone who disagrees comes at me on anon instead of just being an adult and DMing me. I can handle it. We all love this fucking game, we all don’t want it to be ruined. I respect that. There have been times I get upset seeing videos of gameplay ETC but if I have an issue I go direct with someone. If you have an issue with what I am posting, come to me face to face and tell me “hey dude, this isn’t cool.,” and we can hash it out. I’m not an evil person I am willing to hear anyone out even if they have a differing opinion. I won’t hate you, I won’t lash out at you for confronting me and telling me upfront. 
I am not a perfect person, either. I may think I am adhering to rules and you prove me wrong. I make mistakes and I can own up to them so...dang, if I posted something against the rules this hurts me just as much as it does you, so TELL ME and I can REMOVE IT. Don’t just get petty and report me then anon me to let me know....
18 notes · View notes
The reason Tumblr vanished from the App Store: Child pornography that slipped through the filters
Tumblr media
In November 2018 Tumblr's app mysteriously disappeared from Apple's App Store
  This was due to Indecent Images of Children (IIOC) slipping through their filters   Computer Forensics & Mobile Phone Forensics are often used in Police & Defence Cases to Investigate any incident involving Indecent Images of Children (IIOC). Those of you who were looking to download the Tumblr app on your iPhone or iPad were unable to get it in November 2018. But the app's vanishing act isn't the result of a technical issue or glitch. Through independent sources, Download.com learned that the app was removed due to child pornography that got past the site's filters. When Download.com presented these findings to Tumblr, a company spokesperson responded with the following statement: "We're committed to helping build a safe online environment for all users, and we have a zero tolerance policy when it comes to media featuring child sexual exploitation and abuse. As this is an industry-wide problem, we work collaboratively with our industry peers and partners like (NCMEC) to actively monitor content uploaded to the platform. Every image uploaded to Tumblr is scanned against an industry database of known child sexual abuse material, and images that are detected never reach the platform. A routine audit discovered content on our platform that had not yet been included in the industry database. We immediately removed this content. Content safeguards are a challenging aspect of operating scaled platforms. We're continuously assessing further steps we can take to improve and there is no higher priority for our team." This confirmation helps explain why the Tumblr app was removed so suddenly and why there was little explanation from Tumblr or from Apple, since child pornography is a matter that needs to be coordinated with law enforcement. In the following statement sent to Download.com, a spokesperson for the NCMEC explained the role that the organization serves in fighting online child pornography and exploitation "The National Center for Missing & Exploited Children operates the CyberTipline, which serves as the nation's centralized reporting system for online child sexual exploitation. Members of the public and Electronic Service Providers (ESPs) report instances of online child sexual exploitation to the CyberTipline. Last year, NCMEC received more than 10 million reports to its CyberTipline with the vast majority of those reports submitted by ESPs. NCMEC recognizes that global efforts to reduce the proliferation of online child sexual exploitation online requires an industry-wide effort and applauds all ESPs that engage in voluntary efforts to provide content safeguards for their users. In addition to receiving reports from ESPs, any member of the public who comes across suspected child abuse imagery is encouraged to make a CyberTipline report to NCMEC. In a follow-up statement, the NCMEC spokesperson also explained the specific steps that Electronic Service Providers such as Tumblr take to try to block child pornography. "NCMEC offers several voluntary initiatives to ESPs who choose to take extra steps to reduce the distribution of online sexual abuse material on their systems. One of NCMEC's initiatives involves facilitating the sharing of hashes of apparent child pornography images among ESPs. Some ESPs and social networks will participate in this initiative to reduce child sexual exploitation images online and some ESPs will rely on other programs and methods to remove such illegal content from their servers." Tumblr's disappearance was first spotted by PiunikaWeb, which reported on November 16 that users with the iOS parent control features enabled were unable to find the app. Shortly after that, the app vanished completely from the App Store. Tumblr's help center site noted the disappearance in a statement: "We're working to resolve an issue with the iOS app and hope to be fully functional again soon. We really appreciate your patience as we figure this out, and we'll update this article when we have news to share," the company said. Through November 18, the company's message on its help center was that its team was still working on the issue with the app. After this article was first published on November 19, Tumblr updated it to include the statement above that its spokesperson presented to Download.com. Initially, neither Tumblr nor Apple publicly revealed the nature of the "issue." But speculation from a marketing professional on Twitter and 9to5Mac had suggested the app was removed due to inappropriate content in violation of Apple's App Store guidelines. This isn't the first time Tumblr has run into this type of problem. In March 2018, the Indonesian government briefly blocked Tumblr over the company's failure to remove pornographic content from its service. In 2017, South Korea asked Tumblr to take down certain pornographic content. The company initially rejected that request but eventually promised to better monitor the spread of adult content. Download.com contacted Apple and will add to this story with any updates. For now, iOS users who previously downloaded Tumblr may be able to get it again by checking the purchase history on their device. iPhone and iPad owners who want to use Tumblr for the first time will have to wait until the app is reinstated. The app is still available at Google Play for Android users.
Takeaways
The Tumblr app was removed from the Apple App Store because of child pornography that slipped through the company's content filters. Tumblr has removed the offending content and is working with Apple to get the app reinstated. In the meantime, the app remains unavailable to download from the App Store. Computer Forensics & Mobile Phone Forensics are often used in Police & Defence Cases to Investigate any incident involving Indecent Images of Children (IIOC).
Tumblr media
https://download.cnet.com/news/the-reason-tumblr-vanished-from-the-app-store-child-pornography-that-slipped-through-the-filters/   Read the full article
0 notes
spicynbachili1 · 6 years
Text
Tumblr removed from Apple app store over abuse images
Picture copyright Getty Photos
Picture caption Tumblr stated it found baby sexual abuse photos throughout a routine audit
Tumblr has been faraway from Apple’s app retailer as a result of it let some customers put up photos of kid sexual abuse.
The social community’s app was eliminated on 16 November however the cause for it being unavailable has solely simply come to mild.
In a press release it stated the unlawful photos acquired by means of as a result of its filters failed to identify them.
It stated getting the app re-listed was a “precedence” however has given no date for when it could be accessible.
Tumblr gave extra particulars about why it was taken off Apple’s retailer after being approached by tech information website Cnet with a declare that indecent photos of youngsters have been the trigger.
In its assertion, it stated that every one photos uploaded to Tumblr move by means of a database of “recognized baby sexual abuse materials”. It added that any matches to the database are detected and deleted – and would by no means seem on its platform.
On this case, it stated, the photographs being uploaded weren’t within the business database so its filters didn’t catch them. Nonetheless, it stated, the unlawful content material was found throughout a routine audit.
“We instantly eliminated this content material,” it stated, and added: “Content material safeguards are a difficult facet of working scaled platforms.”
Tumblr has a repute for permitting sexually-themed materials to be shared on its service. This led to it being banned for a day in Indonesia over the mature content material. South Korea has additionally requested it to do a greater job of moderating grownup content material on the service.
from SpicyNBAChili.com http://spicymoviechili.spicynbachili.com/tumblr-removed-from-apple-app-store-over-abuse-images/
0 notes
ishipfruk · 6 years
Text
Tumblr removed from Apple app store over abuse images
The app was removed from Apple's store after images showing child sexual abuse were discovered. from BBC News - Technology https://ift.tt/2OT12vi http://www.kindlecompared.com/best-price-kindle/
0 notes
Link
Google has scrambled to remove third-party apps that led users to child porn sharing groups on WhatsApp in the wake of TechCrunch’s report about the problem last week. We contacted Google with the name of one these apps and evidence that it and others offered links to WhatsApp groups for sharing child exploitation imagery. Following publication of our article, Google removed that app and at least five like it from the Google Play store. Several of these apps had over 100,000 downloads, and they’re still functional on devices that already downloaded them.
A screenshot from today of active child exploitation groups on WhatsApp. Phone numbers and photos redacted
WhatsApp failed to adequately police its platform, confirming to TechCrunch that it’s only moderated by its own 300 employees and not Facebook’s 20,000 dedicated security and moderation staffers. It’s clear that scalable and efficient artificial intelligence systems are not up to the task of protecting the 1.5 billion user WhatsApp community, and companies like Facebook must invest more in unscalable human investigators.
But now, new research provided exclusively to TechCrunch by anti-harassment algorithm startup AntiToxin shows that these removed apps that hosted links to child porn sharing rings on WhatsApp were supported with ads run by Google and Facebook’s ad networks. AntiToxin found 6 of these apps ran Google AdMob, 1 ran Google Firebase, 2 ran Facebook Audience Network, and 1 ran StartApp. These ad networks earned a cut of brands’ marketing spend while allowing the apps to monetize and sustain their operations by hosting ads for Amazon, Microsoft, Motorola, Sprint, Sprite, Western Union, Dyson, DJI, Gett, Yandex Music, Q Link Wireless, Tik Tok, and more.
The situation reveals that tech giants aren’t just failing to spot offensive content in their own apps, but also in third-party apps that host their ads and that earn them money. While these apps like “Group Links For Whats” by Lisa Studio let people discover benign links to WhatsApp groups for sharing legal content and discussing topics like business or sports, TechCrunch found they also hosted links with titles such as “child porn only no adv” and “child porn xvideos” that led to WhatsApp groups with names like “Children ” or “videos cp” — a known abbreviation for ‘child pornography’.
WhatsApp has an encrypted child porn problem
In a video provided by AntiToxin seen below, the app “Group Links For Whats by Lisa Studio” that ran Google AdMob is shown displaying an interstitial ad for Q Link Wireless before providing WhatsApp group search results for “child”. A group described as “Child nude FBI POLICE” is surfaced, and when the invite link is clicked, it opens within WhatsApp to a group called “Children ”.  (No illegal imagery is shown in this video or article. TechCrunch has omitted the end of the video that showed a URL for an illegal group and the phone numbers of its members.)
Another video shows the app “Group Link For whatsapp by Video Status Zone” that ran Google AdMob and Facebook Audience Network displaying a link to a WhatsApp group described as “only cp video”. When tapped, the app first surfaces an interstitial ad for Amazon Photos before revealing a button for opening the group within WhatsApp. These videos show how alarmingly easy it was for people to find illegal content sharing groups on WhatsApp, even without WhatsApp’s help.
Zero Tolerance Doesn’t Mean Zero Illegal Content
In response, a Google spokesperson tells me that these group discovery apps violated its content policies and it’s continuing to look for more like them to ban. When they’re identified and removed from Google Play, it also suspends their access to its ad networks. However, it refused to disclose how much money these apps earned and whether it would refund the advertisers. The company provided this statement:
“Google has a zero tolerance approach to child sexual abuse material and we’ve invested in technology, teams and partnerships with groups like the National Center for Missing and Exploited Children, to tackle this issue for more than two decades. If we identify an app promoting this kind of material that our systems haven’t already blocked, we report it to the relevant authorities and remove it from our platform. These policies apply to apps listed in the Play store as well as apps that use Google’s advertising services.”
App Developer Ad Network Estimated Installs   Last Day Ranked Unlimited Whats Groups Without Limit Group links   Jack Rehan Google AdMob 200,000 12/18/2018 Unlimited Group Links for Whatsapp NirmalaAppzTech Google AdMob 127,000 12/18/2018 Group Invite For Whatsapp Villainsbrain Google Firebase 126,000 12/18/2018 Public Group for WhatsApp Bit-Build Google AdMob, Facebook Audience Network   86,000 12/18/2018 Group links for Whats – Find Friends for Whats Lisa Studio Google AdMob 54,000 12/19/2018 Unlimited Group Links for Whatsapp 2019 Natalie Pack Google AdMob 3,000 12/20/2018 Group Link For whatsapp Video Status Zone   Google AdMob, Facebook Audience Network 97,000 11/13/2018 Group Links For Whatsapp – Free Joining Developers.pk StartAppSDK 29,000 12/5/2018
Facebook meanwhile blamed Google Play, saying the apps’ eligibility for its Facebook Audience Network ads was tied to their availability on Google Play and that the apps were removed from FAN when booted from the Android app store. The company was more forthcoming, telling TechCrunch it will refund advertisers whose promotions appeared on these abhorrent apps. It’s also pulling Audience Network from all apps that let users discover WhatsApp Groups.
A Facebook spokesperson tells TechCrunch that “Audience Network monetization eligibility is closely tied to app store (in this case Google) review. We removed [Public Group for WhatsApp by Bit-Build] when Google did – it is not currently monetizing on Audience Network. Our policies are on our website and out of abundance of caution we’re ensuring Audience Network does not support any group invite link apps. This app earned very little revenue (less than $500), which we are refunding to all impacted advertisers.”
Facebook also provided this statement about WhatsApp’s stance on illegal imagery sharing groups and third-party apps for finding them:
“WhatsApp does not provide a search function for people or groups – nor does WhatsApp encourage publication of invite links to private groups. WhatsApp regularly engages with Google and Apple to enforce their terms of service on apps that attempt to encourage abuse on WhatsApp. Following the reports earlier this week, WhatsApp asked Google to remove all known group link sharing apps. When apps are removed from Google Play store, they are also removed from Audience Network.”
An app with links for discovering illegal WhatsApp Groups runs an ad for Amazon Photos
Israeli NGOs Netivei Reshet and Screen Savers worked with AntiToxin to provide a report published by TechCrunch about the wide extent of child exploitation imagery they found on WhatsApp. Facebook and WhatsApp are still waiting on the groups to work with Israeli police to provide their full research so WhatsApp can delete illegal groups they discovered and terminate user accounts that joined them.
AntiToxin develops technologies for protecting online networks harassment, bullying, shaming, predatory behavior and sexually explicit activity. It was co-founded by Zohar Levkovitz who sold Amobee to SingTel for $400M, and Ron Porat who was the CEO of ad-blocker Shine. [Disclosure: The company also employs Roi Carthy, who contributed to TechCrunch from 2007 to 2012.] “Online toxicity is at unprecedented levels, at unprecedented scale, with unprecedented risks for children, which is why completely new thinking has to be applied to technology solutions that help parents keep their children safe” Levkovitz tells me. The company is pushing Apple to remove WhatsApp from the App Store until the problems are fixed, citing how Apple temporarily suspended Tumblr due to child pornography.
Ad Networks Must Be Monitored
Encryption has proven an impediment to WhatsApp preventing the spread of child exploitation imagery. WhatsApp can’t see what is shared inside of group chats. Instead it has to rely on the few pieces of public and unencrypted data such as group names and profile photos plus their members’ profile photos, looking for suspicious names or illegal images. The company matches those images to a PhotoDNA database of known child exploitation photos to administer bans, and has human moderators investigate if seemingly illegal images aren’t already on file. It then reports its findings to law enforcement and the National Center For Missing And Exploited Children. Strong encryption is important for protecting privacy and political dissent, but also thwarts some detection of illegal content and thereby necessitates more manual moderation.
With just 300 total employees and only a subset working on security or content moderation, WhatsApp seems understaffed to manage such a large user base. It’s tried to depend on AI to safeguard its community. However, that technology can’t yet perform the nuanced investigations necessary to combat exploitation. WhatsApp runs semi-independently of Facebook, but could hire more moderators to investigate group discovery apps that lead to child pornography if Facebook allocated more resources to its acquisition.
WhatsApp group discovery apps featured Adult sections that contained links to child exploitation imagery groups
Google and Facebook, with their vast headcounts and profit margins, are neglecting to properly police who hosts their ad networks. The companies have sought to earn extra revenue by powering ads on other apps, yet failed to assume the necessary responsibility to ensure those apps aren’t facilitating crimes. Stricter examinations of in-app content should be administered before an app is accepted to app stores or ad networks, and periodically once they’re running. And when automated systems can’t be deployed, as can be the case with policing third-party apps, human staffers should be assigned despite the cost.
It’s becoming increasingly clear that social networks and ad networks that profit off of other people’s content can’t be low-maintenance cash cows. Companies should invest ample money and labor into safeguarding any property they run or monetize even if it makes the opportunities less lucrative. The strip-mining of the internet without regard for consequences must end.
from Mobile – TechCrunch https://tcrn.ch/2T8FvB2 ORIGINAL CONTENT FROM: https://techcrunch.com/
0 notes
fmservers · 5 years
Text
Google & Facebook fed ad dollars to child porn discovery apps
Google has scrambled to remove third-party apps that led users to child porn sharing groups on WhatsApp in the wake of TechCrunch’s report about the problem last week. We contacted Google with the name of one these apps and evidence that it and others offered links to WhatsApp groups for sharing child exploitation imagery. Following publication of our article, Google removed that app and at least five like it from the Google Play store. Several of these apps had over 100,000 downloads, and they’re still functional on devices that already downloaded them.
A screenshot from today of active child exploitation groups on WhatsApp . Phone numbers and photos redacted
WhatsApp failed to adequately police its platform, confirming to TechCrunch that it’s only moderated by its own 300 employees and not Facebook’s 20,000 dedicated security and moderation staffers. It’s clear that scalable and efficient artificial intelligence systems are not up to the task of protecting the 1.5 billion user WhatsApp community, and companies like Facebook must invest more in unscalable human investigators.
But now, new research provided exclusively to TechCrunch by anti-harassment algorithm startup AntiToxin shows that these removed apps that hosted links to child porn sharing rings on WhatsApp were supported with ads run by Google and Facebook’s ad networks. AntiToxin found 6 of these apps ran Google AdMob, 1 ran Google Firebase, 2 ran Facebook Audience Network, and 1 ran StartApp. These ad networks earned a cut of brands’ marketing spend while allowing the apps to monetize and sustain their operations by hosting ads for Amazon, Microsoft, Motorola, Sprint, Sprite, Western Union, Dyson, DJI, Gett, Yandex Music, Q Link Wireless, Tik Tok, and more.
The situation reveals that tech giants aren’t just failing to spot offensive content in their own apps, but also in third-party apps that host their ads and that earn them money. While these apps like “Group Links For Whats” by Lisa Studio let people discover benign links to WhatsApp groups for sharing legal content and discussing topics like business or sports, TechCrunch found they also hosted links with titles such as “child porn only no adv” and “child porn xvideos” that led to WhatsApp groups with names like “Children ” or “videos cp” — a known abbreviation for ‘child pornography’.
WhatsApp has an encrypted child porn problem
In a video provided by AntiToxin seen below, the app “Group Links For Whats by Lisa Studio” that ran Google AdMob is shown displaying an interstitial ad for Q Link Wireless before providing WhatsApp group search results for “child”. A group described as “Child nude FBI POLICE” is surfaced, and when the invite link is clicked, it opens within WhatsApp to a group called “Children ”.  (No illegal imagery is shown in this video or article. TechCrunch has omitted the end of the video that showed a URL for an illegal group and the phone numbers of its members.)
Another video shows the app “Group Link For whatsapp by Video Status Zone” that ran Google AdMob and Facebook Audience Network displaying a link to a WhatsApp group described as “only cp video”. When tapped, the app first surfaces an interstitial ad for Amazon Photos before revealing a button for opening the group within WhatsApp. These videos show how alarmingly easy it was for people to find illegal content sharing groups on WhatsApp, even without WhatsApp’s help.
youtube
Zero Tolerance Doesn’t Mean Zero Illegal Content
In response, a Google spokesperson tells me that these group discovery apps violated its content policies and it’s continuing to look for more like them to ban. When they’re identified and removed from Google Play, it also suspends their access to its ad networks. However, it refused to disclose how much money these apps earned and whether it would refund the advertisers. The company provided this statement:
“Google has a zero tolerance approach to child sexual abuse material and we’ve invested in technology, teams and partnerships with groups like the National Center for Missing and Exploited Children, to tackle this issue for more than two decades. If we identify an app promoting this kind of material that our systems haven’t already blocked, we report it to the relevant authorities and remove it from our platform. These policies apply to apps listed in the Play store as well as apps that use Google’s advertising services.”
App Developer Ad Network Estimated Installs   Last Day Ranked Unlimited Whats Groups Without Limit Group links   Jack Rehan Google AdMob 200,000 12/18/2018 Unlimited Group Links for Whatsapp NirmalaAppzTech Google AdMob 127,000 12/18/2018 Group Invite For Whatsapp Villainsbrain Google Firebase 126,000 12/18/2018 Public Group for WhatsApp Bit-Build Google AdMob, Facebook Audience Network   86,000 12/18/2018 Group links for Whats – Find Friends for Whats Lisa Studio Google AdMob 54,000 12/19/2018 Unlimited Group Links for Whatsapp 2019 Natalie Pack Google AdMob 3,000 12/20/2018 Group Link For whatsapp Video Status Zone   Google AdMob, Facebook Audience Network 97,000 11/13/2018 Group Links For Whatsapp – Free Joining Developers.pk StartAppSDK 29,000 12/5/2018
Facebook meanwhile blamed Google Play, saying the apps’ eligibility for its Facebook Audience Network ads was tied to their availability on Google Play and that the apps were removed from FAN when booted from the Android app store. The company was more forthcoming, telling TechCrunch it will refund advertisers whose promotions appeared on these abhorrent apps. It’s also pulling Audience Network from all apps that let users discover WhatsApp Groups.
A Facebook spokesperson tells TechCrunch that “Audience Network monetization eligibility is closely tied to app store (in this case Google) review. We removed [Public Group for WhatsApp by Bit-Build] when Google did – it is not currently monetizing on Audience Network. Our policies are on our website and out of abundance of caution we’re ensuring Audience Network does not support any group invite link apps. This app earned very little revenue (less than $500), which we are refunding to all impacted advertisers.”
Facebook also provided this statement about WhatsApp’s stance on illegal imagery sharing groups and third-party apps for finding them:
“WhatsApp does not provide a search function for people or groups – nor does WhatsApp encourage publication of invite links to private groups. WhatsApp regularly engages with Google and Apple to enforce their terms of service on apps that attempt to encourage abuse on WhatsApp. Following the reports earlier this week, WhatsApp asked Google to remove all known group link sharing apps. When apps are removed from Google Play store, they are also removed from Audience Network.”
An app with links for discovering illegal WhatsApp Groups runs an ad for Amazon Photos
Israeli NGOs Netivei Reshet and Screen Savers worked with AntiToxin to provide a report published by TechCrunch about the wide extent of child exploitation imagery they found on WhatsApp. Facebook and WhatsApp are still waiting on the groups to work with Israeli police to provide their full research so WhatsApp can delete illegal groups they discovered and terminate user accounts that joined them.
AntiToxin develops technologies for protecting online networks harassment, bullying, shaming, predatory behavior and sexually explicit activity. It was co-founded by Zohar Levkovitz who sold Amobee to SingTel for $400M, and Ron Porat who was the CEO of ad-blocker Shine. [Disclosure: The company also employs Roi Carthy, who contributed to TechCrunch from 2007 to 2012.] “Online toxicity is at unprecedented levels, at unprecedented scale, with unprecedented risks for children, which is why completely new thinking has to be applied to technology solutions that help parents keep their children safe” Levkovitz tells me. The company is pushing Apple to remove WhatsApp from the App Store until the problems are fixed, citing how Apple temporarily suspended Tumblr due to child pornography.
Ad Networks Must Be Monitored
Encryption has proven an impediment to WhatsApp preventing the spread of child exploitation imagery. WhatsApp can’t see what is shared inside of group chats. Instead it has to rely on the few pieces of public and unencrypted data such as group names and profile photos plus their members’ profile photos, looking for suspicious names or illegal images. The company matches those images to a PhotoDNA database of known child exploitation photos to administer bans, and has human moderators investigate if seemingly illegal images aren’t already on file. It then reports its findings to law enforcement and the National Center For Missing And Exploited Children. Strong encryption is important for protecting privacy and political dissent, but also thwarts some detection of illegal content and thereby necessitates more manual moderation.
With just 300 total employees and only a subset working on security or content moderation, WhatsApp seems understaffed to manage such a large user base. It’s tried to depend on AI to safeguard its community. However, that technology can’t yet perform the nuanced investigations necessary to combat exploitation. WhatsApp runs semi-independently of Facebook, but could hire more moderators to investigate group discovery apps that lead to child pornography if Facebook allocated more resources to its acquisition.
WhatsApp group discovery apps featured Adult sections that contained links to child exploitation imagery groups
Google and Facebook, with their vast headcounts and profit margins, are neglecting to properly police who hosts their ad networks. The companies have sought to earn extra revenue by powering ads on other apps, yet failed to assume the necessary responsibility to ensure those apps aren’t facilitating crimes. Stricter examinations of in-app content should be administered before an app is accepted to app stores or ad networks, and periodically once they’re running. And when automated systems can’t be deployed, as can be the case with policing third-party apps, human staffers should be assigned despite the cost.
It’s becoming increasingly clear that social networks and ad networks that profit off of other people’s content can’t be low-maintenance cash cows. Companies should invest ample money and labor into safeguarding any property they run or monetize even if it makes the opportunities less lucrative. The strip-mining of the internet without regard for consequences must end.
Via Josh Constine https://techcrunch.com
0 notes
topdiyhub · 5 years
Link
WhatsApp chat groups are being used to spread illegal child pornography, cloaked by the app’s end-to-end encryption. Without the necessary number of human moderators, the disturbing content is slipping by WhatsApp’s automated systems. A report from two Israeli NGOs reviewed by TechCrunch details how third-party apps for discovering WhatsApp groups include “Adult” sections that offer invite links to join rings of users trading images of child exploitation. TechCrunch has reviewed materials showing many of these groups are currently active.
TechCrunch’s investigation shows that Facebook could do more to police WhatsApp and remove this kind of content. Even without technical solutions that would require a weakening of encryption, WhatsApp’s moderators should have been able to find these groups and put a stop to them. Groups with names like “child porn only no adv” and “child porn xvideos” found on the group discovery app “Group Links For Whats” by Lisa Studio don’t even attempt to hide their nature. And a screenshot provided by anti-exploitation startup AntiToxin reveals active WhatsApp groups with names like “Children 
Tumblr media Tumblr media Tumblr media
” or “videos cp” — a known abbreviation for ‘child pornography’.
A screenshot from today of active child exploitation groups on WhatsApp. Phone numbers and photos redacted. Provided by AntiToxin.
Better manual investigation of these group discovery apps and WhatsApp itself should have immediately led these groups to be deleted and their members banned. While Facebook doubled its moderation staff from 10,000 to 20,000 in 2018 to crack down on election interference, bullying and other policy violations, that staff does not moderate WhatsApp content. With just 300 employees, WhatsApp runs semi-independently, and the company confirms it handles its own moderation efforts. That’s proving inadequate for policing a 1.5 billion-user community.
The findings from the NGOs Screen Savers and Netivei Reshe were written about today by Financial Times, but TechCrunch is publishing the full report, their translated letter to Facebook, translated emails with Facebook, their police report, plus the names of child pornography groups on WhatsApp and group discovery apps listed above. A startup called AntiToxin Technologies that researches the topic has backed up the report, providing the screenshot above and saying it’s identified more than 1,300 videos and photographs of minors involved in sexual acts on WhatsApp groups. Given that Tumblr’s app was recently temporarily removed from the Apple App Store for allegedly harboring child pornography, we’ve asked Apple if it will temporarily suspend WhatsApp, but have not heard back. 
View this document on Scribd
Uncovering a nightmare
In July 2018, the NGOs became aware of the issue after a man reported to one of their hotlines that he’d seen hardcore pornography on WhatsApp. In October, they spent 20 days cataloging more than 10 of the child pornography groups, their content and the apps that allow people to find them.
The NGOs began contacting Facebook’s head of Policy, Jordana Cutler, starting September 4th. They requested a meeting four times to discuss their findings. Cutler asked for email evidence but did not agree to a meeting, instead following Israeli law enforcement’s guidance to instruct researchers to contact the authorities. The NGO reported their findings to Israeli police but declined to provide Facebook with their research. WhatsApp only received their report and the screenshot of active child pornography groups today from TechCrunch.
Listings from a group discovery app of child exploitation groups on WhatsApp. URLs and photos have been redacted.
WhatsApp tells me it’s now investigating the groups visible from the research we provided. A Facebook spokesperson tells TechCrunch, “Keeping people safe on Facebook is fundamental to the work of our teams around the world. We offered to work together with police in Israel to launch an investigation to stop this abuse.” A statement from the Israeli Police’s head of the Child Online Protection Bureau, Meir Hayoun, notes that: “In past meetings with Jordana, I instructed her to always tell anyone who wanted to report any pedophile content to contact the Israeli police to report a complaint.”
A WhatsApp spokesperson tells me that while legal adult pornography is allowed on WhatsApp, it banned 130,000 accounts in a recent 10-day period for violating its policies against child exploitation. In a statement, WhatsApp wrote that:
WhatsApp has a zero-tolerance policy around child sexual abuse. We deploy our most advanced technology, including artificial intelligence, to scan profile photos and images in reported content, and actively ban accounts suspected of sharing this vile content. We also respond to law enforcement requests around the world and immediately report abuse to the National Center for Missing and Exploited Children. Sadly, because both app stores and communications services are being misused to spread abusive content, technology companies must work together to stop it.
But it’s that over-reliance on technology and subsequent under-staffing that seems to have allowed the problem to fester. AntiToxin’s CEO Zohar Levkovitz tells me, “Can it be argued that Facebook has unwittingly growth-hacked pedophilia? Yes. As parents and tech executives we cannot remain complacent to that.”
Automated moderation doesn’t cut it
WhatsApp introduced an invite link feature for groups in late 2016, making it much easier to discover and join groups without knowing any members. Competitors like Telegram had benefited as engagement in their public group chats rose. WhatsApp likely saw group invite links as an opportunity for growth, but didn’t allocate enough resources to monitor groups of strangers assembling around different topics. Apps sprung up to allow people to browse different groups by category. Some usage of these apps is legitimate, as people seek communities to discuss sports or entertainment. But many of these apps now feature “Adult” sections that can include invite links to both legal pornography-sharing groups as well as illegal child exploitation content.
A WhatsApp spokesperson tells me that it scans all unencrypted information on its network — basically anything outside of chat threads themselves — including user profile photos, group profile photos and group information. It seeks to match content against the PhotoDNA banks of indexed child pornography that many tech companies use to identify previously reported inappropriate imagery. If it finds a match, that account, or that group and all of its members, receive a lifetime ban from WhatsApp.
A WhatsApp group discovery app’s listings of child exploitation groups on WhatsApp
If imagery doesn’t match the database but is suspected of showing child exploitation, it’s manually reviewed. If found to be illegal, WhatsApp bans the accounts and/or groups, prevents it from being uploaded in the future and reports the content and accounts to the National Center for Missing and Exploited Children. The one example group reported to WhatsApp by Financial Times was already flagged for human review by its automated system, and was then banned along with all 256 members.
To discourage abuse, WhatsApp says it limits groups to 256 members and purposefully does not provide a search function for people or groups within its app. It does not encourage the publication of group invite links and the vast majority of groups have six or fewer members. It’s already working with Google and Apple to enforce its terms of service against apps like the child exploitation group discovery apps that abuse WhatsApp. Those kind of groups already can’t be found in Apple’s App Store, but remain available on Google Play. We’ve contacted Google Play to ask how it addresses illegal content discovery apps and whether Group Links For Whats by Lisa Studio will remain available, and will update if we hear back. [Update 3pm PT: Google has not provided a comment but the Group Links For Whats app by Lisa Studio has been removed from Google Play. That’s a step in the right direction.]
But the larger question is that if WhatsApp was already aware of these group discovery apps, why wasn’t it using them to track down and ban groups that violate its policies. A spokesperson claimed that group names with “CP” or other indicators of child exploitation are some of the signals it uses to hunt these groups, and that names in group discovery apps don’t necessarily correlate to the group names on WhatsApp. But TechCrunch then provided a screenshot showing active groups within WhatsApp as of this morning, with names like “Children 
Tumblr media Tumblr media Tumblr media
” or “videos cp”. That shows that WhatsApp’s automated systems and lean staff are not enough to prevent the spread of illegal imagery.
The situation also raises questions about the trade-offs of encryption as some governments like Australia seek to prevent its usage by messaging apps. The technology can protect free speech, improve the safety of political dissidents and prevent censorship by both governments and tech platforms. However, it also can make detecting crime more difficult, exacerbating the harm caused to victims.
WhatsApp’s spokesperson tells me that it stands behind strong end-to-end encryption that protects conversations with loved ones, doctors and more. They said there are plenty of good reasons for end-to-end encryption and it will continue to support it. Changing that in any way, even to aid catching those that exploit children, would require a significant change to the privacy guarantees it’s given users. They suggested that on-device scanning for illegal content would have to be implemented by phone makers to prevent its spread without hampering encryption.
But for now, WhatsApp needs more human moderators willing to use proactive and unscalable manual investigation to address its child pornography problem. With Facebook earning billions in profit per quarter and staffing up its own moderation ranks, there’s no reason WhatsApp’s supposed autonomy should prevent it from applying adequate resources to the issue. WhatsApp sought to grow through big public groups, but failed to implement the necessary precautions to ensure they didn’t become havens for child exploitation. Tech companies like WhatsApp need to stop assuming cheap and efficient technological solutions are sufficient. If they want to make money off huge user bases, they must be willing to pay to protect and police them.
from Social – TechCrunch https://tcrn.ch/2AdhQIG via social
0 notes
theinvinciblenoob · 5 years
Link
WhatsApp chat groups are being used to spread illegal child pornography, cloaked by the app’s end-to-end encryption. Without the necessary number of human moderators, the disturbing content is slipping by WhatsApp’s automated systems. A report reviewed by TechCrunch from two Israeli NGOs details how third-party apps for discovering WhatsApp groups include “Adult” sections that offer invite links to join rings of users trading images of child exploitation. TechCrunch has reviewed materials showing many of these groups are currently active.
TechCrunch’s investigation shows that Facebook could do more to police WhatsApp and remove this kind of content. Even without technical solutions that would require a weakening of encryption, WhatsApp’s moderators should have been able to find these groups and put a stop to them. Groups with names like “child porn only no adv” and “child porn xvideos” found on the group discovery app “Group Links For Whats” by Lisa Studio don’t even attempt to hide their nature. And a screenshot provided by anti-exploitation startup AntiToxin reveals active WhatsApp groups with names like “Children ” or “videos cp” — a known abbreviation for ‘child pornography’.
A screenshot from today of active child exploitation groups on WhatsApp. Phone numbers and photos redacted. Provided by AntiToxin.
Better manual investigation of these group discovery apps and WhatsApp itself should have immediately led these groups to be deleted and their members banned. While Facebook doubled its moderation staff from 10,000 to 20,000 in 2018 to crack down on election interference, bullying, and other policy violations, that staff does not moderate WhatsApp content. With just 300 employees, WhatsApp runs semi-independently, and the company confirms it handles its own moderation efforts. That’s proving inadequate for policing at 1.5 billion user community.
The findings from the NGOs Screen Savers and Netivei Reshe were written about today by The Financial Times, but TechCrunch is publishing the full report, their translated letter to Facebook translated emails with Facebook, their police report, plus the names of child pornography groups on WhatsApp and group discovery apps the lead to them listed above. A startup called AntiToxin Technologies that researches the topic has backed up the report, providing the screenshot above and saying it’s identified more than 1300 videos and photographs of minors involved in sexual acts on WhatsApp groups. Given that Tumblr’s app was recently temporarily removed from the Apple App Store for allegedly harboring child pornography, we’ve asked Apple if it will temporarily suspend WhatsApp but have not heard back. 
View this document on Scribd
Uncovering A Nightmare
In July 2018, the NGOs became aware of the issue after a man reported to one of their hotlines that he’d seen hardcore pornography on WhatsApp. In October, they spent 20 days cataloging over 10 of the child pornography groups, their content, and the apps that allow people to find them.
The NGOs began contacting Facebook’s head of policy Jordana Cutler starting September 4th. They requested a meeting four times to discuss their findings. Cutler asked for email evidence but did not agree to a meeting, instead following Israeli law enforcement’s guidance to instruct researchers to contact the authorities. The NGO reported their findings to Israeli police but declined to provide Facebook with their research. WhatsApp only received their report and the screenshot of active child pornography groups today from TechCrunch.
Listings from a group discovery app of child exploitation groups on WhatsApp. URLs and photos have been redacted.
WhatsApp tells me it’s now investigating the groups visible from the research we provided. A Facebook spokesperson tells TechCrunch “Keeping people safe on Facebook is fundamental to the work of our teams around the world. We offered to work together with police in Israel to launch an investigation to stop this abuse.” A statement from the Israeli Police’s Head of the Child Online Protection Bureau Meir Hayoun notes that: “In past meetings with Jordana, I instructed her to always tell anyone who wanted to report any pedophile content to contact the Israeli police to report a complaint.”
A WhatsApp spokesperson tells me that while legal adult pornography is allowed on WhatsApp, it banned 130,000 accounts in a recent 10-day period for violating its policies against child exploitation. In a statement, WhatsApp wrote that:
“WhatsApp has a zero-tolerance policy around child sexual abuse. We deploy our most advanced technology, including artificial intelligence, to scan profile photos and images in reported content, and actively ban accounts suspected of sharing this vile content. We also respond to law enforcement requests around the world and immediately report abuse to the National Center for Missing and Exploited Children. Sadly, because both app stores and communications services are being misused to spread abusive content, technology companies must work together to stop it.”
But it’s that over-reliance on technology and subsequent under-staffing that seems to have allowed the problem to fester. AntiToxin’s CEO Zohar Levkovitz tells me “Can it be argued that Facebook has unwittingly growth-hacked pedophilia? Yes. As parents and tech executives we cannot remain complacent to that.”
Automated Moderation Doesn’t Cut It
WhatsApp introduced an invite link feature for groups in late 2016, making it much easier to discover and join groups without knowing any members. Competitors like Telegram had benefited as engagement in their public group chats rose. WhatsApp likely saw group invite links as an opportunity for growth, but didn’t allocate enough resources to monitor groups of strangers assembling around different topics. Apps sprung up to allow people to browse different groups by category. Some usage of these apps is legitimate, as people seek communities to discuss sports or entertainment. But many of these apps now feature “Adult” sections that can include invite links to both legal pornography sharing groups as well as illegal child exploitation content.
A WhatsApp spokesperson tells me that it scans all unencrypted information on its network — basically anything outside of chat threads themselves — including user profile photos, group profile photos, and group information. It seeks to match content against the PhotoDNA banks of indexed child pornography that many tech companies use to identify previously reported inappropriate imagery. If it find a match, that account, or that group and all of its members receive a lifetime ban from WhatsApp.
A WhatsApp group discovery app’s listings of child exploitation groups on WhatsApp
If imagery doesn’t match the database but is suspected of showing child exploitation, it’s manually reviewed. If found to be illegal, WhatsApp bans the accounts and/or groups, prevents it from being uploaded in the future, and reports the content and accounts to the National Center For Missing And Exploited Children. The one example group reported to WhatsApp by the Financial Times was already flagged for human review by its automated system, and was then banned along with all 256 members.
To discourage abuse, WhatsApp says it limits groups to 256 members and purposefully does not provide a search function for people or groups within its app. It does not encourage the publication of group invite links and the vast majority of groups have six or fewer members. It’s already working with Google and Apple to enforce its terms of service against apps like the child exploitation group discovery apps that abuse WhatsApp. Those kind of groups already can’t be found in Apple’s App Store, but remain available on Google Play. We’ve contacted Google Play to ask how it addresses illegal content discovery apps and whether Group Links For Whats by Lisa Studio will remain available, and will update if we hear back.
But the larger question is that if WhatsApp was already aware of these group discovery apps, why wasn’t it using them to track down and ban groups that violate its policies. A spokesperson claimed that group names with “CP” or other indicators of child exploitation are some of the signals it uses to hunt these groups, and that names in group discovery apps don’t necessarily correlate to the group names on WhatsApp. But TechCrunch then provided a screenshot showing active groups within WhatsApp as of this morning with names like “Children ” or “videos cp”. That shows that WhatsApp’s automated systems and lean staff are not enough to prevent the spread of illegal imagery.
The situation also raises questions about the tradeoffs of encryption as some governments like Australia seek to prevent its usage by messaging apps. The technology can protect free speech, improve the safety of political dissidents, and prevent censorship by both governments and tech platforms. However, it can also make detecting crime more difficult, exacerbating the harm caused to victims.
WhatsApp’s spokesperson tells me that it stands behind strong end-to-end encryption that protects conversations with loved ones, doctors, and more. They said there are plenty of good reasons for end-to-end encryption and it will continue to support it. Changing that in any way, even to aid catching those that exploit children, would be require a significant change to the privacy guarantees it’s given users. They suggested that on-device scanning for illegal content would have to be implemented by phone makers to prevent its spread without hampering encryption.
But for now, WhatsApp needs more human moderators willing to use proactive and unscalable manual investigation to address its child pornography problem. With Facebook earning billions in profit per quarter and staffing up its own moderation ranks, there’s no reason WhatsApp’s supposed autonomy should prevent it from applying adequate resources to the issue. WhatsApp sought to grow through big public groups, but failed to implement the necessary precautions to ensure they didn’t become havens for child exploitation. Tech companies like WhatsApp need to stop assuming cheap and efficient technological solutions are sufficient. If they want to make money off of huge user bases, they must be willing to pay to protect and police them.
via TechCrunch
0 notes
btreports · 5 years
Link
The app was removed from Apple's store after images showing child sexual abuse were discovered. from BBC News - Technology https://ift.tt/2OT12vi https://ift.tt/eA8V8J
0 notes
activitytips · 6 years
Link
0 notes
Photo
Tumblr media
Tumblr removed from Apple app store over abuse images https://ift.tt/2FJ3GVc
0 notes
knowinng · 6 years
Text
Tumblr removed from Apple app store over abuse images
The app was removed from Apple's store after images showing child sexual abuse were discovered. from RSSMix.com Mix ID 8204427 https://ift.tt/2OT12vi via IFTTT
0 notes