Comment by ghoul2
20 days ago
I long time ago, I was operating a "social network" which allowed image uploads. (India local, didn't amount to much as was shutdown.).
Immediately at launch, we started having a huge amount of (image) pron being uploaded into the pages. We put in some rate limits etc, but did not want to put any major restrictions of user signups etc as that would hurt signup figures (important to the ceo!).
We already had some content review people thru a temp agency on site, so we checked with them and they were fine doing this manual filtering of these images for us. All young (early 20s) women. While my team built a quick "dashboard" for them to be able to do this image filtering quickly and conveniently, I had a detailed conversation with them as I was very concerned about having them review this kinda stuff for 8 hours a day, 5 days a week. Truly nasty stuff.
They were _perfectly_ clear that they had no issues with it, and they told me in so many words to not give it a second thought and to let them get back to work.
It was a surprise, but it was a point of realization: there are much worse things they could be doing. And looking a porn has shock value only the first time. I was under-estimating these women and assuming they were some "snowflakes" who could not deal with something this silly and non-threatening.
Just my own person anec-data.
It's good that you warned them in advance, but I don't think "there are much worse things they could be doing" really is a reason to discard those who are hurt by this as "snowflakes" and the content they're exposed to as "silly and non-threatening." From what you're saying, there's also a big difference between what you were dealing with and what this article is talking about: in your case it was only still images, and only "regular" porn (and I assume they probably had to monitor every image posted if it was "a long time ago" so overall they were not exposed only to that content); in this case they're reviewing videos content including not only porn but also violence and sexual abuse, that was pre-flagged by an AI so with way less rule-abiding content mixed in.
People butcher livestock for pay on a daily basis, rendering a live animal dead and cutting it to bits, on a massive scale to provide the food that we eat. Rescue and medical personnel deal with the injured, ill, and dying on a daily basis on a massive scale. Merely watching videos, even of disturbing material, doesn't even come close to being as bad as those and some other professions.
> Merely watching videos, even of disturbing material, doesn't even come close to being as bad as those and some other professions.
The research and reporting, which looks at actual people's experiences says otherwise. This issue has been coming up for years.
People live through war, murder, torture, rape, etc. We can always find something worse but that doesn't make the current situation better. Human experience of pain and trauma isn't scaled relative to the worst possible pain and trauma.
I think watching children being butchered or prisoners being tortured to death could be equally or more traumatic. Killing in processing animals is at least socially acceptable and (arguably) necessary for survival.
[flagged]
6 replies →
https://news.ycombinator.com/item?id=46921505
The alternative is…. They don’t have a job?
What else is going to happen?
For many people, the alternatives can literally be death, wading unprotected through human excrement, sold into defacto slavery, etc. etc.
AI can do some of it (now), but largely isn’t going to be more accurate and still needs humans to double check its work.
I really don't think it's fair to minimize someone's struggles just because their situation could be worse. Is only the most miserable person on the planet (by what metric anyway?) allowed to complain about their condition?
I also don't think it's fair to exploit people who are in terrible situations by pushing jobs we don't want onto them, pay them a handful of crumbs, and then say they should be happy with what they get because their neighbor who does another job gets half a handful of crumbs.
3 replies →
I actually had a conversation about this with my mom. We were talking about the hotel cleaners in Dubai walking around with toothbrushes to clean the shower which seemed mildly ridiculous to our European eyes.
But we came to the realisation that these folks were probably happy that they could send money back to their villages. And we left a nice tip.
5 replies →
There is plenty of productive work to do in the world that isn’t streaming through the dregs of the internet for hours on end.
This argument is a false dichotomy.
8 replies →
It's modern slavery.
Just because it's beneficial for them doesn't mean they are getting exploited.
Looking at porn or something similar or actually really bad , changes people.
Even if it means they get a lot more insensitive
IMO it's only slavery if they truly have no other options. Which, if true, is also worth addressing.
We live in a world we're most people do not do something which directly affects food production.
To not be modern slavery it would/should have above avg pay and benefits.
https://news.ycombinator.com/item?id=46921505
While one can blame corporations, the most blame lands on the Indian government(s). Decades and decades of corrupt local, state, and central governance has led to dire poverty and high levels of unemployment. The current and past leaders have had no care to fix it, and it’s only getting worse. Their incompetence is what creates these kinds of jobs as alternatives to abject poverty and death.
As an Indian living in India's Silicon Valley, I am calling BS on this.
"All young (early 20s) women" doing explicit content manual filtering is not normal. No Indian Family would allow their daughters (and sons) to take up such a job if they knew about it.
As the article itself points out, the workers are drawn in by non-explicit/non-pron image annotation and then switched to psychologically harmful work when they cannot leave due to job-needs/contracts-enforcement.
This is pure bait-and-switch scam which has severe psychological effects on the women (and men) involved and should not be blithely hand-waved away.
For a detailed understanding of the psychological harm of explicit image annotation see this documentary (not specific to India) The "Modern Day Slaves" Of The AI Tech World. Undercover reporter does annotation of explicit images for AI but is so traumatized by what he sees that he quits in two weeks. - https://www.youtube.com/watch?v=VPSZFUiElls
would the indian families allow their daughters or sons to become part of tech support scams stealing the livelyhood of elderly people?