If I remember correctly, the exact same message was delivered to Jordan Peterson a couple of weeks ago or so, before he sat down with the google memo guy. He was in the middle of a bible lecture series, and Google banned his account, and sent the exact same "we've determined that your video does violate our Community Guidelines and have upheld our original decision. We appreciate your understanding." message.
It seems tone deaf especially since in cases such as these there is no understanding to appreciate. Google will not tell you what you did to violate policy, only that they checked to ensure that they found you guilty, and then they snub you further with the HR speak. It's maddening.
The worst part of the information age is arbitration by unreasonable and impenetrable algorithms rather than humans with the capacity to make a judgement call when the rules clearly don't account for the situation at hand.
A transparent appeals process staffed by humans who can at least deliver a rationale, including what rule you broke, should be required by law. There's irreparable reputational damage associated with an algorithm libelously labeling something "extremist content" that isn't.
Whenever I see these stories, it reminds me of the phrase "the lights are on but nobody's home." Google gives the impression that nobody actually works there. That the whole thing is just an automated system and that sometimes people fall through the cracks.
His channel was re-instated which... was nice, I guess.
I have watched hours upon hours of his videos (I've been a fan long before his PC controversy - I love personality theory and the twist he adds to them).
I'm pretty centre left as far as I'm concerned and his videos do not in any way promote anything nasty. He's completely upstanding. I have no idea why they'd ban his channel unless there was a coordinated flagging.
It's a strange corporate culture thats for sure
to me, this is further evidence that google needs to be put under regulations.
Hoping with the rollout of GDPR this will come to an end.
Google message to Jordan Peterson was not "Stop making sense, we have an agenda to pass" ?
There are 2 uses for that language.
1. It's to add insult to injury while attempting to soften the blow
2. It's an attempt to deny that they have power to do what they did.
For the insult to injury: This is a technique that is under the "thinking past the sale". (http://blog.dilbert.com/post/129433801521/thinking-past-the-...) The context prior to the understanding part basically has put you in: "I've done something bad and now I'm being punished." The last phrase "We appreciate your understanding." has later put you in the position of understandnig that: They (plural people) would like for me to understand their decision.
In short this you're no longer put in a position where you can really fight back directly with the issue at hand. You're reminded that you are fighting against an organization if you disagree. It's predatory and manipulative.
The second part: It's a manipulative attempt to prevent you from attributing ill will against the offending party. They're attempting to "soften the blow" because they appear to be reaching out.
On both of these possibilities, the biggest issue with this is it's incredibly manipulative and it's much more insulting when you notice it. The organizations and people who use those statements should be doomed to be constantly rejected in everything they do and want in his passive aggressive stance for the rest of their life. It's an incredibly anti-social behavior.
Unfortunately, we don't have social punishments for shitty behaviors like this.
Last to note: This is the equivalent of the "apology" "I'm sorry that you feel that way" or "I'm sorry this didn't turn out the way you had hoped."
It's because every reason you give people is something they can use to sue you or harass your employees if they don't agree with the wording.
If you say nothing, there's nothing to grab on to.
It's like online dating, everything you say is something that someone will dislike about you.
> Does that kind of patronizing tone sound polite to the ears of a PR drone?
I don't know, but you're right it is very common, and infuriating.
"Your call is important to us. Please hold."
I think a passive agressive "fuck you" is an important element because it provides enjoyment and some satisfaction to the corporate drones who you are interacting with.
I think allowing low level minions with saddistic tendencies to express that saddism via a kthxbye-but-actually-fu here and there is used to reward them for an otherwise boring and unfulfilling task. In this case it is a bit indirect because it is the developer who wrote the code not necessarily a clerk at the counter or a customer service representative in a call center.
Don't remember the incident, I think it was when someone was fired after some public incident at one of the tech conferences (Pycon I believe) where the HR commenting on the firing said something ridiculous like "we reached out to the developer and told them we'd be letting them go". Which I remembered because it sounded like such a massive and rude passive agressive "fuck you"
Sounds like everyday life in Japan.
I don’t think that Larry and Sergey yet fully grasp the consequences of allowing an echo chamber to form within google. One of the consequences of living in an echo chamber is that we rob ourselves of the feedback that we need to make good decisions. For example, many of us on the left who only listen to those who think like us are blissfully unaware of the severity of the storm that is building on the right and the consequences for the business.
Google’s whole future business model seems to be based on getting deeper into our lives, into our homes, into our vehicles and gathering ever more data about us so they can more effectively help others to market to us. Many of us have been completely okay with that in past because we trust Google. They’ve worked hard to earn that trust. But with the public shaming and firing of James Damore, the blacklisting of “non believers”, the demonetizing and deleting of YouTube videos that violate the “Code of Conduct”, etc. the bonds of trust have been shattered. And once trust has been shattered, it is nearly impossible to re-build.
Yes, it does. And they're not entirely wrong. Consider this video in which Uber CEO talks with someone like a real person instead of "respectfully" brushing him off:
It was a PR disaster. If he'd just ignored what's-his-name as an insignificant peasant, nothing bad would have come of it.
The big secret is that YouTube, one of the most important sites in the history of the internet, has had a bunch of people who just don't care.
Word is at some point it became a dumping ground for Google employees who transferred in because they wanted an easy job where they could use the amenities of the YouTube offices in San Bruno.
> We appreciate your understanding.
It's their way of saying that they know you won't find their decision popular but they hope you won't pursue it any further.
It comes down to the second thing I remember learning in highschool level programming. Always use neutral language in your program text, and don't make jokes.
Google does have a habbit of making jokes in their software, chrome crash page comes to mind.. so I don't really find it very surprising to see this kind of message.
To be clear it's really only an insult when you don't know why your content was removed. If you know why, and can see their side it makes sense and it wouldn't seem so insulting. But in a case like this, it comes off insulting due to the true situation.
Removing the text all together and simply stating the fact would have sent the intended message perfectly.
And finally, likely just a developer, not a pr drone.
All the responses are ascribing a nefarious ulterior motive for that piece of text. It could well be true, as these orgs have enough resources and data to A/B test it to hell.
Or, it could be that, long long ago, that phrase actually conveyed sincerity that won customers.
Soon, that became the next big customer retention tactic to apply. Next, it became cliché. Now it is just grating on our senses to hear the fake insincerity echoing from these huge organizations all around.
So I'm wondering whether this HN exposure will be enough for some sensible humans to review these takedowns. I mean, is it embarrassing enough? I recall that Facebook reacted rather quickly after its "AI" took down videos inappropriately.
Internal studies probably correlates such language with reduced pushback from users.
The wall of jello defense. Works well.
It's somewhat similar to emailing someone a request and signing off with "thanks in advance", implying not just the request but the expectation of honouring it, too.
In the past it's the kind of tone that would inspire me to greatly escalate the issue, and I doubt that I'm alone.
Youtube's response regarding one of these videos documenting abuses (emphasis mine):
> "we've determined that your video does violate our Community Guidelines and have upheld our original decision. We appreciate your understanding."
Can someone explain to me why corporations, when interacting with customers regarding complaints/appeals, seem to have "don't forget to add insult to injury" as one of their motto more often than not? Does that kind of patronizing tone sound polite to the ears of a PR drone?
That is a little insane... Makes me wish there was some sort of website that archived specific YouTube videos marked as historical or criminal evidence or some sort of qualification, as long as they don't abuse copyright just to make sure if places like YouTube delete them they can remain. Or a service that uploads to multiple video streaming sources at once (though I imagine these might violate the TOS of YouTube for w/e reason).
Kind of sad where you have video evidence being deleted by YouTube. It would be nice if they allowed some sort of option for political type videos like these to actually be uploaded by users, especially if the original uploader was killed, to be downloaded with metadata (upload date to youtube, youtuber username, etc) so anyone can reupload it elsewhere.
Another case where I wish TPB had made their own YouTube clone already. I'm sure they would of not taken down these sort of videos.
Wondering where Wikileaks is in these sort of cases? Do they download these sort of videos? That begs the question: why don't they? It seems right up their own alley. I don't always agree with them, nor do I digest their content but at the very least for a site like theirs it would make sense for them to archive YouTube and other politically sensitive videos no?
People really need to download videos from YouTube if they are that important, not doing otherwise is in my opinion reckless. YouTube is not a service you can expect to actually archive important videos, if I look through my favorites or other playlists, a ton of videos are deleted.
Use youtube-dl, download it to your own server and back them up yourself. Yes, this is awful and sucks on so many levels, but please, please, back up data.
In my opinion the important thing to remember is that Youtube could easily do something about this. It seems like there's a trend in a lot tech-centric companies to save money automating away actual content review as well as customer interaction. But humans are entirely capable of reviewing content and giving detailed explanations for why something has been judged a certain way, especially if you have them work alongside an automated system that simplifies the process for them.
I realize that minimizing human labor is a big part of how these sorts of business models achieve their profitability, but problems like this aren't going to go away as long as that's the norm. And I don't just mean poor explanations for policy decisions either. The core issue is bigger than that imo.
The information age has privatized a lot of the modern 'public' social/cultural spaces. For nations that value both the freedom of speech and the preservation of historically/culturally significant speech this is problematic. It reduces the public's ability to express itself but also their ability to look back on old expressions and learn about the history or cultural paradigms behind them.
This isn't really supposed to be a rant at Google in specific. They're just the topic at hand so they're the easy punching bag. In general, customer service aside, I think they do good work and more importantly they exercised the necessary foresight and resources to develop their products into what they are today. I'm by no means implying we should socialize social media... no pun intended. But I do think there needs to be more discourse about how these trends will affect the future of speech and historical censorship. Right now it's just a modern problem in its infancy, but decades from now people who want to see visceral content depicting firsthand experiences from events like those happening in Syria, or the Arab Spring, are going to be getting censored history. What if China started pressuring foreign companies, via benevolent coercion such as financial incentives, to implement systems that made finding information about Tienanmen Square more difficult? The privatized nature of these platforms makes this sort of attack easier as well. And I don't have a good solution, but that's why I think there needs to be more dialogue about the future of online media in general and what direction we want to steer it in.
It's ridiculous easy to sensor the web for corporations and Governments (but for some reason, they keep saying that once something it's posted online, it's over and it will stay there forever)... which is why I use youtube-dl for the videos that I really want to keep...
Why wouldn't Youtube pass on any extremist material on to the FBI before making it unavailable for the wider public?
Also see this Twitter thread: https://twitter.com/EliotHiggins/status/896358097320636416
> Ironically, by deleting years old opposition channels YouTube is doing more damage to Syrian history than ISIS could ever hope to achieve
> Also gone are the dozens of playlists of videos from Syria I created, including dozens of chemical attacks in playlists by date
> Keep in mind in many cases these are the only copies of the videos, and in some the channel owner will have died, so nothing can stop it
Youtube used to be a distribution channel but it slowly became an ad delivery tool with content along for the ride. Like 99% of other free (and even some paid) internet sites.
Either way, totally agree that it's a tragic situation.
The only sympathy I have for Google is that trying to separate the good vs. "evil" (as in "Don't be evil.") content is a monumental task that machine learning will probably never be capable of performing. So they're left with the choice of spending an inordinate amount on human review and detailed research or just make wildly over-broad removals.
I'd rather they leave up more rather than less, but they tried this approach and it almost lost them every major advertiser. So continuing down that road would potentially lead to the whole platform losing viability. Maybe some would like that outcome but these historical videos would be just as lost.
Maybe we'll see the pendulum swing back in an effort to reach a more reasonable middle ground.
One issue with YouTube is archival status. Another is provenance.
At some point it will be as easy to create fake videos as to create fake text. It is unrealistic to expect people who aren't information literate about text will be literate about video, but I hope that there's a way for journalists to move away from YouTube by then.
I doubt they're grieving lost footage. It's loss of access to that distribution channel.
Documentation is pointless if it can't be distributed and used to effect change.
>should have the video raws immediately turned over to a human rights organization for preservation
You're saying like every human rights organization have some magical means of preservation other than uploading to YouTube.
Where else should they be stored?
It was folly to think that YouTube would be a safe place to document war crimes. YouTube is a distribution channel, not a preservation channel. Its ease of use certainly makes it an attractive option to upload things quickly, but anything of historical significance should have the video raws immediately turned over to a human rights organization for preservation.
I have (had) a channel that had videos about missing people, their last sightings on CCTV etc. The parents of a missing person even used an embed video on their site of a CCTV footage. They emailed me if I still have the video because they need it.
YouTube banned the whole channel for extremist/hateful content. Probably some of the videos/titles told the AI that the footage is extreme or some sort of glorification.
I appealed on some form but don't even bother anymore.
I hope YouTube as a video platform (not streaming) gets a serious competitor.
Youtube isn't the only channel to upload videos. Check out webtorrent or bitchute for p2p based videos.
During the Arab Springs I suspected many police violence video would be deleted from Youtube. I had downloaded them to my server and posted everywhere the links for people to mirror them. Not a single person did yet.
I have been amazed at the little importance people put on this kind of video. You have video evidence of crimes with faces appearing clearly. It can take 5 to 10 years for such events to calm down enough to reach a point where crimes can be prosecuted.
And it is hard to blame youtube for that. They are considered the channel for Lady Gaga and silly cats video. Hell, I know 3 years old toddler who browse youtube unsupervised.
In many places Youtube is criticized to promote violence and extremism by leaving these videos. I feel bad for them, they are between a hammer and a hard place.
I just hope that the censored video are not totally deleted from their servers. They should have someone reviewing criminal videos and keeping them at the disposal of judicial authorities but even that opens a whole can of worms: do you obey only to US authorities (who do not care about war crimes in other countries)? Do you obey all world authorities including Saudi and Chinese?
Anyway, that's youtube's problem, not ours. Simply, helping prosecute war crime is not part of Youtube's mission, so do not trust them for it. To anyone who feels this is important content, use youtube-dl and keep backups. Make torrents of it, share it around, make sure it does not disappear.
And when some NGO finally realize that this content is precious, pump up your upload bandwidth and fill their servers.
> Such AI coupled with the inflexible policies of companies like Google and Amazon
I would argue that there is a key difference in customer support which makes me much more confident in Amazon than Google.
Google has non-existent customer support for the public and virtually non-existent customer support for paid customers. If something goes wrong with your Google product your best bet - even as a paid customer - is to contact someone you know at Google. Going through the official channels is a waste of time.
Amazon, on the other hand, will bend over backwards to make sure you're satisfied - even if it loses money in that transaction. Refund decisions are mostly automated at this point, although human support for both vendors and buyers is there if needed.
Such AI coupled with the inflexible policies of companies like Google and Amazon is already starting to be a problem and will only get worse as it's deployed more broadly. Accounts are closed without recourse for invalid reasons and their owners treated like violators. Short of a law requiring explanations and an appeal process, I don't see this situation getting better ever. Yet another reason not to trust these companies or use their services that require creating accounts and agreeing to their bullshit TOS.
True free speech also entails allowing private companies not to be compelled to host what they deem to be extremist material at their own expense.
(whilst simultaneously being expected to take down copyright infringements, because whilst people are prepared to defend to the death certain groups' right to incite people to genocide, an unauthorised Bieber video is clearly going too far)
That remains true even if their algorithm or human criteria for determining what is and isn't extremist material sucks.
Agree to disagree. This isn't really an issue of a lack of absolute free speech but instead has much more to do with AI gone bad. With any reasonable definition of free speech (i.e. limited) it would've been wrong to remove footage documenting war crimes - hell even with a highly restricted free speech.
Freedom of speech is not inherenlty the highest value there possibly is. You should be able to defend something like "Yeah, even Hitler had a right to say what he thought and it'a a good thing he had it, despite the consequences that ensued." because, in my opinion, I rather restrict the freedom of one genocidal maniac than see the dath of 85 mio. people.
Using your definition of freedom of speech we could easily justify not outlawing murder: "We should just educate people not to murder each other instead of banning it." Maybe banning can actually have a chilling effect on ie. hate-speech, heroin abuse? (While I'm for banning heroin, I advocate for providing services that ensure safe consumption (e.g. needle dispensaries, consumption rooms) and help prevent (further) abuse instead of jailing them)
You are right, but i don't think that's actually possible.
It's not easy to educate general population. How would you teach critical thinking, if most people just don't want to?
Current solution is to just babysit general population by essentially censoring information.
It seems to me history stays in replay until we eventually understand freedom is more important than whatever risks are attached.
> Imho the problem comes from the fact that corporations try to hard to be "democratic" about things and "please the majority".
Corporations aren't trying to please the majority. They don't care about the majority. Besides, the majority wants free speech.
Most americans don't want jobs being sent to china, but the elite do. Which side do you think corporations chose?
Youtube and the rest of social media are censoring because the WSJ, NYTimes, etc have been pressuring them to. And the WSJ, NYTimes and the traditional media doesn't represent the "majority", they are the mouthpiece of the elite.
Think about it for nearly 10 years social media has been highly "pro-free speech". The WSJ, NYTimes, etc do hit pieces against social media and put pressure on them and all of a sudden, it's relentless censorship.
Maybe people should get their shit together and realize that true free speech include allowing videos that seek to recruit people into despicable organizations be available! Yeah, even Hitler had a right to say what he thought and it'a a good thing he had it, despite the consequences that ensued.
The problem that needs to be solved is how to educate people into not being lured into those organization DESPITE having access to those materials... This kind of censorship is just as STUPID as banning drugs like heroin and cocaine (instead of just making them unavailable to children, or without a "license") or the "war on drugs".
Imho the problem comes from the fact that corporations try to hard to be "democratic" about things and "please the majority". But this is not a good idea: sometimes the majority of 99% is against freedom, and they are wrong, despite being the 99%. And the majority should be opposed and freedom protected even when the cost is someone's blood. For me personally, there are these words from my native country's national anthem: "life in freedom or death [for all]"... and I will sure as hell fight, die or kill for them.
>US soldier recruitment video and an ISIS soldier recruitment video
Somehow I doubt US recruitment videos have englishmen being decapitated as job perk
Exactly, but in a world where people desire facebook and others to have "truth checking authorities", I don't expect things to get much better.
To me, if you want to regulate controversial opinions, you have to err strongly to the side of too-open.
Remember, before the declaration of independence our founding fathers were terrorists/rebels. I don't mean this as a snappy hollow comparison. I'm saying fundamentally, you can't distinguish between a US soldier recruitment video and an ISIS soldier recruitment video without applying a moral context. How would an AI ever do this? And even if it could, who's moral retelling is the right one?
Better in my mind to stay out of the censorship game altogether and promote a forum that is inherently structure in a format that incentivizes accuracy over emotions.
They need to metamoderate: let people tag video by content (nudity, violence, crime, death...) and as soon a video missing a tag gets flagged close the channel.
User that enable viewing of certain tags csn't complain then, and google only needs to put enough legalese when enabling comtent
They already doing that to an extent with mature content, so there's that
YouTube is balking at their own size. They're discovering what should have been obvious to anyone; the sheer amount of content entering their centralized system is impossible to moderate in any fair way. The only way they can manage is (A) prioritize quality moderation toward channels which are more popular, and (B) enforce the most bland, vanilla experience possible.
They need to moderate because they are centralized, and their revenue demands it. We, as a society, need to create a better option. Not just another YouTube, but a seamless decentralized solution.
Because YouTube doesn’t care about the videos; it cares about the advertisers. You can’t be a proponent of free speech (extremist propaganda) while trying to please advertisers. Also, with today’s political climate, people seem like they want anything that disagrees with them to be labeled as hate speech.
The whole point of the censorship is to stop Youtube from being a tool used for terrorist recruitment. If the terrorists can just check the box "all content" then it's useless.
> Why not create a setting that allows user to see YouTube as sanitized by their AI or all content?
Because it's about controlling narrative, controlling propaganda and giving the "media/news" space back to traditional media.
That's what people have been asking from sites like reddit and HN for years now. Give people the option view the raw threads ( uncensored ) and the moderated ( censored ). But neither are interested or have indicated they will. Instead, on reddit at least, there is more and more censorship.
> Advertisers can opt into certain bubble if they want, or opt out of certain content e.g. content deemed inappropriate by the AI?
Do you really think advertisers care? Do you really think corporate america cares? They don't have morals. China is brutal dictatorship and yet advertisers and corporations have no problem doing business with china.
It's simply a matter of control. Who gets to decide what you and I see. Do the masses get to decide for themselves and control what they see or do the small group of elites? Per usual, the elites won and they get to decide.
Why not create a setting that allows user to see YouTube as sanitized by their AI or all content?
Allow people to chose content level just like they choose security level in browser settings.
1. Legal content. May include content that violates YouTube content policy, but is legal in USA, or the country of the viewer. Maximum freedom of speech and maximum ability to see content that you may find offending.
2. YouTube content policy met. Content that is legal and meets YouTube Content Policy.
3. Legal, Meets YouTube content policy, Meets a certain org's taste. Like when you can pick a charity that you can donate to when you shop on smile.amazon.com. You can select the org whose bubble you want to live in. ADL, Focus on Family, Skeptics etc. The org bans content and it only is banned for people who opt into that blacklist on youtube.
4. When user is not logged in they get AI filtered list but can select "all legal" or "all that meets content policy" filters, even when logged out. All others bubbles available to logged in users only.
Advertisers can opt into certain bubble if they want, or opt out of certain content e.g. content deemed inappropriate by the AI?
How does that sound YouTube?
Doesn't the government security agencies want to know who is watching extremist content and who is not interested in it? How would we know who the extremist are if they fall back to person to person, in person, communication?
Old internet is still there, you just have to not be too lazy to host your own content..
Yes, I could see how that classifies as "extremist material", but that's no reason to delete them...
IMHO the gradual increase of (self-)censorship in the popular Internet is worrying --- one of the most compelling things about the Internet as it existed was that, from the safety of your own home, you could see and experience things that would otherwise be impossible to access. Now it seems it's turned into a massively commercialised effort of "curating" content so that it doesn't offend anyone, and only results in more profits for advertisers.
I urge you to report this inhumane case and follow-up with a "show HN" (I filed an ICC case, then THIS happened).
I am not an expert in international law, but you would still have to prove intent, I believe. It's hard to prove intent on an algorithm that very simply is incapable of understanding the significance of its actions.
Unlikely, but an interesting argument.
Google executives are overwhelmingly US citizens. The ICC has no jurisdiction over them.
I'm willing to bet that in their hundreds and hundreds of pages of terms and conditions there is a paragraph saying that by using their services you give up your right to sue Google for war crimes.
Since my understanding is that covering up a war crime is itself a war crime under Complicity doctrine, could Google executives get charged for this in The Hague?
I remember when I used to like - no, love - almost anything Google did.
That seems like such a long time ago. Since then my attitude has changed to being mostly hostile towards Google, with every such event.
Google should have never entered the "content game" and should have remained a neutral search and distribution (YouTube) platform. Once it went down the path of being a content company, it started "compromising" in all sorts of ways that were terrible for its users.
I wonder if the higher-ups have even noticed this change in attitude towards them, and if they did, then they've probably decided that making money is more important even if they become the Comcast of the internet (most hated company).
Unfortunately many of the original uploaders have since died in the war, and the deleted playlists were the only visible place the videos were accessible.
Have they checked with YouTube to see if the files are actually deleted?
Like just because their gateway won't give you access to it doesn't necessarily mean that the bits have been scrubbed on the back end.
Also: here's a project to archive this information.
> Google, Twitter and FB massively need to ramp up their customer service
This is not going to happen; the whole point of their businesses based on offering free massive online services is that they are dirt-cheap by being run mostly automatically.
No, the only way to fix the problem in those juggernauts, and protect the tiny individuals from getting caught and squashed in their wheels, is the mechanism that governments use to protect citizens from the worst effects of bureaucracy: having an ombudsman. A semi-independent service to receive complaints of severe abuse by the main service, and for which the primary goal is protecting users, not reducing costs.
In some sense, this is how their PR department operates: they'll bring human attention and put all the required effort to fix an unjust situation, to clean the image of the company. The difference is that now the unjust situation needs to become a scandal, as you said, and an ombudsman would be required to examine all applications (either to accept them or reject them) as part of their official definition.
Once again, the only hope for customer service seems to be a (social) media shitstorm.
Seriously, Google, Twitter and FB massively need to ramp up their customer service and not externalize the costs of a lack of support onto society any more. And there are many "costs": people being actively harrassed and intimidated, sometimes so far they are afraid leaving their house, due to hate speech or doxxing, a loss of historically relevant information as in this case, people locked out of vital emails or their businesses (e.g. when their Gmail account gets closed due to copyright violations on Youtube)...
Mis-applying bad so-called “artificial intelligence” is still a prime example of natural stupidity.
Well almost everything we use these days belongs to a private organization. And we're fully subject to their whims.
Something is wrong about that.
They are non-technical people posting from a war zone.
Or were, some are dead now, and have taken their videos with them.
If you use YouTube, you are subject to the whims of that private corporation, regardless of whether it's right or wrong.
They should find a way to host the content somewhere else.
If you are a rebel in a contested part of the world, submitting videos to youtube takes literally 2 clicks on any cheapest android phone, and then the entire world can watch it. Everything else requires at least a bit of technical expertise which you might not have.
Why are people storing evidence on Youtube again?
Not blaming the victim, but at this point most of Google services have not shown to be reliable, especially if you require some kind of thinking human behind a decision
I feel like YouTube uses its monopoly to create a walled garden focused on (in their own words) advertiser-friendly content.
The thing is, it makes perfect sense from their side - they will make people angry, but why would they bother if those people can't go anywhere else?
I'm starting to feel that a competitor providing the same quality of service while allowing all kinds of videos has a chance to succeed. It's OK to have both child videos, porn and Syrian documentation, as long as you can filter - maybe have some sort of a "curiosity" slider that filters child content on one side, YouTube content in the middle and all content to the other side. Also some category toggles,... If you're unhappy with the current selection, just take a few minutes of your time and change your preferences.
Given that all of the videos happen to be anti-ISIS... and YouTube happens to be owned by an evil empire in bed with American military industry which created ISIS... the AI must have figured out that the videos could be a threat to its masters.
We will have to get used to this - people hiding behind their AI's skirts saying 'Wasn't me - she did it'.
What did they train the AI on to deem something 'extremist'?
Should we get to see the training data used and labels?
Or is this the modern day equivalent of credit score algo, something that can have huge impact on lives, but you are not allowed to know what it is.
This is bad.
Yeah wow, an audience of billions having instant access to your content at the click of a button. So terrible for content creators. Most of these content creators wouldn't even exist without Youtube, they'd be working in a cubicle somewhere forwarding memos.
YouTube is a really horrible service for content creators. For this type of content, you're practically probably best off with LiveLeak (which, incidentally, seems to be a much better source of breaking news than YouTube these days). Ideally, we'd all switch to LBRY or some sort of IPFS video distribution or something, but that will take time.
Yeah, forget about beating Go and StarCraft 2 top players. How about making the takedown of YouTube videos actually fair for a change?
Screw YouTube's automation across the board. It's horrendous and lazy.
Google is seemingly more and more a regular almost evil corporation.
I miss the days of "don't be evil".
War crime evidence can also be extremist material. It is often repackaged as propaganda to rile up new troops.
Give evidence to the courts or police. Don't upload it to a video entertainment site and expect it to stay up, despite skirting their rules.
As I understand it, this is the result of Google itself having quite a strong political opinions, at least recently. They profiled themselves as being leftist/progressive... their software just enforces this.
very related to this article about facebook 
corporations control what info passed to people, and create their own version of reality, but blocking what they don't agree with.
I know it's AI, but seems that google appeal agrees with AI decision.
people should read Noam Chomsky's Manufacturing consent book, here's interview about it in 1992 
It seems that we really need to find a new distributed/decentralised censorship-resistant way to distribute videos.
YouTube does not seem to me to be an appropriate medium for "war crimes evidence". Evidence needs documented provenance, chain-of-custody, storage integrity, affidavits, etc etc. Why does this evidence need a high-bandwidth publicly accessible and searchable interface? For what purpose?
To be honest, if you have evidence of a war crime, I hope your plan to seek justice doesn't depend on Youtube.
Douglas adam's 'Peril sensitive sunglasses' are nearly here.
In case it's not already apparent, there's a business opportunity here for someone to automate "set up an S3 bucket and host videos in it" as an app that uses an API key, so that you simply provide the key to the app and it manages your video collection, gives you a UX to it, and charges you a fee per month.
Often there is no difference between war crime evidence and war crime glorification that machine learning could discern. Exactly the same content could be interpreted as "look at us do great things in defense of our noble ideals!" and "look at these monsters do horrific things for no justifiable reason!".
The difference is in the audience's mindset - which is only partially influenced by the uploader's intentions, and partially by how other pages and channels link to the video and present it, and partially by historical context (the same content can acquire a different interpretation five years down the road). Machine learning cannot be expected to emulate that.
I am very concerned about Google using AI to filter hoaxes from search results. Government testing syphilis on black population or selling drugs to fund terrorism? That must clearly be a hoax, right?
One of the most interesting developments in AI will be watching how we respond to human rationality detached from human morality. Programs that optimize for practical outcomes are going to come up with a whole host of solutions that we consider abhorrent, not least because the mere notion that that solution is a practical one riles our sensibilities.
The revolution will not be televised.
I find this interesting in comparison with the google api that detects toxic comments. I suppose we'll be seeing the same sort of situation in comments sections (less irritating though)
Not an obligation, but their mission statement is
"To organize the world’s information and make it universally accessible and useful."
Guess they need to change to "information that we and our advertisers agree with."
Yes, I know they are different companies under Alphabet but it doesn't matter. Google has become a monster and too big, powerful.
An organiziation should not be free of criticism simply because it does not have an obligation to do or not do something.
>To be fair, YouTube is under no obligation to some greater good;
Eeeeeeeeeeh....I don't know about that.
Lots of corporations today target "owning" a certain aspect of humanity. Facebook "owns social", Google "owns search", and LinkedIn is having a jolly good swing at "owning recruitment". Youtube wanted to "own video" and by and large it has succeeded. I'm not sure they get to have that position consequence free though.
I'm increasingly of the opinion that companies that manage to pin an entire market implicitly take on a social responsibility, and lots of them are not shouldering it appropriately.
You are of course right in theory, but it's not good enough to let that justify evidence of war crimes getting lost.
If you film people getting shot at in a demonstration and want to get the word out, chances are you use a popular social network. You might not have any further knowledge, or not be able to (imprisoned, fleeing, dead, like the majority of Syrians) put the video elsewhere.
Yes, but the more people realize what an awful platform YouTube is for them to keep their videos, the better.
And we are under no obligation to not hurt their PR over this.
To be fair, YouTube is under no obligation to some greater good; it's just a video hosting service. Expecting it to "preserve footage" and any footage at that, is a strange expectation.
Torrent based Youtube alternative when? I think the technology is ready to move all of the content to a distributed system where it cannot be censored.
So does YouTube want to look like an ISIS supporter? Or at least, that it doesn't approve of criticizing ISIS?
Is it possible to host such videos on archive.org ? is that a valid option?
hope YT did a soft-delete on those files...
We detached this subthread from https://news.ycombinator.com/item?id=14998738 and marked it off-topic.
"I really hate ads on YouTube."
Just use youtube-dl and you'll never see any ads.
I have no idea what you're saying. YouTube was "taken over"? By whom?
If this news was what changed your opinion, you were simply uninformed; nothing has actually changed. YouTube had done the same more than two years ago, and that's very unlikely to be the first case.
And, come to think they had me convinced that this was not going to happen for few decades.
I think YouTube went down pretty fast and without fight. The ideological takeover of Facebook and Twitter raged on for few years. I think YouTube was taken over literally overnight. I remember being appreciative of YouTube just a few days back.
Guess, time to cancel my $15 Youtube Red Family membership. Ugh, I really hate ads on YouTube. And I was happy to give my $15 month over month. But, I can't fund Youtube anymore given what they are doing. $15 to Youtube, $10 to NetFlix, $10 to Amazon, with $35 a month, I can sponsor ton of content on Patreon that I like. My subscription list on YouTube is not 35 people long, I think it would work out.
Never ever I thought I would type these words... break up Google and Facebook and Amazon.
If you think they're making a bad business decision
No, that's not what people arguing for a law think, and if you don't understand that, you can't make an effective argument rebutting their position.
The implied position is that Google is doing something bad for society, even if it's good for business. You may disagree that this is bad for society, or that even if it is, it's still Google's prerogative, but you should at least understand the argument if you want to have a meaningful discussion.
And, Government has right to break up monopolies.
I now fully support breaking up Google, Amazon, Facebook and Microsoft; may be Apple.
Executives at publicly traded companies have no right to enforce their individual political views as company policies. They are not privately held companies, they are public companies who are held at higher standards of equal treatment to all.
What if Bic and Mead said you can't write a opinion that we don't like using our pen and notebooks?
What if US Postal Service said you can't send a snail mail if it contains references to UPS or FedEx?
Yes, absolutely! We solved the problem!
No really, seriously, this idea is so flawed.
* You don't like the government? Move or start your own government!
* You don't like your house? Rebuild it or buy another house!
* You don't like the way Earth is being run? Move or RIP!
* You don't like the way the school is being run? Move to another town or build your own school!
* You don't like the fact oil is so expensive? Drill your own oil!
* You don't like this comment? Flag it or deal with it or build your own HN.
* You don't like the way anything is done or served? DIY all the way.
* You don't like your local Starbucks? Run one or go to Dunkin' Donut.
* You don't like the way hospital runs? Run one or don't go.
Something more extreme?
* You don't like your parents? Disown your relationship with your parents or make new parents!
* You don't like your child? Disown your child and make one again.
Why don't we dedicate our own life building things every time we don't like? Because we have better things to do. We deserve to complain and we have every right to dislike a service and we don't need to discredit our expectations.
Users want real time PvP and exchange/trade in PokemonGo, but after a year they are still not available. So why don't we build a new PokemonGo? Because we need to rent servers, build the code, maintain the code, make deal with Pokemon rights owners (which is impossible for a small company). So we bend and yield to Google, because we have no better alternative.
This incident teaches us a few things;
1. Please do not think AI/algorithm is smart enough to replace an actual human. Even though human carries biases and are sloppy too, but algorithms are just as bias, if not, more bias and more sloppy (dealing with "abnormality" and edge cases) than a human being will ever be. The hype "AI" will take over... we are nowhere near that.
2. YT is too popular to listen. Losing a few thousand users means nothing to YT. We will soon forget about this in 5 hours and go back to do our own things.
"Go start your own business if you disagree" seems like a middlebrow dismissal unless you're actually offering to fund the endeavor.
Agree. If Google just wants Youtube to be an entertainment funnel, it has every right to do so.
If the service you describe exists, it won't survive long. People would begin to swarm such services with rejected videos from Youtube, which you can foresee will be problematic and messy, in the end it might just turned into a 4chan where you can upload video. Hardly an attractive business.
That addresses one issue posed by AI bans, but not the other issue I mention in the second paragraph.
should be required by law
If your videos don't pass the algorithm, post them somewhere else rather than reaching for the government hammer.
Youtube/Google has every right to run their business of posting or denying video content the way they see fit without justifying it to you, free user of their service.
If you think they're making a bad business decision and that there's a need for a video service that gives great explanations when they deny your videos, start such a service.
This was to be expected. All history books are written this way. History books are government propaganda. History books do not document the truth. History lessons are nothing but propaganda. So history at school is nothing but learning government propaganda.
Thank you WSJ, NYTimes and the traditional media for pressuring youtube, facebook, reddit and social media to censor.
People aren't aware that for the past few years, traditional media and social media has been battling behind the scenes over content, narrative and censorship. It was a major war going on that the public was simply unaware of. Suffice it to say, traditional media won.
It is amazing how a select group of news organizations and their editors and journalists can use their bully pulpit to intimidate certain industries.
"Nationalize Google!" Okay, no, that would be turning the knob to 11.
I upvoted you because I was like, "Yeah, maybe some regulation might be good." Then I thought about who would be doing the regulating and now I'm much less enthusiastic. :-/
Still, it does seem like there should be a bit more, uh, community input into how these vast silos are administered.
Every company is regulated. You have to be more specific than that.
Perhaps its time for google and youtube to be regulated.
Google needs to be broken up. They have too much power.
They don't care, better to lose legit content than advertising dollars.
YouTube start to be dead. People move to use different platforms!
Google, you really start to suck more than my socks!
This is a side effect of Google employing ADL who are only interested in doing political cleansing. "AI" let's them have plausible deniability.
What are you talking about? They are deleting entire accounts of people who were filming their cities/districts being bombed by Russians, Assad or US/Coalition. (where there may not be any direct violence to any particular persons visible)
Who will prosecute whom? Primary source historical material is being removed wholesale by some shitty "AI". Account of recent history is being erased. Researchers who want to put some account of history together will have harder job. They will not go to subpoena google to release some material they don't even know google has. People whose lives were destroyed by a dictator will see YouTube erasing evidence of what happened, often times leaving propaganda channels for the regime untouched. It's a disgrace.
I actually came here today to try again to post about this issue, after deleting my Google/YouTube account, because I wholeheartedly disagree with this whole fiasco, that's going on for the last month or so. So I'm glad it's finally discussed.
Can't any prosecutor gain access to those videos via subpoena anyways?
YouTube is not a reliable video host, but that's okay. It's a company.
Fortunately these videos don't really rely on people finding them by having them recommended by an algorithm as they are merely evidence.
I don't see a problem and completely understand why YouTube (especially as it's getting as non-offensive as it can) doesn't want to show war crimes.