Log in

View Full Version : Made a bad game? Just call copyright strikes on any bad reviews!


Aldurin
10-21-2013, 08:28 AM
Escapist Article (http://www.escapistmagazine.com/news/view/128860-Day-One-Garrys-Incident-Devs-Accused-of-Censoring-Bad-Review)

QfgoDDh4kE0

Long story short, Wild Games Studio made Day One: Garry's Incident and managed to distribute it after a failed fraudulent kickstarter (the CEO tried inflating pledge total to fake popularity) and getting through Steam Greenlight via promising Steam keys to people who voted for it. People discovered that it was shit, Totalbuscuit was one of the people thrashing it and the studio called copyright strike to shut it down, despite ignoring other reviews and LP footage and having blatant documentation around about them giving an OK for people to upload videos of their content.

The big deal? Youtube shuts down channels that get three copyright strikes at once, and this can automatically be assigned, even bypassing agreements with networks to prevent this sort of problem. Removing this taint after three strikes is more or less impossible, with only a few channels like TheSw1tcher recovering through the ungodly clout of the top networks, so suppressing a review in this way is a very vicious abuse since it's a nail in a coffin with only three nail holes.

It also suppresses internet speech and damages consumerism, as the hivemind-esque voice that promotes good products and warns about bad ones is an essential part of encouraging proper business practice and competition. Wild Games Studios is a company of entertainment hacks trying to silence the loudest voices that warn the world that their product is just their used toilet paper.

I'm just disgusted, making a thread about it specifically facilitate the spread of this news so that it becomes as common knowledge as EA being a garbage publisher. Don't just read this thread, but also never shut up about it elsewhere so that this kind of horrendous business practice gets what it deserves.

Grandmaster_Skweeb
10-21-2013, 02:52 PM
read about this the other night. That CEO pulled the PR equivalent of curing one's headache with a shotgun. Won't be surprised if his career is effectively titsup for good at this point.

Loyal
10-21-2013, 04:15 PM
How many times does this shit have to happen before Youtube finally fixes their broken copyright system?

Aerozord
10-21-2013, 04:49 PM
How many times does this shit have to happen before Youtube finally fixes their broken copyright system?

its not really youtube. If youtube could get away with letting people post anything they wanted they'd make alot more money. Its the copyright holders that are so anal about it. Thats why certain IP are more harshly flagged than others.

Youtubes real issue is that this is an automated system with no human oversight. Sadly this isn't really possible to get rid of since there are far too many videos being uploaded for a human being to verify every flag

Loyal
10-21-2013, 04:54 PM
Youtubes real issue is that this is an automated system with no human oversight.This is, in fact, the exact thing I was complaining about. An automated system that doesn't work has no business being used.

Aerozord
10-21-2013, 05:00 PM
This is, in fact, the exact thing I was complaining about. An automated system that doesn't work has no business being used.

and what would they use instead? Hundreds if not thousands of trained civil paralegals to authenticate every suspected violation?

Suppose there might be an alternative automated system, but nothing practical comes to mind.

Sithdarth
10-21-2013, 05:56 PM
We used to pay women to connect calls not more than 3-4 decades ago. I'd wager more than a million at peak (there were still over 300,000 operators working in 1998). It's probably physically possible but expensive. Content ID (which I believe is the copyright thing) has only flagged a total of over 200 million things since it started (though I'm not sure how far over). That doesn't seem like too many claims a day to shift through given how long its been running. Also, I'm sure it should be possible to extract just how close a match the content is to the reference. Anything that scores above a certain value is automatically processed. Anything below a certain cut off value is ignored and anything that is close to that cut off value is processed and reviewed by a human before any action is taken.

Now that I think about it something like that is probably fairly close to what happens. There are probably only a few minor changes to make:

1) Fine tune the cut off for automatic processing.
2) Prevent automatic action in cases that are within a certain range of that cut off.
3) Send the ambiguous cases to a human to a human reviewer. (And I bet in most cases a simple read of the description without watching the video will allow the human to make the proper decision.)
4) Possibly have a tiered system of lower skilled humans that do quick reviews and decided if the claim gets passed to someone with more experience

None of that is particularly difficult but it does represent an investment of time and resources that YouTube isn't willing to make until someone makes them make it. Heck even spending some time on creating and training a neural network (a completely software neural network) rather that using a static algorithm would probably yield better results. After all software neural networks are amazing at pattern recognition when properly trained. (Although creating a data set of sufficient size to train the network and a second data set of sufficient size to test the network can be time consuming. Also, such a system should still be backed by humans if it finds an ambiguous case.)