Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Algorithmic Copyright Cops: Streaming Video's Robotic Overlords

samzenpus posted about 2 years ago | from the you-can't-show-that dept.

Censorship 194

thomst writes "Geeta Dayal of Wired's Threat Level blog posts an interesting report about bot-mediated automatic takedowns of streaming video. He mentions the interruption of Michelle Obama's speech at the DNC, and the blocking of NASA's coverage of Mars rover Curiosity's landing by a Scripps News Service bot, but the story really drills down on the abrupt disappearance of the Hugo Award's live stream of Neil Gaiman's acceptance speech for his Doctor Who script. (Apparently the trigger was a brief clip from the Doctor Who episode itself, despite the fact that it was clearly a case of fair use.) Dayal points the finger at Vobile, whose content-blocking technology was used by Ustream, which hosted the derailed coverage of the Hugos."

cancel ×

194 comments

Sorry! There are no comments related to the filter you selected.

algorithmic frosty stream (-1, Offtopic)

Anonymous Coward | about 2 years ago | (#41255255)

in ya face

He's a she (5, Insightful)

Anonymous Coward | about 2 years ago | (#41255257)

Geeta Dayal is a she. Just sayin'.

VOmit + BILE (-1, Offtopic)

Anonymous Coward | about 2 years ago | (#41255259)

VO BILE
disgusting, ain't it?!

Geeta Dayal is female (5, Informative)

Anonymous Coward | about 2 years ago | (#41255267)

Might want to double check your pronouns.

Re:Geeta Dayal is female (-1)

Anonymous Coward | about 2 years ago | (#41256129)

Might want to double check your pronouns.

Funny that neither the submitter nor editors caught this, considering Indians practically 0wn the IT software market over here in the US.

Re:Geeta Dayal is female (2, Interesting)

formfeed (703859) | about 2 years ago | (#41256289)

female or legitimate female?

Re:Geeta Dayal is female (0, Insightful)

Anonymous Coward | about 2 years ago | (#41256675)

hey now.. in today's radiant socialist future, ANYONE can identify with female if they choose.. it's just when they identify with male that they must be considered wannabe oppressors/terrorists/misogynists..

'Fair Use' is not sufficiently well defined (4, Insightful)

djnanite (1979686) | about 2 years ago | (#41255297)

We have repeated cases of people going to court to dispute 'fair use', which shows that it is not well defined enough for humans to get right, let alone automated bots.

Lay down specific rules for 'fair use' and then you can write an algorithm to respect those rules.

(Just don't let RIAA/MPAA dictate the rules.)

Re:'Fair Use' is not sufficiently well defined (5, Interesting)

Anonymous Coward | about 2 years ago | (#41255363)

But the RIAA/MPAA has already dictated the terms of fair use: Any use that brings us revenue is fair, and all others are not :)

Reason I think we should stuff a hot poker up their asses and make copyright a flat 18 years for individuals and 5 years for corporations, with not extensions and a one year loss in term for each transferral of copyright (be it selling the copyright or merging/wholly owning the company).

That would solve the current issues with it, provide revenue over the primary useful life of the material, cut into residuals sadly, but result in more long term innovation since not producing new material will result in bankruptcy rather than an endless stream of relicensing/remaking old material. If all actors/actresses got flat pay (same as 'staff') however it'd be no different than any modern non-IP related job.

Re:'Fair Use' is not sufficiently well defined (5, Insightful)

cheesecake23 (1110663) | about 2 years ago | (#41255719)

But the RIAA/MPAA has already dictated the terms of fair use: Any use that brings us revenue is fair, and all others are not :)

Funny, but unfortunately the RIAA/MPAA aren't that clever. If they were, they wouldn't issue takedown notices to all the free advertising they get in fan videos on Youtube of movie scenes and teens dancing to pop songs, which would be deemed fair use in any sane legal universe.

Also, definitely not fair use but still remarkable: it's astonishing how they happily piss money away in their idiotic war on pirated films and music - imagine what this word-of-mouth marketing would cost them if they actually had to PAY for it.

Re:'Fair Use' is not sufficiently well defined (5, Informative)

symbolset (646467) | about 2 years ago | (#41256131)

Actually, Viacom sued google for distributing on youtube content uploaded by their own employees, both from their own offices and the employees homes.

Re:'Fair Use' is not sufficiently well defined (1, Interesting)

tqk (413719) | about 2 years ago | (#41256299)

Reason I think we should stuff a hot poker up their asses and make copyright a flat 18 years for individuals and 5 years for corporations ...

Nope, too generous. They're both flawed concepts, and too easily gamed. Zero years for both. Compete on your merits, damnit! Don't expect us to help you with legislative crutches. These are the rules the rest of us are expected to go by. Welcome to reality. Suck it up.

Re:'Fair Use' is not sufficiently well defined (4, Insightful)

FirephoxRising (2033058) | about 2 years ago | (#41255373)

I agree, they shouldn't be allowed to run these bots until they have a perfect algorithm, which will be never. These high profile examples should be enough to allow the EFF to do something? The bots should at the very least have to "request" a violation check, not an immediate take-down.

Re:'Fair Use' is not sufficiently well defined (4, Insightful)

KingMotley (944240) | about 2 years ago | (#41255973)

They should be allowed to run them, however, the consequence of them being wrong should place a large enough deterrent that they should not WANT to if it isn't extremely accurate.

Re:'Fair Use' is not sufficiently well defined (5, Insightful)

Frosty Piss (770223) | about 2 years ago | (#41255553)

We have repeated cases of people going to court to dispute 'fair use', which shows that it is not well defined enough for humans to get right...

Pretty much bullshit, "fair use" is indeed well defined.

But the bigger issue *is not* the "bots", it's the media outlets that accept "take down notices" from bots.

In other words, I'm questioning the legality of machine generated "take downs" that have no human interaction. I'm suggesting that it is not strictly legal and media outlets could certainly make an argument that such automated "take downs" constitute an unfair burden and so are invalid.

Seriously.

Lenz v. Universal (4, Informative)

tepples (727027) | about 2 years ago | (#41255881)

media outlets could certainly make an argument that such automated "take downs" constitute an unfair burden and so are invalid.

And the legal theory on this could develop from Lenz v. Universal: a copyright owner's representative must consider fair use and other defenses in good faith before filing a notice of claimed infringement under OCILLA.

Re:'Fair Use' is not sufficiently well defined (-1)

Anonymous Coward | about 2 years ago | (#41256179)

So do away with fair use.

Want to use something copyrighted? Pay the rights holder what is due for the amount of the copyrighted content to be used.

Again, just do away with fair use.

Re:'Fair Use' is not sufficiently well defined (2)

epyT-R (613989) | about 2 years ago | (#41256699)

noo thanks.. I don't want to live in a police state so people can prop up false scarcity.

Re:'Fair Use' is not sufficiently well defined (3, Informative)

Anonymous Coward | about 2 years ago | (#41255671)

A lot of people (especially online) have a big misunderstanding of what "fair use" is.
The misconception is that there's a set of rights that we have to use copyrighted material, and when the copyright holder doesn't respect those rights he's just being a jerk.
Actually, "fair use" is just a defense you can use when sued for copyright infringement.
And it's almost entirely up to the discretion of the court to decide whether or not you'll get off with that defense.
There's no actual set of rules, and that's the way the law is written.

Re:'Fair Use' is not sufficiently well defined (4, Informative)

mark-t (151149) | about 2 years ago | (#41256281)

Actually, "fair use" is just a defense you can use when sued for copyright infringement.

In Canada, the equivalent concept is not a defense for infringement, the equivalent concept creates an exemption to infringement in the first place. So, if your usage was fair, as determined by law, then your defense if you should happen to get sued for infringement would simply be that you didn't infringe on copyright in the first place.

Re:'Fair Use' is not sufficiently well defined (5, Informative)

jthill (303417) | about 2 years ago | (#41256521)

PP is, in every relevant way false.

Fair use is, by statute declaration, not copyright infringement at all. Copyright holders have no authority at all to forbid any fair use.

PP might as well have said a lot of people have a big misunderstanding of what "innocence" is, that "innocence" is just a defense you can use.

The criteria for fair use are laid out in statute law and have decades of case law to back them. Courts have the same discretion in finding the boundaries of fair use as they have in finding the boundaries of any other law, and the same responsibility. There's nothing at all remarkable about that discretion, it's why they're called "Judge".

Re:'Fair Use' is not sufficiently well defined (0)

Anonymous Coward | about 2 years ago | (#41255829)

The US definition of fair use is the best in the world with the exception of Israel. Israeli law copies US law and then adds it as a right. Currently, fair use is only a defense in the US. The solution, in my opinion, is to do what Israeli law has done. Continuing to use it only as a defense in this age is sort of ridiculous.

Wrong! (0)

Anonymous Coward | about 2 years ago | (#41256625)

No algorithm is going to be able to tell the difference in context, which is what defines fair use. I can take a single clip and weave into a music mash up or a parody. The parody is fair use, but the mash up requires permission. How's the silly computer programmer going to accurately describe the context in a manner consistent with the distinctions enacted into law and adjudicated (keyword here is) 'interpretted' by the courts?

If the video of Michelle Obama's speech was taken down by Al the Gorithm, then Google ought to be ashamed, unless they were paid by the Republicans. In which case they both should be sued for interfering with the Fourth Estate.

The only way this situation will ever get better, if it's truly as described in the fine article, is for the companies that depend on it to be successfully sued. Then they'll implement a seeing-eye person to guide Al the Gorithm so it won't embarrass them or cost them money.

Remember: Fair use is like p()rn. A justice just knows it when he sees it because it's successfully argued as such in front of him in his courtroom. Algorithms are like moles, they don't see anything, they just keep digging til they find a grub. Then they eat it and start digging again.

Re:'Fair Use' is not sufficiently well defined (4, Interesting)

Mr. Shotgun (832121) | about 2 years ago | (#41256679)

Unfortunately the example from the summary are not fair use cases, more like original producers vs hangers on. The content publishers are using bot's without checking the results. They need to have some guy checking the flags and using sanity testing to verify if the flag is correct. I mean come on, NASA vs some newspaper in Cincinnati, who in the fuck is more likely to have produced footage from the curiosity rover on Mars. Or DNC coverage, who has the copyright, the DNC or a news organization rebroadcasting what the DNC made? Some types of people accept what a program says as the gospel truth, which leads to fuckups like the content flagging and Knight Capital. Computers are tools, not overlords as someone else said.

The Solution (1)

Anonymous Coward | about 2 years ago | (#41255307)

Sue them into oblivion when they screw up.

Re:The Solution (1, Troll)

nurb432 (527695) | about 2 years ago | (#41255995)

No. The solution involves C4

Re:The Solution (0)

Anonymous Coward | about 2 years ago | (#41256243)

is that oblivion before or after the third party mods that fix all the game-breaking issues?

Re:The Solution (3, Interesting)

mark-t (151149) | about 2 years ago | (#41256301)

The problem could also be partially solved by simply instituting legal fines to corporations that falsely accuse somebody of infringing on copyright. There'd be no particular benefit to anyone who was wrongly accused, but if the fines are heavy enough, there could plenty of disincentive for companies to do that to people in the first place.

Outrage!??? (0)

Anonymous Coward | about 2 years ago | (#41255319)

Where's the public outrage over this? It's just "uhuhuh yeah, them bots aren't working properly yet uhuhuh". Why the F do these 'bots' have the power to block this stuff in the first place? Is everyone so beaten to death by the Copyright Industry that this is all acceptable collateral damage without need for immediate and harsh punishment?

Re:Outrage!??? (1)

kamapuaa (555446) | about 2 years ago | (#41255415)

Well, right now the #1 Google search result for "youtube" is the Michelle Obama story is youtube, so there is some outrage. But realistically, the video is perfectly watchable now, a couple days later. The video was on TV, the video was probably available from numerous other online sources, so it's considered an accident that didn't really affect anybody. Who really cares if one particular video stream goes down temporarily, for no malicious purpose?

And with the UStream video, perhaps it was seen as karmic vengeance for the committee passing up Community's "Remedial Chaos Theory."

Re:Outrage!??? (4, Insightful)

fustakrakich (1673220) | about 2 years ago | (#41255487)

Make no mistake, takedowns are always malicious, by their very nature. And the law that permits/demands it is even more so. I still hold on to the hope that someday our communications systems (internet, telephone, broadcast, etc) become robust enough to make all censorship impossible. Must destroy central control. That is our obligation.

Re:Outrage!??? (0, Troll)

kamapuaa (555446) | about 2 years ago | (#41255779)

Why?

Takedowns have a legitimate purpose. As an extreme example, what if it's child porn? What if it's a bootleg of your favorite movie that just came out on DVD?

It already is robust enough to make censorship impossible. People choose not to implement this robustness, because of genuine concerns that you may not totally agree with but at least have a logical argument supporting them.

Re:Outrage!??? (1)

Anonymous Coward | about 2 years ago | (#41255867)

Even if its child porn, they should first prove that they own the video before its taken down.

Re:Outrage!??? (2)

Skapare (16644) | about 2 years ago | (#41255955)

The people who think THESE things are so important (and there should be a LOT of such people) need to do EVERYTHING they can to be sure the system works ABSOLUTELY CORRECTLY from here on out. Otherwise the internet WILL work around the flaw of these misprogrammed incompetent bots, and then their goals of blocking things like child porn will not be able to succeed. This is the dire warning they need to heed, and join in the effort to fix the seriously broken copyright enforcement system. ANY one of them that does not shout out against the RIAA and MPAA and others that are ruining things loses any right to expect the changes in the internet to consider their needs.

Re:Outrage!??? (2, Insightful)

Anonymous Coward | about 2 years ago | (#41256053)

Takedowns have a legitimate purpose.

No, they seem to be abused time and time again. Go through the courts if you have an issue. You can't take shortcuts.

As an extreme example, what if it's child porn?

I like how people mention this as if it's self-evident. Honestly, not everyone has an irrational, insatiable desire to go after people who look at images or websites hosting them; some people would rather the actual perpetrators get caught.

But what does this have to do with takedown notices?

What if it's a bootleg of your favorite movie that just came out on DVD?

Why did you even mention this? Why would I care? Incidentally, ask a judge to order the content removed.

Re:Outrage!??? (2)

fustakrakich (1673220) | about 2 years ago | (#41256505)

People choose not to implement this robustness...

And that is exactly what we need to circumvent... We have to remove control from these people who think they know best. We have to create the proverbial 'dumb pipe' that remains transparent to all content, no matter how offensive one may feel about it. You don't control kiddie porn by censorship, you do it by treating the desire. Allowing people to have normal, healthy sex would be a good first step.Censorship has the opposite effect. Sexual depravity is usually a consequence of the puritan rules created principally by powerful religious groups and their subservient governments. Sexual deprivation will make people crazy, literally, and the the only result can be sexual perversion. Censorship is always evil and against our better interests. Takedowns have no legitimate purpose of any kind. There is no benefit to the society. Only its elites with their mad desire for domination can benefit.

As for the movies... fuck them. That discussion has already been hashed out.

Re:Outrage!??? (1)

Anonymous Coward | about 2 years ago | (#41256507)

How is child porn remotely relevant to copyright law? If you think it is then you are suggesting that creators of child porn should be able to enforce their copyright with takedown notices. I don't think that's very logical.

Re:Outrage!??? (2)

1u3hr (530656) | about 2 years ago | (#41256673)

Takedowns have a legitimate purpose. As an extreme example, what if it's child porn? What if it's a bootleg of your favorite movie that just came out on DVD?

The first case is nothing about copyright at all. It shouldn't be the same mechanism. Only a complete idiot would even try to put child porn on the open Internet now. It'd be like walking into an airport with a bomb vest. To suggest that it's somehow in the same category as streaming a Hollywood movie is really stupid and inflammatory, though I'm sure politicians and media companies would do so without blinking.

In any case, both cases would clearly be criminal acts so they should not be "taken down" by a bot but the relevant agency (probably FBI) should track down the source and prosecute, and blocking it might make that harder.

Obama, repeal the DMCA! (4, Funny)

Nirvelli (851945) | about 2 years ago | (#41255331)

President Obama,

The DMCA has deleted your wife from the internet! You must repeal it immediately!

Sincerely,
A Concerned Internet Citizen

Re:Obama, repeal the DMCA! (1, Funny)

Mashiki (184564) | about 2 years ago | (#41255357)

In Soviet America, bot deletes wife!

Re:Obama, repeal the DMCA! (3, Insightful)

fm6 (162816) | about 2 years ago | (#41255467)

I'm sure he'll get right on it. All he has to do is get repeal past the Republican majority in the House and the permanent Republican fillibuster [cnn.com] in the Senate.

Re:Obama, repeal the DMCA! (1)

Anonymous Coward | about 2 years ago | (#41255501)

LOL, you still think there is a difference in the political parties, cute.

Re:Obama, repeal the DMCA! (0)

Anonymous Coward | about 2 years ago | (#41255731)

there is a difference.

one party will have goosestepping lessons twice per week, the other will hold them three times weekly.

choices, choices...

Re:Obama, repeal the DMCA! (4, Insightful)

clarkkent09 (1104833) | about 2 years ago | (#41255843)

What makes you think that Obama wants to repeal the DMCA? With the amount of support and money he is getting from Hollywood it is not surprising he is not mentioning copyright at all.

Re:Obama, repeal the DMCA! (0)

Anonymous Coward | about 2 years ago | (#41255921)

If he is the hope and change he claimed to be in the first election he might just say fuck yall to Hollywood after reelection and do the right thing. ...

I'm not counting on it.

Lets get back to the 24/7 coverage of the epic electoral battle between Douchbag and Turd Sandwich.

Re:Obama, repeal the DMCA! (1)

fm6 (162816) | about 2 years ago | (#41256113)

I'm sorry, I can't always remember to use a smiley when I make a joke.

And why is the technology to blame? (4, Interesting)

c0lo (1497653) | about 2 years ago | (#41255333)

the trigger was a brief clip from the Doctor Who episode itself

In itself, the tech has shown an impressive quality if a brief clip was recognized in realtime.

Would anyone blame the hammer because it's an excellent tool to drive nails under one's... well... nails?

Re:And why is the technology to blame? (4, Insightful)

Required Snark (1702878) | about 2 years ago | (#41255627)

This technology was designed to find infringement. It was not designed to find cute images of puppies. There is nothing in the code to recognize fair use. The technology is intrinsically broken. Perhaps it could be fixed, but there is no incentive to make it work fairly.

A lot of technology is like OxyContin: it is very easy to abuse. The manufacturers/deployers make money and never suffer the negative effects. It's disingenuous to say that the technology is neutral and does not embody an business/political agenda. In this case allowing fair use would make the system much more complex, and might render it useless. For example if there were meaningful fines for false positives then those using this technology would have to act differently. Hell will freeze over before that happens.

Re:And why is the technology to blame? (0)

Anonymous Coward | about 2 years ago | (#41255951)

Touche, it was in fact designed precisely TO find videos of cute puppies! There's probably a lot of cute-puppy-related content that is potentially infringing...

Re:And why is the technology to blame? (4, Insightful)

Skapare (16644) | about 2 years ago | (#41256033)

This technology was designed to find infringement.

I seriously doubt that. I think it was designed to find matching content and CLAIM it to be infringement while really having no means whatsoever to determine that.

Re:And why is the technology to blame? (4, Interesting)

c0lo (1497653) | about 2 years ago | (#41256101)

This technology was designed to find infringement. It was not designed to find cute images of puppies. There is nothing in the code to recognize fair use. The technology is intrinsically broken.

Correction: as demonstrated, the technology is excellent (in its recognition capabilities). Also as demonstrated, the use of the technology for certain purposes (police copyright infringement) is broken.
It doesn't mean that for other purposes (finding images of cute puppies included) the same technology cannot be excellent.

My point: don't blame the "robots", blame those who use them as "overlords". Otherwise, you'd be only adopting the same position to those who would very much like to ban/criminalize a technology (e.g. encryption? The use of Tor?) only because they can be used for copyright infringement or drug trafficking.

Re:And why is the technology to blame? (0)

Anonymous Coward | about 2 years ago | (#41256215)

RIAA/MPAA and Oxycontin: One is a habitual thief, money-grubber, and life-ruiner... the other is oxycodone.

Re:And why is the technology to blame? (2)

Skapare (16644) | about 2 years ago | (#41256001)

Whether something is infringing or NOT infringing, cannot be determined by matching content alone. Anything that only looks for matching content is intrinsically and fundamentally broken. Anyone who would design it that way and claim it to be correct is a liar and a fraud.

A hammer is not expected to detect if the target happens to be a screw. Claiming something merely detects that some content matches some other content is fine. Jumping to the conclusion that because it matches, it must therefore be infringing is stretching truth beyond the breaking point. I am not impressed by it at all because it clearly is a failure to determine whether content is, or is not, actually infringing.

Re:And why is the technology to blame? (2, Insightful)

c0lo (1497653) | about 2 years ago | (#41256135)

Whether something is infringing or NOT infringing, cannot be determined by matching content alone. Anything that only looks for matching content is intrinsically and fundamentally broken. Anyone who would design it that way and claim it to be correct is a liar and a fraud.

Correct. But that's exactly my point: the anyone who would etc is to blame and the reason for the blame is not because they designed the technology, but the way they use it (or claim it can be used) - is the second part of the logical conjunction that renders them liars.

Re:And why is the technology to blame? (0)

Anonymous Coward | about 2 years ago | (#41256393)

the trigger was a brief clip from the Doctor Who episode itself

In itself, the tech has shown an impressive quality if a brief clip was recognized in realtime...

I suppose you define "impressive" a bit differently than I do. I see nothing but overzealous idiots drunk on power and control abusing the shit out of any system that can react THAT fast...(sorry, it's a rather nasty side effect of being overexposed to Government abuse)

Re:And why is the technology to blame? (2)

c0lo (1497653) | about 2 years ago | (#41256553)

the trigger was a brief clip from the Doctor Who episode itself

In itself, the tech has shown an impressive quality if a brief clip was recognized in realtime...

I suppose you define "impressive" a bit differently than I do. I see nothing but overzealous idiots drunk on power and control abusing the shit out of any system that can react THAT fast...(sorry, it's a rather nasty side effect of being overexposed to Government abuse)

There are two aspects of this (the "technology" and "the (ab)use of the technology") and both of them can be categorized on the "impressive" dimension/axis.
It just happens that I'd classify both of them as "impressive" in their absolute value, but... when I add also the "sign for the value", the things change into:
1. the technology - has a large positive value - to detect a short clip in realtime while streaming is a feat in my opinion
2. the use of the technology - has a large but negative value, like in "I'm exceptionally but unpleasantly impressed by the stupidity of using a powerful technology under the absolutely moronic assumption that any detection of a copyrighted sequence in a stream constitutes a copyright violation and must be terminated".

Last, but important enough to repeat myself: it is crucial to make the distinction between a technology and a particular use of the technology: fail to make this distinction and you will "terminate" a technology instead of "terminate an immoral/illegitimate use of the technology".
E.g.Internet is the main technology used nowadays for infringing copyright; how about banning Internet in all its uses? What about encryption/Tor?

penalties required (3, Insightful)

Anonymous Coward | about 2 years ago | (#41255361)

The solution is to implement penalties for false takedown requests. Say, $25 per user per stream.

Re:penalties required (3, Interesting)

Drishmung (458368) | about 2 years ago | (#41255647)

The problem with that is that the *IAA don't, strictly, make a takedown request. This is a proactive service that Google/Ustream et al offer well above the DMCA requirements. So, there is no way to penalise them for what they will claim they didn't do.

Instead, make the takedown request cost up front. It costs Google/Ustream etc. to implement the bots. It seems reasonable that those benefiting from them should pay.

I suggest something like:

  • You put $x up front into our account, @$y per implemented block, sufficient to process n takedowns. (n = $x/$y).
  • Any takedowns in excess of 'n' will not be processed.
  • At the end of the month, you will be rebated m x $y where 'm' is the number of undisputed takedowns.
  • Disputed takedowns will not be reblocked. You must file a DMCA takedown f you wish to dispute the case.

    Still not perfect, but if the studios don't like it there is always the DCMA.

    (What Google get out of this is essentially the interest on the money for a month---not much but enough to compensate them somewhat).

Re:penalties required (2)

PPH (736903) | about 2 years ago | (#41255977)

$25? Too cheap.

If an unauthorized download can cost someone from $20,000 to as high as $80,000 [cnn.com] per song, lets make the false takedown requests cost the requester the same thing.

Have Google/YouTube and others require requestors to post a bond for the amount prior to honoring it. If the request proves to be in error, the poster or actual copyright holder gets the proceeds. If not, the requestor gets the bond back.

Hugo Weeps (1, Troll)

fm6 (162816) | about 2 years ago | (#41255433)

The Hugo Awards, he said, were not using the paid “pro” version of Ustream’s live streaming service. The paid version of Ustream does not use Vobile.

“The Hugo Awards were using the free ad-supported capability,” Hunstable said. “And unfortunately Ustream was not contacted ahead of the time about their use of the platform.”

I think the lesson we should take from that is this: if you're broadcasting copyrighted material, you need to contact the streaming vendor and work with them to make sure there's no interruption.

Which is not to defend the interruption. It seems pretty unfair to automatically take down a live stream just because it might have unauthorized content. Though one can't really complain about it when you're using a free ad-supported version of the service. Next time, the Hugo people will presumably do their homework, maybe spend a little money, and avoid this kind of glitch.

Re:Hugo Weeps (1)

rtb61 (674572) | about 2 years ago | (#41255491)

Ahh so the new rule is you must pay the copyright overloards to ensure the content you create isn't taken down by default, something really, really stinks in that statement.

It stinks like the existing publishers are trying to enforce a system which necessitates paying them a percentage of your revenue else like protection racket organised crime your content will suffer an accident. If you do not see the criminality implicit in your statement then you deserve to be arrested, ignorance of the law is no excuse. The rest of use of course are now seeing what is really happening, a direct attempt at extortion by the existing content cartel, pay us or have your content disrupted.

Re:Hugo Weeps (3, Insightful)

fm6 (162816) | about 2 years ago | (#41255651)

You're not paying the copyright overlords. You're paying the video distribution system. If you're using the system for free, you can't expect them to take the lawsuit risk for you, so you shouldn't complain if they impose a stupid filter robot. So pay a fee (which probably gets rid of those ads people love to complain about) and show them you have permission to use the material you're broadcasting, so they can safely turn off the filter.

This is nothing new. "Clearance" has always been a major part of making movies and TV shows. (You know why the little kid in E.T. ate Reese's Pieces? They couldn't get permission to use Skittles.) Creative people have to work with the system, in part because it's the same system that allows them to profit from their work.

Mind you, I'm not defending the copyright overlords, with the legal sledgehammers and retroactive copyright extensions. But as fucked up as the system is, it's the one we've got, and the problems of dealing with it are nothing new.

Re:Hugo Weeps (0)

Anonymous Coward | about 2 years ago | (#41255787)

so you shouldn't complain if they impose a stupid filter robot.

What I should or should not do is up for me to decide. I very well can complain! Complaining and then taking action is one way to change something.

in part because it's the same system that allows them to profit from their work.

Not necessarily. Perhaps they're not using the more draconian 'features'.

Honestly, it's laws like the DMCA that are causing this mess. Remove first and ask questions later (or bye to your safe harbor status).

Re:Hugo Weeps (2)

rtb61 (674572) | about 2 years ago | (#41255953)

Do you not understand how organisation like the Mafia implement a protection racket. The Don or Boss doesn't go out roughing people up, nor does the Consigliere, even the Caporegime doesn't go out and rarely do the soldiers pay a visit (they tend to visit latter in the piece), it's associates who go out and do the dirty work, those with the least links bank to the Capo. Now that holds true for them all whether it be the Triads, Yakuza or the RIAA/MPAA. Something else interesting all of them at various times have had direct ties to government, just as the RIAA/MPAA does now.

Re:Hugo Weeps (2)

fm6 (162816) | about 2 years ago | (#41256103)

So, anybody who tries to get money out of somebody else for any purpose is a gangster? As the anarchists used to say, Property is Theft? Whatever. I don't think that's a concept that's going to catch on.

Re:Hugo Weeps (0)

Anonymous Coward | about 2 years ago | (#41255589)

The problem is, how will the bots know what copyrighted material in a given stream is licensed and what is not? There are no (repeat: NO) exceptions being built into these bots, just the "FLICK THE SWITCH NOW" code. So, if I properly license my stream, and the upstream service provider doesn't care and their bots turn it off anyway (it's not their liability if they turn it off by the way - just part of their terms of service), the only recourse I have is to sue. If you're a little guy (read: have less than $10M in revenue annually) the best you will get is "Oh, sorry. Our bad".

The bigger question is: do we really want censorship included in the core of our communications systems? Listening is one thing, acting on it and cutting the line is another. This same technology can be used to drop connections on any number of criteria, not just copyright violations. This is just a bad idea from start to finish.

Re:Hugo Weeps (1)

bws111 (1216812) | about 2 years ago | (#41256465)

News flash: YouTube, UStream, et al are not 'the core of our communication systems'. They are commercial entities who's current business model is allowing users to post/stream stuff for free. They are no different than a store that makes a choice on what kinds of items and what brands it wants to carry. They are not 'censoring' anything, they are simply deciding what they will and will not provide. You are perfectly free to take your hosting elsewhere or host it yourself.

Believe it or not, 'censor' does not mean 'makes it a little more difficult ot expensive than I would like'.

Re:Hugo Weeps (0)

Anonymous Coward | about 2 years ago | (#41256501)

They are not 'censoring' anything

If they remove content that was already there, that is censorship. You might believe it's perfectly acceptable for them to do, but it's still censorship.

Re:Hugo Weeps (1)

FrangoAssado (561740) | about 2 years ago | (#41256533)

There are no (repeat: NO) exceptions being built into these bots, just the "FLICK THE SWITCH NOW" code.

Where did you get this information? Because it completely contradicts what UStream said a few days ago [ustream.tv] :

Users of our paid, ad-free Pro Broadcasting service “NOTE: UPDATED CLARIFICATION” ‘and those free broadcasters who notify Ustream in advance they have copyrights permissions (Ustream’s messaging to our broadcaster community how this process works is inadequate. We are resolving this now)’ are automatically white listed to avoid situations like this and receive hands-on client support.

Use BitTorrent (with encryption) (0)

Anonymous Coward | about 2 years ago | (#41255493)

Why do people still use censoring sites like YouTube?

Re:Use BitTorrent (with encryption) (0)

Anonymous Coward | about 2 years ago | (#41255543)

what are the alternatives?

Re:Use BitTorrent (with encryption) (1)

HornWumpus (783565) | about 2 years ago | (#41255727)

PornTube.

Re:Use BitTorrent (with encryption) (1)

lister king of smeg (2481612) | about 2 years ago | (#41256579)

yeah I'm sure thats what the Hugo Awards want to associate themselves with, besides I think that even they have limits on what type of content is posted so as to insure that people find what they want there, aka porn only. The Hugo's would be best off hosting the stream on their own site on maybe something along the lines of a amazon ec2 instance so it can handel all of the requests.

Re:Use BitTorrent (with encryption) (0)

Anonymous Coward | about 2 years ago | (#41255755)

how about mediafire.com?

Re:Use BitTorrent (with encryption) (0)

Anonymous Coward | about 2 years ago | (#41256577)

liveleak.com

Ustream Boycott (0, Interesting)

Anonymous Coward | about 2 years ago | (#41255495)

From TFA:

Brad Hunstable, Ustream’s CEO, says the volume of content is overwhelming and content-blocking algorithms are key to keeping copyright holders happy.

Let's boycott Ustream for a week, starting October 1st. Perhaps some Reddittors can give this some boost. Who does Ustream want to listen to, the MPAA or those that they serve their ads to?

Its not just the AI (1)

wbr1 (2538558) | about 2 years ago | (#41255535)

However the content is removed, be it by an AI skimmer, a human, or a copyright holder or troll sending takedown notices, we keep missing one key part of the equation.
That part is the fact that there is little to no recourse for those that have legitimate content taken down. In this case apparently uStream was silent and ignored things. How hard would it have been to get a human to look at the stream? Shouldn't your NOC have a few people on hand to do this at all times?
Another case, takedowns on youtube. One troll can issue unlimited bogus takedown requests with no fear of any punishment or reprisal, even though they are clearly in the wrong. An individual youtube channel however does not fare so well, to many takedowns, legit or not, and you are suspended or removed.
There are no real checks and balances in the system. Those in power, with money and lobbyists and pet politicians sit at the top of the hill, and the shit rolls downhill to the common man.
Technology has and is making it easier and easier for the common man to produce content. This we know, and it scares the behemoths as they see people slowly fleeing. This type of behavior is grasping, saying mine, mine min, when you already have more than enough, and also another barrier to entry for competition, which poorly replaces the old barrier of content production and distribution, which was high cost.

Re:Its not just the AI (-1)

Anonymous Coward | about 2 years ago | (#41256221)

So produce content, without using theirs.

Distribute user created content only.

Re:Its not just the AI (5, Insightful)

James McGuigan (852772) | about 2 years ago | (#41256309)

We live in an economic system based primarily on the concept of scarcity. Money is valuable because nobody ever seems to have quite enough of it, just like diamonds and gold. The law of supply and demand, there is more demand than supply.

The internet and the information age changes all that, its economics that is based on abundance. For the cost of a broadband connection we can give every human child access to the entire digitized archive of the collected works of the human race. The cost of doing this even 100 years ago would have been astronomical and beyond the reach of even the richest philanthropists. The old business models based upon everybody getting a percentage of the production costs break down then the production costs become zero. We have so thoroughly eliminated the nature of scarcity with information that our scarcity based economics, upon which we all depend for our basic survival, has deemed everything that abundance touches to be worthless. Rather than creating "utilitarian" value, it is seen as destroying "valuable" scarcity.

If you want economics based on abundance/gist-culture to actually work, then content effectively needs to be paid for in advance, with the risk of not-recieving, rather than our debt based culture of buy now pay later, where we only buy the hits after they have proven popular.

Copyright is about trying to maintain the concept of artificial scarcity in an environment where it doesn't naturally exist. But assuming that copyright has its place, there are checks and balances, but they don't all operate at the cost level and the speed we are used to in internet time.

The old school system of checks and balances are the courts and the legal system, You can spend a huge quantity of time, energy and money to argue your case and get a ruling from a professional judge. This works well when you have a small number of large, rich corporations with known fixed addresses and assets. Under the old school system, for things like copyright, most private individual use tended to simply fly under the radar and nobody made a big fuss about it regardless of the technicalities of the law. So disregarding morals, we are now down to game theory.

Now Mr Joe Blogs with his modern broadband connection could easily broadcast more copyrighted music than half a dozen fully funded commercial radio stations from a decade ago. Mr Joe Blogs has very little in sue-able assets beyond a bankruptcy order and by the time the cogs in the legal system have turned, he has managed to transfer more data than could fit on a supercomputer from 25 years ago. Joe blogs wasn't as issue when all he had was a mix tape and 5 friends, now he has a mix computer and 5 billion friends. Plus there are now a hundred million Joe Blogs on the internet doing exactly the same thing. So we have innocent until proven guilty, with a very high cost and high standard of proof required to enforce copyright, with the burden resting fully on the copyright holders.

So the copyright holders go to the politicians and say they cannot effectively enforce their business model, so they ask for the risk model of the checks and balances to be changed. Hence the DMCA, we want to be able to just send a letter and then shift the burden of proof onto Joe Blogs to argue his case, plus we want legal liability to rest with registered companies with static address who have something to lose if we have to take them to court. Guilty until proven innocent, but at least with the right of appeal.

But still the DMCA only works at the speed of the postal system, and requires human interaction, which is still orders of magnitude slower and more costly than "internet speed" which we all now mostly take for granted. Hence the advent of AI copyright bots that can at least operate at "internet speed", but the people who create them are on the payroll of the copyright holders and their definition of "failsafe" is to block content first and ask questions later. The checks and balances then start to operate, but they can only proceed at "human speed" or at worst "legal speed", which is measured in days or months, rather than the milliseconds of "internet speed". If everything checks out, and we have a "false positive" then it means we get to watch the "live show" a few days later, which in "internet time" means the content is now "old news" and has thus lost most of its value.

The two sides to this argument have totally different perceptions and value systems around of risks, costs and time delays of both a "false positive" and a "false negative" in wanting to decide the relative level of certainty and speed in reaching a first decision on blocking potentially copyrighted content and the tricky to define limits of what constitutes a "fair use" exception.

If a line has to be drawn, then it has to be drawn somewhere. And if you have to negotiate in an argument, its often best to ask for twice as much as you actually want and then offer to meet half way.

Unfortunately (3, Insightful)

NoobixCube (1133473) | about 2 years ago | (#41255617)

Unfortunately, this has only agitated people who already were against automated copyright filtering and DRM. It's like telling eskimos snow is cold. No, we'll have to wait until the MTV music awards are knocked offline by copyright bots before anybody who didn't already know about them gets wind of it.

Re:Unfortunately (1)

Chrontius (654879) | about 2 years ago | (#41255663)

Go post your idea (as a hypothetical, not a suggestion!) on 4chan. Then when 30,000 /b/tards, working independently, report the MTV music video awards as copyright infringement and it's automatically pulled by the googlebot

I'm waiting for Amanda Palmer to write a song (1)

Drishmung (458368) | about 2 years ago | (#41255753)

I'm waiting for Amanda MacKinnon Gaiman Palmer [amandapalmer.net] to write a song about the incident. That would be cool. (I am assuming that Brad Hunstable [ustream.tv] , who by the way has deleted all the comments on his blog (at least one of them was vaguely supportive). would be less happy to have her turning her attention to him).

Business Solution is Required (2)

confuscan (2541066) | about 2 years ago | (#41255773)

The fundamental problem with the current situation is that there is no "pain" (e.g., financial penalty) for these erroneous takedowns and that's the problem with DMCA. I wonder what the online world would look like if there was an equivalent "3 strikes" rule for false takedowns. After that, the escalating financial penalties kicked in, with damages going to the aggrieved party. Business understands money. Frame action and inaction in that context and business tends to behave in a predicable fashion, most of the time.

Re:Business Solution is Required (1)

bws111 (1216812) | about 2 years ago | (#41256263)

Surely the contract you signed with whoever is hosting your content states the penalties for failing to host your content, doesn't it? What's that, no contract? Well, maybe you can try a warrantee claim, and at least get back the money you paid them to host your content. What's that, you didn't pay them anything? And you actually expect them to somehow owe YOU something?? That is a good one.

The problem is not the DMCA, or that there is no "pain", the problem is that some people actually think they are entitiled to literally get something for nothing.

Yes, business understands money. And if you actually PAY them money for a service, they understand you better.

Law Enforcement Will Be Doing This Next (1)

IonOtter (629215) | about 2 years ago | (#41255789)

Law enforcement in cities where protests are expected to take place, will pull out some of their internal training videos, then put them up on big screens around areas where they expect a protest.

Then, when a LiveStreamer catches some of that training video, the bots will automagically shut off the protester's live feed.

Censorship is so bad... (0)

symbolset (646467) | about 2 years ago | (#41255855)

Censorship is such a bad thing that here in the US it is one of the powers explicitly prohibited the government. "Congress shall make no law..." And yet apparently now we allow the courts to permit private corporations to require other private corporations to censor the Democratic National Convention, based on Congress' implementation of the Copyright clause and so get our censorship third hand but still enforced by the government - and we let that go. Interesting. It appears that Hollywood has "fixed" the First Amendment "glitch".

We are not going to respond to this in the reasonable, measured way that I think we should.

What a world (0)

Anonymous Coward | about 2 years ago | (#41255865)

We have physicists and engineers working to manipulate matter at the atomic level to make the electronics to allow this kind of software to run, so psychopaths can keep filling their pockets with money. Whatever happened to using technology to make life better for people??? We seem to work harder and longer than before, surrounded by more advanced technology than ever, and we still seem to cling to obsolete notions of work and value.

Don't Fight the DMCA, Countersue (3, Insightful)

Carcass666 (539381) | about 2 years ago | (#41255959)

The DMCA will probably never be overturned in the US, there is too much industry money behind it, and we know what feeds the political machine in the US.

If some third-party copyright trollbot interferes with the legitimate viewing of a webcast event, there has to be a law firm somewhere that, for the notoriety alone, would be willing to file a class action suit alleging damages of inconvenience and anguish on the behalf of thousands of viewers. Moreover, the broadcaster could sue for the costs of their broadcast that was interfered with. It costs real money to do a good quality webcast, trolls should be on the hook for diluting the value of a broadcaster's investment.

Re:Don't Fight the DMCA, Countersue (1)

wbr1 (2538558) | about 2 years ago | (#41256449)

Not when there is a one way facing no class-action arbitration clause in the terms of service we so blithely click on.

Simple solution (0)

Anonymous Coward | about 2 years ago | (#41255971)

It seems very simple to me. If they have no legal right to block your stream, have them arrested/prosecuted for hacking.

Not just fair use... (1)

jnork (1307843) | about 2 years ago | (#41256055)

Not just fair use, but they sought and received permission to use the clips. The use was specifically authorized.

The real news here is ... (1, Interesting)

imp (7585) | about 2 years ago | (#41256181)

... skynet lives and it is testing its metal...

Re:The real news here is ... (0)

Anonymous Coward | about 2 years ago | (#41256313)

Please tell me you don't think the expression actually is "testing its metal"... Please?

A.I. Taking matters into it's own digital hands (2, Insightful)

Anonymous Coward | about 2 years ago | (#41256305)

This is just a part of a much bigger problem, and that is who is responsible for the behaviour of rouge A.I.? Soon we'll have to face this problem head-on. With the military drones and robots being more and more autonomous, it's only a matter of time before DARPA comes up with the Terminator(TM), capable of autonomously deciding on the killshot. Then it's a matter of time before this machine makes a mistake killing a civilian or a war journalist and we'll find ourselves in the deepest legal shit humanity has faced since the Nuremberg Trials. You can't send a robot to prison, you can't charge it's maker, you can't lock up the author of the software. Yes, you CAN demand damages from the government/military but the crime still goes unpunished. So, should we grant the owners of such A.I.s a license to kill/threaten/arrest in autonomous nature and hope it doesn't turn ugly? And when it does, do we just say "shit happens, here's some money to make you feel better"?

I propose we make the people up-top personally responsible for such events. And there shall be peace on earth...

Accept only human-verified take-down requests? (1)

Anonymous Coward | about 2 years ago | (#41256315)

Can companies legitimately accept only human made take-down requests?

Automated reply to takedown request ...
"We received this takedown request. Please call this number to vouch for its validity."

or

"We received this takedown request. Please go to this URL and enter the captcha requested to vouch for its validity."

The text on these takedowns (3, Insightful)

Punto (100573) | about 2 years ago | (#41256447)

If I remember correctly, these take down notices have a section where the issuer of the notice swears "under penalty of perjury" that the information on the notice is correct. When it turns out to be incorrect (or even when it isn't but no human ever checks the results from the bot), is that actionable? In a civil court? What is "penalty of perjury" exactly?

Not so bad (1)

bigdavex (155746) | about 2 years ago | (#41256455)

I would support my son taking an iPhone-building internship. Foxconn conditions might suck, but I'm not fundamentally opposed to students doing physical work.

Border crossing has a similar problem (1)

Anonymous Coward | about 2 years ago | (#41256555)

To cross some borders I need a visa for my wife. She has the right under EU law to cross the border, but the form you have to fill in online is restricted to very narrow conditions that prevent her ever submitting the application.

You can write or ring, but the letter is returned, with the reply that you must fill in the online form.

If you deliberately miss complete the form in order to get to the next page of it, then that is grounds to reject the visa! Thus by a simple algorithm on the form server, the computer defines your rights as narrower than they really are.

This is your future too, today you don't need a visa, but you certainly will need some form of ID, for which some online form will be required, for which a set of rules on valid entries into the fields will be defined, for which some rules will be laid down. Those rules will be your rights, regardless of what the law says.

What's needed here (1)

Logger (9214) | about 2 years ago | (#41256677)

What's needed here is to turn this technology against those who are currently wielding it.

Since newspapers are making false claims of copyright ownership over material, we need to somehow submit false claims. Primarily on political material. We need to politicians to be greatly inconvenienced by this. There's nothing like self interest to create change. But if you can take down all the videos on YouTube that have adds with this method, that might get some favorable change as well.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?