Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

YouTube Video-Fingerprinting Due in September

Zonk posted about 7 years ago | from the have-you-done-your-homework dept.

Google 115

Tech.Luver writes "The Register is reporting on Google's statement to a presiding judge that video-fingerprinting of YouTube material will be ready in September. The development is required to head off a three-headed suit against the company, currently being debated in a New York City courthouse. The system will, according to Google, 'be as sophisticated as fingerprinting technology used by the Federal Bureau of Investigation.' From the article: 'As Google told El Reg in an earlier conversation, the company already has two systems in place for policing infringing content - but neither are ideal. One system allows copyright holders to notify Google when they spot their videos on the company's sites. When notified, the company removes the offending videos, in compliance with the American Digital Millennium Copyright Act. A second system uses "hash" technology to automatically block repeated uploads of infringing material.'"

cancel ×


Sorry! There are no comments related to the filter you selected.

Hard AI ftw (4, Interesting)

UbuntuDupe (970646) | about 7 years ago | (#20057035)

Earlier I had joked [] about Google's claim to be nearing a system that lets them check for copyrighted works. I said that they're basically claiming to have solved a hard AI problem.

Others pointed out that, no, it's not a hard AI problem to just compare some kind of checksum of the video against a set of banned checksums. That's true. But what about once people know they're using this system? They can just trivially re-encode. Perhaps add a scene break here or there, and totally mess up the fingerprint. To prevent that, it seems, you would need to solve a hard-AI problem: that is, be able to determine if an arbitrarily-encoded video appears to a human to match some copyrighted work. It would have to be robust against minor scene shortenings and lengthenings, scene breakups, color gradients laid over the video, etc.

Anyone know how difficult this program is to circumvent? (Just hypothetically -- not advocating criminal activity here.)

Amazon Mechanical Turk (5, Funny)

InvisblePinkUnicorn (1126837) | about 7 years ago | (#20057097)

That would be the ultimate mechanical turk - sitting around watching youtube videos all day and getting paid... in addition to what you are already being paid as you put off work to watch youtube videos all day.

Re:Amazon Mechanical Turk (0)

Anonymous Coward | about 7 years ago | (#20063353)

You'd want to watch videos like this video [] all day long? Warning, genuinely entertaining video contained therein. :)

Re:Amazon Mechanical Turk (1)

stonedcat (80201) | about 7 years ago | (#20065061)

I have to agree with the coward.
I saw this video near the end of last year and nearly pissed myself laughing.

Re:Hard AI ftw (2, Interesting)

Udderdude (257795) | about 7 years ago | (#20057131)

They could potentially scan the entire movie for a few keyframes that they know will be in there reguardless of silly scene breaks, etc.

Nobody would know what the keyframes are, so it would be hard/impossible to black out that specific frame.

Re:Hard AI ftw (4, Interesting)

sholden (12227) | about 7 years ago | (#20057537)

So I guess it doesn't matter if it then blocks a video which has taken a couple of seconds of video from a TV show in a "Review of Episode X" video post, and just happens to grab one of those keyframes?

Of course a little bit of coding and you have a program that takes that 10 minute video, splits it into 10 1 minute videos and uploads them. The ones that got rejected it splits into 10 6 seconds videos and uploads them. Rinse and repeat until you have however small an set of rejections you asked it for. Then it cuts out just the necessary fragments of videos (replacing them with the last good frame or something?).

Of course that can be worked at google's end by adding a delay to the report rejection step, and by banning those who get lots of rejects.

Re:Hard AI ftw (0)

Anonymous Coward | about 7 years ago | (#20058269)

With all that cutting, you could end up with something like this []

Re:Hard AI ftw (4, Interesting)

DDLKermit007 (911046) | about 7 years ago | (#20057595)

Sorry, but fingerprinting the video is damn near impossible. Closest any image recogniton app can go is similar scene right now. Some can do logos, but were talking low resolution videos. Pretty much they'll do a file checksum, and that'll kill most reuploads. For the determined. All that'll be needed is a simple re-encode. That'll completely change any checksums involved. They won't seriously spend that much machine time on checking if they are allowed to make money off a video. They'll do just enough to say they did something, but not so much you can't find a way around it.

Re:Hard AI ftw (1)

babyrat (314371) | about 7 years ago | (#20062785)

reminds me of the old saying:

"those that claim it is impossible should stay out of the way of those that are doing it"

Re:Hard AI ftw (1)

General Wesc (59919) | about 7 years ago | (#20063077)

Pretty much they'll do a file checksum, and that'll kill most reuploads.

Did you even read the summary? That's mentioned as one of the two already in place. YouTube is adding a different system.

fingerprinting video is trivial (2, Informative)

heinzkunz (1002570) | about 7 years ago | (#20064985)

Philips has a video fingerprinting system [] . From the site:

The system is robust against severe degradations like low bit rate video compression, scaling, rotation, cropping, noise addition, median filter and noise removal. [...]
A 5 second video fingerprint on any segment of video content is sufficient to uniquely identify that segment.

You obviously need more than a simple re-encode to get around that and I'm sure Googles system won't be fooled by simple tricks either.

Re:Hard AI ftw (1)

digitalchinky (650880) | about 7 years ago | (#20065591)

If they use a simple file hash, or checksum, changing one (or a few) bit anywhere in the file will result in a different hash. A hex editor will do this in half a second. I agree, it'll still kill the majority of copy-cat 're-uploads'

The other way I guess would be to break files up in to a series of chunks and hash those, scan every new upload and look for some identical percentage of values and allow or deny based on that - still trivially simple to bypass though.

Re:Hard AI ftw (1)

TapeCutter (624760) | about 7 years ago | (#20066751)

I don't think google will try and claim their system is foolproof, I think they will claim it's as good as what the cops do with real fingerprints. I think they will will claim it enforces "fair use" and if the complainant is not happy with the automated "fair use" they still have the right to serve a "takedown notice". I'm sure they will also point out that real fingerprints are auto-matched on patrial information, and will claim this be done by google taking a few random hash strings and using it as the partial information.

If they have put any of those genius developers they have been hiring to good use they will challenge their oppnents to test the system under court supervision. Goes without saying that the system shown to the court will be a "beta version".

I smell a Firefox plugin (0)

Anonymous Coward | about 7 years ago | (#20067073)

That tweaks your video a bit before uploading it. Probably the simplest way around a fingerprinting, even if it actually works as claimed (which is extremely doubtful) is to add a 5-second lead segment (titles, etc.) which makes its 5-second sampling invalid.

Re:Hard AI ftw (1)

badasscat (563442) | about 7 years ago | (#20061041)

Could digital watermarking actually be used against the copyright holders in this case?

Say an encoder inserts a unique watermark that can't be seen by eye but is part of the data stream. Google isn't looking for it and doesn't recognize it when the video's uploaded, so it allows the video. Somebody would then have to complain, and Google would take down the video and add it to their "banned" database. The problem is, they would then have added basically a garbage entry into their database because it only applies to that particular encode of the video. The video could then just have a new watermark inserted and be uploaded again. Essentially they're back to square one; relying on people to report copyrighted videos, and for employees to then actually watch them and remove them by hand. The only difference is now there's the worthless extra step of adding a bunch of useless data into a large and growing database that's filled with nothing but garbage.

I have no idea if this would work, but it seems like it or something like it would.

The problem with keyframe detection (1)

Anonymous McCartneyf (1037584) | about 7 years ago | (#20062727)

That could kill fair use. Any keyframes that would be in every reasonable edit of a film or TV show would be critical to many short non-infringing excerpts from that film or TV show.
Pity the MPAA doesn't believe in fair use....

Re:Hard AI ftw (4, Insightful)

MyLongNickName (822545) | about 7 years ago | (#20057135)

I think you miss the point.
1) Is there a way around the system? Yes.
2) Does that matter? No.
3) Why is that? This solution shows that Google is making reasonable efforts to comply with the legal issues.

The majority of folks aren't going to take the effort to circumvent these controls. Rates will drop significantly. Google can honestly say they are making every effort to comply with copyright protection. Lawsuits will go away.

Re:Hard AI ftw (1)

dnegreira (1135407) | about 7 years ago | (#20057211)

Will the lawsuits go away? Or this will prove that the only way to fight this copyright thingies is to shutdown the youtube service? I think that is almost impossible to fingerprint a video and all the changes that can be submitted by re-encoding the video..

Re:Hard AI ftw (5, Insightful)

utopianfiat (774016) | about 7 years ago | (#20057681)

It's not going to get shut down because of copyright infringement. :/ That's like saying we should bomb Hong Kong because they sell copyrighted works there- just because something has an illegitimate use doesn't make it illegitimate on face ffs.
Note, this can also be applied to "kitchen knives can kill so we should ban kitchen knives." and "people can die in cars so we should ban motor vehicles"
and uh... "People who have killed a lot of people have played video games, so we should ban video games." The states needs to get over the damn prohibitionist culture that's removing any sense of personal responsibility from our great nation.

Re:Hard AI ftw (1)

dnegreira (1135407) | about 7 years ago | (#20057923)

Of course it wont shut down... I just said that the only way to get off all the copyrighted thingies is to shut down the entire site. Or to kill anyone who tries to upload a copyrighted video even with a frame changed.

Re:Hard AI ftw (1)

jZnat (793348) | about 7 years ago | (#20058487)

Legally, all Google have to do is remove videos identified by DMCA take-down requests. The fact that they're going above and beyond that could end up biting them in the ass when it fails in any given situation. Any IP lawyers or law students here to clarify the matter?

Re:Hard AI ftw (1)

Orange Crush (934731) | about 7 years ago | (#20060183)

Legally, all Google have to do is remove videos identified by DMCA take-down requests. The fact that they're going above and beyond that could end up biting them in the ass when it fails in any given situation.

IANAL or law student, but I'd think the "above and beyond" would work in Google's favor in a court case. They can tell a copyright holder that is suing them: 1) You never bothered to use the existing laws and just ask us to take the offending material down and 2) We're making every practical effort we can think of to keep your material from being posted in the first place, so piss off. (Translated to legalese, of course.)

Not so fast, Slick. (1)

Safety Cap (253500) | about 7 years ago | (#20061871)

It WILL bite them in the ass for the very reason laid out in the DMCA itself:

512(g)(1) No liability for taking down generally.-- Subject to paragraph (2), a service provider shall not be liable to any person for any claim based on the service provider's good faith disabling of access to, or removal of, material or activity claimed to be infringing or based on facts or circumstances from which infringing activity is apparent, regardless of whether the material or activity is ultimately determined to be infringing.

(2) Exception.-- Paragraph (1) shall not apply with respect to material residing at the direction of a subscriber of the service provider on a system or network controlled or operated by or for the service provider that is removed, or to which access is disabled by the service provider, pursuant to a notice provided under subsection (c)(1)(C), unless the service provider--

(C) replaces the removed material and ceases disabling access to it not less than 10, nor more than 14, business days following receipt of the counter notice, unless its designated agent first receives notice from the person who submitted the notification under subsection (c)(1)(C) that such person has filed an action seeking a court order to restrain the subscriber from engaging in infringing activity relating to the material on the service provider's system or network.
source [] ) [emphasis all mine]

You'll need to read the rest of the cited sections to get a complete understanding of the process, but here's how it goes:

  1. Sally Sharer posts a video on YouTube.
  2. B.I.G., Inc. sends a DMCA takedown notice to YouTube.
  3. YouTube removes the video.
  4. Sally sends a counter-claim to YouTube.
  5. YouTube MUST (see 512(g)(2)(C) above, baby!) restore Sally's video.
  6. B.I.G., Inc. must file in court to take any further action.

So when YouTube gets a takedown notice and the person counter-claims, 'Tube has a choice: either restore the video or be liable for a shiny new lawsuit.

Rates will drop significantly. (1)

nurb432 (527695) | about 7 years ago | (#20059935)

As will its userbase.

Re:Hard AI ftw (3, Insightful)

Nasarius (593729) | about 7 years ago | (#20057177)

There are already several decent systems [] for fingerprinting audio; it's not particularly surprising that Google researchers would be able to do something similar for video.

Re:Hard AI ftw (1)

packetmon (977047) | about 7 years ago | (#20057185)

Depending on what they release. If its something akin to a facial recognition system, I would think that filling every few frames unseen to the naked eye or perhaps slight snow could throw off any facial recognition like software Google could throw out. Another method if this is the case would be to throw in perhaps a lens filter over the video. E.g. assume Google's software is set to compare a current video with known content. How would it achieve this. Perhaps light based spectrums, image positioning... Shift every so frames, throw in a slight tint, and it should be a wrap.

Re:Hard AI ftw (1)

Cedric Tsui (890887) | about 7 years ago | (#20057867)

Well, you could always start with the sound alone, which has already been done.

As for image recognition, usually this is done using Fourier transforms as an edge finding algorithm. Basically, you can use a computer filter to throw out the bulk of the information, and keep only the most visually identifying features in the video. Changes in tint or timing won't affect the shapes and movements of these predominant edges.

Umm. To visualize this, imagine you take a full colour photo, and you trace it as best you can with a sharpie. Much much simpler, but still identifiable.

That's my guess anyhow. This would probably result in a number of false positives, but maybe they'll just flag certain videos and have a person make the final call if they are to be canned.

Re:Hard AI ftw (1)

digitalchinky (650880) | about 7 years ago | (#20065739)

The method you describe can still be defeated quite easily just by adding randomly placed contrasting edges throughout the video. These don't even need to be particularly visible, just enough to throw out the transform method used.

In the RADAR world, electronic warfare weenies spend their lives fingerprinting emitters. Coming up with seemingly different hardware can be as simple as tapping on the klystron with a screwdriver (or a hammer for those fun situations where you can safely say "it just fell apart in my hands chief").

Re:Hard AI ftw (1)

lilomar (1072448) | about 7 years ago | (#20057219)

Why does it have to be hard AI?
Wouldn't some sort of soft AI (expert-system, neural-net) do just as well?
I could be wrong, but doesn't "hard" AI refer to a system that is conscious?
Why would you have to be conscious to recognize movie clips?

Re:Hard AI ftw (3, Insightful)

Hatta (162192) | about 7 years ago | (#20057383)

Mostly it's an assumption. Since only humans so far have been able to recognize video, then it must take something like a human to do it. Viewing clip A and clip B and abstracting the parts that make them the same is not the kind of problem computers are good at, and if you've done it you've probably solved the really hard problem in AI, how abstraction works. Of course this rests on the assumption that strong AI is fundamentally different from weak AI, and the difference is not just one of degrees. Personally I think that's a specious distinction much like that between microevolution and macroevolution.

Re:Hard AI ftw (-1, Troll)

Anonymous Coward | about 7 years ago | (#20061535)

Personally I think that's a specious distinction much like that between microevolution and macroevolution.

But isn't *all* of evolution pretty much just a theory at this point, when you think about it on some level in any case?

Re:Hard AI ftw (2, Informative)

Hatta (162192) | about 7 years ago | (#20061793)

Sure it is. So is the heliocentric model of the solar system, Einstein's theory of special relativity, and quantum electrodynamics. It's just one of the best tested, best supported, and most theoretically fruitful theories we have.

Re:Hard AI ftw (1)

techiemikey (1126169) | about 7 years ago | (#20057369)

or it could check audio for a small segment (let's say a 15 second clip), and have it converted from the sound waves to a check sum and compare that to a database of "blacklisted" checksums. Then if it is within an acceptable margin of error, have a few parts of video which it will look for, also within a margin of error. The margin of error can be quite large actually to account for missing frames, changes in formats, etc. based on more data they decide to use to judge by.

Re:Hard AI ftw (1)

porkThreeWays (895269) | about 7 years ago | (#20057591)

The article said google is already doing checksums and they aren't ideal. Creating fingerprints wouldn't be impossible. Very difficult - yes, but not impossible. We should also note that google employs some of the smartest and most creative people in the world.

Off the top of my head... I would throw out the whole notion of "checksums". They aren't really applicable because their purpose is to compare exactly. Even if doing key frames, all one would have to do is lighten/darken the video and the whole checksum changes. You'd need to create a "profile" for the video. Divide the frame into contiguous regions and get the area of contiguous region. Store those values as fields in the profile. Find the color in relation to other contiguous areas and give it a relative value (i.e. 8% darker than contiguous region #1). If everything is relative it would prevent simple lightening/darkening of the video. Compare and contrast this to frames before and after and compare multiple frames before and after to prevent frame skips. You want to completely rid the idea of checksumming and embrace the idea of relativity.

Create a similar audio profile except with frequency and amplitude and the key is to make sure everything is relative i.e. 440hz freq is 10% louder than 660hz etc, etc. Once you have many relative variables to compare and contrast to you can come up with a hit score for how similiar the two videos are and filter based on a reasonable score. This was just some schmuck at work thinking out loud, so imagine what someone wayyyyy smarter than me with a lot more time, paid a lot more could come up with...

Re:Hard AI ftw (1)

Hal_Porter (817932) | about 7 years ago | (#20057655)

Maybe you could see from the viewer pattern if it contains something interesting to a wide audience.

E.g. if it contains copyrighted material it should get posted to an indexing site. That should bring people from all over the world. Then you tag it and get a human to watch it and check if it is copyrighted. Whereas original material is probably viewed by a small circle of friends.

Just dumbly checking the popular stuff would help a bit, but I think you need to look at referrer information, or the location of the IP address. Another possibility would be check for patterns of cuts, pans and lighting to see if they look professional. Or if they match a finger print of a known copyrighted work.

Re:Hard AI ftw (1)

Kjella (173770) | about 7 years ago | (#20057701)

Google doesn't need so solve the hard problem. They can spend several decades "improving" the filter as long as they can nab some new trivial changes, then go tell the content producers "hey, we're getting smarter but the pirates are too". Besides, if they can make videos look at least somewhat crappy that's already a win for the content producers... I certainly get annoyed by that quickly.

instant circumvention - just add random noise! (1)

simplerThanPossible (1056682) | about 7 years ago | (#20058735)

eg. in a gimp plug-in.

But yes, it will reduce it. By how much?
I don't know. Maybe a lot.

Re:instant circumvention - just add random noise! (1)

Anonymous McCartneyf (1037584) | about 7 years ago | (#20062817)

YouTube videos are already relatively low-res. You've a fine balance there--if you circumvent with random noise, you've gotta make sure that the resulting video is still watchable by human beings, or you are circumventing to no purpose.

Re:instant circumvention - just add noise! (1)

simplerThanPossible (1056682) | about 7 years ago | (#20062955)

yes, you're right. Also, it's surely compressed, and random noise will reduce the compressed size.

However, I think only a little noise would be necessary. Eg. +1 or -1 brightness, on one pixel per screen.

Hmmm... or maybe just different compression parameters? This changes the artifacts introduced into the video, and so the uncompressed video would look slightly different.

It's not hard AI (1)

the_povinator (936048) | about 7 years ago | (#20058787)

It's not as hard a problem as you think. There are algorithms that could recognize a short segment of video even if re-encoded.. it's like iris recognition. They can recognize your iris even though the new image may be rotated and scaled. It's all about transforming the image so that these kinds of trivial transformations are made to disappear, and then comparing the result to a bank of pre-stored images.

Re:Hard AI ftw (1)

j1mmy (43634) | about 7 years ago | (#20059635)

The easiest way to circumvent such a system is to upload your video elsewhere.

Re:Hard AI ftw (1)

Paulrothrock (685079) | about 7 years ago | (#20060263)

This is trivially easy to work around. Just put a description of the video at the beginning or end, or use freely available video editing software to make a scrolling image across the bottom. Heck, you could just insert a couple black frames randomly across the film. Or scale it down and put it on a background of white noise.

Checksums are horrible ways of checking non-text data.

Re:Hard AI ftw (1)

Warbothong (905464) | about 7 years ago | (#20061595)

My AI lecturer keeps saying that AI is a tough area to work in, since you get loaded with every problem that regular software developers can't figure out ("Sorry, you need AI for that") but then after any solution is found you never get any credit ("That's not AI, that's just a database-backed checksum comparison and Fourier waveform analysis system"). The lines about what AI actually is vary depending on whether it is effort or praise involved.

Re:Hard AI ftw (1)

UbuntuDupe (970646) | about 7 years ago | (#20061731)

Heh ... the former is a pet peeve of mine: people thinking that their job is more complicated than it really is. It's not limited to software developers.

As for the later...

"okay, okay, you *technically* passed a Turing Test, but only hy having it basically ignore me and ridicule everything I did wrong."

"Sir, you were talking to your wife the whole time."

Re:Hard AI ftw (0)

Anonymous Coward | about 7 years ago | (#20061789)

Video fingerprinting technology does exist, and it's not an AI problem or a hashing problem. Philips has developed a robust system for this sort of application. Others have as well. YouTube is/was using Audible Magic's audio fingerprinting system to identify the audio track associated with a user submitted video and determine whether it was copyrighted and should be blocked. It is unclear whether Google is developing their own video fingerprinting technology or licensing technology from a third party.

Re:Hard AI ftw (0)

Anonymous Coward | about 7 years ago | (#20063757)

Its actually not as hard as you think to detect arbitrarily encoded music/video. Checkout [] for a summary.

Re:Hard AI ftw (1)

heinzkunz (1002570) | about 7 years ago | (#20064537)

When you write "checksum", do you think of an MD5 hash?

Google could just re-encode uploaded videos with a very low resolution (say 8*8 pixels) and use the result as a fingerprint. This is trivial to implement and makes re-encoding useless. I guess that cropping, stretching and many other modifications are detectable as well without tackling any AI problems at all.

Google is certainly able to make uploading of banned videos at least very inconvenient.

Re:Hard AI ftw (1)

Anonymous Cowdog (154277) | about 7 years ago | (#20066345)

The really surprising thing about your post is it sounds like you think Google is stupid. I doubt you really believe that.

>They can just trivially re-encode.

No. You're thinking of cryptographic hashes where a one-bit change in the input leads to a totally different signature. This wouldn't be that kind of hash. It would most likely be a collection of a lot of hashes for each video, amalgamated into one or more signatures for each video.

If I explained the basics of the problem to my eight year old and pointed her in the right direction toward the solution, she would be able to figure it out.

>To prevent that, it seems, you would need to solve a hard-AI problem

Not really. There are simple techniques for doing robust signature creation and checking, without solving any hard AI problem.

Re:Hard AI ftw (1)

Anonymous Cowdog (154277) | about 7 years ago | (#20066361)

And before you say that a collection of hashes would be vulnerable to a re-encoding attack, no, they wouldn't be, if they were hashes of the right features of the video. You don't do a hash of the bits of the file, you do a hash of the properties of the video. Properties being things like: is this pixel/group of pixels surrounded by darker or lighter pixels/groups of pixels in each compass direction. Similar approach with audio.

separation of the web (4, Insightful)

4solarisinfo (941037) | about 7 years ago | (#20057065)

As soon as Google stops indexing/posting material people want (legal or not) people will stop using Google. I believe they know what a fine line they're walking between 'do no evil' and survival here, I wonder which will pervail?

Re:separation of the web (-1, Offtopic)

Anonymous Coward | about 7 years ago | (#20057123)

Some people say cucumbers taste better pickled.

Re:separation of the web (1)

HitekHobo (1132869) | about 7 years ago | (#20057183)

I'm not sure I'd complain overly much if Google went back to being 'that obscure search engine that knows all about OSS'. I'm sure their stockholders would though.

Re:separation of the web (2, Informative)

drrck (959788) | about 7 years ago | (#20057203)

I've posted numerous video clips to Google Video and YouTube. I have recently received 3 e-mails from Google telling me that I have been flagged by the copyright holder. Subsequently I have already stopped using Google Video.

Re:separation of the web (3, Insightful)

4solarisinfo (941037) | about 7 years ago | (#20057267)

For the unoffocial record - are you the copyright holder of these clips, or could they legitimate requests?

Re:separation of the web (1)

drrck (959788) | about 7 years ago | (#20057319)

I'm clearly not the copyright holder. But that's not the point I attempting to make. They were clips from Robot Chicken on Adult Swim. I've just made my choice based on the reactions of the two services.

Re:separation of the web (2, Insightful)

4solarisinfo (941037) | about 7 years ago | (#20057411)

Soo... the system works, and "do no evil" gets a point.

Re:separation of the web (-1, Troll)

Anonymous Coward | about 7 years ago | (#20057469)

Upholding copyright law IS doing evil, jackass. []

Re:separation of the web (1)

CopaceticOpus (965603) | about 7 years ago | (#20057881)

But calling someone a jackass because they disagree with you is doing good?

I'm against long term copyrights, but I'm for showing basic respect to others. I think the most damaging thing to the cause of copyright reform is the childish behavior of its supporters.

Re:separation of the web (0)

Anonymous Coward | about 7 years ago | (#20057903)

get a clue dumbass, once you leave moms basement and need a job, you will realise how much effort goes in to producing quality entertainment. Leeches and thieves like you belong in a cell.

Re:separation of the web (1)

QuoteMstr (55051) | about 7 years ago | (#20060711)

Rant all you want, but society has spoken. I'm 22. Damn near everybody in my generation doesn't file trading wrong. Assuming that we maintain our republic, the government will eventually respond to democratic opinion and weaken copyright. Corporate content creators will have to adapt or perish.

Really, the only thieves are the powerful corporations who maintain an ironclad, if weakening, grip on what ought to be in the public domain by now anyway.

Re:separation of the web (3, Interesting)

radish (98371) | about 7 years ago | (#20064899)

You think? Here's a history lesson. The generation currently in power were all pot-smoking hippies back in the 60's and 70's, but anti-drug laws just keep on getting stronger. Teenagers and 20-somethings have always had this wacky belief that they are heralding a new way of thinking, and eventually when all the old gas-bags die we'll have a utopia. Somehow it never seems to happen.

Re:separation of the web (1)

DreadPiratePizz (803402) | about 7 years ago | (#20058489)

I believe they know what a fine line they're walking between 'do no evil' and survival here, I wonder which will pervail?

How exactly is automatically removing copyrighted content that shouldn't be posted in the first place evil?

Re:separation of the web (1)

4solarisinfo (941037) | about 7 years ago | (#20058727)

How exactly is automatically removing copyrighted content that shouldn't be posted in the first place evil?

I believe you missed my point.

"Do no Evil" - company motto, remove all copyrighted content
Survival - being the engine people use to find things

I was trying to point out that they may bo oposing forces in this case. the more they remove, the less people will use google to search.

Re:separation of the web (1)

Sancho (17056) | about 7 years ago | (#20065161)

It depends upon how far they go. Don't forget Fair Use.

In Soviet Russia (0)

Anonymous Coward | about 7 years ago | (#20057091)

Gootube fingerprints you!!

Oh wait.....

Obfuscation? (3, Funny)

HitekHobo (1132869) | about 7 years ago | (#20057129)

Can I just filter the video to change the general shape and size of the content and scribble all over it until humans can't recognize it? Seems to work for websites that require a signup... I had one the other day that took 4 people and 5 attempts to actually sign up.

Two-part Protection (3, Interesting)

RancidPickle (160946) | about 7 years ago | (#20057195)

One part is the same -- someone spots "their" video, they take it down immediately to avoid getting sued under the DCMA. Expect the takedown notices to continue, which will still kill parody videos, fair-use compliant videos, and videos that are legal, but someone sends bogus takedown notices, such as the ones that Viacom "accidently" included in their original request.

The second part sounds more promising, but someone may be able to get around hashing the videos, such as inserting random one-frame images, as in the Fight Club movie, or adding in overlay text, or possibly adding in effects. If they try to hash a few selected time slices, someone will figure it out eventually. As with all digital protection, this just pushes off the inevitable. At least it will make Google look good in court, since they're attempting to comply with Viacom and the other copyright holder's requests for not posting their material.

In the end, it won't count for much. It would make more sense to add in additional protections for false or malicious takedown notices, such as adding in a $50K fine for false claims. This would at least make the big companies scrutinize the videos that they're issuing a takedown notice for.

Re:Two-part Protection (1)

kebes (861706) | about 7 years ago | (#20057615)

Yeah the users can probably circumvent it... but will they bother?

My impression is that most of the unauthorized clips on YouTube are put there by fans of the shows in question. They do it because it is easy and fun, and they want to share the thing they like with others. I don't think they will continue to bother if it becomes onerous to post the clips (requiring constant editing, posting, re-editing, re-posting, etc.). This is different from P2P networks, where once someone goes to the effort of making a nice copy, it can spread indefinitely. Because YouTube is a closed system, people will get bored with constantly over-coming it. (Or just get the content on P2P, where you can get full-length, full-quality versions.)

Of course, given that these clips are posted by enthusiastic fans, I really wonder why companies are so worried. It seems like they generate more ill-will than anything. From my random sampling, the number of full-length shows posted on YouTube has decreased significantly. What I see now are short clips from shows. I suppose someone who watches 20 random clips from Family Guy may be less inclined to buy the DVD ("I've seen all the best jokes!"), but I suspect the opposite: that it increases exposure and hence increases sales.

Anyways, my suspicion is that by making it annoying to post clips from shows, they will indeed succeed in decreasing the number of such clips. However I believe this will ultimately back-fire (from the point of view of the big media companies), because this will just make the huge YouTube viewer base switch to watching more "original YouTube" content, and the growing number of shows that actively post material on YouTube... with a corresponding decreased reliance on "old media" to fill our need for art and entertainment.

Re:Two-part Protection (0)

Anonymous Coward | about 7 years ago | (#20061395)

I'm guessing this system would prevent the uploading of such movies as A Fair(y) Use Tale [] , which is an legitimate original work made entirely out of snippets from Disney movies. Sure, it's a rare case, and Google is in no way obligated to make any effort whatsoever to allow such videos on their site, (Indeed, it's rather unlikely they can make a computer system that can detect fair use.) but it would still annoy me.

As "sophisticated" as FBI fingerprinting? (2, Insightful)

dpbsmith (263124) | about 7 years ago | (#20057199)

We don't want "sophistication," we want reliability.

And since they are making the comparison... just how reliable []
are fingerprints, really?

True, a character in Mark Twain's 1893 novel Pudd'n'head Wilson tells a court

"Every human being carries with him from his cradle to his grave certain physical marks which do not change their character, and by which he can always be identified -- and that without shade of doubt or question. These marks are his signature, his physiological autograph, so to speak, and this autograph canImage available not be counterfeited, nor can he disguise it or hide it away, nor can it become illegible by the wear and mutations of time. This signature is not his face -- age can change that beyond recognition; it is not his hair, for that can fall out; it is not his height, for duplicates of that exist; it is not his form, for duplicates of that exist also, whereas this signature is each man's very own -- there is no duplicate of it among the swarming populations of the globe! This autograph consists of the delicate lines or corrugations with which Nature marks the insides of the hands and the soles of the feet."

and ever since Mark Twain said so everyone has believed it, but that doesn't necessarily make it true.

Re:As "sophisticated" as FBI fingerprinting? (3, Informative)

westlake (615356) | about 7 years ago | (#20058261)

And since they are making the comparison... just how reliable are fingerprints, really?

The Newman link is from 2001.

The judge who decided the original Llera-Plaza motion, which is discussed and critiqued in the following article, reversed himself on March 13, 2002, holding that expert evidence of a "match" was admissible. Judge Pollak had granted the Government's motion for a reconsideration that is mentioned above, and he also reopened the record to hear additional testimony for the prosecution as well as for the defense. In reversing himself in a 60-page opinion, Judge Pollak stated, in part, "In short, I have changed my mind.' The Reliability of Fingerprint Evidence: A Case Report []

You'll find links here to many articles on Identification Evidence. For example: Phenotype vs Genotype: Why Identical Twins Have Different Fingerprints []

Re:As "sophisticated" as FBI fingerprinting? (1)

Skreech (131543) | about 7 years ago | (#20058517)

We don't want "sophistication," we want reliability.
We want? Hell I don't want this. If it satisfies the course case then fine, if it doesn't then it's still not my problem really. Heh.

Dumb. Really dumb. (4, Insightful)

Anonymous Coward | about 7 years ago | (#20057207)

The supposedly clever media moguls are missing a wealth-building opportunity. Lots of these "infringing videos" are short clips from longer presentations. If they had any smarts at all, they'd ask Google to set up a link on those pages where people could buy the programs/music on disk, or direct download them for a fee. Instead, the moguls want to get rid of what amounts to "free advertising" because they fear the new paradigm.

Re:Dumb. Really dumb. (1)

Opportunist (166417) | about 7 years ago | (#20057603)

Appearantly you never had a boss who would shoot down every idea that isn't his own. Usually, businesses with a boss like that go under in pretty short time.

In other words, I have my hopes up that we might get rid of them pretty soon.

Re:Dumb. Really dumb. (1)

SCHecklerX (229973) | about 7 years ago | (#20057709)

Bingo! I already posted, or I would mod you up.

Why isn't Google fighting this out in court? (1)

TubeSteak (669689) | about 7 years ago | (#20057239)

When notified, the company removes the offending videos, in compliance with the American Digital Millennium Copyright Act.
The trouble with the first system is that neither Google nor the copyright holders can possibly keep up with the vast number of copyrighted videos uploaded each day.
What exactly is the compelling legal argument that spawned three lawsuits?
That GooTube isn't complying with the DMCA?
That complaince with the DMCA isn't enough?

Depending on your POV, the 'right' thing to do is either to create new filters (business), or to try and win the lawsuits (users).

Re:Why isn't Google fighting this out in court? (3, Insightful)

eclectro (227083) | about 7 years ago | (#20057601)

Well, it's the cost of fighting a copyright battle, and also the dark possibility that the judge would side with the copyright holders which they almost invariably always do these days.

Take that and the fact that Google is actually a big fat cash cow with a bulls-eye on the side of it and it becomes obvious that the best strategy is one of accomadation. Rather than a long drown out battle that would also hurt googles stock price because of the uncertainity it creates.

So anyway you cut it, this looks like the best route for them to take. Maybe google could throw some lobbyists on congress to address the copyright abuse that copyright holders are getting away with.

Just the start (4, Funny)

sjonke (457707) | about 7 years ago | (#20057287)

YouTube Video Strip-Searching is due in January '08

Re:Just the start (1)

Opportunist (166417) | about 7 years ago | (#20057641)

Oh, I'm fairly sure, searching YouTube for stripping videos has been going on for as long as it exists.

Anonymous uploads, here we come... (1)

Adeptus_Luminati (634274) | about 7 years ago | (#20057403)

Alright kiddies,

Step1. Get out your laptop & your random MAC address generator toolkit
Step2. Drive down some random street until you find...
Step3. A neighbour with unprotected WIFI (or just crack their non-WPA2 secure connection)
Step4. Carry on & upload your Simpson's episodes to Youtube.
Step5. Cause profits (loss) for Simpson's authors


Re:Anonymous uploads, here we come... (1)

techiemikey (1126169) | about 7 years ago | (#20057461)

you forgot the part of step 3.3 Sign up for e-mail addresses, and a you tube account so that those can not be traced to you. Also, step 5 is not a proven profit loss by any stretch of the imagination. Oftentimes the teasers and stuff people post on youtube cause more people to want to watch the movie.

Re:Anonymous uploads, here we come... (1)

vigmeister (1112659) | about 7 years ago | (#20057541)

Step2. Drive down some random street until you find...
Step3. A neighbour with unprotected WIFI (or just crack their non-WPA2 secure connection)
Call me a pedant, but do you not find it disconcerting to have neighbours on random streets? Maybe you meant they were somebody else's neighbour...


'infringement' (2, Interesting)

SCHecklerX (229973) | about 7 years ago | (#20057677)

Same as with music. If people are going to buy it, they will. Just charge a fair price. Use youtube as advertisement for commercial interests (daily show, colbert report, robot chicken, anyone?)

But youtube is a little different in that many of the things people go there for are unique or one-time things that the only way you'll ever get a chance to see them again is if you recorded it yourself, or somebody else does and you are lucky enough to find it online.

The biggest issue I have is stuff that you'll NEVER BE ABLE TO ACTUALLY BUY OR SEE AGAIN being taken down. My favorite example is prince performing at half time for the superbowl. Now, not only are the videos gone from youtube, but also all of the comments (which IMHO are equally as valuable to the community) about the videos.

Taking things like this down erodes our culture and destroys valuable records of what has gone on in our lives.

Re:'infringement' (1)

tlhIngan (30335) | about 7 years ago | (#20058113)

The biggest issue I have is stuff that you'll NEVER BE ABLE TO ACTUALLY BUY OR SEE AGAIN being taken down. My favorite example is prince performing at half time for the superbowl. Now, not only are the videos gone from youtube, but also all of the comments (which IMHO are equally as valuable to the community) about the videos.

<stupid question>I seem to remember that one could buy DVD sets of the superbowl, no? Wouldn't said DVD sets include the half-time show?</stupid question>

I only ask because I have seen them for sale (including a humongous box set that included like every superbowl ever). However, not being a NFL fan (or liking football), and being in Canada (meaning we get crappy ads), means I don't watch it. I'm sure someone out there sells a DVD sets of superbowl half-time shows, as well. (On a limb here - does someone sell the ads?)

Re:'infringement' (1)

grassy_knoll (412409) | about 7 years ago | (#20058941)

The biggest issue I have is stuff that you'll NEVER BE ABLE TO ACTUALLY BUY OR SEE AGAIN being taken down. My favorite example is prince performing at half time for the superbowl.

Isn't not being able to see Prince perform a feature? ;)

Why is Google doing this? (1)

QuoteMstr (55051) | about 7 years ago | (#20057763)

Google is already complying with the letter of the DMCA. What right does any other organization have to compel Google to go beyond that and spend a fortune creating a video fingerprinting system? Google ought to be fighting this, not bending over and doing whatever large media companies demand.

Doing this is manifestly against the interest of the people who made Google what it is today. What happened to doing no evil?

Re:Why is Google doing this? (0)

Anonymous Coward | about 7 years ago | (#20058595)

maybe google realise that content providers sped billions making content that people like you expect to take for nothing, and that preventing copyright infringement is the exact opposite of evil.
Sorry to burst your communist bubble.

Just had my first experience with this on Soapbox (1)

rock603 (1074230) | about 7 years ago | (#20058121)

Funny- I can't post a home video with copyrighted background music on MSN Soapbox - they refused to publish it. But it works great in GooTube. A previous writer hit the nail on the head - when Google pulls down all of the illegal content, GooTube will turn into a GooGhostTown...

What next? (1)

wanderingknight (1103573) | about 7 years ago | (#20058163)

Are they gonna sue Google because its search engine aids in the acquisition of copyrighted content (like when you search for torrents)?

Google loses Common Carrier Protection (?) (1)

xmas2003 (739875) | about 7 years ago | (#20058199)

Phone companies and ISP's claim that since they are "Common Carriers" just carrying content from other people without any knowledge/filtering of the content, that they can't be sued if naughty/illegal stuff is carried on their wires. If Google has pro-active measures in place to filter submissions, then won't they lose that protection and can then be sued for anything that slips through?

Re:Google loses Common Carrier Protection (?) (1, Informative)

Anonymous Coward | about 7 years ago | (#20058471)

Um, no. A carrier passes traffic. YouTube hosts content. A big difference.

Faster than you can say Napster... (1)

Xian97 (714198) | about 7 years ago | (#20058457)

People will find another outlet. If the content that people are looking for is not on youtube then they will abandon the site for somewhere else.
The content industry should take a lesson learned from the past. Right now they have a large concentration of people looking at grainy, low resolution video in one place. Remove that and the sites will go underground, and maybe with even better quality video which would be a real threat to their model. They should take the opportunity to promote their product - here is the low resolution preview, click here for a clean high res version for a small fee. Make a tie-in to portable device downloads. There are lots of possible marketing opportunities. However, once they shut down one of the primary reasons for visiting youtube and the content becomes decentralized then that becomes a lot harder to do.

The Shape Of Things To Come (0)

Anonymous Coward | about 7 years ago | (#20058877)

Another popular media site imeem [] has been using audio fingerprinting on user content for a while now, they have deals with the usual clutch of indie labels and more interestingly they have a deal with Warner Brothers (who up until a month ago were suing imeem for copyright violations). The main draw of the site is 'youtube for music', and I guess imeem has some process to detect what is licensed and what isn't because some uploads are turned into 30 second previews if they're not covered by one of imeem's deals.

If you're the kind of person who finds themselves wasting all their time surfing youtube you had better stay away from imeem because it will suck you in forever.

Another Consideration (1)

Stanislav_J (947290) | about 7 years ago | (#20058949)

Thing is, the content owners will have to provide the "fingerprint," or at least the video from which the "fingerprint" will be made. Now, while I can see the bigger corporations taking the time, effort and money to do this for current and recent popular TV shows and movies, a lot of the more obscure stuff is still going to be around. Part of the fun and appeal of sites like YouTube is getting a chance to see or re-see more esoteric (i.e., not very commercially viable) clips -- old TV shows with a cult following, but no chance of being released on DVD; old commercials for no longer existent products; continuity clips (promos, station IDs, etc.); old local newscasts, etc. The rights to all of these things are still held by someone, somewhere, but much of it will still be up on YT -- just the stuff that no one (except for the small base of fanatics that appreciate it) cares about. I mean, is someone going to "fingerprint" every episode of "Hello, Larry" or some equally stinkeroo old show just in case someone uploads a few episodes that they recorded on their Betamax way back when?

Fair Use and the Justice Droid (1)

HTH NE1 (675604) | about 7 years ago | (#20059147)

Of course, such a fingerprinting scheme ignores fair use. If a match is found, the video will automatically be flagged as infringing without concern for context and prevented from being uploaded without any oversight.

It's like being found guilty for murder because your fingerprints were found at the scene... on some groceries you bagged for the victim at the supermarket a week earlier, but that is of no concern to the justice droid.

Who cares? (0)

Anonymous Coward | about 7 years ago | (#20059355)

Someone will just create another video site and host it from an oil derrick somewhere in international waters.

It's time for NEW-Tube!! (tm)

watermark with hue shift (1)

192939495969798999 (58312) | about 7 years ago | (#20059571)

you could easily beat any hash of a video with a watermark hue shift of say, 1 value in the red. The video would be unintelligibly different to the naked eye, but each and every frame would be significantly different than an original based on the exact video hash. Is this not how it would work?

Wide open for abuse (2, Insightful)

AmiMoJo (196126) | about 7 years ago | (#20059579)

How is this going to work? Will Google process all copyrighted videos themselves and produce the necessary data to block them? If so, what is the backlog going to be when big media submits 90 years of video?

If Google are not going to check it, what is to stop me downloading a Quicktime trailer of a movie, generating the data and submitting it to Google for blocking? It will quickly become impossible for even sanctioned videos to appear. Cultists/Scientologists will be screwed too.

As usual, media companies are being idiots. They paniced about the VCR, they paniced about P2P, they are panicing about DVRs and YouTube. In the end, new technology tends to do them good in the long run and besides which, you can't fight it.

Kadokawa holdings deal (1)

Frumply (999178) | about 7 years ago | (#20059679)

Wonder if it's got anything to do with Kadokawa Group's [] deal with Google. I found it amusing that they were helping develop this kind of system for Google, since they're probably one of the few groups that have really gotten some sales/publicity boost from user-generated content on Youtube and NiconicoVideo [] .
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>