Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

"Digital Universe" Enters the Zettabyte Era

CmdrTaco posted more than 4 years ago | from the less-than-half-porn dept.

Data Storage 137

miller60 writes "In 2010 the volume of digital information created and duplicated in a year will reach 1.2 zettabytes, according to new data from IDC and EMC. The annual Digital Universe report is an effort to visualize the enormous amount of data being generated by our increasingly digital lives. The report's big numbers — a zettabyte is roughly a million petabytes — pose interesting questions about how the IT community will store and manage this firehose of data. Perhaps the biggest challenge isn't how much data we're creating — it's all the copies of it. Seventy-five percent of all the data in the Digital Universe is a copy, according to IDC. See additional analysis from TG Daily, The Guardian, and Search Storage."

Sorry! There are no comments related to the filter you selected.

Who cares? (0)

Anonymous Coward | more than 4 years ago | (#32085336)

A zettabyte is more data than you generate during your whole lifetime. It's pointless to have so much space.

Re:Who cares? (4, Insightful)

Stooshie (993666) | more than 4 years ago | (#32085396)

Maybe you won't but then you are not CERN or the Hadron Collider

Re:Who cares? (1)

wallyhall (665610) | more than 4 years ago | (#32085410)

A zettabyte is more data than you generate during your whole lifetime. It's pointless to have so much space.

For those wondering: 1,000,000,000,000,000,000,000-ish, or some 10^21

Re:Who cares? (1, Informative)

Anonymous Coward | more than 4 years ago | (#32086256)

Yes, or roughly a million petabyte, where a million is roughly 10^6 and peta is roughly 10^15.

I'll make roughly one post about this matter.

Re:Who cares? (1)

FreeUser (11483) | more than 4 years ago | (#32085424)

A zettabyte is more data than you generate during your whole lifetime. It's pointless to have so much space.

Speak for yourself.

Re:Who cares? (4, Funny)

natehoy (1608657) | more than 4 years ago | (#32085716)

Yes, 640 petabytes should to be enough for anybody.

Re:Who cares? (1)

Monkeedude1212 (1560403) | more than 4 years ago | (#32086044)

My Hot Fresh Brewed Coffee shot out my nose. It still burns a little, but that joke was totally worth it.

Re:Who cares? (1)

natehoy (1608657) | more than 4 years ago | (#32086092)

Glad I could assist in the steam cleaning of your nasal cavities.

Just another service we offer in addition to sarcasm.

Re:Who cares? (4, Funny)

CODiNE (27417) | more than 4 years ago | (#32086292)

Don't you mean

Yes, 640 petabytes should to be enough for everybody.

Re:Who cares? (1)

natehoy (1608657) | more than 4 years ago | (#32087758)

No, what I meant specifically was "(Score:5, Funny)".

Re:Who cares? (1)

SmackTheIgnorant (985978) | more than 4 years ago | (#32086350)

OK, I laughed when I read that, and then I wondered: How much of that would be porn? NOBODY needs a porn collection that big. And I considered making a Library of Congress type comparison, but saying the LoC would be a single hair...... it got creepy. Can we just, as a collective, hit the delete key a couple dozen more times a day? And keep the porn down to under a terabyte?

Re:Who cares? (0)

Anonymous Coward | more than 4 years ago | (#32086640)

NOBODY needs a porn collection that big... And keep the porn down to under a terabyte?

Blasphemy!

Re:Who cares? (0)

Anonymous Coward | more than 4 years ago | (#32086690)

NOBODY needs a porn collection that big.

Welcome. You must be new here.

Re:Who cares? (1)

natehoy (1608657) | more than 4 years ago | (#32088584)

NOBODY needs a porn collection that big.

You're forgetting backups.

Re:Who cares? (1)

insnprsn (1202137) | more than 4 years ago | (#32085820)

Didnt it say that they are talking about created data (and assorted copies of it)... not how much data we can store

Re:Who cares? (1)

plover (150551) | more than 4 years ago | (#32085896)

A zettabyte is more data than you generate during your whole lifetime. It's pointless to have so much space.

Ooooh, it sounds like somebody's pr0n collection seems a bit inadequate today.

Re:Who cares? (1)

Dthief (1700318) | more than 4 years ago | (#32085994)

I have 10^21 + 1 desktop shortcuts

Re:Who cares? (1)

natehoy (1608657) | more than 4 years ago | (#32088124)

Ah, so you've finally taken my advice and started cleaning those up. Thanks.

Re:Who cares? (1)

religious freak (1005821) | more than 4 years ago | (#32089226)

lol - now that's funny

Re:Who cares? (1)

ByOhTek (1181381) | more than 4 years ago | (#32086136)

That's everyone, not per person.

That's ~1/7th TB per person

Man am I over quota.

Where do I get a Zettabyte Drive? (1)

filesiteguy (695431) | more than 4 years ago | (#32085392)

Hmm - thinking that I'd like to pop over to cnet or tigerdirect or fry's and pick up a zettabyte drive. I'm sure that's "more than enough storage" for all my digital files...

Are they on sale for $149.00 yet?

Re:Where do I get a Zettabyte Drive? (1)

plover (150551) | more than 4 years ago | (#32085930)

I'm sure that's "more than enough storage" for all my digital files...

<oblig>640 zettabytes ought to be enough for anybody.</oblig>

Re:Where do I get a Zettabyte Drive? (1)

filesiteguy (695431) | more than 4 years ago | (#32086842)

LOL!

Re:Where do I get a Zettabyte Drive? (1)

IronDragon (74186) | more than 4 years ago | (#32090828)

Not yet, but soon.

Hardware: "Digital Universe" Enters the Zettabyte (2, Interesting)

Thanshin (1188877) | more than 4 years ago | (#32085400)

"In 2010 the volume of digital information created and duplicated in a year will reach 1.2 zettabytes, according to new data from IDC and EMC. The annual Digital Universe report is an effort to visualize the enormous amount of data being generated by our increasingly digital lives. The report's big numbers -- a zettabyte is roughly a million petabytes -- pose interesting questions about how the IT community will store and manage this firehose of data. Perhaps the biggest challenge isn't how much data we're creating -- it's all the copies of it. Seventy-five percent of all the data in the Digital Universe is a copy, according to IDC."

Re:Hardware: "Digital Universe" Enters the Zettaby (2, Insightful)

cgenman (325138) | more than 4 years ago | (#32085506)

Only 75%? Considering that all DVD's are copies, all local caches are copies, I wouldn't be surprised if that number was much larger.

Also, cutting out all the copies would only reduce the problem to .3 zettabytes. For day-to-day IT purposes, that's about the same number.

Re:Hardware: "Digital Universe" Enters the Zettaby (2, Insightful)

Rockoon (1252108) | more than 4 years ago | (#32085698)

In the world of home storage, 75% is definitely way too low. The average personal desktop probably has 20 to 40 gigabytes of used storage, with far less than 1 gigabyte being original content. If they also back up this data, the fraction grows even lower.

Everything on their DVR is also not original.

Now, in the business world things are a bit different. Here you can expect the same 20 to 40 gigabytes of used storage on the median machine, but backed by a massive networked database of original uptime-critical content with at least a couple mirrors.

It is this second category that is clearly driving their estimate.

Re:Hardware: "Digital Universe" Enters the Zettaby (2, Insightful)

iamhassi (659463) | more than 4 years ago | (#32086426)

HD home movies and photographs are far more than 1gb

Re:Hardware: "Digital Universe" Enters the Zettaby (1)

Rockoon (1252108) | more than 4 years ago | (#32087368)

The average person doesnt have the ability to take HD home movies because they dont even own the equipment necessary.

I've seen you project your geek lifestyle onto the world before.

Re:Hardware: "Digital Universe" Enters the Zettaby (0)

Anonymous Coward | more than 4 years ago | (#32088044)

Are you kidding? I bought, at retail, a Sony cam, with 60GB internal drive and HD resolution 2 years ago for $900 from Costco. Most people buying new camcorders have ready access to HD quality cameras. Given the purchasing behavior of my cousin and brother-in-law; I think far more people cycle their personal tech more frequently than you suspect.

Re:Hardware: "Digital Universe" Enters the Zettaby (1)

iamhassi (659463) | more than 4 years ago | (#32091240)

Where have you been, stuck in 2000? I said HD home movies and photographs. First, average joe can't hardly find a camera that's not digital since walmart only sells 2 cameras that still use film. [walmart.com]

Second, you can't buy a camcorder that's not flash or hard disk. Yep, you heard me: Walmart only sells 2 camcorders that record directly to DVD, the other 150+ are all flash and hard drive [walmart.com] . The camcorder offering the smallest hard drive capacity is still 80gb for a paltry sum of $350 [walmart.com] and HD camcorders start at only $89. [walmart.com]

So it is not I that is projecting my geek lifestyle on the world, it is you who is out of touch with modern consumer electronics.

Re:Hardware: "Digital Universe" Enters the Zettaby (1)

masshuu (1260516) | more than 4 years ago | (#32087424)

The average personal desktop probably has 20 to 40 gigabytes of used storage
This is slashdot, i doubt many here qualify as average.
I myself fill my drives with 680GB of stuff, with 40-50GB easily being original.

Re:Hardware: "Digital Universe" Enters the Zettaby (1)

Jenming (37265) | more than 4 years ago | (#32088524)

HD home movies and photographs are copies, even if only one digital copy exists.

Re:Hardware: "Digital Universe" Enters the Zettaby (3, Insightful)

PlusFiveTroll (754249) | more than 4 years ago | (#32085850)

If every piece of digital data doesn't have a copy made of it, it is one hardware failure away from non-existence. Most of the storage space used in businesses that I administrate is not for the original data, but for multiple backup copies. Copies are not a bad thing, in the business we call them redundancy.

# Only wimps use tape backup: real men just upload their important stuff on ftp, and let the rest of the world mirror it ;)
        * Torvalds, Linus (1996-07-20).

Re:Hardware: "Digital Universe" Enters the Zettaby (1)

basso (230632) | more than 4 years ago | (#32086478)

The comment about all the duplication of storage makes me think of the current pop culture obsession with hoarding.

I'd guess that all slashdotters have known someone who obsessively downloads music - to the point that they've got more music stored than they could possibly listen to.

Re:Hardware: "Digital Universe" Enters the Zettaby (1)

archshade (1276436) | more than 4 years ago | (#32089770)

My dads probably got more music (excluding the stuff he downloads) than he will ever get round to listening to. He is almost 50 and I seriously doubt he will double that and his entire basement manages to fit 2x chair + HI-FI + 90% record collection + 75% CD collection. If I go stay the rest of the records and some CDs make it impossible to get into my bed without knocking a stack over.

This is all after getting rid of tapes/MD and quite a lot of the vinyl/CD sometime I think he hold up the entire UK music scene on his own.

My point is you don't need to download to have to much music just spend all your time in record shops.

Re:Hardware: "Digital Universe" Enters the Zettaby (0)

Anonymous Coward | more than 4 years ago | (#32086700)

I can't believe none of you caught this... mod funny please, not interesting. Too bad there's no +1 Redundant.

Re:Hardware: "Digital Universe" Enters the Zettaby (0)

Anonymous Coward | more than 4 years ago | (#32088120)

Seriously, what is up with the mods on this one?!?

Re:Hardware: "Digital Universe" Enters the Zettaby (0)

Anonymous Coward | more than 4 years ago | (#32087514)

a zettabyte isn't ROUGHLY a million petabytes - it is EXACTLY a million petabytes, that being its definition, and all.

Depends on whether you're SI & IEC or JEDEC (1)

Ungrounded Lightning (62228) | more than 4 years ago | (#32087946)

a zettabyte isn't ROUGHLY a million petabytes - it is EXACTLY a million petabytes, that being its definition, and all.

Depends on whether you're using SI & IEC units (Z/zeta- and Zi/zebi- for 1000^7 / 1024^7) or extrapolating JEDEC (which would come up with Z/zeta- for 1024^7).

Re:Hardware: "Digital Universe" Enters the Zettaby (0, Redundant)

kybur (1002682) | more than 4 years ago | (#32088180)

"In 2010 the volume of digital information created and duplicated in a year will reach 1.2 zettabytes, according to new data from IDC and EMC. The annual Digital Universe report is an effort to visualize the enormous amount of data being generated by our increasingly digital lives. The report's big numbers -- a zettabyte is roughly a million petabytes -- pose interesting questions about how the IT community will store and manage this firehose of data. Perhaps the biggest challenge isn't how much data we're creating -- it's all the copies of it. Seventy-five percent of all the data in the Digital Universe is a copy, according to IDC."

Hardware: "Digital Universe" Enters the Zettabyte (0, Redundant)

Darric (1721334) | more than 4 years ago | (#32088530)

"In 2010 the volume of digital information created and duplicated in a year will reach 1.2 zettabytes, according to new data from IDC and EMC. The annual Digital Universe report is an effort to visualize the enormous amount of data being generated by our increasingly digital lives. The report's big numbers -- a zettabyte is roughly a million petabytes -- pose interesting questions about how the IT community will store and manage this firehose of data. Perhaps the biggest challenge isn't how much data we're creating -- it's all the copies of it. Seventy-five percent of all the data in the Digital Universe is a copy, according to IDC. "

Re:Hardware: "Digital Universe" Enters the Zettaby (1)

ArsonSmith (13997) | more than 4 years ago | (#32089082)

So when this is duped in a few hours will that be irony or just funny?

Re:Hardware: "Digital Universe" Enters the Zettaby (1)

d474 (695126) | more than 4 years ago | (#32091250)

Someone please Mod parent +Funny - Did no one get the irony?

ZFS? (1)

CuriousKumar (1058312) | more than 4 years ago | (#32085450)

With Z already in place and with [not so] recent inline deduplication [sun.com] feature, I think ZFS should do it.

Re:ZFS? (1)

Stooshie (993666) | more than 4 years ago | (#32085494)

Like Yotta

Kids know what KiB is just fine (0)

Anonymous Coward | more than 4 years ago | (#32085454)

"You talk to a kid these days and they have no idea what a kilobyte is. The speed things progress, we are going to need many words beyond zettabyte."

Sure they do. When their download speed drops under 100 kilobytes, they start sending "seed pls" comments to the Pirate Bay.

a kb is a lot of data (1)

jDeepbeep (913892) | more than 4 years ago | (#32085728)

640K ought to be enough for anybody

Re:a kb is a lot of data (1)

CecilPL (1258010) | more than 4 years ago | (#32090628)

I need that much every second, at least!

Annoying audio ad at the 'digital universe' link (0)

Anonymous Coward | more than 4 years ago | (#32085458)

i know some people here frown down on those

How do we have copies of all this data? (2, Insightful)

HockeyPuck (141947) | more than 4 years ago | (#32085476)

Since this is EMC, let me tell you...

EMC loves to tell you to use RAID1. - 2 copies of your data
If it's important, you should use timefinder (snapshots), 1 more copy of the data.
If you want DR, then you should implement SRDF, 1 more copy of the data (this one is remote)
If you want to do data warehousing on what you just replicated, you run timefinder on the remote copy, 1 more copy.

So that makes it 5 copies of my data on disk.

Oh, and to protect myself from data corruption (or a deleted file) being replicated to all these copies, it's still recommended that I backup to tape/VTL/MAID.

Total of 6 copies of data. That is if I'm using dedup on my VTL or TSM (which stores versions of a given file). If i'm using a traditional (daily incrementals plus weekly fulls) I could have lots of duplications within my tape infrastructure.

Ever wonder why EMC stands for Endless Mirroring Company.

Re:How do we have copies of all this data? (1)

Thanshin (1188877) | more than 4 years ago | (#32086072)

Ever wonder why EMC stands for Endless Mirroring Company.

And if the endless mirroring company starts moving really fast, until they couldn't possibly move faster, they'd represent the relationship between mass and energy.

Re:How do we have copies of all this data? (1)

ArsonSmith (13997) | more than 4 years ago | (#32089046)

i thought they were the Excessive Margins Company.

I'm happy to see (2, Funny)

ltning (143862) | more than 4 years ago | (#32085672)

That we have all become good citizens, backing up all our data. I presume the data recovery firms are all panicking now that all their potebtial customers have backups of everything, and thus no longer need their services.

Not bad to have a global backup ratio of >1:1

Personally I use RAIM (Redundant Array of Instant Messages) to back up all my important notes and communications. It only works as long as all my friends log everything too, of course.

Re:I'm happy to see (2, Funny)

natehoy (1608657) | more than 4 years ago | (#32086014)

Dude, that's so old-school. I use RAT (Redundant Array of Tweets). My data is backed up... 140 characters at a time.

I'm thinking of upgrading to a system with a larger packet size. RASC (Redundant Array of Slashdot Comments) might work, but I'm afraid of having my pr0n collection marked "insightful".

What is the data? (2, Interesting)

sadtrev (61519) | more than 4 years ago | (#32085674)

I was told about 10 years ago that "70% of the world's digital data is stored under MVS" which surprised me a bit, even then.
After some thought when you consider that almost all commercial transactions (banks, telcos etc) whould have been running MVS then it may have been true.
SETI and CERN and other large scientific endeavours are small fry in comparison.

Re:What is the data? (1)

Yvan256 (722131) | more than 4 years ago | (#32086056)

10 years ago MVS [wikipedia.org] carts were considered huge compared to the other gaming systems of the era.

Challenge? (2, Insightful)

O('_')O_Bush (1162487) | more than 4 years ago | (#32085676)

" Perhaps the biggest challenge isn't how much data we're creating &mdash; it's all the copies of it. "

Why is that a challenge? Digital media is somewhat unique in that you can carefully craft media or information (reports, programs, videos much in the same way you'd carve a chair) but risk instantly and nearly irrecoverably lose it (much unlike a chair).

Copies of data are a safeguard by redundancy. A website gets taken offline, well good thing there is a mirror. My camera breaks or my hard drive disk fails, well good thing I have an external backup or copies on my DVDs.

Re:Challenge? (1)

Menkhaf (627996) | more than 4 years ago | (#32086008)

" Perhaps the biggest challenge isn't how much data we're creating &mdash; it's all the copies of it. "

Slashdot apparently even manages to create new data while still backing up the old data...

In other news... (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#32085694)

In a related study, 59.3% of all data ever generated is estimated to be pr0n.

But it's pretty (1)

hierofalcon (1233282) | more than 4 years ago | (#32085848)

When I first started, information was hard to produce... punched cards and all that.
Information storage was expensive.
At some point we started word processing on the desktop.
Information storage was still expensive.
Files were still small and the majority of the bytes in each file was information.

As time progressed and Microsoft Office has permeated the work area, the information content of each file hasn't changed much.
Each release seemed to take more space to store the same information.
Today, the portion of the file consumed in making it pretty though has gone through the roof.

We could always go back to just plain text files that were easy to search and cheap to store. Project Gutenberg has taken that approach for saving books. Keep it simple. Of course if we did that the productivity might go through the roof and layoffs might be high.

Guess pretty isn't so bad, even if it is part of the zettabyte problem.

Re:But it's pretty (1)

maxume (22995) | more than 4 years ago | (#32086522)

That's a silly analysis of it, text markup and layout is some tiny fraction of it. 150 pages of text layout information takes up about the same amount of space as 2 crappy snapshots, or a few seconds of high quality video.

Re:But it's pretty (1)

tendrousbeastie (961038) | more than 4 years ago | (#32088108)

I think Hierofalcon was probably referring more the huge inefficiency of MS Word and co to store even a simple text based document.

I have seen 70MB+ Word files, which you can open, Ctrl-A, Ctrl-C and then paste into a new empty doc - save that doc and you have a 50kb Word file.

Plain text doesn't really have the capability of being inefficient (unless I suppose you fill the file with crap, but then it is simply efficiently storing a load of crap).

Re:But it's pretty (1)

maxume (22995) | more than 4 years ago | (#32088266)

The part at the end where he talks about pretty favors my interpretation.

I don't know what causing what you describe, but there is probably something tracking changes to the document. And maybe somebody posted in a large bitmap (from what I have seen, people think that is a great idea), or perhaps of series of them, and then deleted them.

Re:But it's pretty (1)

tendrousbeastie (961038) | more than 4 years ago | (#32088390)

Yeah, Word has a lovely feature whereby when you remove sections from a document it doesn't so much delete that content from a file as just delete the references to it. So, if someone changes one image in a doc for another it will keep a copy of both images in the file but only show the new one.

It would be a good feature if it was actually made use of in some sort of revision history system, but as far as I can tell the only effect of it is the increased file size of some docs.

I agree with you that the talk about making it pretty must refer to markup code of some kind, so you're right, my talk of plain text doesn't really address the issue.

Re:But it's pretty (1)

Smauler (915644) | more than 4 years ago | (#32089686)

Gamefaqs still hosts all its faqs in text, which is one of the reasons I use it pretty exclusively. That and it being easily the most authorative faq site out there.

Library of congress (1)

nlann (1125759) | more than 4 years ago | (#32085940)

How many libraries of congress is that?

Re:Library of congress (0)

Anonymous Coward | more than 4 years ago | (#32086520)

About 100,000

Re:Library of congress (1)

iamhassi (659463) | more than 4 years ago | (#32086524)

And if it were in phonebooks stacked on top of each other how far would it reach?

Re:Library of congress (2, Interesting)

OctaviusIII (969957) | more than 4 years ago | (#32086528)

According to Wikipedia, it's about 10^9 Libraries of Congress, not including images.

Of COURSE most of it consists of copies! (2, Insightful)

King_TJ (85913) | more than 4 years ago | (#32086150)

A typical individual wouldn't have a whole lot of unique information to store in the first place.... Basically, a collection of photos and some video from a few vacation trips or holidays, and some handwritten notes .... Maybe some artistic works (a few original songs or paintings, or ?) if he/she was interested in such endeavors. Oh, and your tax records and resume. But let's face it. Most of us are FAR more of content consumers than creators. Content creation usually results in mass re-distribution of the original work, as others want to enjoy a copy of it.

I don't see any harm with this either, since duplication is the best way to protect against data loss. (When my parents were trying to trace their family history, they reached a dead-end because a library had burnt up in a fire that contained the only known records of some of the people they needed to research. With so much data going digital, on media that's practically EXPECTED to fail after less than 10 years of regular use? You better believe we need lots of duplicates out there!)

I have often said.... (1)

hesaigo999ca (786966) | more than 4 years ago | (#32086240)

I have often spoken to a many engineers from gmail and hotmail....pertaining to the data they store and how they could improve their
systems by having pointers to emails instead of actual copies per storage account. if someone sends a joke email from one gmail account to all his friends which have 80% gmail accounts (so let's say, 25 in 30) you would only still have one copy of that joke email sitting on their server accessible by all who have that pointer reference, but in fact looks like they all have their own copy, unless of course someone modifies that email, which then becomes its own version , so splitting into many copies....

If you were to add that to many more things in life like cloud computing for storage purposes, across the board, you would have a lot less bandwidth being used, imagine being able to reference first by using lookup pointer tables, and instead of downloading, you already know someone on your own network has that same service pack for xxx application, so the download supposedly goes quicker because you are not streaming the packets from outside the network to get the info. etc...etc...

Anyways, I can't wait for our first zetta drive being sold at a futureshop near you, i will be the first in line to buy one!
In a few years i think.....lol

Re:I have often said.... (1)

iamhassi (659463) | more than 4 years ago | (#32086646)

Or you could just have pointers to words in a dictionary. All you need is a few million words and you could recreate any possible email written in English

Re:I have often said.... (1)

CecilPL (1258010) | more than 4 years ago | (#32090670)

Or you could just have pointers to letters in the English alphabet! Then you can store all your emails in only 26 bytes (plus some overhead for the pointers).

Roughly? (0)

Anonymous Coward | more than 4 years ago | (#32086248)

"a zettabyte is roughly a million petabytes"
Surely that should read *exactly* ...?
/me ducks

1.21 zettabytes? (5, Funny)

non-registered (639880) | more than 4 years ago | (#32086324)

1.21 zettabytes? Great Scott!

Re:1.21 zettabytes? (1)

nebaz (453974) | more than 4 years ago | (#32089012)

To aptly apply the mispronounced "jigawatt" paradigm: <Doc Brown>1.31 settabytes? Great Scott!</DocBrown>

Re:1.21 zettabytes? (0)

Anonymous Coward | more than 4 years ago | (#32089654)

Actually, "jigawatt" is the correct pronunciation.

http://www.merriam-webster.com/dictionary/gigawatt [merriam-webster.com]

Retarded IP (5, Insightful)

static416 (1002522) | more than 4 years ago | (#32086416)

This beautifully illustrates how idiotic the concept of "copy right" and IP in general is in the digital universe. When 75% of 1.2 zettabytes is mostly untracked copies of other information, just storing the licenses alone would be an impossible task.

How do you maintain a business model built on the exclusive right to copy information in world where everything is a infinitely copied and copyable? It's like trying to legislate and sell access to saltwater while floating on a raft in the middle of the pacific.

Re:Retarded IP (1)

sexconker (1179573) | more than 4 years ago | (#32086774)

How do you maintain a business model built on the exclusive right to copy information in world where everything is a infinitely copied and copyable? It's like trying to legislate and sell access to saltwater while floating on a raft in the middle of the pacific.

It's more like trying to sell bottled saltwater while floating on a raft in the middle of the ocean. Anyone's free to take what they want from the ocean, but they risk being singled out by the sharks (lawyers, with freaking laser beams).

Headline... (0)

Anonymous Coward | more than 4 years ago | (#32086526)

kinda makes me wonder what it would take to store the non-digital universe.

It hurts my head.

Of course... (0)

Anonymous Coward | more than 4 years ago | (#32086564)

1.1ZB of that is porn.

Re:Of course... (1)

gmuslera (3436) | more than 4 years ago | (#32086848)

... and lolcats [wherestheanykey.co.uk]

Space Program (5, Funny)

Ukab the Great (87152) | more than 4 years ago | (#32086722)

- 1 zettabyte / 1.44MB floppy disk = approx 694,444,444,444,444 floppy disks.

- 694,444,444,444,444 * 3.5 inches per disk = 2,430,555,555,555,550 inches if you laid the floppies end to end.

- 2,430,555,555,555,550 inches / 63360 inches per mile = 38,361,040,965 miles

- 38,361,040,965 miles / 2.7 billion miles to pluto = approx 7 round trips to Pluto via floppy disk.

In conclusion: Don't kill NASA yet, President Obama. We've found a way to get to Pluto!.

Too many duplicates consuming disk space? (2, Insightful)

RhapsodyGuru (1250396) | more than 4 years ago | (#32086776)

No problem...

zfs set dedup=on tank

there... that should do the trick.

75% sounds about right (1)

SlippyToad (240532) | more than 4 years ago | (#32086806)

75% of everything I have on disk is a copy of something else, but unfortunately I usually have lost the copy somewhere in the process of moving, moving from one machine to the next, or trying to clear up disk space so I can download more stuff to leave on my disk.

A zettabyte is EXACTLY a million petabytes (0, Troll)

swillden (191260) | more than 4 years ago | (#32086926)

By definition. And since EMC is a storage company, they're almost certainly using the SI prefixes properly.

The author of the summary is, I think, confusing zettabytes and petabytes with the base-2 units, zebibytes and pebibytes. For all of the binary prefix haters, when you get up into these sizes the difference between base 2 and base 10 units is more than big enough to justify the effort to use the correct terms. The difference between one zebibyte and one zettabyte is over 180 exabytes.

Re:A zettabyte is EXACTLY a million petabytes (0)

Anonymous Coward | more than 4 years ago | (#32088040)

And when you get to those kinds of numbers, they are the same order of magnitude so the difference is fairly irrelevant.

Re:A zettabyte is EXACTLY a million petabytes (0)

swillden (191260) | more than 4 years ago | (#32088472)

And when you get to those kinds of numbers, they are the same order of magnitude so the difference is fairly irrelevant.

Order of magnitude, yes, but a power of 10 is a pretty wide range.

Put it this way: A zebibyte is almost 20% larger than a zettabyte. That's a pretty big difference.

I would have expected 100% duplication (0)

Anonymous Coward | more than 4 years ago | (#32086968)

Everything should be backed up ...... Right?!?!

Then I would expect 100% duplication, or higher should be the norm.

Damn pirates (1)

Issarlk (1429361) | more than 4 years ago | (#32087158)

75% of 1.2ZB = 1E14 megabytes = 150 GigaCDs. At arround 10 tracks per CD costing each 22000USD, that makes it 34 petadollars in lost sales for the music industry !

75% of all percent of all the data in the Digital (0)

Anonymous Coward | more than 4 years ago | (#32087392)

"75% percent of all the data in the Digital Universe is a copy"

Slashdot Editors unavailable for comment.

How do you know which one is the copy? (1)

Sir_Dill (218371) | more than 4 years ago | (#32087470)

If a digital copy is identical to the source file, then how do you know which one is the copy?

Identical meaning everything down to the create date and last modified date.

Volume of digital data, not information (1)

rcamans (252182) | more than 4 years ago | (#32087648)

Actually, information is useful stuff. The internet, and the world is saturated with useless stuff (data, noise). Also, the world's stuff is considerably smaller when de-duplicated. And then if you remove redundancies, and different ways of saying the same thing...
I am pretty sure that all the world's Information can be contained on a single petabyte. That would include all the world's literature, and all the newspapers, magazines, etc. If you include pictures, maybe significantly more.
Part of the Data problem is tons of data which has not been analyzed. Of the analysis could keep up with the data collection rates, then no problem. And no zetabytes needed, either.

questions about how to store and manage the data (1)

Kopretinka (97408) | more than 4 years ago | (#32087792)

"the big numbers pose interesting questions about how the IT community will store and manage this firehose of data" - just like the construction community will house and manage the firehose of over 6 billion people, so to speak.

Everybody takes care of their own bit(s) & backups; there is no single entity dealing with managing 1.2ZB.

Questions not so interesting. Move on.

The scary part (1)

jgreco (1542031) | more than 4 years ago | (#32088084)

Is what percentage of it ISN'T backed up AND should be (which will be something less than 25% but much greater than 0%).

Roughly? (0)

Anonymous Coward | more than 4 years ago | (#32089124)

a zettabyte is roughly a million petabytes

No, a zettabyte is exactly a million petabytes.

A zebibyte is roughly a million pebibytes.

Sheesh.

Need some smart software (0)

Anonymous Coward | more than 4 years ago | (#32089632)

I have just a new terabytes sitting around the house plus back up on various computers, USB sticks here and there. I have also tried to find a nice unified storage device, say RAID 5, that can collect it all and remove all duplications (but keep a kinda of symbolic link to the one true copy and only fork it if it is changed). So if I have two files A and B in two different directories but they where the exact same file then they could be just stored a one file on the disk. If I decide to change A then I should be ask if I should update it or fork a copy of it, leaving B alone.

Seriously? They included duplication? (1)

pclminion (145572) | more than 4 years ago | (#32090044)

The number is meaningless, because "duplication" is arbitrary. Where do you draw the line? If duplication means "copying data from one place to another" then data is duplicated every time function parameters are pushed on the stack, every time memcpy() is called, every time something is loaded from disk into RAM. I could write a simple loop that copies a 32-bit quantity from EAX to EBX three billion times per second. If you include all that shit going on, I bet their number would be higher by a factor of a thousand or more. What the fuck is the use of this metric? How do you justify the arbitrary stopping point?

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?