Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

ZeoSync Makes Claim of Compression Breakthrough

michael posted more than 12 years ago | from the claims-easy-proof-hard dept.

989

dsb42 writes: "Reuters is reporting that ZeoSync has announced a breakthrough in data compression that allows for 100:1 lossless compression of random data. If this is true, our bandwidth problems just got a lot smaller (or our streaming video just became a lot clearer)..." This story has been submitted many times due to the astounding claims - Zeosync explicitly claims that they've superseded Claude Shannon's work. The "technical description" from their website is less than impressive. I think the odds of this being true are slim to none, but here you go, math majors and EE's - something to liven up your drab dull existence today. Update: 01/08 13:18 GMT by M : I should include a link to their press release.

cancel ×

989 comments

Sorry! There are no comments related to the filter you selected.

First karma capped first post (-1, Offtopic)

King Of Chat (469438) | more than 12 years ago | (#2803116)

Well, might as well get modded down a bit.

Re:First karma capped first post (-1, Offtopic)

ZaxxonFlux (473066) | more than 12 years ago | (#2803125)

Thats nice, but id like to comment about ZeoSync here, and its new compression algorithms. I have been working for Zeo for a year, and its been great. Unisys tried to sue over a compression algorith, and I thought we were going the way fo the dot bomb,

The one day I read this in an internal email, and I knew everything was all good.:

GM Jesse twitched and farted in his sleep. Next to him lay Susie Anne Lou, the GM plant-slut and all around bar whore. She was 42. She was also awake.

"Jesse, wake up," she hissed, her voice like sandpaper from years of a 5-pack-a-day smoking habit. "Wake up you fat son of a bitch!"

Stirring and mumbling something about "polishing his knob like a good little slut," GM Jesse awoke slowly. He winced as had fallen asleep with his cheap sunglasses on again and inadvertantly shoved them into his face in a failed attempt to wipe the sleep from his eyes.

"What the fuck!?" he exclaimed as he groggily looked around. He gasped as he saw Susie Anne Lou; he had forgotten that he had "seduced" (bought her 7 beers at the bar) and fucked her earlier that night. He had been dreaming about Sarah Jessica Parker's perky Jewish tits and her shaven Jewish pussy. He had "messed her pussy up" all night long in his dream and waking up to Susie Anne Lou was in sharp contrast to his fantasy.

"You were snoring and farting. You God-damned pig, I don't expect to deal with shit-smell and grunting after I fuck," Susie Anne Lou said pointedly. "God dammit!"

Without hesitation, GM Jesse bitchslapped Susie Anne Lou. "Fuckin' cunt, God-damn bitchin' an' whinin' after I got my balls in you," he berated. "Next time you want this God-damned meat pole you're gonna get down on your knees and kiss my balls first, you fuckin' hag!"

The next morning, Susie Ann Lou, the GM plant-slut, was nowhere to be seen. Neither was GM Jesse's guitar (untouched since '78), his Journey records (last listened to yesterday), his beer (all 5 cases of it, chilled), and a stack of porno mags (Open Legs, Hustler, and Shaved).

"Fuckin' slut stole all my shit, God dammit!" GM Jesse exclaimed angrily. "Fuckin' fuck-hole walked off with all my favotire shit!"

He grabbed his jean jacket and waddled out the door. It was a warm summer morning in Kansas City and he was wearing his finest red cut-off jogging pant shorts, a stained white tshirt, and a flannel shirt overtop of that. His shoes were imitation leather with Velcro straps. GM Jesse didn't have time to fuck around with tyin' his shoes!

His belly hung out from his tshirt, and though he didn't notice, his dirty cock was hanging limply from a hole he had cut in the front of his jogging pant cut-off shorts so he woulnd't have to pull them down to piss. He'd done this while trying to piss in a beer bottle in his reclining chair late one Wednesday night. It was quite hard to piss in a beer bottle sitting down with your dick aimed down and over the top of an elastic band!

His '78 Sedan stationwagon peeled out of his driveway and down the gravelly road toward I-70 and the GM plant.

His buddies from the line were drinking in the parking lot before work, per tradition every work day, and he didn't want to be late.

troll tuesday - put it in ya! (-1)

neal n bob (531011) | more than 12 years ago | (#2803119)

up your ass, michael!

Re:troll tuesday - put it in ya! (-1)

CmderTaco (533794) | more than 12 years ago | (#2803171)

3 [goatse.cx]
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
63
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6

100:1 I dont think so (0)

ryanh50 (534202) | more than 12 years ago | (#2803121)

while it sounds nice it's inplausable. That amount of data cannot be compressed beyond each unique charachter + compression data. Unless long repeating strings appear. Of course then it would not be random. P.S. FIRST POST WOOHOOO!!!

Current ratio? (2, Interesting)

L-Wave (515413) | more than 12 years ago | (#2803126)

Exscuse my lack of compression knowledge, but whats the current ratio? Im assuming 100:1 is pretty damn good. =) btw...even though this *might* be a good compression algorithm and all that, how long would it take to decompress a file using your joe average computer??

Re:Current ratio? (0, Offtopic)

MrFredBloggs (529276) | more than 12 years ago | (#2803140)

"how long would it take"

Who cares? If you can download a whole CD`s worth of music as a 7 meg file, who cares if it takes 1,2,5,10,20,100 mins to unpack it?

Re:Current ratio? (2)

skroz (7870) | more than 12 years ago | (#2803166)

Umm... I care. If your compression/decompression time exceeds the amount of time it would take to transfer the file uncompressed, you're really not gaining anything.

The mathematical implications alone of such a breakthrough would be impressive. 100:1 compression of truly random data? Wow.

Re:Current ratio? (2, Insightful)

Sobrique (543255) | more than 12 years ago | (#2803173)

Of course, given that cpu speed increases faster than bandwidth, even if it is an issue now, it won't be in a year.

Re:Current ratio? (1)

cide1 (126814) | more than 12 years ago | (#2803188)

About 1:2, sometimes more, often times less. It depends on the nature of the material being encoded, and the algorithm being used.

Re:Current ratio? (1)

cide1 (126814) | more than 12 years ago | (#2803219)

2:1 is what I meant, 1:2 would be increasing the size of the data.

Re:Current ratio? (3, Informative)

CaseyB (1105) | more than 12 years ago | (#2803213)

but whats the current ratio?

For truly random data? 1:1 at the absolute best.

Re:Current ratio? (2, Redundant)

CaseyB (1105) | more than 12 years ago | (#2803249)

That's not right. A 1:1 average for a large sample of random data is the best you can ever do. On a case by case basis, you can get lucky and do better, but no algorithm can compress arbitrary random data at better than 1:1 in the long run.

Re:Current ratio? (5, Informative)

radish (98371) | more than 12 years ago | (#2803216)


For lossless (e.g. zip, not jpg, mpg, divx, mp3 etc etc) you are looking at about 2:1 for 8-bit random, much better (50:1?) for ascii text (e.g. 7-bit non-random).

If you're willing to accept loss, then the sky's the limit, mp3 @ 128kbps is about 12:1 compared to a 44k 16bit wave.

Re:Current ratio? (1)

dannyspanner (135912) | more than 12 years ago | (#2803232)

Data compression is based on finding redundant data, i.e. repetition, that can be factored out. With random data this tends toward 1:1 as your increase the size of the data. Sorry, but I haven't got time to point you in the right direction, just hit Google with "random data compression theory" or the like.

It all boils down to the fact that you just plain can't compress random data in the general case.

Re: Information theory says 1 (2, Interesting)

Dada (31909) | more than 12 years ago | (#2803240)

The maximum compression ratio for random data is 1. That's no compression at all.

No way (-1)

Fucky the troll (528068) | more than 12 years ago | (#2803127)

"something to liven up your drab dull existence today."

Surely you're not saying slashdot readers are dull are you?

how can this be? (3, Informative)

posmon (516207) | more than 12 years ago | (#2803128)

even lossless compression still relies on redundancy within the data, normally repeating patterns of data. surely 100-1 on TRUE random data is impossible?

Re:how can this be? (4, Insightful)

jrockway (229604) | more than 12 years ago | (#2803159)

I'm going to agree with you here. If there's no pattern in the data, how can you find one and compress it. The reason things like gzip work well on c files (for instance) is because C code is far from random. How many times do you use void or int in a C file? a lot :)

Try compressing a wav or mpeg file with gzip. Doesn't work too well, becuase the data is "random", at least in the sense of the raw numbers. When you look at patterns that the data forms, (i.e. pictures, and relative motion) then you can "compress" that.
Here's my test for random compression :)

$ dd if=/dev/urandom of=random bs=1M count=10
$ du random
11M random
11M total
$ gzip -9 random
$ du random.gz
11M random.gz
11M total
$

no pattern == no compression
prove me wrong, please :)

Re:how can this be? (2)

skroz (7870) | more than 12 years ago | (#2803184)

They just threw out information theory entirely... too restrictive. They came up with their own theory... disinformation theory! Everyone seems to be jumping on the bandwagon, too... these guys [wired.com] even compiled a list of the pioneers!

Re:how can this be? (5, Funny)

Rentar (168939) | more than 12 years ago | (#2803185)

I'm going to agree with you here. If there's no pattern in the data, how can you find one and compress it. The reason things like gzip work well on c files (for instance) is because C code is far from random. How many times do you use void or int in a C file? a lot :)

So a perl programm can't be compressed?

Re:how can this be? (2)

sprag (38460) | more than 12 years ago | (#2803246)

Well, I can think of two ways that "random" data might be compressed without an obvious pattern:

* If the data was represented a different way (say, using bits instead of bytesize data) then patterns might emerge, which would then be compressable. Of course, the $64k question is: will it be smaller than the original data?

* If the set of data doesn't cover all possibilities of the encoding (i.e. only 50 characters out of 256 are actually present), then a recoding might be able to compress the data using a smaller "byte" size. In this case, 6 bits per character instead of 8. The problem with this on is that you have to scan through all of the data before you can determine the optimal bytesize...and then it still may end up being 8.

Re:how can this be? (3, Interesting)

Shimbo (100005) | more than 12 years ago | (#2803197)

They don't claim they can compress TRUE random data only 'practically random' data. Now the digits of Pi are a good source of 'practically random' data for some definition of the phrase 'practically random'.

Re:how can this be? (2, Informative)

mccalli (323026) | more than 12 years ago | (#2803214)

even lossless compression still relies on...normally repeating patterns of data. surely 100-1 on TRUE random data is impossible?

However, in truly random data such patterns will exist from time to time. For example, I'm going to randomly type on my keyboard now (promise this isn't fixed...):

oqierg qjn.amdn vpaoef oqleafv z

Look at the data. No patterns. Again....

oejgkjnfv,cm v;aslek [p'wk/v,c

Now look - two occurences of 'v,c'. Patterns have occured in truly random data.

Personally, I'd tend to agree with you and consider this not possible. But I can see how patterns might crop in random data, given a sufficiently large amount of source data to work with.

Cheers,
Ian

Re:how can this be? (1)

Sobrique (543255) | more than 12 years ago | (#2803215)

It's easy.
If it's true random, it's meaningless, so you can just cat an equivalent number of bytes out of /dev/random to replace it.

100:1 ? I don't think so... (5, Insightful)

Mr Thinly Sliced (73041) | more than 12 years ago | (#2803129)

They claim 100:1 compression for random data. The thing is, if thats true, then lets say we have data A size (1000)

compress(A) = B

Now, B is 1/100th the size of A, right, but it too, is random, right (size 100).

On we go:
compress(B) = C (size is now 10)
compress(C) = D (size 1).

So everything compresses into 1 byte.

Or am I missing something.

Mr Thinly Sliced

Re:100:1 ? I don't think so... (1, Redundant)

RareHeintz (244414) | more than 12 years ago | (#2803146)

I think you've hit on one of a few arguments showing that their claim is bullshit.

OK,
- B

Re:100:1 ? I don't think so... (5, Insightful)

MikeTheYak (123496) | more than 12 years ago | (#2803182)

It goes beyond bullshit into the realm of humor:

ZeoSync has developed the TunerAccelerator(TM) in conjunction with some traditional state-of-the-art compression methodologies. This work includes the advancement of Fractals, Wavelets, DCT, FFT, Subband Coding, and Acoustic Compression that utilizes synthetic instruments. These are methods that are derived from classical physics and statistical mechanics and quantum theory, and at the highest level, this mathematical breakthrough has enabled two classical scientific methods to be improved, Huffman Compression and Arithmetic Compression, both industry standards for the past fifty years.

They just threw in a bunch of compression buzzwords without even bothering to check whether they have anything to do with lossless compression...

Re:100:1 ? I don't think so... (1)

Sobrique (543255) | more than 12 years ago | (#2803189)

Full house!
I just won buzzword Bingo!

Re:100:1 ? I don't think so... (5, Funny)

oyenstikker (536040) | more than 12 years ago | (#2803147)

Maybe they'll be able to compress their debt to $1 when they go under.

Re:100:1 ? I don't think so... (3, Informative)

Xentax (201517) | more than 12 years ago | (#2803151)

No...the compressed data is almost certainly NOT random, so it couldn't be compressed the same way. It's also highly unlikely any other compression scheme could reduce it either.

I'm very, very skeptical of 100:1 claims on "random" data -- it must either be large enough that even being random, there are lots of repeated sequences, or the test data is rigged.

Or, of course, it could all be a big pile of BS designed to encourage some funding/publicity.

Xentax

Re:100:1 ? I don't think so... (0)

Anonymous Coward | more than 12 years ago | (#2803244)

No...the compressed data is almost certainly NOT random, so it couldn't be compressed the same way. It's also highly unlikely any other compression scheme could reduce it either.

Any data stream can be considered "random" in that it has an equal probability of occurring as any other stream. I guess it depends on what the company means by random: Do they mean it can take any input and compress it 100-fold, or do they mean that the data must be sufficiently random, i.e. there is specifically not redundancy in the stream?

Re:100:1 ? I don't think so... (0)

Anonymous Coward | more than 12 years ago | (#2803157)

But the output would not be random.
Its information content would be higher

Although I still think 100:1 compression is very unlikely.

Re:100:1 ? I don't think so... (5, Insightful)

arkanes (521690) | more than 12 years ago | (#2803161)

I suspect that when they say "random" data, they are using marketing-speak random, not math-speak random. Therefore, by 'random', they mean "data with lots of repetition like music or video files, which we'll CALL random because none of you copyright-infringing IP thieving pirates will know the difference"

Re:100:1 ? I don't think so... (1)

pointym5 (128908) | more than 12 years ago | (#2803205)

No, that's not right: read their press release. They explicitly claim to be able to compress the output of the compressor.

I agree about random (1)

a random streaker (538956) | more than 12 years ago | (#2803224)

To marketing, "random data" means a .jpeg, a .mpeg, an audio file, a .exe file, a text file, a .doc, etc. I.e. the algorithms apply to general data, as opposed to schemes to compress specific data (aka .jpeg for pictures, .mpeg for a series of similar pictures, etc.)

Re:100:1 ? I don't think so... (3, Interesting)

Rentar (168939) | more than 12 years ago | (#2803165)

This is a proof ('though I doubt it is a scientificly correct one), that you can't get lossless compression with a constant compression factor! What they claim would be theroretically possible if 100:1 where an average, but I still don't think this is possible.

Re:100:1 ? I don't think so... (1)

White Shade (57215) | more than 12 years ago | (#2803237)

A simple way of thinking about it is this:

A compression algorithm makes a smaller number of bytes "equal" to a larger number of bytes.

However, there are less smaller strings of bytes then there are larger strings of bytes.

Consequently, Not all larger strings of bytes can be represented by smaller strings of bytes.

Therefore, while it is possible to reduce MANY larger strings of bytes into extremely small strings of bytes (ie most data can compress very well), there are, logically and obviously, strings of bytes that cannot[in the realm of any given compression algorithm] which simply cannot ever be represented with a unique smaller series of bytes (some data compresses like crap.. ie already compressed data)..

simple concept, i hope, it just takes a little bit of writing to explain

the result of this is, then, exactly what you say: You can never get get a constant compression factor if you want it lossless!

I hope this helps clear up any compression mysteries...

Re:100:1 ? I don't think so... (1)

Klaruz (734) | more than 12 years ago | (#2803169)

I'm pretty sure they'd mean random raw data.

What your saying is like saying gzip can compress random text x amount, compressing it, then expecting it to compress by x amount again. Once you've compressed the data, it's no longer random.

That said, I'm pretty sure it's not real as well, or is something less than what they've said it is. It is possible than somebody came up with a completly new way of thinking about things though.

Re:100:1 ? I don't think so... (1)

Klaruz (734) | more than 12 years ago | (#2803187)

Eeek scratch out my comment, it's too early, I'm dumb.

TRUE random data would be much harder to compress than your typical text file full of slashdot troll stories.

Re:100:1 ? I don't think so... (1)

grazzy (56382) | more than 12 years ago | (#2803190)

err. this really doesnt have anything todo with anything. same argument should work for pkzip or any other compressiontechnique.

A = 100 bytes.
B = compress(A)

B ~ 40% less data = 60 bytes.

And you cant compress B anymore.

Its natural to belive that they dont have developed a technique to compress 100 bytes to 1 byte. But if they have managed to compress a movie in avi of 6gb to 600M thats not a 'whoawhoawhao' thingy since divx already does this.

But then again, divx doesnt compress random data..

Conclusion, this compression wont help us broadcast movies or sounds, but it'll help with data that doesnt fit in those categories, soo god save us from the pirates.

Yes you are... (2)

radish (98371) | more than 12 years ago | (#2803193)


B is not random. It is a description (in some format) of A.

But, what you say does have merit, and this is why compressing a ZIP doesn't do much - there is a limit on repeated compression because the particular algorithm will output data which it itself is very bad at comrpessing further (if it didn't why not iterate once more and produce a smaller file internally?).

Re:100:1 ? I don't think so... (0)

Anonymous Coward | more than 12 years ago | (#2803218)

First of all 1000/100 = 10 (not 100)!
and when you compress random data the result is no
longer random. The possibilities for representing
1000 bytes of data with a mere 10 bytes must be quite slim!

Re:100:1 ? I don't think so... (4, Funny)

Mr Thinly Sliced (73041) | more than 12 years ago | (#2803248)

Not only that, but I just hacked their site, and downloaded the entire source tree here it is:

01101011

Pop that baby in an executable shell script. Its a self extracting
./configure
./make
./make install

Shh. Don't tell anyone.

Mr Thinly Sliced

Conserve Bandwidth? (2, Funny)

Atzanteol (99067) | more than 12 years ago | (#2803130)

Maybe they just needed more bandwidth for their terrible site?

Re:Conserve Bandwidth? (0)

Anonymous Coward | more than 12 years ago | (#2803152)

Why do sites insist on using Flash anyway? I tried to get by as long as possible without it but browsing many sites brings up a constant prompt to install Flash. In IE there doesn't seem to be any "never ask me this question again" setting for not downloading the flash plugin. 99% of the sites that use Flash could have just as easily presented their information as text and a few simple gifs or png images but instead they choose to make this huge animated monstrosity that takes 2 minutes to download over broadband. Now with all the flash animated ads I'm REALLY trying to avoid using it altogether. Have you guys seen these god damn ads that swoop down and fly across the page you're trying to read and block the content? Is that supposed to be cute? The Internet is really going to hell.

Re:Conserve Bandwidth? (0)

Anonymous Coward | more than 12 years ago | (#2803206)

A vector diagram in flash is much smaller than a gif, it prints properly and it will scale to the window size. Thouse all all very good reasons to use flash in place of gif or png or jpg for many types of images.

however noone on the web does....

Time for a new law of information theory? (5, Funny)

Anonymous Coward | more than 12 years ago | (#2803134)

The odds on a compression claim turning out to be true are always identical to the compression ratio claimed?

Tech details from the crappy Flash-only website (5, Informative)

bleeeeck (190906) | more than 12 years ago | (#2803135)

ZeoSynch's Technical Process: The Pigeonhole Principle and Data Encoding Dr. Claude Shannon's dissertation on Information Theory in 1948 and his following work on run-length encoding confidently established the understanding that compression technologies are "all" predisposed to limitation. With this foundation behind us we can conclude that the effort to accelerate the transmission of information past the permutation load capacity of the binary system, and past the naturally occurring singular-bit-variances of nature can not be accomplished through compression. Rather, this problem can only be successfully resolved through the solution of what is commonly understood within the mathematical community as the "Pigeonhole Principle."

Given a number of pigeons within a sealed room that has a single hole, and which allows only one pigeon at a time to escape the room, how many unique markers are required to individually mark all of the pigeons as each escapes, one pigeon at a time?

After some time a person will reasonably conclude that:
"One unique marker is required for each pigeon that flies through the hole, if there are one hundred pigeons in the group then the answer is one hundred markers". In our three dimensional world we can visualize an example. If we were to take a three-dimensional cube and collapse it into a two-dimensional edge, and then again reduce it into a one-dimensional point, and believe that we are going to successfully recover either the square or cube from the single edge, we would be sorely mistaken.

This three-dimensional world limitation can however be resolved in higher dimensional space. In higher, multi-dimensional projective theory, it is possible to create string nodes that describe significant components of simultaneously identically yet different mathematical entities. Within this space it is possible and is not a theoretical impossibility to create a point that is simultaneously a square and also a cube. In our example all three substantially exist as unique entities yet are linked together. This simultaneous yet differentiated occurrence is the foundation of ZeoSync's Relational Differentiation Encoding(TM) (RDE(TM)) technology. This proprietary methodology is capable of intentionally introducing a multi-dimensional patterning so that the nodes of a target binary string simultaneously and/or substantially occupy the space of a Low Kolmogorov Complexity construct. The difference between these occurrences is so small that we will have for all intents and purposes successfully encoded lossley universal compression. The limitation to this Pigeonhole Principle circumvention is that the multi-dimensional space can never be super saturated, and that all of the pigeons can not be simultaneously present at which point our multi-dimensional circumvention of the pigeonhole problem breaks down.

Re:Tech details from the crappy Flash-only website (0)

Anonymous Coward | more than 12 years ago | (#2803227)

The limitation to this Pigeonhole Principle circumvention is that the multi-dimensional space can never be super saturated, and that all of the pigeons can not be simultaneously present at which point our multi-dimensional circumvention of the pigeonhole problem breaks down.

So you can compress 100Mb of data to 1Mb, but only if it's not all there. Ground-breaking stuff.

Is this April 1st? (3, Informative)

tshoppa (513863) | more than 12 years ago | (#2803136)

This has *long* been an April 1st joke published in such hallowed rags as BYTE and Datamation for at least as long as I've been reading them (20 years).

The punchline to the joke was always along the lines of

Of course, since this compression works on random data, you can repeatedly apply it to previously compressed data. So if you get 100:1 on the first compression, you get 10000:1 on the second and 1000000:1 on the third.

Re:Is this April 1st? (1)

Johannes K. (27905) | more than 12 years ago | (#2803238)

Of course, since this compression works on random data, you can repeatedly apply it to previously compressed data. So if you get 100:1 on the first compression, you get 10000:1 on the second and 1000000:1 on the third.

I quote from their press release [zeosync.com] :

Existing compression technologies are [...] limited in application to single or several pass reduction. ZeoSync's approach to the encoding of practically random sequences is expected to evolve into the reduction of already reduced information across many reduction iterations, producing a previously unattainable reduction capability.

Not so far from that April first joke, is it?

randomness (2)

Derwen (219179) | more than 12 years ago | (#2803141)

a breakthrough in data compression that allows for 100:1 lossless compression of random data.
That's fine if you only have random [tuxedo.org] data - but a lot of mine is non-random ;o)
- Derwen

Yet Another Fantastic Compression Scheme (1)

pointym5 (128908) | more than 12 years ago | (#2803143)

"Breakthrough" compression schemes are the perpetual motion machines of the 21st century. Any technological claim that's introduced with the statement that they've broken through the boundaries of information theory falls way on the wrong side of Occam's razor for me.

Think about it: 100-to-1 compression of random data? Just think in terms of first principles: How many bit strings are there of a given length? How would you reduce the size of a binary description that identifies a particular one? And note that the random data thing is straight from their press release!

What's this... Numbers? (-1)

CmderTaco (533794) | more than 12 years ago | (#2803144)

3 [goatse.cx]
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
63
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6

Drink. Feck. Girls. Arse. (0)

Anonymous Coward | more than 12 years ago | (#2803148)

Drink! Drink! Drink!
cup of tea, father?
ah, GO ON!

No Way... (2, Redundant)

tonywestonuk (261622) | more than 12 years ago | (#2803149)

Pure random data is imposible to compress - If You compress 1Mb of random data (propper Random Data, not pseudo random).. and you get, say 100K's worth of compressed output; what's stopping you feading this 100K's worth back through the algorhythm, again and reduceing it down even more.... again, and again, untill the whole 1MB is squashed into a byte! (Which, obviously is a load of rubbish).....

Re:No Way... (1)

Sobrique (543255) | more than 12 years ago | (#2803242)

Erm, cos the output wouldn't be random?
Always assuming you're interested in reconstructing the initial values of course.

The proofs in the pudding. (5, Funny)

neo (4625) | more than 12 years ago | (#2803150)

ZeoSync said its scientific team had succeeded on a small scale in compressing random information sequences in such a way as to allow the same data to be compressed more than 100 times over -- with no data loss. That would be at least an order of magnitude beyond current known algorithms for compacting data.

ZeoSync announced today that the "random data" they were referencing is string of all zero's. Technically this could be produced randomly and our algorythm reduces this to just a couple of characters, a 100 times compression!!

The pressrelease (4, Informative)

grazzy (56382) | more than 12 years ago | (#2803153)

ZEOSYNC'S MATHEMATICAL BREAKTHROUGH OVERCOMES LIMITATIONS OF DATA COMPRESSION THEORY

International Team of Scientists Have Discovered
How to Reduce the Expression of Practically Random Information Sequences

WEST PALM BEACH, Fla. - January 7, 2001 - ZeoSync Corp., a Florida-based scientific research company, today announced that it has succeeded in reducing the expression of practically random information sequences. Although currently demonstrating its technology on very small bit strings, ZeoSync expects to overcome the existing temporal restraints of its technology and optimize its algorithms to lead to significant changes in how data is stored and transmitted.

Existing compression technologies are currently dependent upon the mapping and encoding of redundantly occurring mathematical structures, which are limited in application to single or several pass reduction. ZeoSync's approach to the encoding of practically random sequences is expected to evolve into the reduction of already reduced information across many reduction iterations, producing a previously unattainable reduction capability. ZeoSync intentionally randomizes naturally occurring patterns to form entropy-like random sequences through its patent pending technology known as Zero Space Tuner(TM). Once randomized, ZeoSync's BinaryAccelerator(TM) encodes these singular-bit-variance strings within complex combinatorial series to result in massively reduced BitPerfect(TM) equivalents. The combined TunerAccelerator(TM) is expected to be commercially available during 2003.

According to Peter St. George, founder and CEO of ZeoSync and lead developer of the technology: "What we've developed is a new plateau in communications theory. Through the manipulation of binary information and translation to complex multidimensional mathematical entities, we are expecting to produce the enormous capacity of analogue signaling, with the benefit of the noise free integrity of digital communications. We perceive this advancement as a significant breakthrough to the historical limitations of digital communications as it was originally detailed by Dr. Claude Shannon in his treatise on Information Theory." [C.E. Shannon. A Mathematical Theory of Communication. Bell System Technical Journal, 27:379-423, 623-656, 1948]

"There are potentially fantastic ramifications of this new approach in both communications and storage," St. George continued. "By significantly reducing the size of data strings, we can envision products that will reduce the cost of communications and, more importantly, improve the quality of life for people around the world regardless of where they live."

Current technologies that enable the compression of data for transmission and storage are generally limited to compression ratios of ten-to-one. ZeoSync's Zero Space Tuner(TM) and BinaryAccelerator(TM) solutions, once fully developed, will offer compression ratios that are anticipated to approach the hundreds-to-one range.

Many types of digital communications channels and computing systems could benefit from this discovery. The technology could enable the telecommunications industry to massively reduce huge amounts of information for delivery over limited bandwidth channels while preserving perfect quality of information.

ZeoSync has developed the TunerAccelerator(TM) in conjunction with some traditional state-of-the-art compression methodologies. This work includes the advancement of Fractals, Wavelets, DCT, FFT, Subband Coding, and Acoustic Compression that utilizes synthetic instruments. These are methods that are derived from classical physics and statistical mechanics and quantum theory, and at the highest level, this mathematical breakthrough has enabled two classical scientific methods to be improved, Huffman Compression and Arithmetic Compression, both industry standards for the past fifty years.

All of these traditional methods are being enhanced by ZeoSync through collaboration with top experts from Harvard University, MIT, University of California at Berkley, Stanford University, University of Florida, University of Michigan, Florida Atlantic University, Warsaw Polytechnic, Moscow State University and Nankin and Peking Universities in China, Johannes Kepler University in Lintz Austria, and the University of Arkansas, among others.

Dr. Piotr Blass, chief technology advisor at ZeoSync, said "Our recent accomplishment is so significant that highly randomized information sequences, which were once considered non-reducible by the scientific community, are now massively reducible using advanced single-bit- variance encoding and supporting technologies."

"The technologies that are being developed at ZeoSync are anticipated to ultimately provide a means to perform multi-pass data encoding and compression on practically random data sets with applicability to nearly every industry," said Jim Slemp, president of Radical Systems, Inc. "The evaluation of the complex algorithms is currently being performed with small practically random data sets due to the analysis times on standard computers. Based on our internally validated test results of these components, we have demonstrated a single-point-variance when encoding random data into a smaller data set. The ability to encode single-point-variance data is expected to yield multi-pass capable systems after temporal issues are addressed."

"We would like to invite additional members of the scientific community to join us in our efforts to revolutionize digital technology," said St. George. "There is a lot of exciting work to be done."

About ZeoSync

Headquartered in West Palm Beach, Florida, ZeoSync is a scientific research company dedicated to advancements in communications theory and application. Additional information can be found on the company's Web site at www.ZeoSync.com or can be obtained from the company at +1 (561) 640-8464.

This press release may contain forward-looking statements. Investors are cautioned that such forward-looking statements involve risks and uncertainties, including, without limitation, financing, completion of technology development, product demand, competition, and other risks and uncertainties.

Vaporware 2002 (1, Funny)

Anonymous Coward | more than 12 years ago | (#2803154)

Looks like Wired has a start to their top 10 list for 2002.

Buzzwordtastic (2, Interesting)

Steve Cox (207680) | more than 12 years ago | (#2803155)

I got bored reading the press release after finding the fourth trademarked buzzword in the second paragraph.


I simply can't believe that this method of compression/encoding is so new that it requires a completely new dictionary (of words we presumably are not allowed to use).

I can do better than that! (2, Funny)

Sobrique (543255) | more than 12 years ago | (#2803158)

100 to 1? Bah, that's only 99%.
The _real_ trick is getting 100% compression. It's actually really easy, there's a module built in to do it on your average unix.
Simply run all your backups to the New Universal Logical Loader and perfect compression is achieved. The device driver, is of course, loaded as /dev/null.

Practically Random (1)

sfraggle (212671) | more than 12 years ago | (#2803164)

Quote from the site:

WEST PALM BEACH, Fla. - January 7, 2001 - ZeoSync Corp., a Florida-based scientific research company, today announced that it has succeeded in reducing the expression of practically random information sequences. Although currently demonstrating its technology on very small bit strings, ZeoSync expects to overcome the existing temporal restraints of its technology and optimize its algorithms to lead to significant changes in how data is stored and transmitted.

Note the wording: "Practically Random", not "Random". This of course does throw some doubt on this claim, as "Practically Random" could mean anything...

Re:Practically Random (1)

Isao (153092) | more than 12 years ago | (#2803202)

Yes, I was thinking about that. I can think of two interpretations:

Practically Random - Functionally random
Practically Random - Nearly random

Big difference, but there are so many other nits to pick, why start here?

They'd better pony up with some more details and/or expert testimony, or they'll be labelled as cranks, even if they DO eventually come up with something.

Hmmmmm. Dial Up! (0)

Anonymous Coward | more than 12 years ago | (#2803167)

I doubt this compression thing is true but if it is...I'm gonna tell @home to eat it's cable modem,
and enjoy a blazing dial up connection

Uhh... (1)

praxim (117485) | more than 12 years ago | (#2803168)

Here're a few nebulous bits from the site to keep the skeptics going:

ZeoSync intentionally randomizes naturally occurring patterns to form entropy-like random sequences through its patent pending technology known as Zero Space Tuner(TM). Once randomized, ZeoSync's BinaryAccelerator(TM) encodes these singular-bit-variance strings within complex combinatorial series to result in massively reduced BitPerfect(TM) equivalents. The combined TunerAccelerator(TM) is expected to be commercially available during 2003.
According to Peter St. George, founder and CEO of ZeoSync and lead developer of the technology: "What we've developed is a new plateau in communications theory. Through the manipulation of binary information and translation to complex multidimensional mathematical entities, we are expecting to produce the enormous capacity of analogue signaling, with the benefit of the noise free integrity of digital communications. We perceive this advancement as a significant breakthrough to the historical limitations of digital communications as it was originally detailed by Dr. Claude Shannon in his treatise on Information Theory." [C.E. Shannon. A Mathematical Theory of Communication. Bell System Technical Journal, 27:379-423, 623-656, 1948]
"There are potentially fantastic ramifications of this new approach in both communications and storage," St. George continued. "By significantly reducing the size of data strings, we can envision products that will reduce the cost of communications and, more importantly, improve the quality of life for people around the world regardless of where they live."

[note - It appears to cure cancer and solve the issue of world hunger]

...These are methods that are derived from classical physics and statistical mechanics and quantum theory, and at the highest level, this mathematical breakthrough has enabled two classical scientific methods to be improved, Huffman Compression and Arithmetic Compression, both industry standards for the past fifty years...

scientific method, fact... goes out the window, r (1)

Anonymous Coward | more than 12 years ago | (#2803170)

Science is based on fact, news is based on fact.

"The company's claims, which are yet to be demonstrated in any public forum, could vastly boost the ability of computer disks to store, text, music and video -- if ZeoSync's formulae succeed in scaling up to handle massive amounts of data."

make prediction, test, make new prediction on results if applicable.

What happened too the the idea that test reults need to be duplicated by others before its accepted as fact(see:news).

I'm just a lowly mechanical engineer, so I'll take this to mean 1 thing.

Being labeled "renowned" must mean you are not bound by the scientific method, and that "journalists" are not bound by "fact" in reporting "news".

Re:scientific method, fact... goes out the window, (2, Funny)

Anonymous Coward | more than 12 years ago | (#2803239)

Screw ZeoSync, I've built a compression algorithm that is 1000:1 and is completely lossless. I've yet to demonstrate it in public though but please give me venture capital. Thank you.

Impossible (0)

Anonymous Coward | more than 12 years ago | (#2803172)

If the data is truly random, there is absolutely no way possible to compress it. This is bollocks.

That was... (0)

Anonymous Coward | more than 12 years ago | (#2803176)

...the funniest thing i've read in a while.

Re:That was... (0)

Anonymous Coward | more than 12 years ago | (#2803222)

I agree, must have been designed to get a response out of slashdot. Some people have a lot of time on their hands.

bull! (1)

bob@dB.org (89920) | more than 12 years ago | (#2803177)

compress all possible sequensec of 1000 bytes down to 10 bytes. if none of the "compressed" sequences are the same, the method works, if not these guys are just blowing smoke.

any first year cs studen knows it can't be done.

random (1)

blank_coil (543644) | more than 12 years ago | (#2803178)

I didn't read the entire press release, but I did notice the subtitle:

"International Team of Scientists Have Discovered How to Reduce the Expression of Practically [emphasis mine] Random Information Sequences "

So I guess the data does have at least some redundancy in it. I'm not an expert, so I don't if this makes their claim more likely to be true, but I thought it should be pointed out.

Why must you be so jealous Michael? (-1, Troll)

Uttles (324447) | more than 12 years ago | (#2803179)

...something to liven up your drab dull existence today.

Look, us Engineers and math types can't help it if we make more money than you CS and CIS people, it's just the way the world works. Go back to school and get a real degree.

Unlikely (1)

Travelr9 (514162) | more than 12 years ago | (#2803180)

From the splash screen:
"with costumer service agents providing chat assistance."

Let's see, I prepare a press release guaranteed to garner my website tens, if not hundreds of thousands of hits, and I leave an egregious typo as my first impression?

Not!

Am I hated? (-1)

CmderTaco (533794) | more than 12 years ago | (#2803181)

[goatse.cx]
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
63
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6

In this house we obey the 2nd law of thermodynamic (3, Insightful)

tshoppa (513863) | more than 12 years ago | (#2803183)

From the Press Release [zeosync.com] :
This press release may contain forward-looking statements. Investors are cautioned that such forward-looking statements involve risks and uncertainties, including, without limitation, financing, completion of technology development, product demand, competition, and other risks and uncertainties.
They left out Disobeying the 2nd law of Thermodynamics! [secondlaw.com]

theres (1)

Zephy (539060) | more than 12 years ago | (#2803194)

excrement, bovine at that, in the air.

Fairly safe to assume that this is a hoax (1)

CharlesDarwin (163099) | more than 12 years ago | (#2803195)

From their website:

"ZeoSync's new "Binary Accellerator (TM)" is not a compression technology, rather it encodes digital information into fast and dependable muti-dimensional mathematical entities that the company calls "Gems (TM)". We have chosen the name "Gems" as an acronym for Multi-Dimenstional Mathematical Reduction (MDMR) does not clearly define the condensation process, as successfully as does the mental image of the crystallization of nature that transforms rough materials into precious stones."

"Once crystallized, Gems are able to move rapidy on a fixed set of binary carriers through existing digital transmission devices, breaking all known transmission barriers. This MindSpeed velocity affects the complete global communications infrastructure by sending more data accross less bandwidth while saving time..."

MindSpeed velocity? You've got to be kidding me!

Re:Fairly safe to assume that this is a hoax (-1)

Fucky the troll (528068) | more than 12 years ago | (#2803220)

This site looks like moneyzapper [moneyzapper.com] . Go honchie go!

Re:Fairly safe to assume that this is... (0)

Anonymous Coward | more than 12 years ago | (#2803243)

It is fairly safe to assume that this is (supposed to be) an advertisement of a company that provides "Strategic Business Communications" services...
http://www.wilsonmchenry.com/ [wilsonmchenry.com]

I have many freaks (-1)

CmderTaco (533794) | more than 12 years ago | (#2803196)

3 [goatse.cx]
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
63
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6

comms compression, not data compression (-1)

Fucky the troll (528068) | more than 12 years ago | (#2803200)

It's not like zip files, where you can compress data. It's more like modem compression which compresses the signal rather than the data itself. Read the article.

I could see it working in a specific context (2)

SerpentMage (13390) | more than 12 years ago | (#2803203)

Many people may say this is bull, but think of it in another way.

Instead of assuming that data is static, think of it constantly moving. Even in random data, moving data can be compressed because it constantly moving along. It is sort of like when a herd of people file into hall. Sure everyone is unique, but you could organize and say, "Hey five red shirts now", "ten blue shirts now".

And I think that is what they are trying to achieve. Move the dimensions into a different plane. However, and this is what I wonder about. How fast will it actually be? I am not referring to the mathematical requirements, but the data will stream and hence you will attempt to organize. Does that organization mean that some bytes have to wait?

Compression to one bit (1)

BlueWonder (130989) | more than 12 years ago | (#2803204)

Hey, if their algorithm works on random data, re-apply it to the output, and it will be compressed again. You can do this again and again, until only one bit is left!

Now, let's uncompress a 0 bit and a 1 bit. All software ever written and ever to be written in the future must come out, since there cannot be anything which compresses to anything else than a 0 or 1 bit, if compressed to a single bit.

Seriously though, the comp.compression FAQ [faqs.org] is really worth a read, especially question #9 [faqs.org] .

1/7/2001? (1)

NFNNMIDATA (449069) | more than 12 years ago | (#2803212)

It's either a year old or they are just that lame...

What's random? (2)

Moderation abuser (184013) | more than 12 years ago | (#2803221)

What're they talking about? 20Gb of rand() output?

If so, they're a bunch or twits.

Been there, done that... (4, Informative)

color of static (16129) | more than 12 years ago | (#2803226)

There seems to be a company claiming to exceed, go around, obliterate Shannon every few years. In the early 90's there was a company called Web (before the WWW was really around by a year or so). They made claims of compressing any data, even data that had already been compressed. It is a sad story that you should be able to find in either the sci.compression FAQ or the renewed deja archives. It basically boils down to as they got closer to market, they found some problems... you can guess the rest.
This isn't limited to the field of compression of course. There are people that come up with "unbreakable" encryption, infinite gain amplifier (is that gain in V and I?), and all sorts of perpetual motion machines. The sad fact is that compression and encryption are not well understood enough for these ideas to be killed before a company is started or stacked on the claims.

license? (1)

Karmageddon (186836) | more than 12 years ago | (#2803228)

at the risk of starting a licensing flame war, what license do you think they'll use when they release this? GPL? BSD? Or will it be a regular old patent-with-royalty sort of thing?

Think really hard...

Wow!!! look how long my post is!!! (-1)

CmderTaco (533794) | more than 12 years ago | (#2803230)

3 [goatse.cx]
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
63
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6
3
1
5
6
4
5
6
5
7
4
9
8
7
6
8
7
5
6
4
3
2
4
1
2
5
6
5
4
1
8
9
5
7
1
9
8
4
3
4
2
6
6
5
5
6
4
6
5
7
7
6
9
6
1
5
2
4
9
8
3
4
9
3
5
8
6
7
4
9
5
8
6
7
7
9
6
7
8
7
5
8
9
8
6
7
9
8
8
7
6
6
5
3
4
4
2
5
1
4
2
3
2
3
1
1
3
4
1
4
6
5
4
1
9
5
8
5
2
9
8
7
6
9
8
3
7
7
8
4
6
5
6
4
5
4
1
2
2
3
4
1
1
3
2
3
1
6
5
1
4
9
3
8
5
6
7
9
8
4
3
7
7
9
6
8
7
6
5
4
8
6
2
5
1
4
6
7
5
6
4
3
8
9
7
3
2
9
8
3
4
7
9
1
8
2
4
3
6
4
5
1
4
6
3
2
3
2
1
4
5
4
3
5
6
3
5
2
7
5
6
6
4
9
8
7
9
8
4
7
6
6
2
5
4
6

Why Flash? (1)

Isao (153092) | more than 12 years ago | (#2803231)

You'd think they'd create a java application to present their site compressed with their methodology.

Heck, even Sun ponied up with a streaming video on demand java applet for their CEO speeches, just to illustrate there weren't performance issues involved.

Blah! (2, Funny)

jsse (254124) | more than 12 years ago | (#2803233)

We already have lzip [slashdot.org] to compress the files down to 0% of their original size. ZeoSync doesn't catch up with latest technologies on /. it seems.

compressing the story 100:1 (0)

Anonymous Coward | more than 12 years ago | (#2803235)

claim = crap

It's about /practically/ random data (2)

Telcontar (819) | more than 12 years ago | (#2803236)

If you read the press release carefully, they claim to be able to compress practically random data, such as pictures of green grass, 100 : 1. They never claim to be able to do the same with true random data, since this is impossible.

There may be something about that. However, there are also many points that make me sceptical, but maybe the press release has not been reviewed carefully enough.
This new algorithm does not break Shannon's limit, which is impossible, so the phrase about the "historical limitations" is a hoax...

Buzz-word ALERT! (2)

Hougaard (163563) | more than 12 years ago | (#2803241)

ZeoSync intentionally randomizes naturally occurring patterns to form entropy-like random sequences through its patent pending technology known as Zero Space Tuner(TM). Once randomized, ZeoSync's BinaryAccelerator(TM) encodes these singular-bit-variance strings within complex combinatorial series to result in massively reduced BitPerfect(TM) equivalents. The combined TunerAccelerator(TM) is expected to be commercially available during 2003.

I think they have made a buzz-word compression routine, even our sales peoply have difficults putting this many buzz-words in a press release :)

Some background reading: (5, Interesting)

Quixote (154172) | more than 12 years ago | (#2803245)

Section 1.9 of the comp.compression FAQ [faqs.org] is good background reading on this stuff. In particular, read the "WEB story".

I wonder if... (2)

mirko (198274) | more than 12 years ago | (#2803247)

Most random generation uses bytes as their unit.
Now, what if they look for bit-sequences (not only 8-bit sequences but maybe odd numbers) in order to generate patterns ?
I guess this could be a way to significantly compress data but this'd imply a huge number of data read in order to achieve the best result possible.
Note they may also do this in more than one pass-through but then their compression thing should be really lengthy, then.

Reminder to Self... (2)

kramer (19951) | more than 12 years ago | (#2803250)

Never, *EVER* accept any advice from the Aberdeen Group. Apparently their analysts don't know shit.

"Either this research is the next 'Cold Fusion' scam that dies away or it's the foundation for a Nobel Prize. I don't have an answer to which one it is yet," said David Hill, a data storage analyst with Boston-based Aberdeen Group.

Wonder which category he expects them to win in...

Physics, Chemistry, Economics, Physiology / Medicine, Peace or Literature

There is no Nobel category for pure mathematics, or computing theory.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>