Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

IBM Optical Chip Zips Huge Files Using Little Power

ScuttleMonkey posted more than 6 years ago | from the zip-without-the-zap dept.

95

An anonymous reader wrote to mention that IBM has unveiled a new prototype chip that can transmit data at up to 8 TB/sec, or about 5,000 high-def video streams. While this might not be entirely amazing, the fact that they did it using the same amount of juice required to light a 100-watt lightbulb, is. "The resulting total bi-directional data transfer rate is 300 Gb/s, nearly doubling the performance of a version IBM introduced last year. Compared to current commercial optical modules the transceiver provides 10-fold greater bandwidth in 1/10 the volume while consuming comparable power, IBM said."

cancel ×

95 comments

Sorry! There are no comments related to the filter you selected.

So that would make it use about... (5, Funny)

Anonymous Coward | more than 6 years ago | (#22608624)

100 watts? Anybody want to check my math?

Re:So that would make it use about... (1)

RichiH (749257) | more than 6 years ago | (#22609276)

Using advanced techniques, I was able to determine that you are wrong.

It is 200 Watts.

Re:So that would make it use about... (2, Interesting)

that this is not und (1026860) | more than 6 years ago | (#22609290)

They said the amount of juice to light a 100 watt bulb. Check it some time. You can reduce the voltage to half and get light out of a 100 watt bulb. So do they mean 100 watts? Or are they maybe driving with enough voltage that it consumes 800 watts for a short time? These clever journalistic/marketing phrases are vague.

Re:So that would make it use about... (2, Funny)

SickHumour (928514) | more than 6 years ago | (#22611276)

Odd. I usually measure juice in litres, not light bulbs.

Re:So that would make it use about... (1)

that this is not und (1026860) | more than 6 years ago | (#22611494)

I measure it in june bugs. Or in oranges, of course.

Re:So that would make it use about... (4, Funny)

Chapter80 (926879) | more than 6 years ago | (#22609502)

Proper conversion:

Start with a 100 Watt bulb

Divide by 7, which is the number of watts necessary to properly illuminate a square foot of floor space.
This gives roughly 14.28 square feet able to be illuminated.

Divide this into 2.1 million sq ft [nps.gov] , the amount of square feet in the Library of Congress (USLOC).
This tells us that 147,000 watts are necessary to illuminate the US Library of Congress.

Divide by 1.09951163 × 10^13 bytes [google.com] , the amount of storage per unit of USLOC. [jamesshuggins.com]
This tells us that 1.33 x 10^-8 bytes are illuminated per watt

Multiply by 7GB (7,516,192,768 bytes), which is the number of gigabytes of printed material that can be properly illuminated by a 1-watt bulb.

Answer: 100.4

So you were close.

Re:So that would make it use about... (4, Funny)

Chapter80 (926879) | more than 6 years ago | (#22609528)

Answer: 100.4

So you were close.

Oops, I used Excel, so don't trust my calculations. Never mind.

Re:So that would make it use about... (1)

b96miata (620163) | more than 6 years ago | (#22609504)

someone needs to check the article's math. How do they go from 8TB/s down to 300GBps?

Re:So that would make it use about... (1)

maxwell demon (590494) | more than 6 years ago | (#22609688)

Extremely lossy compression. :-)

Re:So that would make it use about... (4, Funny)

maxwell demon (590494) | more than 6 years ago | (#22609670)

No, they are speaking about amounts of juice, so the proper unit would be liter/second (or gallons/second for you Americans). So how much juice would you need? Well, they didn't say what type of juice they would need, so I just assumed apple juice. I've found this table [kalorien-tabelle.de] [German], which tells me that 0.2 liters of apple juice with sugar has 175 kJ. That is, we have an energy of 875 kJ/l. Now a 100 W lightbulb needs 0.1 kJ/s, therefore to power a 100 W lightbulb, we need about 10 liters per day.

Re:So that would make it use about... (3, Funny)

ScrewMaster (602015) | more than 6 years ago | (#22610640)

Well, they didn't say what type of juice they would need, so I just assumed apple juice.

You're a Mac user, aren't you.

Re:So that would make it use about... (1)

superflyguy (910550) | more than 6 years ago | (#22610332)

Actually it's 200*cos^2(60*pi*t+theta) watts as a function of time, assuming a lightbulb is an ideal resistor.

Juice! (5, Funny)

kevmatic (1133523) | more than 6 years ago | (#22608638)

The same amount of juice to power a 100-watt light bulb!?

That's like... 100 Watts!

Unless you go compact florescent. Then its like 15watt.

Re:Juice! (0, Funny)

Anonymous Coward | more than 6 years ago | (#22608646)

My calculations reveal a power usage of 100 watts!

Re:Juice! (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#22608710)

I have never used juice to power anything except for animals. I also have an anus the size of god, and it is full of unnecessary particles. That's lowercase god - he's God's younger unisex sibling.

Re:Juice! (5, Funny)

Anonymous Coward | more than 6 years ago | (#22608894)

Maybe it was written by one of these people: [upenn.edu] . (Or maybe I just wanted an excuse to post the link).

"He was as tall as a six-foot-three-inch tree"

"The red brick wall was the color of a brick-red Crayola crayon."

"John and Mary had never met. They were like two hummingbirds who had also never met."

Re:Juice! (5, Funny)

JustOK (667959) | more than 6 years ago | (#22609022)

The parent post is as off topic as an off-topic parent post.

Re:Juice! (1)

mabinogi (74033) | more than 6 years ago | (#22609100)

Disease and deprivation stalk the land like two...giant stalking things!

Re:Juice! (1)

VVrath (542962) | more than 6 years ago | (#22611140)

<pedant class="VocabNazi">

Those "Metaphors" are all similes.

</pedant>

Re:Juice! (2, Funny)

JustOK (667959) | more than 6 years ago | (#22612742)

Well, a metaphor is like a metaphor.

Re:Juice! (1)

JustOK (667959) | more than 6 years ago | (#22612754)

And making a dumb typo even when using the preview button is like making a dumb typo without using the preview button, but worse.

Re:Juice! (1)

autocracy (192714) | more than 6 years ago | (#22609414)

Measured from the power company... :)

*waits for the comment "it's the same 100 watts..."*

Re:Juice! (1)

TooTechy (191509) | more than 6 years ago | (#22609632)

Yep. I can see one of these puppies in my laptop.

The 160W power supply might cost a pretty penny. But then, if we need the PSU for Blu Ray? What's the difference?

Re:Juice! (1)

vanyel (28049) | more than 6 years ago | (#22610874)

...which explains why optical computing hasn't caught on if it's that power hungry!

Re:Juice! (1)

dulridge (454779) | more than 6 years ago | (#22611870)

Stuff that!
CFL marketing claims are one thing, light meters are another...

Say 25W and you get closer, but not there. 30W CFLs probably exceed 100W incandescent lightbulbs, lower wattage ones don't. Speaking as the guy who still tries to find 200W incandescent bulbs which used to be easy to find and now aren't findable at all.

Re:Juice! (1)

danwat1234 (942579) | more than 5 years ago | (#22614260)

no, the 100w fluourescent bulb equiv. is 25 watts (General Electric) but whatev

Third post (0)

Anonymous Coward | more than 6 years ago | (#22608644)

Third post!

How much power? (5, Funny)

l00sr (266426) | more than 6 years ago | (#22608658)

Let's see... how much power does it take to power a 100-Watt light bulb... hm... Well, according to Wikipedia, a 100-Watt incandescent lightbulb outputs about 1700 lumens. A quick googling reveals that the average incandescent bulb achieves a lighting efficiency of roughly 15.75 lumens per Watt. A simple calculation then yields that the power used by a 100-Watt light bulb is roughly 107.93 Watts. Q.E.D.

Re:How much power? (0)

Anonymous Coward | more than 6 years ago | (#22608764)

Of all the wattage jokes in this thread so far, yours is tops. Bravo!

Re:How much power? (1)

Whiteox (919863) | more than 6 years ago | (#22609252)

Not really, I mean no one mentioned that they could use one of those energy efficient bulbs and reduce the power consumption to about 20 watts.

Re:How much power? (2, Funny)

that this is not und (1026860) | more than 6 years ago | (#22609300)

I can probably get detectable amounts of light out of a 100 watt bulb with the bulb consuming maybe 4-7 watts. If you insist we can take this matter to the lab and find out.

Re:How much power? (4, Funny)

thrillseeker (518224) | more than 6 years ago | (#22609476)

If you insist we can take this matter to the lab and find out.

Ah, the never tiring slashdotter pick up line ...

Re:How much power? (2, Funny)

dreamchaser (49529) | more than 6 years ago | (#22609920)

That wasn't a pickup line. It was calling someone out.

"Want to see my rare collection of Star Trek memorabillia" would be a more typical pickup line.

Server use? (1)

cloakable (885764) | more than 6 years ago | (#22608690)

I dunno - power usage aside, 300gbps isn't something to be sniffed at. I could see this device taking pride of place in some geek's fileserver, for example - I'd imagine being able to max out a few 1gpbs links to the server would be very handy for things like a mythtv server, or ripping hidef video directly to the network server.

I know I'm interested!

What's with the summary? (2, Insightful)

macslas'hole (1173441) | more than 6 years ago | (#22608700)

A hundred watts, that's all good and well, but what does it have to do with zipping huge files? Or am I reading impaired?

Re:What's with the summary? (3, Informative)

UncleTogie (1004853) | more than 6 years ago | (#22608822)

A hundred watts, that's all good and well, but what does it have to do with zipping huge files? Or am I reading impaired?

My bet is on someone using the word "zip" somewhat like Ballmer used "squirt", i.e., to send quickly...

Re:What's with the summary? (4, Funny)

macslas'hole (1173441) | more than 6 years ago | (#22608838)

Ballmer used "squirt"

Please, Mr. Ballmer, don't squirt me!...

YUCK!

The mental image of Sweaty MonkeyBoy and the work squirt should never be in the vicinity of each other. It is an abomination.

Re:What's with the summary? (1)

SoulRider (148285) | more than 6 years ago | (#22613288)

Christ, Im going to have nightmares for a week now. Thanks a lot guys.

Sheesh!

Re:What's with the summary? (1)

that this is not und (1026860) | more than 6 years ago | (#22609314)

I was worried this was going to turn into an issue, with the usual trolls espousing one or the other of the other annoying compression formats like lharc or 7zip. Or whatever is the newest and greatest 'hot-dog' compression format the warez cowboys are using to distinguish themselves as superior to the rest of us. I ran into a new one just the other week. Thank goodness pkgsrc rescued me.

Re:What's with the summary? (0)

Anonymous Coward | more than 6 years ago | (#22609378)

Close, but not quite. They are not the same but are connected to each other.
The order is usually "zip"->"squirt"->"zip". The "zip" operations are enable and disable signals for the "squirt" operation.

Things can get messy if you forget to call the first "zip". And your peers might make fun of you if you forget the last "zip".

Re:What's with the summary? (3, Interesting)

evanbd (210358) | more than 6 years ago | (#22608828)

High-speed networking takes a non-trivial amount of power to drive the signals, be they electrical or optical. Especially for optical devices, the efficiency in getting that power onto the transmission medium is low. At high enough speeds, there are also a lot of high speed transistors switching in the control logic that use power for the same reasons as your CPU. So, they've improved the power consumption in these and other areas.

Re:What's with the summary? (1)

macslas'hole (1173441) | more than 6 years ago | (#22608856)

High-speed networking takes a non-trivial amount of power to drive the signals, be they electrical or optical. Especially for optical devices, the efficiency in getting that power onto the transmission medium is low. At high enough speeds, there are also a lot of high speed transistors switching in the control logic that use power for the same reasons as your CPU. So, they've improved the power consumption in these and other areas.

My quibble is with "IBM Optical Chip Zips Huge Files". Made me think IBM had some specialized compression photonic chips.

Re:What's with the summary? (1)

twiddlingbits (707452) | more than 6 years ago | (#22609792)

Say what? The losses for light propagating in fibers is about 0.2 dB/km for modern single-mode silica fibers. That's pretty darn efficient. And typically, the reflected power is about 4% of the incident power and a glass to air interface. Many lasers for (current) optical transmission are under 10 watts. If IBM has gotten the speed they got for 100watts that's impressive.

As for equipment, Optical switches use a couple Watts, optical routers aren't that bad, I can spec a system of about 2.5Tbps (Juniper T-series)that uses about 4500 watts for chassis, cards, control logic, hard drives, etc. By the way, the logic is all in ASIC, very little discrete ciruitry.

Re:What's with the summary? (1)

evanbd (210358) | more than 6 years ago | (#22611130)

The inefficiency, AIUI, is in the electrical -> optical energy conversion, not anything in the purely optical domain. If you need 100mW of optical power, it takes a lot more electrical power to get it. I don't know exact numbers off hand, or even how much power is typical (I'm not a networking guy, but I've looked at diode lasers for other purposes before).

I assumed there wasn't any discrete circuitry beyond perhaps the optical component and a temperature controller if there is one. It's the ASIC I was talking about using power, though I confess I have no idea what a modern power budget would be for it.

Re:What's with the summary? (1)

Plutonite (999141) | more than 6 years ago | (#22609086)

Yeah, I was about to say that if they can zip huge files while using power "like a 100-watt lightbulb" then why the hell can't they unzip them just as effortlessly? Or even better, why can't they play Crysis or run Vista while using the said likeliness of lightbulb consumption? Or do you automatically need more power (lots of lighbulb-watt stuff) for Vista? Is running Vista like running a Christmas tree compared to the lightbulb of file zipping? This is just so unfair, and I hope Ballmer squirts some education into the summary writer's mail box. I am so not going to RTFA.

Re:What's with the summary? (1)

macslas'hole (1173441) | more than 6 years ago | (#22609114)

I hope Ballmer squirts some education into the summary writer's mail box. I am so not going to RTFA.

LOL LOL LOL ROFL
You, Sir, have made beer come out my nose. I bow before your exquisite hilarity.

Re:What's with the summary? (1)

that this is not und (1026860) | more than 6 years ago | (#22609318)

Well, I can zip files with an 8088 processor, or even a pathetic little 6502. I'm not sure the algorhythm would fit onto a PIC10F200 though. It's all a matter of how slow you want the compression to run. It could probably be done with a homebrew computer that uses vaccum tubes so that it would consume 8000 watts and take 40 minutes to 'zip' a 532 byte text file.

If Balmer squirts on that one, though, he risks shattering the glass envelopes on the tubes. In particular stay away from the very hot gas reference tube from whatever you quirt with. The reference tube doesn't have a filament but is still the hottest tube on the chassis.

Re:What's with the summary? (1)

mysticgoat (582871) | more than 6 years ago | (#22610412)

This story was one of the first things I looked at this morning, and I misread the title as "IBM optical chip zips huge flies" which sounded kind of weird in a cyborg sort of way.

And now parent post talks of zipping huge flies, which sounds kind of weird with a hint of goatse.

<!--
note to self:
Don't read slashdot before coffee
-->

Re:What's with the summary? (1)

Dun Malg (230075) | more than 6 years ago | (#22612148)

A hundred watts, that's all good and well, but what does it have to do with zipping huge files? Or am I reading impaired?
No, you're reading fine. The problem is that ScuttleMonkey (or whomever wrote the headline) is a fucking illiterate moron. They used the verb "to zip" to mean "send", when it has no such meaning in regular speech. Now, it's usually acceptable to bend meanings for words and use their context as a clue, but in this case the verb "to zip" already has a common meaning in the computer world (i.e. to compress using the ZIP format developed by Phil Katz), so it's contextual meaning is overridden by its common meaning.

Re:What's with the summary? (0)

Anonymous Coward | more than 5 years ago | (#22616088)

"is a fucking illiterate moron" ... "so it's contextual meaning is overridden by its common meaning."


hmmmm..... glass houses....

Scuttlemonkey (2, Funny)

Anonymous Coward | more than 6 years ago | (#22608706)

What the hell are you doing to the language of Shakespeare? Did you translate the submission from German?

While this might not be entirely amazing, the fact that they did it using the same amount of juice required to light a 100-watt lightbulb, is.

IBM Rocks! (4, Insightful)

The_Dougster (308194) | more than 6 years ago | (#22608708)

Even though a lowly peon like myself can barely aspire to ever own much real IBM hardware, I have to say they really make some great stuff. Since my P20 monitor finally died, all I have now is an IBM Z50 Workpad, which is a pretty sweet little thing.

I had a RS/6000 briefly, I experimented with running Debian on it. It was some impressive metal, but AIX ran circles around Debian and the graphics was unsupported in Linux. I sold it for more than I payed for it and kept the P20 monitor for free. I ran that monitor for about 5 years.

IBM hardware has always been esoteric, fantastically expensive, and of supreme quality; however, they are just a bit out of touch with regular lusers. For instance, why can't we buy a workstation with a CELL chip even now? We know it could run Linux, easily. Why are we forced to fool around with PS3 consoles when Big Blue could be making the next best thing since the IBM PC?

I'd seriously consider spending $5k for a spiffy IBM cell box running AIX or Linux as long as it could run a PCIe OpenGL card. Heck, I'd take it if it came with OS/2 even!

Re:IBM Rocks! (1)

kestasjk (933987) | more than 6 years ago | (#22609066)

Software that's custom made for the cell chip barely exploits all the parallel compute power, so I doubt gcc would compile Linux to make use of it (if it can even compile to cell, which I'm not sure of).

Re:IBM Rocks! (4, Insightful)

BiggerIsBetter (682164) | more than 6 years ago | (#22609188)

Software that's custom made for the cell chip barely exploits all the parallel compute power, so I doubt gcc would compile Linux to make use of it (if it can even compile to cell, which I'm not sure of).
IOW, you have no idea what you're talking about. :-) There are out-of-the-box distributions of Linux for Cell platforms (Yellow Dog, Fedora, Ubuntu even), and the gcc supplied in IBM's SDK is quite happy to compile for the PPE and SPE cores. Yes, I've played with all this on the PS/3. PCI and Blade hardware is available from Mercury and IBM, but it's pricey... you could drop one of Mercury's Cell PCI cards into a small IBM xSeries...

Anyway, I agree with the OP, this is a killer chip for many many of the applications we use today, and IBM should talk Lenovo (or, oh please, SUN) into selling a Cell-based Linux (or, oh please, Solaris) workstation. That would be ridiculous for software development if it had a Java SDK to go with it.

Re:IBM Rocks! (1)

Garridan (597129) | more than 6 years ago | (#22610902)

Workstation? How bout a Cell-based thinkpad? Especially the little one they want to pit against the Air. You'd need an external battery, but it would be *so* worth it. I'm still annoyed by the complete lack of quad-core laptops (except for the $3000, 50lb XTremeNotebook) out there. A Cell-based mini-thinkpad would save the frikkin' day!

Re:IBM Rocks! (2, Insightful)

Oddster (628633) | more than 6 years ago | (#22611956)

Software that's custom made for the cell chip barely exploits all the parallel compute power, so I doubt gcc would compile Linux to make use of it (if it can even compile to cell, which I'm not sure of).
IOW, you have no idea what you're talking about. :-) There are out-of-the-box distributions of Linux for Cell platforms (Yellow Dog, Fedora, Ubuntu even), and the gcc supplied in IBM's SDK is quite happy to compile for the PPE and SPE cores. Yes, I've played with all this on the PS/3. PCI and Blade hardware is available from Mercury and IBM, but it's pricey... you could drop one of Mercury's Cell PCI cards into a small IBM xSeries...

Anyway, I agree with the OP, this is a killer chip for many many of the applications we use today, and IBM should talk Lenovo (or, oh please, SUN) into selling a Cell-based Linux (or, oh please, Solaris) workstation. That would be ridiculous for software development if it had a Java SDK to go with it.
You're both half-wrong. Yes, you can get Linux running on a Cell just fine. No, software that was not written specifically with the Cell in mind (read: almost everything) uses the co-processors (SPU/SPE whatever) anywhere near capacity. And in fact, almost all of the software that is written for the Cell processor still doesn't use the co-processors anywhere near capacity. It is a very difficult platform to program for, and because of the inherent design of the Cell, it simply performs poorly compared to SMP for a large class of problems. And by difficult, I mean that you have to sit there trying to figure out how to fit your 17k of code and 512k of data into a unified 256k buffer (information theory comes in handy), because going outside the local buffer and using DMA is not only a huge pain in the ass to code up, it is also a huge performance hit. Programming for the Cell is a step backwards from the ideal computer science goal of abstracting the hardware from the code.

I work in the games industry, and I recently saw the performance timer graphs of a very popular racing game that was very recently released for the PS3, a second-generation PS3 title. It was using the co-processors at about 10% of their capacity, and that only came in regular-interval spikes. And this is a piece of software that the Cell was specifically designed to run.

Trust me, you do not want to be programming for the Cell. Stay happy with SMP on the desktop and above, leave the Cell to die in the PS3 as it should. Yes, it's a bit of a technical marvel, but quite frankly, it is not worth all the secondary costs.

Re:IBM Rocks! (1)

BiggerIsBetter (682164) | more than 6 years ago | (#22613746)

You're both half-wrong. Yes, you can get Linux running on a Cell just fine. No, software that was not written specifically with the Cell in mind (read: almost everything) uses the co-processors (SPU/SPE whatever) anywhere near capacity. And in fact, almost all of the software that is written for the Cell processor still doesn't use the co-processors anywhere near capacity. It is a very difficult platform to program for, and because of the inherent design of the Cell, it simply performs poorly compared to SMP for a large class of problems. And by difficult, I mean that you have to sit there trying to figure out how to fit your 17k of code and 512k of data into a unified 256k buffer (information theory comes in handy), because going outside the local buffer and using DMA is not only a huge pain in the ass to code up, it is also a huge performance hit. Programming for the Cell is a step backwards from the ideal computer science goal of abstracting the hardware from the code.

I work in the games industry, and I recently saw the performance timer graphs of a very popular racing game that was very recently released for the PS3, a second-generation PS3 title. It was using the co-processors at about 10% of their capacity, and that only came in regular-interval spikes. And this is a piece of software that the Cell was specifically designed to run.
Interesting to hear a perspective from the gaming industry. Disappointing results though, as someone who's played with the Cell in an academic environment.

Re:IBM Rocks! (1)

DrSkwid (118965) | more than 6 years ago | (#22609246)

Yeah, MCA was the dogs, XGA is amazing

Re:IBM Rocks! (0)

Anonymous Coward | more than 6 years ago | (#22609308)

Every time IBM tries to enter the personal computer business, they fail horribly.

IBM PC? Got cloned and gave us the Wintel monopoly.

PS/2? Tried to imitate the Macintosh proprietary business model... and failed, badly.

Then IBM tried to get back into the PC business, as sorta a competitor to Dell, Compaq, and the like. I don't remember what the venture was called this time around, but it didn't make very much money, and was eventually scrapped.

Then everything got transferred over to Lenovo et al.

Along the way were various experiments with OS/2, OpenDoc (not to be confused with ODF), PowerPC, etc... none of which were ultimately successful.

Final analysis: The desktop computer business is low margin, commodity hardware that absolutely doesn't benefit from what IBM is good at, which is consulting and putting together huge mainframe systems. IBM spun off all that useless desktop crap, and is a much better company today for it.

And that, my friend, is why IBM isn't in the desktop business, and never will be. They make more money not even caring about Joe Blow User.

Re:IBM Rocks! (1)

that this is not und (1026860) | more than 6 years ago | (#22609326)

My RS/6000 box is a Power 1 system with the microchannel bus. Rumor has it that it was fantastically expensive once. I'd shy away from saying it is 'impressive metal' at this point. Nice heavy steal chassis, but it sure can't outcrank my SparcServer 1000 (4 cpus!).

I also have an IBM PC Convertible. It does a fairly unimpressive job of running Minix, if you really want.

Re:IBM Rocks! (0)

Anonymous Coward | more than 5 years ago | (#22616206)

"Nice heavy steal chassis"

When were you planning to return it?

Re:IBM Rocks! (1)

TooTechy (191509) | more than 6 years ago | (#22609652)

I agree. I have enjoyed using lots of IBM HW since post 1992. There was a little junk in that time frame and some seriously bad HDs. But, on the whole, SOLID!

There have also been some RS6K laptops and from HP, some PA-RISC. They sold, but not many.

The market just ain't big enough!

Re:IBM Rocks! (1)

Hollinger (16202) | more than 6 years ago | (#22610314)

I'd seriously consider spending $5k for a spiffy IBM cell box running AIX or Linux as long as it could run a PCIe OpenGL card. Heck, I'd take it if it came with OS/2 even!


Actually, you just haven't looked hard enough. :-) You can buy a QS21 "Cell Blade" for $9,995 [ibm.com] . Add $1949 for a BladeCenter S Chassis [ibm.com] and you're set.

You can also purchase these items used for a significant discount if you look in the right places [google.com] .

ARRRR! (3, Funny)

eggman9713 (714915) | more than 6 years ago | (#22608718)

Let's just hope the pirate bay doesn't get a hold of this puppy.

Re:ARRRR! (0, Offtopic)

macslas'hole (1173441) | more than 6 years ago | (#22608766)

Let's just hope the pirate bay doesn't get a hold of this puppy.

What's with the moderation in this article? The parent [slashdot.org] is at least funny.

Crack-smokin Mods (0, Offtopic)

macslas'hole (1173441) | more than 6 years ago | (#22608784)

Off-topic, my drunken ass! There are, at this moment, five other posts at zero or less that should be at one or more. Do you late-nighters just dislike AC's that much?

Re:Crack-smokin Mods (1)

macslas'hole (1173441) | more than 6 years ago | (#22609056)

That's hilarious [slashdot.org] . I wonder what took you guys so long.
Back on topic...

I really expected, that when I clicked on this article, to see something about a photonic data-compression chip. That would have been interesting. I am usually the first person to mod down BS posts as over-rated, but this article is just disappointing. A more efficient inter-connect is useful, but I'm sure that, in a few months, an even more efficient one will come along. This article is barely news, even for nerds like me.

How to stops the retarted submissions? (3, Insightful)

Anonymous Coward | more than 6 years ago | (#22608756)

Someone who confuses "TB/s" vs "Tb/s" and "zips" vs "transmits" doesn't deserve to be posted.

I read (0)

Anonymous Coward | more than 6 years ago | (#22608790)

"... Chip Zips Huge FLIES Using Little Power ..."

I have to admit I'm a bit disappointed now.

First psot! (0)

Anonymous Coward | more than 6 years ago | (#22608812)

Frist post!!!

'Zips huge files' (5, Insightful)

malakai (136531) | more than 6 years ago | (#22608840)

I can't be the only one that clicked on this expecting some sort of hardware based compression acceleration. I expected some sort of optical take on compression.

Re:Abracadabra compression (0)

Anonymous Coward | more than 6 years ago | (#22608974)

Now you see it, now you don't.

Re:'Zips huge files' (0)

Anonymous Coward | more than 6 years ago | (#22609164)

No you aren't.

Quite disappointing in fact.

Re:'Zips huge files' (1)

Katatsumuri (1137173) | more than 6 years ago | (#22609278)

Maybe you simply need a paradigm shift to grok the leverage of power words in headlines...

Re:'Zips huge files' (1)

that this is not und (1026860) | more than 6 years ago | (#22609336)

The word 'grok' is now copyrighted by that pajamas legal clerk chick. You're gonna have to make a visit to your lawyer now, dude.

Re:'Zips huge files' (1)

argStyopa (232550) | more than 6 years ago | (#22609302)

/seconded.
This is a TECH news site, where the readers can be assumed to have a certain minimal technical acumen. The editors MUST have a basic familiarity with the jargon and glossary of technical terms of the computer field. For example, in computerese "ZIP" doesn't specifically mean 'go fast' - it refers to a specific method of data compression or the idea of compressing data generally.

To not understand that suggests a level of editorial competence in this field below that of, oh, say Us magazine. Not that Us is badly edited, not at all. But the audience there accepts a certain level of technical uncertainty that this audience wouldn't.

Re:'Zips huge files' (0)

Anonymous Coward | more than 6 years ago | (#22610356)

There's already lots of chips out there for zip and gzip compression/decompression in hardware.

Am I the only one... (1)

Yvanhoe (564877) | more than 6 years ago | (#22609120)

... to think that 100 W for a chip is still a lot ?
Maybe we would need a point of comparison.

Re:Am I the only one... (1)

imsabbel (611519) | more than 6 years ago | (#22609270)

Calculate it down:
This would be equivalent to 125mW per Gbit ethernet port.
IIRC, current implementations are a factor of 10-30 above that value.

I, for one (1)

LecheryJesus (1245812) | more than 6 years ago | (#22609364)

I for one welcome out new illuminati(ng) overlords...

This Story Raises The Burning Question (4, Funny)

SkyDude (919251) | more than 6 years ago | (#22609610)

How many /.ers does it take to change a lightbulb?



No one knows. Those who try keep getting electrocuted when their tinfoil hats make contact with the socket.

Re:This Story Raises The Burning Question (3, Funny)

maxwell demon (590494) | more than 6 years ago | (#22609726)

But they will be able to tell you amazing things about changing light bulbs.
You'll learn that
  • in Soviet Russia, light bulbs change YOU
  • In Korea, only old people change light bulbs
  • Netcraft confirms: light bulbs are dying
  • A reasonable business plan might look like
    1. Change light bulb
    2. ???
    3. Profit!

Also, they'll want to know if the light bulb runs Linux, and ask you to imagine a Beowulf cluster of them.

Re:This Story Raises The Burning Question (1)

svartrev (1057480) | more than 6 years ago | (#22622218)

Well, I for one welcome our /. changing light bulb Overlords.

Re:This Story Raises The Burning Question (0)

Anonymous Coward | more than 6 years ago | (#22684152)

"Also, they'll want to know if the light bulb runs Linux, and ask you to imagine a Beowulf cluster of them"

    I once had a Beowulf cluster of light bulbs, but my mom put it on the Christmas tree and our golden retriever FANG knocked it over... I lost $2.34

Re:How many /.ers does it take to change a lightbu (1)

Derf_X (651876) | more than 6 years ago | (#22609846)

How many /.ers does it take to change a lightbulb?
The correct answer is none. /.ers don't need any lightbulb, the only light they need is the one coming from their computer monitor, and maybe from the LEDs of all their electronic gadgets.

Re:How many /.ers does it take to change a lightbu (1)

SkyDude (919251) | more than 5 years ago | (#22615204)

How many /.ers does it take to change a lightbulb?
The correct answer is none. /.ers don't need any lightbulb, the only light they need is the one coming from their computer monitor, and maybe from the LEDs of all their electronic gadgets.

Ahhh.... true, it is so.

Re:This Story Raises The Burning Question (0)

Anonymous Coward | more than 5 years ago | (#22614380)

How many /.ers does it take to change a lightbulb?


I've been trying to find out, but these compact fluorescent and LED bulbs never seem to burn out. Ask again later.

Watts that in Joules per Library of Congress? (1)

davidwr (791652) | more than 6 years ago | (#22609746)

Let X = number of Libraries of Congress per second.

100 Watts = 100 Joules per second

Answer: 100/X

Plug in X and you have your answer.

Finally! (0)

Anonymous Coward | more than 6 years ago | (#22609944)

Now lotus notes might be as fast and snappy as other applications!

90's Flashback (3, Interesting)

Fieryphoenix (1161565) | more than 6 years ago | (#22610012)

Anyone else remember the dramatic claims of special chips that would "soon" allow insane levels of compression in data storage using fractal algorithms? 135 times compression, back when Stacker was was the app that saved your bacon when you ran low on disk space? That's the sort of thing I thought of when I read the headline.

Re:90's Flashback (1)

TapeCutter (624760) | more than 6 years ago | (#22612276)

There are at least two good reasons fractal compression 'failed' as a general compresion algorithim...

1. It's not lossless.
2. Someone patented it, tried to gloss over the first problem and priced it out of the market.

Not entirely amazing (1)

NormalVisual (565491) | more than 6 years ago | (#22610030)

While this might not be entirely amazing, the fact that they did it using the same amount of juice required to light a 100-watt lightbulb, is.

Dad: "That's 100 watts to you and me, Billy"
Billy: "Wow!"

Bad Measurement.. (1)

LingNoi (1066278) | more than 6 years ago | (#22612226)

about 5,000 high-def video streams.
That's great but I have to ask the question that everyone wants to know... How many libraries of congress is this?

Well dang. (1)

pclminion (145572) | more than 6 years ago | (#22612512)

I first read that headline as, "IBM Optical Chip Zaps Huge Flies." How disappointing.

bits != bytes (1)

Mikeeee84 (969805) | more than 5 years ago | (#22614888)

From TFS

can transmit data at up to 8 TB/sec
From TFA

could transmit up to 8 terabit/sec
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?