×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel Kills Consumer Larrabee Plans

Soulskill posted more than 4 years ago | from the vapor-what dept.

Graphics 166

An anonymous reader tips news that Intel has canceled plans for a consumer version of their long-awaited and oft-delayed Larrabee chip, opting instead to use it as a development platform product. From VentureBeat: "'Larrabee silicon and software development are behind where we had hoped to be at this point in the project,' said Nick Knuppfler, a spokesman for Intel in Santa Clara, Calif. 'Larrabee will not be a consumer product.' In other words, it’s not entirely dead. It’s mostly dead. Instead of launching the chip in the consumer market, it will make it available as a software development platform for both internal and external developers. Those developers can use it to develop software that can run in high-performance computers. But Knuppfler said that Intel will continue to work on stand-alone graphics chip designs. He said the company would have more to say about that in 2010."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

166 comments

Oh rats (1)

cheesybagel (670288) | more than 4 years ago | (#30331570)

No consumer version means this will turn into another i860. I guess ATI will remain the only viable competitor to NVIDIA then.

Re:Oh rats (0)

Anonymous Coward | more than 4 years ago | (#30331620)

More likely this means the chip was a gigantic failure. More deticated chips offer supperior performance, no suprise.

On a sad note, I was hoping for such a chip to allow better 'hardware acceleration' for videos on Linux, as well as allow for better performance for standard APIs that hardware makers dont support.

the performance is there (3, Interesting)

Blue Shifted (1078715) | more than 4 years ago | (#30331696)

Re:the performance is there (1, Informative)

Anonymous Coward | more than 4 years ago | (#30331790)

Read the comments. It looks like a lopsided comparison, with other folks getting higher results from e.g. ATI 4800, 5800.

Re:the performance is there (5, Insightful)

Foredecker (161844) | more than 4 years ago | (#30331934)

Vaporware is not faster than existing products.

Re:the performance is there (1)

eggnoglatte (1047660) | more than 4 years ago | (#30332346)

To be fair, the hardware exists, it is just not commercially available (and now we have learned that it won't be available in the future either, unless you are in a research group). My guess is that the hardware works just fine, but the programming model makes it much harder than anticipated to reach the nominal performance for practical problems. Kind of like the i860 as cheesybagel points out.

Of course this is what a lot of graphics researchers thought might happen ever since Intel had the Larrabee paper at Siggraph. There is going to be a lot of "told you so" going on over the next few days.

Re:the performance is there (2, Interesting)

guardiangod (880192) | more than 4 years ago | (#30332434)

So what we have here is Itanium- look good on paper but impossible to be fully utilized.

That constitutes a failure if you ask me.
 
Actually I hold the exact opposite view. The hardware isn't ready, and by not ready I mean the performance isn't as high as expected due to design issues.
If I am correct Intel doesn't want a repeat of the 1st gen Itanium where on release the brand name is blemished by the less than expected performance. This perception that IA64 is slow continues to haunt Intel up to this day. So by delaying Larrabee, Intel will have time to improve the cpu to the point where on release it will be a killer product (ie. hyped).
 
It's not as if Intel needs Larrabee in the near future anyway- AMD doesn't have anything significant in the near future as well; even if they do, with Intel's brute engineering capability, they will just pull a Core2 again.
 
Another possibility is that no game company is able to support Larrabee's architecture. Rather than releasing a product that 1. nothing old can run efficiently on 2. nothing new is designed for, Intel is delaying the release until more developers hop on the gravy train. When that happen, Intel can release the chip and immediately, consumers will be awe by the chip's performance in the newest games.

Re:the performance is there (2, Insightful)

Foredecker (161844) | more than 4 years ago | (#30332636)

I have no insider knowledge, but I strongly suspect they had problems with both the HW and Softare. I suspect the hardware actualy worked pretty well (it is Intel - the don't suck at all) but the problem was costs. Both ATI and NVIDIA have been at this a long time and producin cost effective graphics silicon is quite difficult. The software is also quite complext. The rendering model is -very- different and nobody is goin to re-write all theitr software to accomodate someting alien. So, they had to make it work with existing models. This is expensive. Both in terms of run itme efficiencey and engineering calendar time. I suspect they figured out they simply couldn't compete in the mass PC graphics market. I suspect Jen-Hsun Huang at NVIDIA is having a very, very good day.

Re:the performance is there (1)

symbolset (646467) | more than 4 years ago | (#30332870)

No, Intel is very good but sometimes the best laid plans of mice and men aft gang agley and all that.

It's cool that they're not afraid to hang themselves out there like that. If you want to see something new you gotta scratch your feet on a new road.

sounds more like gma video good on paper but the d (1)

Joe The Dragon (967727) | more than 4 years ago | (#30332676)

sounds more like gma video good on paper but the divers / performance is just not there.

Re:the performance is there (1, Insightful)

Anonymous Coward | more than 4 years ago | (#30333198)

If I am correct Intel doesn't want a repeat of the 1st gen Itanium where on release the brand name is blemished by the less than expected performance. ...
It's not as if Intel needs Larrabee in the near future anyway- AMD doesn't have anything significant in the near future as well; even if they do, with Intel's brute engineering capability, they will just pull a Core2 again. ...
Another possibility is that no game company is able to support Larrabee's architecture.

Intel is great at manufacturing and CPUs but they couldn't make a decent GPU and driver if their life depended on it. Until Intel can produce a gpu that is competitive with ATI / nVidia, any pie in the sky talk (like Larabee) is just vaporware and should be largely ignored.

Re:the performance is there (3, Insightful)

Kjella (173770) | more than 4 years ago | (#30332472)

Vaporware is not faster than existing products.

Vaporware is always faster than existing products.

Re:the performance is there (1)

Foredecker (161844) | more than 4 years ago | (#30332608)

well! You of course are perfectly correct! ahahahahah! I havne't been hanging around the marketing folks for a while...

Re:the performance is there (0)

Anonymous Coward | more than 4 years ago | (#30332138)

Throughput computing, Larrabee is great, which is what that article talks about. Not the same as rendering a video game... which is what they were going for...

Re:Oh rats (-1)

Anonymous Coward | more than 4 years ago | (#30331624)

One year ago I said the opposite, but right now it looks like ATI is the only high-end consumer graphics card supplier. I wonder if nvidia is about to throw the towel in high end consumer segment and concentrate in low end + HPC instead.

Re:Oh rats (0, Flamebait)

chriso11 (254041) | more than 4 years ago | (#30331810)

The simple truth is Intel can't do anything but CPUs (and maybe chipsets). Anytime they go outside of their comfort zone, they get smacked around.

Re:Oh rats (4, Insightful)

QuantumRiff (120817) | more than 4 years ago | (#30331672)

I would say ATI AMD are about to become the leader. Intel is making it more difficult to ship mobile systems without the craptastic intel graphics cards. Larrabee was supposed to be a decent performance GPU, that would almost be like a co-processor.

AMD has slightly slower CPU's, but their intgerated graphics blow the snot out of the Intel ones, and are getting even better.. What good is a super fast CPU, if you can't play any games, or even do basic stuff without using the power hungry CPU?

Future is Fusion (1)

sanman2 (928866) | more than 4 years ago | (#30332432)

I like the Fusion concept, and feel that Intel will ultimately be forced to imitate it as well. Their abandonment of Larrabee is consistent with that. Hell, I even hope that Scorpius will become the foundation for Nintendo's Wii-2 or Wii-HD.

Re:Oh rats (2, Interesting)

alvinrod (889928) | more than 4 years ago | (#30332442)

I don't know about that. Intel's offerings that are slated to come out 1Q - 1H of 2010 could give AMD some problems. Right now AMD has the performance advantage in the server space, but Gulftown [wikipedia.org] will likely trump their offerings. Arrandale [wikipedia.org] also looks quite impressive, especially the quad core i7 with an 18 watt TDP. The cores only run at 1.2 GHz, but with their Turbo boost the chip can clock up to 2.2 GHz. That will offer some amazing battery life for laptops and still provide good performance. I do believe some of the Arrandale processors will have a GPU on die as well. Granted it's an Intel GPU, but it offers some great power and cost savings over having to include a discrete card.

AMD doesn't look to have anything great coming out until late 2010 or early 2011 based on their roadmap [anandtech.com]. It helps that ATI is kicking ass in the graphics space. Right now they're winning on price and power. If they can get more of their 5800 series out in the market and release the mobile versions of those cards sooner rather than later, they'll be able to push a lot of hardware that way. However, they're not a real threat to Intel until they can get their SOC products out the door and offer a really compelling reason to go with their products.

Settling their legal issues with Intel will also help them a lot in the long run, but they're not out of the woods yet. They're still having financial problems, but if they can get through the next 18 months they'll be in great shape. The fact that they've been ahead of schedule on a lot of their new chips in the last year has probably helped substantially as well. AMD is in good position for the long term, but they need to decent sales in the coming quarters, which may be difficult to do with Intel releasing a lot of great new chips, especially in the mobile market where AMD hasn't been particularly strong recently.

Re:Oh rats (1)

Lemming Mark (849014) | more than 4 years ago | (#30332474)

Craptastic as the Intel cards may be, in overall performance terms, I could happily take any of the integrated parts by Intel that has decent Linux support on my next desktop, even if that meant a massive reduction in performance. I have an Xbox 360 for playing games on and I would love for my desktop to Just Work as well as my Eee does with Linux. That said, with ATI cards getting better and better support under Linux it is quite possible that they'll be the best option by the time I upgrade again...

I disagree (1, Informative)

Sycraft-fu (314770) | more than 4 years ago | (#30332734)

Many people really don't care about their graphics card. If you don't do games, an Intel chipset graphics unit works fine. It accelerates the shiny interface in Windows 7 and everything is nice and responsive. For business uses, this is plenty.

Ok well if you do care about games, then you want a discreet graphics solution. Integrated solutions will just never do well. Big reason is memory. You can make your card as fast as you like, if it shares system memory it is severely bottlenecked. Graphics cards needs their own dedicated high speed memory to perform well.

As such I just don't see ATi having a slightly better integrated solution as something people will care much about. The bigger question is who makes the better CPUs that that is firmly in Intel's arena. Their CPUs are faster, and can be lower power. So regardless of if you want a power saving app or a performance solution, they've got a good chip.

AMD really has to get their chips up to snuff before they'll start competing with Intel more. They don't have to beat Intel at everything, but they need to have at least one area they are better for and they really don't seem to. Also they need to do better with chipsets and motherboards. A big advantage Intel has with regards to the reseller market is that they do their own solutions. Intel will sell you a CPU, chipset and motherboard and they all work together well. OEMs like this, cuts down on supply chain problems and problems of vendors blaming each other when there's trouble.

This has also historically been a weakpoint for AMD. I remember when their Athlons came out and there was no question, they beat the P3's price/performance ratio. They were the kings of the hill. I bought one... and returned it two weeks later. The reason? Chipsets. I could not get a chipset that would work with my GeForce 256 properly. They had poor regulation of the AGP signal and it just wouldn't work. Bought an Intel chip/board and it worked flawlessly the first time.

So when AMD has a good CPU/chipset/mobo combo and CPUs competitive with Intel in at least one arena, I think maybe they'll make gains. Until then, I think they'll mainly be relegated to "cheap brands" and to enthusiast BYO systems.

Re:I disagree (1)

TheLink (130905) | more than 4 years ago | (#30332864)

> Ok well if you do care about games, then you want a discreet graphics solution.

The graphics hardware for games tend to be rather indiscreet. Big rapidly spinning fans, hot, noisy, lots of shiny/glossy metal and big.

See the second pic:
http://techreport.com/articles.x/17986 [techreport.com]

Integrated graphics solutions (which are nondiscrete) tend to be way more discrete. Just one small chip (or even just part of another chip), quiet, fanless, small.

Re:I disagree (0)

Anonymous Coward | more than 4 years ago | (#30333118)

It accelerates the shiny interface in Windows 7 and everything is nice and responsive. For business uses, this is plenty.

Photoshop and other transcoding applications already support GPU acceleration. Can other apps be far behind? Where will Intel be with such a backwards product line? This type of behavior allowed AMD to define the 64 bit instruction set and forced Intel to lag behind.

Ok well if you do care about games, then you want a discreet graphics solution. Integrated solutions will just never do well. Big reason is memory. You can make your card as fast as you like, if it shares system memory it is severely bottlenecked.

That problem has already been solved. Also, even motherboards with IGPs now have dedicated frame buffer memory called sideport memory which allows for (some) gaming. Since this is the first generation of chipsets with dedicated memory, I'm sure this technology will only get better and allow for an improved gaming experience. Finally, many laptops already have dedicated video memory.

Also they need to do better with chipsets and motherboards. A big advantage Intel has with regards to the reseller market is that they do their own solutions. Intel will sell you a CPU, chipset and motherboard and they all work together well.

How is this different than AMD and ATI? You can get an integrated solution from AMD as well that includes CPU, chipset, and an IGP. There are other reasons why AMD is not competitive in a lot of markets, but having a working, integrated solution isn't one of them.

mod parent up (1)

Billly Gates (198444) | more than 4 years ago | (#30333252)

My wife and I play wow but most users prefer to use a wii or ps3 if they want to play games.

Its frustrating and I agree that the intel chipsets and integrated chips (not true video cards) put desktops 5 - 6 years behind and piss off game developers forcing them to port only to consoles.

The netbook phenomena shows this trend for slim boring graphics that are cheap cheap and uh cheap.

Most game developers have left the pc as a result due to angry kids whose parents get a nice i945 graphics chipset computer for them and they wonder why Crysis is a slide show.

Re:Oh rats (-1, Troll)

KillShill (877105) | more than 4 years ago | (#30331740)

Barely.

ATi is only as viable as the monopolist Nvidia will let them be.

ATi has been very fortunate so far but they've just squeaked by in some years.

The problem is an uneducated populace... they keep rewarding anti-competitive and unethical behavior by continuing to purchase products from the monopolist.

Hopefully the thrice-convicted monopolist (intel) will beat the snot out of nvidia (recent lawsuit) and thereby reducing their effectiveness as a bully in the video card market.

Fighting fire with fire sometimes is the only way when no other solution is provided.

Re:Oh rats (5, Interesting)

chriso11 (254041) | more than 4 years ago | (#30331988)

NVidia hasn't let ATI do anything. Actually, NVidia is dealing with a series of problems - from serious packaging problems last year to TSMC yield issues now. ATI/AMD has been really effective lately; NVidia historically had a dominant position, but definitely not a monopoly, and I'll say that they have slipped a lot recently. Things change fast in the GPU race, so NVidia may recover quickly. But ATI/AMD have a solid amount of momentum, and the only real execution problem I've seen them make in the last few months in GPUs has been to rely on TSMC.

Take a look at the Dell Zino HD - it combines AMD's 'just enough CPU' with top end GPU to make a very compelling system. Intel has cut NVidia out of the chipsets, so they don't get the synergy that AMD has with ATI.

AMD is definitely better situated for the long haul than NVidia, and actually may be better off than Intel for complete systems.

Re:Oh rats (1, Informative)

Anonymous Coward | more than 4 years ago | (#30332306)

NVidia historically had a dominant position

I suppose "historically" is a relative term. I remember when just about EVERY graphics card was ATI.
ATI had the OEM market in the bag for quite a while.

From 1999: [findarticles.com]

What this also does is put a dent in the armor of ATI Technologies Inc., Toronto, Canada. ATI is the PC graphics market share leader with revenues close to $1 billion and has been steam rolling over the competition in the PC space for the past year or so. This includes S3, Trident Microsystems, 3Dfx, 3Dlabs and even Intel. The only companies to put up much of a fight was Nvidia, which is much smaller than ATI, and Montreal, Canada-based Matrox Graphics Inc., which has a similar business model to ATI.

Until the nVidia juggernaut took off [zdnetasia.com] in 2000:

Nvidia has overtaken ATI Technologies as the biggest maker of chips to enhance graphics on desktop computers, according to a new study by industry consultant Mercury Research.
In the third quarter, Nvidia chips were in 48 percent of all desktop computers, more than doubling its market share from 20 percent in the third quarter last year, Mercury said. ATI slipped to 34 percent from 39 percent.

Re:Oh rats (1)

eggnoglatte (1047660) | more than 4 years ago | (#30332354)

I might agree with you if ATI/AMD would finally get serious about producing drivers that aren't complete crap. Their hardware is fine, but Linux drivers, as well as OpenGL drivers on Windows just plain suck.

Re:Oh rats (1)

Maian (887886) | more than 4 years ago | (#30333044)

I don't think AMD cares that much if Linux gets subpar drivers, given its low marketshare and audience. Macs might be a different story, and their reliance on OpenGL could get AMD to care more about it.

Re:Oh rats (2, Informative)

TeXMaster (593524) | more than 4 years ago | (#30333370)

I might agree with you if ATI/AMD would finally get serious about producing drivers that aren't complete crap. Their hardware is fine, but Linux drivers, as well as OpenGL drivers on Windows just plain suck.

It's not just the video drivers. ATI also has a horrible software stack (SDK, runtime, compiler and documentation) for their Stream GPGPU computing architecture, which is why everybody uses NVIDIA and its excellent CUDA. Generally speaking, ATI has excellent hardware, but such hardware is useless if you don't have a matching software to exploit it.

Lol at the idiots (2, Funny)

Anonymous Coward | more than 4 years ago | (#30331580)

So they intend to take a product, who's chief advantage was that it could run old x86 code, and only sell it people who are designing new software? Am I the only one that sees a problem with this?

In other words... (4, Insightful)

sznupi (719324) | more than 4 years ago | (#30331586)

A nicer way of saying:

Uhm, guys, remember how we were supposed to ship a year ago and said recently we will ship a year from now? Well, add 5 to that now...but we will provide and totally kick ass, promise.

Re:In other words... (2, Funny)

symbolset (646467) | more than 4 years ago | (#30331834)

An Itanium class part, then.

Re:In other words... (2, Insightful)

sznupi (719324) | more than 4 years ago | (#30331874)

Hm, yeah... a variant of FUD; spreading wonderful stories about a future product just to stall / eradicate the competition; just so potential clients will wait.

What doesn't add up in this case is that Intel, at this point in time, seems quite cautious in their claims about Larabee - they hardly have anything / are themselves very skeptical about it, even in face of major delay & reengineering?

Re:In other words... (4, Interesting)

ppanon (16583) | more than 4 years ago | (#30333612)

Yeah, I've been wondering about that. For the last year I've heard people parrot how great Larabee was going to be and it reminded me a lot of the hype about how the Pentium IV (or even Itanium for that matter) was going to kick ass. I couldn't see Intel all of a sudden going from dead last in graphics performance to top of the heap. They would have needed some top graphics system designers on both the h/w and s/w sides and those people just haven't been at Intel. I can't help but wonder if Larabee FUD and the chipset disputes with NVidia might have been a one-two punch plan to knock down NVidia's market capitalization down a peg or two to make it cheaper buy out. Then Intel's in the driver's seat to get NVidia's expertise and patents for a song instead of paying top dollar for them. Intel could have been planning this from the moment AMD bought out ATI two years ago, or even earlier when the latter two were still in preliminary talks, I doubt there would be any email smoking guns over it though; Intel's where the paranoid survive after all. But if I'm right then I would expect Intel to make a play for NVidia in inside of two years. To wait much longer would give AMD/ATI too much of a headstart in a market increasingly dominated by laptops. Somehow, 18 months after Intel buys NVidia, Larabee II will show up with graphics performance slightly better than NVidia's last GPU (and those suckers doing Larabee development are going to find the pipeline/rendering model significantly changed to look a lot like NVidia's).

FIRST POST!!! (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#30331596)

So long, suckers!

Larrabee = Graphics Chip competing w nVidia (5, Informative)

billstewart (78916) | more than 4 years ago | (#30331648)

In case you've forgotten what a Larrabee was (like I had), it was Intel's planned graphics / vector processing chip, competing with nVidia and AMD / ATI graphics systems. Here's the Wikipedia article [wikipedia.org].

Re:Larrabee = Graphics Chip competing w nVidia (2, Insightful)

segedunum (883035) | more than 4 years ago | (#30332036)

I certainly had forgotten, thanks. I certainly haven't forgotten with regards to marrying a powerful Intel processor with anyting like acceptable integrated graphics.

I guess this means that the only option we have to get half-decent graphics with an Intel processor is with an nVidia chipset. However, that relationship looks a bit rocky and very soon we'll probably only be left with the incredibly shitty Intel integrated graphics systems that work passibly (i.e. you can display a Vista/7 desktop with it and that's it) until you actually want it to do anything even remotely...........graphical. Their acceleration performance for video isn't too hot either.

Either that, or you move to AMD/ATI if you want a decent processor/chipset/integrated graphics combination. AMD must be pleased. This is the best news they've had in quite a while. Their purchase of ATI looks to be paying off. If Intel can't get Larrabee working then I don't know where they go from here, apart from try again and actually get it working or start being nice to nVidia.

Re:Larrabee = Graphics Chip competing w nVidia (0)

Anonymous Coward | more than 4 years ago | (#30332136)

I find an excellent way of getting "half-decent graphics" is to buy a discrete graphics card. They aren't at all expensive, if you're only looking for half-decent.

Re:Larrabee = Graphics Chip competing w nVidia (2, Insightful)

dwinks616 (1536791) | more than 4 years ago | (#30332256)

Oh, well please show me where I can buy this discrete card for my laptop please?

Re:Larrabee = Graphics Chip competing w nVidia (0)

Anonymous Coward | more than 4 years ago | (#30332684)

Buy a laptop with a discrete card. Is it really such a difficult concept to grasp? I had no problem with it.

Re:Larrabee = Graphics Chip competing w nVidia (1)

moosesocks (264553) | more than 4 years ago | (#30332600)

I'm not one for conspiracy theories, although I wouldn't be terribly shocked if Intel surprised everybody and launched Larrabee a few months after AMD releases a competing product.

In the past, Intel's deliberately stifled product development and engaged in anticompetitive behaviors that would even make Microsoft look twice (and has been found guilty and forced to pay up to this extent). Remember how quickly Intel brought consumer x86-64 chips to market after AMD proved that the platform was technically and commercially viable?

Of course, this may be giving Intel too much credit -- the success of the 'Core' series was essentially a whole lot of luck -- Itanium and Pentium 4 were always planned to be "the way forward" for the company. When neither panned out, the company was able to fall back on its low-power mobile platform, which turned out to scale remarkably well, despite having its origins in a much older architecture.

Re:Larrabee = Graphics Chip competing w nVidia (1)

qazadex (1378043) | more than 4 years ago | (#30333560)

Why doesn't Intel just stop trying to create GPU's? I've got a new laptop with one of their graphics chipsets, and it absolutely sucks. Seems like Intel should stick to 'normal' processors.

Re:Larrabee = Graphics Chip competing w nVidia (0)

Anonymous Coward | more than 4 years ago | (#30333664)

The GP somewhat misrepresents what Larrabee is (was going to be?). It wasn't a graphics processor so much as an extension to the x86 instruction set (and corresponding addition of a lot more floating point units) such that Intel could make a processor which showed up as a bunch of regular processors which would have those special instructions to allow software rendering on them to be really fast.

Does Sarah O'Connor has anything to do with it ? (0)

Anonymous Coward | more than 4 years ago | (#30331658)

Too bad, Larrabee looked like the next thing.

Re:Does Sarah O'Connor has anything to do with it (1)

ettlz (639203) | more than 4 years ago | (#30331764)

Sarah O'Connor? Would she be the mother of Jonny O'Connor from County Cork, who'll lead humanity to victory in the war against the machines and the English?

Re:Does Sarah O'Connor has anything to do with it (0)

Anonymous Coward | more than 4 years ago | (#30333594)

You laugh now, but you'll be calling for his help once Sky O'Net blows up the British nuclear stockpiles of up to 50 warheads.

Great, just in time for Duke Nukem Forever! (4, Funny)

WoTG (610710) | more than 4 years ago | (#30331662)

Hmm... I think Intel's plan is for Larrabee GPU's to launch at the same time as Duke Nukem Forever! :)

Re:Great, just in time for Duke Nukem Forever! (0)

Anonymous Coward | more than 4 years ago | (#30331772)

it's old dude. give it a rest.

Re:Great, just in time for Duke Nukem Forever! (5, Funny)

Shikaku (1129753) | more than 4 years ago | (#30331804)

Imagine a beowulf cluster of old memes. Oh wait, I don't have to, it's Slashdot.

Re:Great, just in time for Duke Nukem Forever! (1)

Kjella (173770) | more than 4 years ago | (#30332482)

Imagine a beowulf cluster of old memes. Oh wait, I don't have to, it's Slashdot.

If Intel has one-chip cloud computers, then slashdot has one-post beowulf clusters.

So the next mini, low end imac and 13" macbook's w (1, Interesting)

Joe The Dragon (967727) | more than 4 years ago | (#30331784)

So the next mini, low end imac and 13" macbook's will be stuck with shit video and the mac pro will start at $3000 with 6 core cpus.

Will apple move to amd just to get better video in low end systems?

Re:So the next mini, low end imac and 13" macbook' (1, Informative)

Anonymous Coward | more than 4 years ago | (#30331902)

Apple already dropped GMA for low end stuff, they're using GeForce 9400M instead. They're also using Radeons on most iMac models.

Re:So the next mini, low end imac and 13" macbook' (1)

jasonwc (939262) | more than 4 years ago | (#30331986)

I'm not sure what you're referring to. Macbook and Macbook Pros are configured with Nvidia 9400M or 9600M chipsets. They may not be powerful but at least they are dedicated graphics solutions. Far superior to Intel Integrated graphics, and they provide working hardware acceleration for H.264/VC-1. The Intel G45 chipset does so - but only with MPC-HC - not for commercial blu-ray playback - and it had some corruption last I checked.

i3/i5 cut off nvidia and the low end cpus have gma (1)

Joe The Dragon (967727) | more than 4 years ago | (#30332624)

i3/i5 cut off nvidia and the low end cpus have gma build in and apple likely will put i3 in the mini and stick it with carp video at $800 as well.

Re:i3/i5 cut off nvidia and the low end cpus have (1)

jasonwc (939262) | more than 4 years ago | (#30332720)

Ah, I thought you were talking about their current rather than future offerings.

Re:So the next mini, low end imac and 13" macbook' (2, Interesting)

willy_me (212994) | more than 4 years ago | (#30333568)

but at least they are dedicated graphics solutions

Actually, the 9400m is not. It uses system memory but does a much better job then Intel. It also acts as the memory controller and does system IO. The reason for the parent's comments is that all future Intel CPUs will have integrated memory controllers (like the i7 and i5) and an integrated GPU. Performance will suck but it will make for cheap systems. This will make it difficult for system builders to make a low end system with good graphic performance as the market for such systems will be small. The smaller market will reduce the quality/performance of available parts for those system builders - one of which is Apple.

Re:So the next mini, low end imac and 13" macbook' (0)

Anonymous Coward | more than 4 years ago | (#30332030)

Marketroids buy parts for systems, not engineers. Apple has a contract with Intel and they will continue to buy from Intel until the profit margins shrink.

Re:So the next mini, low end imac and 13" macbook' (0)

Anonymous Coward | more than 4 years ago | (#30333630)

Marketroids? Leo, is that you?

Re:So the next mini, low end imac and 13" macbook' (0)

Anonymous Coward | more than 4 years ago | (#30333690)

Woops, my mistake. it was Mike Smithwick who coined the term [uni-bielefeld.de]? Though I don't know if Mike is on Slashdot, and I do know Leo was posting to comp.sys.amiga back when Mike coined the term only a few years after the release of its inspiration: Robotron.

Re:So the next mini, low end imac and 13" macbook' (0)

Anonymous Coward | more than 4 years ago | (#30332296)

A better question is when will AMD come out with a competitive mobile platform, because Apple sure as hell would never use their current stuff.

Heterogeneous Processors Are Doomed (0, Interesting)

Anonymous Coward | more than 4 years ago | (#30331868)

The idea that the future of parallel processing somehow rests on the use of a bunch of hybrid cores built on the same die was wrong right out of the gate. If parallel CPU cores are a pain in the ass to program, what makes them think that it will be easier by combining them with a non-compatible type of parallel hardware? The CPU/GPU marriage is a match made in hell and, deep down, Intel knows it. Larrabee was just so much puffery and chest beating, king of the jungle and all that jazz.

The way to solve the parallel programming crisis is by first acknowledging that last century's computing paradigms are completely inadequate in the age of massive parallelism. It is time to change to the true computing religion and abandon the outmoded worship of the hopelessly flawed Turing Machine.

Next in line for destruction: AMD's Fusion. You read it here first.

How to Solve the Parallel Programming Crisis [blogspot.com]

Re:Heterogeneous Processors Are Doomed (0)

Anonymous Coward | more than 4 years ago | (#30332172)

Louis, you're describing Erlang's model of OO programming in your page. Erlang is nice. And tools like it will continue to gain industrial support.

Re:Heterogeneous Processors Are Doomed (0)

Anonymous Coward | more than 4 years ago | (#30332330)

Nope. Erlang is not the solution [blogspot.com] either.

Re:Heterogeneous Processors Are Doomed (0)

Anonymous Coward | more than 4 years ago | (#30332388)

Nope. Erlang is not the solution [blogspot.com] either.

Apparently Erlang is incompatible with the tiny angels that push electricity around in the computers of the future...

Re:Heterogeneous Processors Are Doomed (0)

Anonymous Coward | more than 4 years ago | (#30332180)

Because a fringe pseudoscience blogger has completely pwned the thousands of engineers at Intel. Yep.

Oh, the post above *sounds* reasonable. But then you poke around and find things like this:

http://rebelscience.blogspot.com/2009/11/lattice-propulsion-one-more-clue.html

From that article:

The electrostatic field between two charged parallel surfaces consists of opposite-facing seraphim being emitted by the plates. The seraphim reaching the plates interact with the plate particles.

Clearly somebody needs their meds adjusted...

Re:Heterogeneous Processors Are Doomed (0)

Anonymous Coward | more than 4 years ago | (#30332414)

Let's see. "I'll ignore his message on parallel programming and go straight for the ad hominem, because I know he's right and I can't stand it." You're a paid Intel shill and you know it. LOL.

Re:Heterogeneous Processors Are Doomed (2, Insightful)

ThatMegathronDude (1189203) | more than 4 years ago | (#30333072)

I have a 4 year CS degree and I can tell you with certainty that that blogger is full of shit. The problems that are already parallelizable, are easily multithreaded with current technology. The problems with serial dependency, are not, and never will be, easily multithreaded.

Rendering graphics is already done, because its easy to split the task of rendering a bunch of pixels into pixel-sized chunks. Each small thread can read from the same shared memory (the scene graph and textures, etc.) and write to a distinct location (its one pixel in the frame buffer).

Encoding video using motion-compensation techniques (basically all modern video codecs) will never be satisfactorily parallelizable because the best bang/bitrate can only be achieved when frames are processed serially. Frames need to be processed as a whole to optimize for panning and other full-scene motion, and the results of the previous frame's motion analysis is typically needed to compute the next delta. You can break the processing up into multiple threads easily enough, but you miss out on opportunities to make the output more efficient or better looking.

When Mr. PseudoScience blogger can parallelize the video encoding problem without so many dependencies that its essentially a serial process, then he should get some credit, not before then.

Re:Heterogeneous Processors Are Doomed (0)

Anonymous Coward | more than 4 years ago | (#30333420)

I have a 4 year CS degree

You're just a moron.

Re:Heterogeneous Processors Are Doomed (0)

Anonymous Coward | more than 4 years ago | (#30333666)

So, just how deterministic is that parallel Seraphim computer there?

Re:Heterogeneous Processors Are Doomed (0)

Anonymous Coward | more than 4 years ago | (#30332206)

Thanks. Ever since I quit reading sci.math I have an occasional yearning to read the ill-informed ramblings of a crank.

Re:Heterogeneous Processors Are Doomed (0)

Anonymous Coward | more than 4 years ago | (#30332578)

except that Larrabee was supposed to be a grid of x86-64 CPUs. Pretty homogenous compared with the host CPU. All that was needed is OS support

Wow... shock horror (5, Funny)

Plasmoid2000ad (1004859) | more than 4 years ago | (#30331890)

I spent most of internship in intel arguing with people hyping larabee as the 2nd coming of jesus that it would never happen... And now i can finally say HAH!

Re:Wow... shock horror (1)

FatdogHaiku (978357) | more than 4 years ago | (#30332240)

So when the big guy does show up we will know what kind of a processor he'll be rockin', cool.
Just remember:

"Thou shalt NOT rootkit The Lord thy Admin."

Re:Wow... shock horror (0)

Anonymous Coward | more than 4 years ago | (#30332372)

Oh, you are one of those smart interns, who come in and then tell architects who have been running simulations for years and collecting results, that hey, this is not going to work, without any data of your own, just from your intution?
Are you, instead of larabee, the second coming of Jesus?

Re:Wow... shock horror (0)

Anonymous Coward | more than 4 years ago | (#30333238)

Congrats to your fantastic insight even as an intern in going against decades of experience and turning out to be right. You must be really smart. Or at least as lucky as a coin tosser.

Re:Wow... shock horror (1)

robbiedo (553308) | more than 4 years ago | (#30333682)

If I worked at Intel in the group developing a product, I would keep my mouth shut, even if I was an intern. There are possibly a large group of smart dedicated people trying to make this happen.

Intel Inside... (0)

Anonymous Coward | more than 4 years ago | (#30331970)

Intel insiders have seen this coming. Dadi [intel.com] won. Three strikes and you're out for Pat [emc.com].
1. Itanium
2. Pentium 4
3. Larrabee

Fortunately for the guys in Hillsboro, Nehalem is a glowing success.

Mis-reported, I think. (2, Insightful)

Anonymous Coward | more than 4 years ago | (#30332072)

This is being mis-reported or mis-communicated by Intel, I believe.
The first version of Larrabee silicon isn't going to consumers, that's all.
From the consumer's perspective, it's a delay. Yet to be seen if it's fatal.
Otherwise, who'd want to use it to develop software?

oh lord why! (0)

Anonymous Coward | more than 4 years ago | (#30332082)

This is wholy depressing.
Open source graphics stack is disheartening to say the least and the kms/gallium architecture is probably 1.5 years
from delivering on it's promise to optimize open graphics.
I was hoping that larrabee would at least motivate ATI to put some real man-power behind
their half-hearted support for the xf86-video-ati driver.

This is almost sad enough to make me run to nvidia with my wallet wide open!

Bad for Linux (1)

Tailhook (98486) | more than 4 years ago | (#30332280)

Intel has shown real commitment to supporting their video hardware on Linux with full time staff [intellinuxgraphics.org] employed to produce high quality open source drivers in addition to providing open specifications for (most) of their contemporary hardware. Unfortunately this hardware provides only limited 3D acceleration. I was hoping that Larrabee would conflate these two and provide vendor supported, open, high performance accelerated 3D for Linux.

So much for that happening anytime soon...

I can't understand why Intel cedes the GPU market to it's competitors. Have I been getting duped into paying hundreds while everyone else gets free GPUs? People are paying good money for these chips, right? NVidia's got Playstation 3 and Apple. ATI got the 360. Intel has nothing the the discrete GPU market at all. Why? What blocker within Intel prevents them from taking a piece of that pie?

Re:Bad for Linux (1)

TheKidWho (705796) | more than 4 years ago | (#30332418)

They don't have the experience and all the good computer graphics engineers are at Nvidia and ATI.

I wonder if Bangalore has anything to do with it. (4, Interesting)

bertok (226922) | more than 4 years ago | (#30332358)

I think the announcement of the 48-core Intel 'Bangalore' chip [slashdot.org] just recently is not a coincidence.

When I first read about the Larrabee chip, I thought the decision to make it a cache coherent SMP chip to be simply insane - architectures like that are very difficult to scale, as the inter-core chatter scales roughly as the factorial of the number of cores. Remember how Larrabee was designed around a really wide 1024-bit ring bus? I bet that's required because otherwise the cores would spend all of their time trying to synchronize between each other.

So, Larrabee is effectively cancelled, but only a day or two before Intel announced an almost identical sounding part without cache-coherence! It sounds to me like they've given up on the 100% x86 compatibility, and realised that a chip with some extra instructions around explicit software controlled memory synchronization and message passing would scale way better. Without cache coherence, a "many core" chip is basically just an independent unit repeated over and over, so scalability should be almost infinite, and wouldn't require design changes for different sizes. That sounds like a much better match for a graphics processor.

While Intel kept their cards relatively close to their chest, from all of the presentations I've seen, no first-gen Larrabee chip could scale beyond 24 cores even with a 1024 bit bus, while the new Bangalore chip starts at 48 cores. There's no public info on how many lanes Bangalore has in its on-chip bus but based on the bandwidth of its 80 core experimental predecessor, I'm guessing it's either 32-bit or 64-bit (per core).

Re:I wonder if Bangalore has anything to do with i (0)

Anonymous Coward | more than 4 years ago | (#30332536)

Wow... thanks for your insight! Should have known Intel would be logical even about their failures, and roll them over to something that has a chance of applicability. The only thing I wish they would do is skip the 64-bit crap and make 128-bit architectures that are compatible with both 32- and 64-bit predecessors. It would ease the development of new applications since the life time of 128-bit archs would be decades as opposed to developing all 64-bit apps to only have 128-bit archs appear in 5-10 years.

Re:I wonder if Bangalore has anything to do with i (2, Insightful)

bertok (226922) | more than 4 years ago | (#30332652)

Wow... thanks for your insight! Should have known Intel would be logical even about their failures, and roll them over to something that has a chance of applicability. The only thing I wish they would do is skip the 64-bit crap and make 128-bit architectures that are compatible with both 32- and 64-bit predecessors. It would ease the development of new applications since the life time of 128-bit archs would be decades as opposed to developing all 64-bit apps to only have 128-bit archs appear in 5-10 years.

I'm not sure if you're trolling or not, but 64-bit memory capacity is not "twice" as big as 32-bit, it's 4.3 billion times as big. That's more than just 5 to 10 years of Moore's law, that's more like 50 years. Physical bus widths have nothing to do with architecture bitness either, there are memory buses for 64-bit architectures that only have a few pins.

Re:I wonder if Bangalore has anything to do with i (1)

RzUpAnmsCwrds (262647) | more than 4 years ago | (#30333500)

The problem is, a many-core non cache-coherent x86-like system isn't particularly interesting. The big advantage of Larrabee was that you could treat it like a normal SMP system, including (presumably) running standard multithreaded C code on it. Once you have to deal with memory synchronization explicitly, Larrabee starts to look a lot more (from a programming standpoint) like Fermi, Cypress or whatever other Nvidia/ATI GPUs are out at the time.

There's nothing magic about x86/AMD64 in the HPC world. It's attractive because it is cheap and has good performance. Clusters can, have been, and still are built using POWER and other architectures.

Re:I wonder if Bangalore has anything to do with i (0)

Anonymous Coward | more than 4 years ago | (#30333532)

Intel realized a big problem in computing that they cannot solve easily. The cache coherency is a real problem that cannot be solved simply. But their platform is good for evolutionary computing and I barely see a hardware solution other than Laarabee. the only solution is that each core goes with its memory and you have a distributed approach like grid-computing. Many computational optimization problems can be solved with evolutionary computing and not with a central memory. Data is processed on the agent. Moving small data back and forth from main memory is an overkill. You need PCIe clusters of quad core CPUs (mainly like atom) with their own memory to maintain clock speeds low. For example mpeg4 decoding can be done this way. you sent chunks of continuous frames for decoding to clusters and collect back the results. But you need a high speed interconnect, but I think PCIe is enough.

If you are told that you .... (0)

Anonymous Coward | more than 4 years ago | (#30332404)

will be working on a graphics chip project at Intel then:

- You know someone in management hates you
- You need to move your cube to a different floor
- You don't go to any meetings and if you do look like shit and fall asleep often
- Your career will be forever tarnished
- You will never get those 18 months back

and last but not least -- You know you shouldn't have put that whoopee cushion on Paul Ottelini's chair

nvidia + intel?? (0)

Anonymous Coward | more than 4 years ago | (#30332668)

Is this a precursor to some nvidia/intel alliance?
It's a shame this isn't going to happen. If anything this would have kicked wide open the video market with a known GPU instruction set. We may well be doomed to proprietary driver hell with system stability becomming more and more reliant on the proficiency of nvidia/amd. For linux users they are the weak point in system stability.

AMD Patents (0)

Anonymous Coward | more than 4 years ago | (#30332738)

Now that Intel has full use of AMDs ATI graphics patents I'm not surprised they have dumped Larrabee.... I would expect to see a new GPU product announcement from them next year that is similar to AMDs offerings....

Likely the Intel PowerVR partnership (1)

Criton (605617) | more than 4 years ago | (#30333212)

I suspect the Intel and PowerVR partnership may have have something to do with no consumer Larrabee plans. This partnership already has resulted in the 3100ce and PowervR has been working on some 1080p media accelerators. Larrabee does use a lot of power for the level of performance it would offer as a 3D chipset perhaps Intel and PowerVR have came with with something that does not use 160watts.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...