Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel Discrete Graphics Chips Confirmed

kdawson posted more than 7 years ago | from the ready-or-not dept.

Graphics 159

Arun Demeure writes "There have been rumors of Intel's re-entry into discrete graphics for months. Now Beyond3D reports that Intel has copped to the project on their own site. They describe it as a 'many-core' architecture aimed at 'high-end client platforms,' but also extending to other market segments in the future, with 'plans for accelerated CPU integration.' This might also encourage others to follow Intel's strategy of open-sourcing their Linux drivers. So, better watch out NVIDIA and AMD/ATI — there's new competition on the horizon."

Sorry! There are no comments related to the filter you selected.

Predictable... (0)

Anonymous Coward | more than 7 years ago | (#17722300)

Nothing for you to see here. Please move along.

Re:Predictable... (5, Funny)

ozmanjusri (601766) | more than 7 years ago | (#17722310)

Nothing for you to see here. Please move along.

Try reinstalling the drivers.

Re:Predictable... (1, Funny)

jaweekes (938376) | more than 7 years ago | (#17722640)

Petty I have no mod points; that was funny! :D

More competition (5, Insightful)

GreenEnvy22 (1046790) | more than 7 years ago | (#17722334)

Competition is almost always good, so I look forward to this. I'd like to see Intel push ATI and Nvidia to create more power efficient chips, as it's quite rediculous right now.

Mostly they are efficient (5, Insightful)

Moraelin (679338) | more than 7 years ago | (#17722446)

If you look at the vast majority of chips either ATI or nVidia sell, they're actually pretty efficient.

But they invariably _have_ to have some benchmark-breaking super-card to grab the headlines with. The way it works is that while only a minority of people will actually buy the top-end graphics card, there are millions of people who just need a reminder that "nVidia is fast" or "ATIs are fast". They'll go to some benchmark site to see some "nVidia's 8800 GTX is faster than ATI's X1900XTX!" article (not entirely unexpected, it's one generation ahead), end up with some vague "nVidia is faster than ATI" idea, then go buy a 5200. Which is the lowest end of two generations behind the ATI, or 3 behind that 8800 GTX.

Both ATI and nVidia even went through times of not even trying to produce or sell much of their headline-grabbing card. And at least ATI always introduces their latest technology in their mid-range cards first, and they tend to be reasonably energy efficient cards too. But it's like a chicken contest: the one who pulls out loses. The moment one of them gave up on having an ultra-high end card at all, the benchmark sites and willy-waver forums would proclaim "company X loses the high performance graphics battle!"

I don't think Intel will manage to restore sanity in that arena, sadly. Most likely Intel will end up playing the same game, with one overclocked noisy card to grab the headlines for their saner cards.

Re:Mostly they are efficient (2, Insightful)

Kjella (173770) | more than 7 years ago | (#17722814)

That sorta assumes you can have one without having the other - can you really have a damn good mirange card that wouldn't perform as a high-end card if you jacked up the GPU frequence, RAM speed and added a huge noisy fan? Trying to measure the midrange gets too complicated though, too many variables like noise and power consumption. Let's just have an all-out pissing contest and assume that it scales down.

Technologicly, it does. But then there's the part about market economics, you charge what the market will pay and pocket the margin. That's why they're mostly close anyway. Let's take Intel's Core 2 introduction, before: AMD vs Intel was close. Intel introduces a damn good new processor, AMD slashes prices 50%, again they're close. Who had the best technology before and after? Good question, but most of the difference doesn't show up in the market.

Re:Mostly they are efficient (-1, Redundant)

Pulzar (81031) | more than 7 years ago | (#17722942)

can you really have a damn good mirange card that wouldn't perform as a high-end card if you jacked up the GPU frequence, RAM speed and added a huge noisy fan?

That's simply how the technology works... If you take almost any processing chip, and jack up its frequency and the supporting RAM speed, you'll get more processing power out of it (until you reach the point where it doesn't work). And, with more power, you need a bigger fan.

But, it definitely works the other way, too -- you can dial it all down, and it'll be cool and quiet, and performing slower. Everybody gets to pick what they want, and everybody should be happy :).

Re:Mostly they are efficient (3, Insightful)

danpsmith (922127) | more than 7 years ago | (#17723100)

But they invariably _have_ to have some benchmark-breaking super-card to grab the headlines with. The way it works is that while only a minority of people will actually buy the top-end graphics card, there are millions of people who just need a reminder that "nVidia is fast" or "ATIs are fast". They'll go to some benchmark site to see some "nVidia's 8800 GTX is faster than ATI's X1900XTX!" article (not entirely unexpected, it's one generation ahead), end up with some vague "nVidia is faster than ATI" idea, then go buy a 5200. Which is the lowest end of two generations behind the ATI, or 3 behind that 8800 GTX.

Maybe I'm in the minority of people here, but I've always gone to sites that have actual reviews of the card I will potentially be buying. Companies have different models and each one of those models of product has its own advantages and disadvantages. I think a lot of the people that do a lot of shopping comparison online (i.e. most of the market that's actually going to be buying/installing their own graphics card) know this and do the same. ATI and Nvidia cards are only going to sell to a certain section of the market other than OEMs, and I doubt very severely that this is the approach that the type of people upgrading video cards would use in determining which card to purchase. I know I usually check out anandtech.com and look for benchmarks on the price range that I'm in.

This is like saying "Alpine stereos are better" and buying the lowest model level alpine without comparing it to anything else in the price range, nobody who is going to be installing it themselves can be that stupid, unless they were fanboys looking for a reason to hype up their favorite company anyway. Either way it doesn't look like a real market strategy to me.

Re:Mostly they are efficient (1)

renoX (11677) | more than 7 years ago | (#17723820)

> Maybe I'm in the minority of people here, but I've always gone to sites that have actual reviews of the card I will potentially be buying.

I doubt that a majority of slashdot readers don't do the same thing, that said I don't know if you have seen but the website usually compares boards in the same range but don't provide the prices of the boards, so you have to build yourself the graph with the performance/price comparison.
So it's not so easy to do the comparison..

Re:Mostly they are efficient (1)

osu-neko (2604) | more than 7 years ago | (#17724466)

It should also be noted that some of us have to have fairly large monitors to do our jobs, and what might pump out acceptable performance on an average sized monitor with 768k pixels just doesn't cut it when trying to push around 3M pixels (and running your LCD monitor at anything other than it's native resolution looks like crap, so changing your resolution isn't an acceptable option).

So it's not like the only point of the high-end card is to win benchmarks and sell more low-end cards, the point is to have a useful product for those of who actually need that powerful of a card just to get the same performance on our monitors that you're getting with that 5200 on yours.

Restore sanity? Please. There's nothing insane going on here. Insanity would be abandoning any customer who isn't "mainstream". Your competition will naturally and intelligently pounce on any market you choose to ignore. A smaller piece of a huge market is frequently itself a multi-million dollar market. Companies don't score points by turning down money. This is particularly true when the market segment you're turning your back on is the one's that spend the most bucks (about 70% of the cost of any computer system I buy is devoted to the graphics -- my most expensive component is always the monitor, and next most expensive is the card to run it. The entire rest of the system generally accounts for less than 30% of the cost).

Re:Mostly they are efficient (1)

gstoddart (321705) | more than 7 years ago | (#17724476)

But they invariably _have_ to have some benchmark-breaking super-card to grab the headlines with. The way it works is that while only a minority of people will actually buy the top-end graphics card, there are millions of people who just need a reminder that "nVidia is fast" or "ATIs are fast".

And, for some of us, we don't even need to know if it's all that fast. As long as it's properly supported by Xorg.

The nVidia video card on my FreeBSD box died the other week. I went to my local PC shop, got the cheapest nVidia they had ($50 CDN or so), re-ran the configuration for Xorg to set the new amount of memory, and I was back in X windows with no problems.

I'll settle for the knowledge that with the generic nVidia driver, I'm assured to get a working 1600x1200 at a good refresh rate. I don't game on either of my machines, so uber video performance isn't really a requirement.

If Intel wants to get my business, I'll wait until the OSS people catch up and have drivers before I'll even consider looking at it. Maybe, if they're smart, they'll find some OSS people and give them demo hardware.

Cheers

Re:More competition (1)

canuck57 (662392) | more than 7 years ago | (#17722578)

Competition is almost always good, so I look forward to this. I'd like to see Intel push ATI and Nvidia to create more power efficient chips, as it's quite rediculous right now.

No kidding! Looking at video cards to get away from dreaded shared memory I couldn't believe what they want for anything decent that wouldn't burn a hole in anything it touched (heat/cost). And given Intel's history of Open Source drivers for the wireless, I am not holding on waiting for them. AMD/ATI, I hope AMD makes ATI management do a turn about as getting drivers for some other items is impossible and they do not respond to inquires for what is needed to write the drivers. NVIDIA I have had good luck with since switching some 6-7 years ago from ATI.

And there is no reason why not to integrate a real chipset into the core without that shared memory garbage we see today. But I suspect this is posturing by both AMD and Intel... we will all just have to wait to see how this shakes out.

Re:More competition (-1, Flamebait)

Anonymous Coward | more than 7 years ago | (#17723270)

RIDICULOUS you fucking retard.

Jesus fucking Christ on a bike, will you Americans EVER learn to fucking spell?

Re:More competition (-1, Troll)

Anonymous Coward | more than 7 years ago | (#17723538)

That's funny, your mom asked me the same thing as I was writing "Your-a-peon" in semen on her flabby pancake tits after she rubbing me off. I just smacked the bitch and told her to OMFGSTFU!!!1111ONE

Re:More competition (0)

Anonymous Coward | more than 7 years ago | (#17724508)

You said what, exactly?

Discrete? (0)

Anonymous Coward | more than 7 years ago | (#17722346)

What is so discrete about their graphics? They are telling about it in a press release? Or does this mean that they no longer restrict themselves to the continuous graphics market?

Re:Discrete? (2, Insightful)

Shard013 (530636) | more than 7 years ago | (#17722400)

I am guessing in this context "discrete" means seperate from the motherboard.

Re:Discrete? (1)

asliarun (636603) | more than 7 years ago | (#17723296)

"I am guessing in this context "discrete" means seperate [sic] from the motherboard."

It means that the graphics chip won't get in your way when you're setting up your motherboard, unlike some video cards I know. :-)

Re:Discrete? (0)

Anonymous Coward | more than 7 years ago | (#17723266)

You should be more discreet with showing your stupidity.

Intel Video hardware is just nice... (3, Informative)

vhogemann (797994) | more than 7 years ago | (#17722356)

And if they enter the gaming video market, I can assure you that my next videboard will be an Intel one.

Intel drivers for Linux Just Work(TM). I installed Ubuntu 6.10 on my Acer notebook, with a i915g video adapter, and everything worked without any extra effort. And I'm even able to use Beryl/Compiz as my default window manager, without any stability issues.

Both nVidia and ATI should learn from Intel.

What desktop motherboard? (1)

Ed Avis (5917) | more than 7 years ago | (#17722416)

I want to get a motherboard with Intel onboard graphics (that has free Linux drivers). I've heard of the G965 chipset; is that the one to go for? I would prefer to buy a 'workstation' rather than 'consumer' motherboard but they tend not to have integrated graphics, no?

Are Intel's own-brand motherboards worth it? In the past I've bought Asus but that was for AMD-based systems.

Re:What desktop motherboard? (1)

i_should_be_working (720372) | more than 7 years ago | (#17722730)

I've heard good things about that one and how it does fine with the modern eye-candy like xgl etc. Apparently there's newer and better already out from Intel. See here [wikipedia.org] .

I've looked around for boards with these chips and found them in several brands including Asus and Intel. A search for "gma" on your favourite computer store's site should find something.

Re:What desktop motherboard? (0)

Anonymous Coward | more than 7 years ago | (#17722962)

and the fact that the 965 chipset is fully Trusted Computing enabled has nothing to do with Intel's decision...

The fact is, Intel no longer needs to control the source to control the user. With a TPM (in the 965), Intel can control what binary you use... and at the same time, get the P.R. from "open sourcing" their drivers.

Re:What desktop motherboard? (2, Informative)

chrish (4714) | more than 7 years ago | (#17723154)

The Intel GMA950 is the one Apple's using in the Mac Mini and MacBook laptops, and doesn't seem too horrible for an integrated shared-memory GPU; it runs all the spiffy OS X eye-candy nicely, and I've had people tell me that playing games (World of Warcraft natively, or City of Heroes after installing BootCamp and XP) on it is fine.

Since gaming isn't really your focus if you're running Linux ;-), I imagine the GMA950 chipset (or something newer) would be great for KDE/GNOME/etc. even when they start using OpenGL.

Maybe something like Asus' P5B motherboards (P5B-VM [asus.com] and P5B-V [asus.com] )?

(Note an Asus employee or stockholder, just a happy customer of an ASUS P4G8X Deluxe).

Re:Intel Video hardware is just nice... (1)

should_be_linear (779431) | more than 7 years ago | (#17722484)

Yes, if your Linux machine is used for serious business (server, workstation) just forget crapy "mainstream" options (ATI, nVidia). Intel graphics, while not providing enough horsepower for gaming-on-the-edge (yet) is real deal when it comes to stability, usability and simplicity of instalation. If they manage to bring performance to nVidia-range, I for one will welcome our new Linux-graphics-kick-ass-overlords.

Re:Intel Video hardware is just nice... (2, Insightful)

Lonewolf666 (259450) | more than 7 years ago | (#17722510)

Intel drivers for Linux Just Work(TM)
That might have to do with their drivers being Open Source, which has been recommended by the Linux community for a long time. According to all statements from kernel devlopers I've read, Open Source drivers are much easier to maintain.

Re:Intel Video hardware is just nice... (4, Interesting)

Andy Dodd (701) | more than 7 years ago | (#17722760)

"Intel drivers for Linux Just Work(TM). I installed Ubuntu 6.10 on my Acer notebook, with a i915g video adapter, and everything worked without any extra effort. And I'm even able to use Beryl/Compiz as my default window manager, without any stability issues."

This is because Intel's graphics chipsets are crippled and don't implement any of the features covered by other companies' patents which force ATI and NVidia to go closed-source.

You seem to forget that ATI had fully open-source drivers until they were forced to "go closed" due to licensing another company's IP for their chipsets. In that particular case, the first incident was S3 Texture Compression, a feature essentially required by all modern games, and apparently with patent licensing agreements that prohibit closed-source drivers. For a few months, S3TC was why Unreal Tournament 2003 (or was it 2k4?) only ran on NVidia cards under Linux - it wasn't until ATI released binary drivers that supported S3TC that UT2k3 would run on ATI cards under Linux.

The end result is that ultimately, the choice will not be Intel's as to whether to go open-source or not for full functionality, just as ATI had no choice but to "go closed" or simply leave certain critical features disabled/unsupported under Linux.

I call FUD (1)

oliverthered (187439) | more than 7 years ago | (#17723160)

When you write a patent you have to describe exactly how it works (well here in the UK anyway). That makes all patents open source, so if I can just look up the S3TC patent and get the algorithm why would it force ATI/NVidia to close source their drivers? That doesn't make sense.

Re:I call FUD (1)

Andy Dodd (701) | more than 7 years ago | (#17723338)

Possibly because their patent agreement with the patent owner requires the implementation to be closed source? Or the easiest way to obtain a license to use the patent was to purchase someone else's implementation?

While the patent description is itself "open", nothing says that the licensing agreements to use that patent can't force closed source.

Re:I call FUD (1)

dpilot (134227) | more than 7 years ago | (#17724054)

I would think it more likely that the "IP agreement" is not just a patent license. It's most likely a patent license plus some verilog netlist plus some driver code. It's that last part that likely precludes open source. While no doubt none of that drive code has actually survived into the final driver, it's all that "derived works" type of stuff. No doubt a cautious lawyer would consider the driver code to be partially derived from the licensed snippet. Beyond that, if they've licensed S3TC, it's not even clear that they can even document their texture compression hardware, since it is most likely considered a "derived work."

Re:Intel Video hardware is just nice... (1)

value_added (719364) | more than 7 years ago | (#17723434)

You seem to forget that ATI had fully open-source drivers until they were forced to "go closed" due to licensing another company's IP for their chipsets. In that particular case, the first incident was S3 Texture Compression, a feature essentially required by all modern games, and apparently with patent licensing agreements that prohibit closed-source drivers. For a few months, ...

So this is all the fault of gamers ... who use Windows ...?

I don't know whether to laugh, cry or punch someone in the face.

Re:Intel Video hardware is just nice... (2, Insightful)

MartinG (52587) | more than 7 years ago | (#17723990)

Why can't they release open source drivers that cover as much functionality as possible and provide a close source version optionally that includes the non-oss releasable parts?

Re:Intel Video hardware is just nice... (1)

PzyCrow (560903) | more than 7 years ago | (#17724112)

patents which force ATI and NVidia to go closed-source.

Yeah, because you can't go around revealing patents, they're supposed to be secret...

Re:Intel Video hardware is just nice... (4, Interesting)

realnowhereman (263389) | more than 7 years ago | (#17724114)

This is because Intel's graphics chipsets are crippled and don't implement any of the features covered by other companies' patents which force ATI and NVidia to go closed-source.

And I should care about that why?

Intel cards are not bleeding edge. However, if all you want is a reasonably powerful, 3D supporting card for your open source desktop, then they are perfect. I don't require a huge framerate in $LATEST_GAME, because I don't play it. If I did, then an Intel card would obviously not be for me.

My intel-based graphics work perfectly, and don't give a moments trouble. I can run 3D applications if I want, and a flashy eye-candy-full desktop too. I previously had an nVidia card, and it was nothing but a fight - is my card supported with this release of the driver? Is it crashing my computer? Is it going to compile with the latest kernel?

Nowadays, I do nothing but apt-get upgrade to keep my graphics in order and I am a lot happier for it.

Not due to patents (1)

spitzak (4019) | more than 7 years ago | (#17724184)

Patents would not require the code to be closed source.

You are probably confusing patents with copyrights on the submitted code.

The only way patents make code closed is when a company thinks they may be violating a patent and wants to hide it.

Re:Intel Video hardware is just nice... (1)

Cius (918707) | more than 7 years ago | (#17724394)

How does this theory hold up with the paradigm switch we're seeing toward unified shader architectures? With Intel's clout and their experience with multi-core design, I'm sure they could cook up a decent general purpose "stream processing engine" suitable for graphics work. In the move to general purpose GPUs, I would expect to see less emphasis on the specialized algorithms supported by these add-ins as graphics processors and more emphasis upon the ability to implement your own algorithms.

Re:Intel Video hardware is just nice... (1)

xenocide2 (231786) | more than 7 years ago | (#17724422)

Why does liscencing a patent require closed source drivers? If I recall correctly, the patent system requires disclosure as reciprocation for limited monopolies. I believe there are open source 3d drivers for the savage3 chipset, which is strange given the circumstances claimed.

Re:Intel Video hardware is just nice... (1)

cciRRus (889392) | more than 7 years ago | (#17723412)

Intel drivers for Linux Just Work(TM).
Just to add on, Intel drivers for Mac OS X 10.4.8 on the x86 platform works real fine. I get Quartz Extreme and Core Image the moment my HP D7600 (onboard Intel graphics chip) boots into OS X. However, things weren't so smooth on my other boxes with Radeon and Geforce cards.

Intel link (0)

Anonymous Coward | more than 7 years ago | (#17722358)

The summary doesn't link to the proof, which can be found on the careers section [intel.com] of Intel's website.

My money is on NVidia (3, Interesting)

LaughingCoder (914424) | more than 7 years ago | (#17722362)

Intel is years behind in this market. And they tried this once before, with dismal results: http://news.com.com/Intel+retreats+from+graphics+c hips/2100-1001_3-230019.html [com.com]

If anything the graphics market has gotten even more specialized since then. I don't know why they think they can succeed this time.

Re:My money is on NVidia (3, Insightful)

CastrTroy (595695) | more than 7 years ago | (#17722606)

But most people don't buy the top end. There's still a lot of computers being sold with Intel graphics chipsets, right on the motherboard, because most people could care less about which graphics card they have. They'd rather be playing games on their big TV with their console. As long as they can play Tetris variation #349 and freecell, they don't really care which graphics card they have.

Re:My money is on NVidia (2, Informative)

thue (121682) | more than 7 years ago | (#17723446)

most people could care less about which graphics card they have

They could care less? It would only possible do be able to care less if you actually cared.

http://www.impleader.com/photos/blog/caringcontinu um.jpg [impleader.com]

Re:My money is on NVidia (1)

CastrTroy (595695) | more than 7 years ago | (#17724172)

Yes, could care less is correct, because it's short for the phrase:

I suppose I could care less, but I'm not sure how.

Re:My money is on NVidia (4, Insightful)

thue (121682) | more than 7 years ago | (#17724372)

Yes, could care less is correct, because it's short for the phrase:
I suppose I could care less, but I'm not sure how.


I agree with you, and concede the point.*

*Here "I agree with you, and concede the point" is actually short for the phrase "I could agree with you, and concede the point, but I consider using words which mean the opposite of what you are trying to say in normal conversation to be extremely silly.".

Re:My money is on NVidia (1)

Kjella (173770) | more than 7 years ago | (#17722656)

I think it's quite simply because AMD/ATI has been flagging combined solutions, which means Intel and nVidia either need to team up or roll their own. This might be just as much strategical: "nVidia, you need us more than we need you". nVidia has proven they're no slouch when it comes to business, for example by refusing to license SLI they've muscled in on the high-end motherboard market. Intel certainly has greater ambitions than to deliver Intel CPUs to a nVidia system, and this might be a way of saying "you push us out the high-end mobo market, we'll push you out of the mid-range gfx market". I doubt Intel will have a GF8800-killer anytime soon.

Re:My money is on NVidia (2, Interesting)

should_be_linear (779431) | more than 7 years ago | (#17722764)

If I am reading this article right, "multi-core" and and "high-end" graphics probably means that intel is going after realtime ray-tracing HW support, which is seen as natural succesor of current z-buffered graphics. There are university projects already proving that ray-tracing hardware support works fine and bring way better graphics then what is available by ATI/nVidia. Battle for best ray-tracing HW will start soon among all 3 key players (ATI/Intel/nVidia) and Intel probably thinks this is right time to enter graphic HW business again, now that are all previous graphical HW patents, resarch and know-hows more or less obsolete.

Re:My money is on NVidia (4, Insightful)

suv4x4 (956391) | more than 7 years ago | (#17722806)

I don't know why they think they can succeed this time.

Remember when AMD made Intel clones down to the very chip architecture and it didn't matter which manifacturer you bought from?

Remember how AMD K5 sucked and people started leaning towards Intels? And then Pentium 4 happened, and AMD's new architecture was much superior? And then Core turned things on their head again?

Things change. I don't think we're using 3DFX cards anymore either too. They used to be ahead of everyone.

Intel is the only one... (2, Interesting)

Rastignac (1014569) | more than 7 years ago | (#17722442)

...who can compete with ATI and Nvidia.
Intel has technology, has brains, has money, has plants. They can do something "as good as" the two others. Competition is a good thing (prices falling, etc); only two main actors for videocards is a bad things.
S3 can't compete. Matrox can't compete. 3dfx can't compete (they're dead). Others can't compete. Intel is our only hope.

Re:Intel is the only one... (3, Interesting)

nbannerman (974715) | more than 7 years ago | (#17722516)

Well, SONICblue (formerly S3 / Diamond) are essentially dead as well(chapter 11, most product lines sold off), but Matrox still survive with a 3-5% share of the market, and they're doing fairly well in niche markets - scientific, medical, military and financial. As for 3dfx, their assets (intellectual and staff) where purchased by NVIDIA; so any innovation from their prime years is probably still alive and well (to a degree).

Re:Intel is the only one... (3, Informative)

guy-in-corner (614138) | more than 7 years ago | (#17723578)

The graphics part of S3 was sold to VIA at about the same time as it transformed to SONIC|blue. So the Chapter 11 thing is irrelevant.

Re:Intel is the only one... (1)

sjf (3790) | more than 7 years ago | (#17722756)

Intel has technology, has brains, has money, has plants.

What they don't have, though, is ATI(AMD) and NVIDIA's patent portfolios.

Re:Intel is the only one... (0)

Anonymous Coward | more than 7 years ago | (#17724094)

Right. Poor, patentless Intel won't stand a chance. If only they had patents of their own, they could negotiate a cross-licencing deal, but alas...

Re:Intel is the only one... (0)

Anonymous Coward | more than 7 years ago | (#17722832)

ATI bought 3dfx, thus providing us with only two major video card manufacturors.

Close... (1)

Junta (36770) | more than 7 years ago | (#17723140)

But nVidia bought 3dfx, not ati...

If they really do this, I am sold (1)

jonwil (467024) | more than 7 years ago | (#17722468)

If Intel can make a graphics card that is better than my current GeForce FX 5700LE in all areas (including shader performance) I am sold.
Especially if they have open source Linux drivers for the thing :)

Re:If they really do this, I am sold (1)

bmgoau (801508) | more than 7 years ago | (#17722594)

Right now there are two seperate generations of cards on the market (even an emerging 3rd), some very very affordable, which can best that card, and provide (in a roundabout way) support for linux. The utility cost of waiting for intel to bring out their chip may be disproportionate to simply buying todays budget cards.

Re:If they really do this, I am sold (3, Insightful)

jonwil (467024) | more than 7 years ago | (#17722742)

If I wanted to run binary kernel modules I would just buy a 6xxx series NVIDIA and be done with it.
I specifically said "Open Source" :)

Re:If they really do this, I am sold (-1, Troll)

Anonymous Coward | more than 7 years ago | (#17723290)

lol 5700.... You could buy a better card than that piece of junk for $80.

And you didn't specifically say "open source," you said "especially open source." That implies a secondary importance to it, that you would buy it without open source if you had to.

Besides, what's so damn important that your graphics card drivers need to be open source? It's not like you pay for them, or can't tweak the performance if it's closed source. :/

----

you should read a few comments up, about the texture compression thing. It's impossible to write open source drivers that do many of today's graphics features unless you coded them all yourself form scratch. This is Intel we're talking about, they're hardly likely to invest millions of dollars and reinvent the past 15 years of graphics drivers just so some geek can soil himself over it being "open source."

Re:If they really do this, I am sold (1)

Jesus_666 (702802) | more than 7 years ago | (#17724346)

Mutually exclusive. You get either new features or open source drivers. Many of the more interesting features newer GPUs offer are covered by patents etc., which require Intel to license the stuff. The license agreements will most probably contain clauses that prohibit the disclosure of implementation details; for example because the licence comes with example code and Intel probably doesn't want to go to great lengths to be able to prove in court that their implementation is not derivative of the example code - thus their entire system becomes tainted and they can't release the driver code or specs to the OSS crowd. That's why ATI and NVidia drivers are closed-source, too.

So either you get something akin to Intel's current integrated GPUs or something with a closed driver. A powerful OSS-friendly card is very highly improbable.

I wonder if this will change onboard graphics... (2, Interesting)

jimstapleton (999106) | more than 7 years ago | (#17722470)

Will Intel be clever enough and innovative enough to have a "GPU" socket on such motherboards? Maybe even GPU-specific memory sockets rather than shared memory?

One can always hope.

Re:I wonder if this will change onboard graphics.. (1)

ZachPruckowski (918562) | more than 7 years ago | (#17722552)

I doubt this will eliminate onboard graphics. At the low-end price range and in the light-weight mobile market, they're simply necessary. But if Intel could produce an onboard graphics chip that would compete with the 300-series (low-end discrete) from Nvidia and ATi, that could change the game.

It's also unlikely Intel boards would have a GPU slot that's not PCIe (or PCIe 2.0), since no one would buy a motherboard that locks them into only Intel. Even Crossfire/SLI boards allow you to have one of the other guy's card.

Re:I wonder if this will change onboard graphics.. (1)

Don_dumb (927108) | more than 7 years ago | (#17723444)

Considering most motherboards are sold already in PCs I doubt most people even know or care that they are locked into Intel only and anyway motherboards don't lock you in that much, you can just get a new motherboard without worrying too much. Intel boards don't allow you to plug in an AMD processor chip after all and I dont hear many people complaining about being locked into a processor manufacturer. The performance, efficiency and manufacturing gains might well be worth producing powerful on board graphics, the wheel of reincarnation revolves onwards.

Re:I wonder if this will change onboard graphics.. (4, Interesting)

Lonewolf666 (259450) | more than 7 years ago | (#17722560)

That socket is usually called a "PCIe slot" these days. If you use a socket instead of just integrating the graphics chip into one that is onboard anyway, you might as well use the established solution.
Another interesting approach (albeit not for high end machines and somewhat OT here) is AMD's plan to integrate the GPU with the CPU. That way, you might have some more choice than with a soldered in chip, and GPU cooling could profit from the availability of decent CPU coolers.

Re:I wonder if this will change onboard graphics.. (1)

TheThiefMaster (992038) | more than 7 years ago | (#17722714)

Don't forget that AMD was also talking about opening up the hypertransport spec and their cpu socket (from the 4x4 line?) so that you could use a cpu and another chip, either graphics or specialist, in a dual-cpu board. Would be interesting to see a socket 940 graphics processor, that's for sure.

Re:I wonder if this will change onboard graphics.. (1)

Joe The Dragon (967727) | more than 7 years ago | (#17724424)

A HTX slot video card is better.

Re:I wonder if this will change onboard graphics.. (1)

jimstapleton (999106) | more than 7 years ago | (#17722718)

you missed the point entirely.

This doesn't take up one of your expansion slots, since you already have the graphic-out ports on the motherboard in such solutions. Meaning in a small-form-factor machine, you have one more option for tweaking the system to what you want/need.

Re:I wonder if this will change onboard graphics.. (0)

Anonymous Coward | more than 7 years ago | (#17722918)

And you missed the point entirely as well.

Unless a specialized connector offers some particular advantage (e.g. like AGP did over PCI), it's best to stick to the general-purpose connector, due to manufacturer familiarity and economies of scale. Sure, you lose an expansion slot, but that slot has motherboard real estate available to it, because it's not being used up by some special graphics socket.

Re:I wonder if this will change onboard graphics.. (0)

Anonymous Coward | more than 7 years ago | (#17723874)

Muppet

Re:I wonder if this will change onboard graphics.. (1)

specific (963862) | more than 7 years ago | (#17723294)

This would be nice in a laptop. Nvidia's effort to standardize MXM [wikipedia.org] (Mobile PCI Express Module) hasn't been widely received by ODM's, or it was dropped, or it wasn't properly adhered to, etc... Having a solution in the iMac, Mini, or small form factor would make sense. I think engineers could design a board with this kind of integration without increasing its 'footprint' so to speak.

Intel can interface with theircpus (2, Interesting)

majortom1981 (949402) | more than 7 years ago | (#17722636)

Intel has a chance. Intel has the experience with cpu's. Intel can also interface with their new processors. I think Intel could atleast put up a good fight. Why do you think Amd bought ati? They know that intel can do gpus and really good ones if they tried and the only way amd would be able to compete would be buying a gpu maker wich they did.

Re:Intel can interface with theircpus (1)

dusanv (256645) | more than 7 years ago | (#17723372)

They know that intel can do gpus and really good ones if they tried

Then I guess Intel hasn't really tried yet because they haven't yet produced a half decent graphics chip in their history (current integrated graphics lineup *stinks* compared to ATI/NVidia integrated stuff). I don't think anyone is afraid of discrete Intel graphics cards.

Re:Intel can interface with theircpus (0)

Anonymous Coward | more than 7 years ago | (#17724276)

I used to think that based on previous experience with older cards, but I'm changing my mind on that based my new laptop. I thought I'd be forced to get a separate machine with a 'real' graphics card for gaming, but so far I've been getting an all-around acceptable performance. (I'll admit I play mostly older games.)
It's by no means high performance, but from what I can tell the Intel integrated chipsets have been pretty good low-end graphics cards since they got hardware shaders, which used to be the main reason nothing good would run on them.

So, yeah. We all know Intel doesn't make high-end graphics cards, but it's not like it's because they're incompetent. They are half decent, and they do not stink.

I'll believe it when I see it (1)

tedgyz (515156) | more than 7 years ago | (#17722716)

I've never met an Intel graphics solution that could play anything more intense than Solitaire. Is the G965 any good?

I just upgraded my sister's mobo + CPU. It had embedded graphics, so I figured it would be comparable to her 2 year old nVidia AGP card. Nope. I had to buy a new PCIe nVidia card to handle Sims 2.

On a side note: Has anyone noticed that the extremely popular family-friendly 3D games are the worst performers? Sims 2 and RCT3 both take eons to load - much slower than Q4 or HL2.

Passable. (1)

Svartalf (2997) | more than 7 years ago | (#17722866)

It's performance is on a par with the IGP's about 3 iterations back from ATI- mostly due to immature drivers. It's closer in performance to the previous generation of integrated graphics (Which happens to be a chip from the previous era of GPUs with a vastly lower power consumption due to process shrink and logic improvements...)- some things it bests ATI's chips, other things ATI's chip with it's current drivers pastes it all over the place. The chip's capable of quite a bit more, but it's hampered by an immature driver (gee, this sounds familiar...) so it's a mixed bag right at the moment. The Larabee Group, depending on what they do, might actually give the other players a run for their money.

Re:I'll believe it when I see it (0)

Anonymous Coward | more than 7 years ago | (#17722896)

It has a lot to do with the quality of the programmers at the company. The Sims is made by EA which treat the programmers as cattle and they trust that if you throw enough programmers at a task it will be completed and it will work as envisioned, just not with optimal performance. As a recent example we can take StarTrek Legacy; with all settings on high the game runs bog slow even on a fast machine even though the grahpics are not that advanced. Now look at THQ who made Dawn Of War, that game loads faster than any other game i have, looks awesome and shows the brilliance of the programmers who made it. Why some companies like EA fails to write good code, i dunno. THQ, Valve, ID and Blizzard are all good examples of companies that can write good optimized code.

Re:I'll believe it when I see it (1)

Eideewt (603267) | more than 7 years ago | (#17723384)

Intel isn't that bad. A 915 plays Halo fine, and it *is* a budget chip. Point taken, Intel chips don't do what the brawnier ones do, but I don't think Intel has demonstrated incompetence in the field yet.

Real-time raytracing from Intel ? (3, Interesting)

Rastignac (1014569) | more than 7 years ago | (#17722724)

I've been waiting for years for such kickass videocards. I've seen running protoypes in labs/universities; quite impressive videos. After a few years, now, the technology should be ready for the big market ? Pixar-like technology at home !
Real-time raytracing needs a lot of power; so, a multicore videocard is a great idea ! With raytracing, each core can compute one part of each picture. Better than SLI.
Using their knowledges, Intel can build a very fast multicore real-time raytracing videocard. It will be "something different", and it will compete with ATI and Nvidia in a new innovative way...

Re:Real-time raytracing from Intel ? (1)

pipatron (966506) | more than 7 years ago | (#17723104)

But no games would work. :)

Re:Real-time raytracing from Intel ? (1)

Rastignac (1014569) | more than 7 years ago | (#17723414)

Games WILL work. I've seen existing demos of OpenGL games (Doom3, etc) running on real-time raytracing videocard prototypes !
The games just call OpenGL or DirectX or whatever. You just need good drivers to handle instructions from OpenGL (or DirectX) to hardware-inside-instructions. The drivers make the translation. Good drivers = no problems.

Re:Real-time raytracing from Intel ? (1)

Eideewt (603267) | more than 7 years ago | (#17723460)

But no games would work without driver support for OpenGL/DirectX, you mean. If the chips can handle rendering duties, then they can presumably render something like the average card's output as well.

Documentation (0)

Anonymous Coward | more than 7 years ago | (#17722794)

I wish this one comes with complete documentation so I can fully use it when I make my own OS. I couldn't find one for G945. :(

Thanks but no thanks (4, Funny)

markov_chain (202465) | more than 7 years ago | (#17722876)

Until this new hardware will let me display fractional polygons I'm sticking to my continuous graphics board.

But will they do DVI? (1)

kimvette (919543) | more than 7 years ago | (#17722972)

But will they include DVI? Better yet, dual DVI for those who run either dual monitors or really large monitors which require dual link?

Re:But will they do DVI? (5, Insightful)

Slashcrap (869349) | more than 7 years ago | (#17723566)

But will they include DVI? Better yet, dual DVI for those who run either dual monitors or really large monitors which require dual link?

No, in fact they aren't even going to include DSUB outputs. They are going to use modulated RF outputs like you got on the ATARI ST and AMIGA. They will be capable of displaying NTSC resolutions at anything up to 60Hz refresh rate.

What the fuck do you think?

Re:But will they do DVI? (1)

Jesus_666 (702802) | more than 7 years ago | (#17724578)

This comes on the heel of the news about Creative ditching the Audigy brand in favor of the new "SIDmeister" series, which will be based around the MOS Technology 6581 Sound Interface Device, offering stunning three-voice 16-bit 2.0 surround sound*.

Creative is confident to bring the first 100,000 units to market in fall 2007. "We have already managed to find 80 SIDs," a spokesperson said, "and we're pretty sure we can get the other 99,920 ones in the next couple months."


* Sound will appear to come from everywhere if listener is wearing headphones.

Welcome back, Intel (4, Insightful)

BillGatesLoveChild (1046184) | more than 7 years ago | (#17722988)

Intel's previous foray into the Discrete Graphics Market was the Intel i740. I got one, agreeing with PC salesman "Hey, you can't go wrong with Intel can you?" It was quite a decent chip for its time, and the driver was very stable. I don't ever recall graphics hanging once! It was disappointing when Intel bailed out of the 3D market, but to their credit they continued to update the drivers whenever a new version of DirectX rolled out.

Intel have already made a return of sorts to 3D with their Media Accelerator 9XX series chips you'll find in many Intel laptops. It's funny, because you'd expect an embedded chipset to be lame; lowest common denominator, shared RAM and akk. But this lappie has it and the graphics scream. It's faster than my nVidia 5700 which is two years old. The driver is stable too; never crashed. If they can do this with an embedded chipset 3d, imagine what they can do when they really put their mind to it?

nVidia and ATI have the market to themselves these days. nVidia has got pretty lax regarding driver stability for these days, and it's damned near impossible to get support out of them. They've fobbed off support to OEMs, who slap electronics onto cards and are in no position to help with driver problems. That's the sort of thing that happens when a company dominates a market.

If Intel can come out with some high performance electronics and stable drivers, well, Welcome back, Intel! I for one welcome you as my new Overlord!

Question about Intel Media Accelerator 9XX (2, Interesting)

StressGuy (472374) | more than 7 years ago | (#17723374)

I've been monitoring this thread with some interest. I'm looking to build a new home computer that will run Linux exclusively (most likely Kubuntu). Mostly, it will be my personal workstation but I do plan to install some games - mostly 1st person shooter types. While I don't require "cutting edge", I would like decent performance. Can this chipset handle things like the latest UT or Doom III on Linux?

I mean, I like nVidia, but if Intel is supported out-of-the-box with open source drivers, then that works for me as well.

Re:Question about Intel Media Accelerator 9XX (1)

BillGatesLoveChild (1046184) | more than 7 years ago | (#17723894)

Intel have a Game Compatibility List on their website. Not sure what your situation is, but my lappie use i945. DOOM III is fine, but Quake IV isn't. If Intel want to really penetrate the graphics market, obviously their next list will have to be all green spots. "Intel: The way it's meant to be played!" ;-)

http://www.intel.com/support/graphics/intel945gm/s b/CS-021400.htm [intel.com]

Serious gamers bitch that the 9XX series is low end. Maybe it is. But it whips my nVidia 5700 and that's good enough for me!

Here's a list of supported Linuxes. Bizarrely Ubuntu isn't mentioned, but some folks seem to have it running. Mandriva works out of the box. Google is your friend. You may be able to find more up to date information than these: Good Luck!

http://www.intel.com/cd/channel/reseller/asmo-na/e ng/products/linux/feature/279817.htm [intel.com]
http://www.ubuntuforums.org/showthread.php?p=20287 83 [ubuntuforums.org]
http://www.linuxquestions.org/questions/showthread .php?t=435050 [linuxquestions.org]

Re:Welcome back, Intel (1)

RulerOf (975607) | more than 7 years ago | (#17723556)

...imagine what they can do when they really put their mind to it?

Read: Cash.

Re:Welcome back, Intel (1)

Slashcrap (869349) | more than 7 years ago | (#17723638)

It's faster than my nVidia 5700 which is two years old.

Most people on Slashdot these days are uncritical and accepting enough to believe that without querying it, just because some random person wrote it in a not obviously troll like manner.

I think you're full of shit. Prove otherwise.

And as for the i740 being quite a decent chip for its time, that's just fucking hilarious. Again, most people on Slashdot probably don't remember that far back.

You haven't provided any data either... (1)

StressGuy (472374) | more than 7 years ago | (#17723898)

So, what we really have is one person saying, "it's better than my nVidia 5700" in a non-troll-like manner vs. you saying "you're full of shit" in a...well..toll-like manner.

So, we have two opposing opinions, one with only anecdotal evidence and one with....bupkis.

Why don't you do the "uncritical and accepting" masses on Slashdot a favor and point us to some data we can use?

In fact, maybe you should, otherwise, you might start to look like the one who's "full of shit".

Re:You haven't provided any data either... (1)

siDDis (961791) | more than 7 years ago | (#17724388)

I have a Core Duo laptop with GMA 950 (or is it 945?) and I can confirm that in *cough*.... Wintendo it has fast enough 3D acceleration. I've not tried 3DMark, however I can play Quake 4 at low settings perfectly smooth with an average of 30 frames per second(I belive it's most likely because of it's exellent support for two cpu cores). Today I have two games that I usually play on this laptop. This is Warcraft 3 which runs smooth as silk(that means at least 60 frames per second) and Civilization 4 which is a bit slow(like 15 fps), but fully playable. In Linux/Ubuntu I can't play a single 3D accelerated game fine at all, though some of the 3D screensavers seems to work fine.

The i740 *was* a decent chip (1)

meosborne (8640) | more than 7 years ago | (#17723904)

The i740 was a very decent for corporate use. It was inexpensive, had stable drivers for both Windows *and* Linux, and had decent performance for everyday tasks. It worked well without muss or fuss. I would definitely call that decent.

It wasn't the greatest gamer's chip ever made for sure, but most tasks on a computer aren't games, especially in a business.

Re:Welcome back, Intel (1)

nxtw (866177) | more than 7 years ago | (#17723720)

my laptop's GMA 950 and my HTPC's GMA 900 both have never been the cause of a driver crash. I can't say that about the ATI 9600 in my old desktop or the X1400 in my old laptop. the GMA 900 outputs 1080p (1920x1080) to my TV and the GMA 950 in my laptop outputs two 1280x1024 (total 2560x1024) displays.

My current laptop is rated for 3/4 the battery life when fitted with an ATI X1300 GPU.

Driver Open Sourcing (4, Interesting)

Midnight Warrior (32619) | more than 7 years ago | (#17723040)

Has anyone considered that the reason ATI/NVidia won't open source their drivers/firmware is because there are blatant copyright and patent violations in their code? I'm not saying there are violations, but if there are, then I would expect each to violently defend against anyone seeing their source code. To date, the best argument heard is that access to the code would provide their competitors an unfair advantage into their optimization techniques, which most of us recognize to be hog wash. At worst [zdnet.com] , they wrap it up in "we have licensed proprietary algorithms" declarations and refuse to give the community a chance to work around those algorithms.

There is only one way forward. NVidia should fund the effort to rewrite their firmware/drivers, providing only the hardware register descriptions and nuances. I'm quite sure others have asked NVidia to do this already, but Intel moving forward with this plan should force the other's hand. I'm surprised that Microsoft hasn't chimed in here because for every open specification we get in the OSS world, they also get. That's where all those Microsoft drivers come from. And only on occasion is a vendor-supplied driver better that the Microsoft one. Open sourcing any drivers also helps Microsoft support more hardware out of the box, without a multitude of licensing agreements and royalty schemes.

And of course, NVidia (and now ATI) have been adding more treasure to their war chests with the PCIe motherboards. I just bought a new motherboard and it's extremely hard to find a new board with PCI-Express that doesn't have an nForce or ATI chipset.

It's going to be a tough game for Intel because it's not just graphics drivers. AMD could play into this game if they took a decisive maneuver with their GPU integration into the CPU. Remember that AMD now owns ATI.

Re:Driver Open Sourcing (2, Insightful)

Cheesey (70139) | more than 7 years ago | (#17723674)

Has anyone considered that the reason ATI/NVidia won't open source their drivers/firmware is because there are blatant copyright and patent violations in their code? I'm not saying there are violations, but if there are, then I would expect each to violently defend against anyone seeing their source code.

Yes, this has been suggested before. These violations, if they exist, may not be deliberate though.

Remember that software patents are often very broad. It is hard to write any software at all without violating some patent or other. If you write software, and you have a lot of money, the patent trolls will come knocking. Giving away source code makes the troll's job much easier. Perhaps that is part of what NVIDIA and ATI want to avoid.

Another problem is that they've used other people's code under NDA in their drivers. There is a similar problem with Windows - Microsoft could not release the source as free software without removing a lot of third-party components.

Re:Driver Open Sourcing (1)

NSash (711724) | more than 7 years ago | (#17724408)

There is only one way forward. NVIDIA should fund the effort to rewrite their firmware/drivers, providing only the hardware register descriptions and nuances.

What is NVIDIA's incentive to do this?

Industry Benefit (2, Interesting)

Darkryft (1054812) | more than 7 years ago | (#17723452)

I believe in competition being good, but I'm not sure it's all about just competition. This likely could be the move to save PC Gaming as a whole. Technology-wise PC's will always have superiority over consoles, but there are rare arguments to the economics of top-end gaming PC's against consoles. Microsoft and Sony take huge losses to push their hardware, and slowly but surely it does pay off - Gears of War on the Xbox 360 has sold 3 million copies in just a hair over 60 days. Name one PC title that is using every bleeding-edge technology and has sold that many copies that fast. You won't find it, because the segment of people who will pay between $2500 and $5000 for a PC to play those kinds of games (Crysis, Oblivion) is so small you can't hope to sell that many copies. Intel knows how to make computer chips quickly, and on the cheap. That is what I feel they are bringing to this contest. I think Intel believes they can make a graphics platform just as powerful or more powerful than Nvidia/ATI and can do it for less cost. That is how you generate competition not just in the graphics sector, but you make PC's more competitive against the consoles. The PC has endless amounts of good games to sell, the problem is there aren't cheap PCs that will play them with the slickness that consoles provide. Ultimately this move should make the top-end PC cheaper, which is good for everyone because the inherent competition will force Nvidia/ATI to lower prices. I like this move. Go Intel!

compete against a decade of experience (1)

peter303 (12292) | more than 7 years ago | (#17723534)

Even Intel doesnt get it perfect in the fist chips.
Another considerationis they may have to emulate someone elses API. Too much software out there to have a new one.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?