Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel To Integrate DirectX 11 In Ivy Bridge Chips

kdawson posted more than 3 years ago | from the keeping-up-with-the-jonses dept.

AMD 199

angry tapir writes "Intel will integrate DirectX 11 graphics technology in its next generation of laptop and desktop chips based on the Ivy Bridge architecture, a company executive revealed at CES. AMD has already implemented DirectX 11 in its Fusion low-power chips. Intel expects to start shipping Ivy Bridge chips with DirectX 11 support to PC makers late this year. Ivy Bridge will succeed the recently announced Core i3, i5, and i7 chips, which are based on Intel's Sandy Bridge microarchitecture."

cancel ×

199 comments

Sorry! There are no comments related to the filter you selected.

also includes DRM ? (5, Insightful)

Anonymous Coward | more than 3 years ago | (#34824864)

does it still contain the DRM restrictions capability ?,

because Intel can forget all about CPU sales from us and from any of our customers until its removed

i dont care if it promises a free pony
contains DRM==No sale

period

Re:also includes DRM ? (0)

Anonymous Coward | more than 3 years ago | (#34824956)

contains DRM==No sale

period

Until there are no comparable alternatives anymore... what would prevent AMD from implementing DRM, too?

Re:also includes DRM ? (1)

trum4n (982031) | more than 3 years ago | (#34825320)

Sales.

Re:also includes DRM ? (1)

Anonymous Coward | more than 3 years ago | (#34825432)

Yeah right.

They won't care about the few lost sales because of the /. crowd. The rest of the world will still buy their products. There may be a small outcry around IT people, but because they are so dependent on the technology, they will have choice other than to buy it. After a while, the whole thing will be buried into oblivion.

Re:also includes DRM ? (1, Funny)

fnj (64210) | more than 3 years ago | (#34825016)

What the heck are you babbling about? Do you have the slightest idea?

Re:also includes DRM ? (5, Informative)

supersloshy (1273442) | more than 3 years ago | (#34825218)

What the heck are you babbling about? Do you have the slightest idea?

I believe he's babbling about this [techdirt.com] . Sandy Bridge will have DRM in it (though they don't call it that for some weird reason), and Sandy Bridge is directly related to Ivy Bridge [wikimedia.org] , so therefore it could possibly inherit the DRM features of Sandy Bridge.

Disclaimer: I am a total n00b when it comes to discussing processor architectures, so I could be wrong about something.

Re:also includes DRM ? (3, Interesting)

fnj (64210) | more than 3 years ago | (#34825472)

At least that is a coherent discussion, which I haven't seen elsewhere. But when idiots talk about DRM, they lose contact with reality. Content producers want true end to end DRM for obvious reasons. This just gives them a way to realize that. It can't encumber anything that presently exists. It just allows some new DRM'ed protocol to be developed; one that only works on recent Intel processors.

So what? If you don't like closed content, just don't use it!

Re:also includes DRM ? (4, Insightful)

julesh (229690) | more than 3 years ago | (#34825792)

So what? If you don't like closed content, just don't use it!

Widespread deployment of systems that allow closed content are likely to encourage content providers who are releasing content using current unprotected or insecure systems to switch to a more secure closed system. This reduces the utility of open source software, which almost universally is unable to take advantage of this kind of system due to protection measures that typically require signed trusted code. Hence, it is something that should be discouraged.

That said, boycotting closed media is likely to be just as effective as boycotting hardware that supports it; probably more so, as it is somewhat more direct.

Re:also includes DRM ? (1)

fnj (64210) | more than 3 years ago | (#34825854)

Almost all boycotts are quixotic.

Re:also includes DRM ? (1)

mdielmann (514750) | more than 3 years ago | (#34826284)

It can't encumber anything that presently exists.

So you can't imagine a scenario where the content playback system will only play properly signed media? Sure, it's a broken system (for something that can play purchased media), but it's been tried before, and will be tried again. I'm hoping none of the future implementations succeed.

Netflix falls into this category. While I don't like DRM in principle, I can accept it in certain areas. Media rentals is one of them.

and if... (1)

aeoo (568706) | more than 3 years ago | (#34826392)

So what? If you don't like closed content, just don't use it!

And if you don't like the CPUs that support the creation of the closed content, just don't buy them!

Re:also includes DRM ? (1)

Monkeedude1212 (1560403) | more than 3 years ago | (#34825224)

He's babbling about DRM.

What that has to do with this Intel Chip? I don't know. But at least I have a SLIGHT idea what he's ranting about.

Re:also includes DRM ? (2)

Joce640k (829181) | more than 3 years ago | (#34826036)

Anything with an HDMI output has to support DRM so people can't record the signal.

(We have the master key so, yes, it's a waste of time but Intel is contractually bound to support the DRM if they want to have HDMI output)

Re:also includes DRM ? (0, Redundant)

Shikaku (1129753) | more than 3 years ago | (#34825148)

http://en.wikipedia.org/wiki/Trusted_Platform_Module [wikipedia.org]

Are you talking about hardware DRM? Or are you talking about software DRM? Intel has nothing to do with DRM unless it's TPM. It is (for now unless you have a laptop, and even then used nowhere except companies with critical data) almost all software.

How about you explain yourself because your post is cryptic FUD.

Note: the most DRM I'd advocate ever is Steam facsimile DRM.

Re:also includes DRM ? (1)

JonySuede (1908576) | more than 3 years ago | (#34825290)

go read the slashdot [slashdot.org] article on sandy bridge

Other OSes ? (5, Interesting)

SirGeek (120712) | more than 3 years ago | (#34824866)

Will Intel provide documentation so that other OSes will be able to make use of this feature ?

Re:Other OSes ? (3, Insightful)

Surt (22457) | more than 3 years ago | (#34825144)

Almost certainly. They want to sell hardware, and being a full generation or more behind their competitors, have no reason to hold back any secrets of their implementation.

Re:Other OSes ? (1)

Threni (635302) | more than 3 years ago | (#34825160)

What about OpenGL? Does this hardware help OpenGL support?

Re:Other OSes ? (3, Insightful)

Surt (22457) | more than 3 years ago | (#34825212)

Yes. Assuming someone writes the driver. DX11 is a bit ahead of OGL in hardware requirements/capabilities, so full support for dx11 means it has everything OGL needs also.

Re:Other OSes ? (5, Informative)

kyz (225372) | more than 3 years ago | (#34825556)

Better than that. In OpenGL, you say "give me this vendor-specific feature" you get it. Programmers have used this to get at the latest features of chipsets long before they're standardized.

OpenGL programmers are always ahead of DirectX, even in this case where the hardware directly targets future DirectX specs.

It's like using -moz-border-radius, -webkit-border-radius and -khtml-border-radius to get CSS3 rounded borders long before CSS3 is officially released, and yet CSS3 won't be beholden to any one browser's implementation.

Re:Other OSes ? (2)

Surt (22457) | more than 3 years ago | (#34825616)

You can get to the vendor specific features in directx also. But in either case, that's definitely the ugly way to write code.

Re:Other OSes ? (2)

SplashMyBandit (1543257) | more than 3 years ago | (#34826414)

> You can get to the vendor specific features in directx also. But in either case, that's definitely the ugly way to write code.

lol. Some folks still don't get it. Direct X is 'vendor specific' no matter what manufacturer's chipset is supported. That's why the guys doing OpenGL (ES) can write for Android, and iPhone/iPad, and Linux, and Solaris, and Max OS X, *AND* Windows.

Incidentally, your "DX11 is a bit ahead of OGL in hardware requirements/capabilities" is incorrect (used to be true for a while not so anymore). Suggest you check out the latest and greatest OpenGL spec.Oh yeah, OpenGL can do what DX10 & 11 do (if you have the graphics hardware) on Windows XP too. Enjoy your homework reading the OpenGL spec.

Re:Other OSes ? (2)

citizenr (871508) | more than 3 years ago | (#34825978)

Almost certainly. They want to sell hardware, and being a full generation or more behind their competitors, have no reason to hold back any secrets of their implementation.

sure, just like GMA 500

Re:Other OSes ? (0)

Anonymous Coward | more than 3 years ago | (#34825202)

Direct X API is Direct X API.

They don't need to include any more documentation about the API than ATi or NVIDIA does since it's the exact same thing

Oh, also, why would Intel need to provide anything for other OSes. Direct X is a Microsoft product, not an Intel product.

http://en.wikipedia.org/wiki/DirectX

Re:Other OSes ? (4, Informative)

petermgreen (876956) | more than 3 years ago | (#34825518)

Direct X is a Microsoft product
Direct X isn't really a product (you can't buy it and never have been able to). DirectX itself is an interfaces supplied by windows for various things gaming related. Most significantly these days 3D graphics.

These days each version of directx specifies a set of required features. A "DirectX 11 card" means a card that implements all the features required by DirectX 11. In this context it's perfectly reasonble to ask whether those features will be exposed to other operating systems.

Re:Other OSes ? (1)

Jeremy Erwin (2054) | more than 3 years ago | (#34826254)

And just how were you planning to write the drivers without documentation?

RISC please (-1)

Anonymous Coward | more than 3 years ago | (#34824880)

Can we switch back to a RISC architecture, please? Or is it to late?

Re:RISC please (1)

Surt (22457) | more than 3 years ago | (#34825042)

For the foreseeable future you can have your pick of ARM and x86.
On the plus side, x86 has been pretty much RISC internally for a long time now. And a lot of the ISA has been changed over too. Once they tack on one or two more ISA extension you'll be able to have 100% of your code avoid the x86 path.

Re:RISC please (2)

sexconker (1179573) | more than 3 years ago | (#34825572)

Nvidia is making ARM CPUs.
The next version of Windows will run on ARM.

So, yes.

And if you're a Linux zealot, you can compile your kernel for whatever target hardware you want.

Re:RISC please (4, Insightful)

the linux geek (799780) | more than 3 years ago | (#34826060)

Why? What RISC architecture provides the same price/power/performance ratio that x86 provides?

POWER is fast and has an excellent power/performance, but entry-level systems cost ~$3500 after discounts.
Itanium is fast, but expensive and power-hungry.
MIPS is fast and power-efficient, but none of the players in the high-performance MIPS market have any interest in anything but network processors.
SPARC gives you two options - SPARC64 (slow, expensive, power-inefficient) and SPARC T-series (fast, but only for throughput-driven workloads; expensive; fairly power-hungry)
ARM has good power and price characteristics, but is slow compared to any production x86 chip except the Atoms and ULV stuff.

Basically, I'm not seeing a credible alternative to x86 for the market that it thrives in. If you want to pay up and get a nice fast RISC system, they're out there; alternatively, if you want a somewhat slower one for cheap, ARM is always available.

Hard-wired DirectX? (0)

Wowsers (1151731) | more than 3 years ago | (#34824892)

And what use is this to those that do not use Microsoft Windows? And what use is it when a bug is found in DirectX, you can change software, but hardware?

Re:Hard-wired DirectX? (0)

peragrin (659227) | more than 3 years ago | (#34824954)

Worse what happens when directX 12 comes along? is the hardware useless? can the hardware be upgraded? Direct X 10 is only 4 years old. that isn't a lot of time for hardware. Not when people are getting 6+ years out of a given machine.

Re:Hard-wired DirectX? (4, Insightful)

Anonymous Coward | more than 3 years ago | (#34825072)

Worse what happens when directX 12 comes along? is the hardware useless? can the hardware be upgraded?

1) The same thing that happens when you install DirectX 10 on a DX9 card: the DX9 subset of DX10 is hardware accelerated, the DX10 parts are run in software.

2) No. It's not useless. It will still accelerate everything it was accelerating before.

3) Probably not. But who cares? Either replace it, or live with a subset of current functionality.

Re:Hard-wired DirectX? (2)

Surt (22457) | more than 3 years ago | (#34825100)

What happens to your nvidia 580 card when dx 12 comes along? Exactly the same thing happens with these cpus. Either you live with the reduced functionality, or you put in a new video card, assuming your motherboard has a graphics card slot.

Re:Hard-wired DirectX? (1)

Monkeedude1212 (1560403) | more than 3 years ago | (#34824988)

And what use is it when a bug is found in DirectX, you can change software, but hardware?

Well, considering DX11 has been out for a while and has been generally tested for bugs already - the idea is that you won't HAVE a bug if it's in the hardware - theres no where for the variables to change values based on a different CPU build or other factors if the calculations are specifically designed to run on that piece of hardware. At least, thats the theory.

But yeah - this does nothing if you typically aren't running Windows. Though I'm more concerned on what this will do to the future of DirectX. Where will the push be to improve things for a DX12 if everything is neatly designed for DX11? We've got enough backwards compatibility issues with old games requiring DX3 not working anymore.

This seems like a prime time for OpenGL to pick up speed. Specific hardware to meet DX11 makes it sound like the DX development process is becoming stagnated. Otherwise, why would you bother?

DirectX isn't open (1)

bussdriver (620565) | more than 3 years ago | (#34825400)

OpenGL has to please a large group with more uses than just games; it is done with input from the wide range of developers that use it. Its open, more democratic.
The DirectX dictatorship is faster and likely more efficient (in a way) but it comes at a price that wiser people are not willing to pay.

I'll take slow freedom.

If they could do everything ass-backwards without a speed loss just to make it extremely hard to port to/from OpenGL DirectX would do that. If they really just wanted to move faster, they could add layers above OpenGL and hack in new features into OpenGL; essentially fork it. They don't because there intention is not merely to be ahead of the curve; its about power.

Re:Hard-wired DirectX? (1)

0123456 (636235) | more than 3 years ago | (#34825812)

the idea is that you won't HAVE a bug if it's in the hardware

I can tell you've never developed graphics hardware or drivers... I'm sure the people I know who do that will be glad to know that they won't have to work around chip bugs anymore.

Re:Hard-wired DirectX? (1)

Monkeedude1212 (1560403) | more than 3 years ago | (#34826342)

I've worked with directX at a low level a bit, but no I've never actually to develop the hardware or the drivers for such devices.

What I was getting at is that if the Chip is designed specifically for DirectX11, you shouldn't have DirectX11 bugs. Yes, chip bugs definately do exist, but I would think (though I have no proof) that when a piece of hardware is designed for a specific task, it generally preforms that one task better and has issues elsewhere.

Re:Hard-wired DirectX? (1)

Joce640k (829181) | more than 3 years ago | (#34826098)

It's not really hard-wired hardware these days. The graphics chip runs code which is uploaded when the machine boots. Fixing a bug is usually just a driver update.

Re:Hard-wired DirectX? (1)

LWATCDR (28044) | more than 3 years ago | (#34825028)

ahhh... No.
DirectX has certain hardware requirements. They are not going to hardwire in DirectX but will instead support all the hardware features that DirectX 11 needs.
I hope they support OpenCL as well.
I am not a gamer but I would love to see more programs use the GPU for trans-coding and other none game play uses.
DX11 does support GPGPU but I use OS/X, Linux, and Windows so I want standards support.
 

Re:Hard-wired DirectX? (1)

Surt (22457) | more than 3 years ago | (#34825066)

It's not what you think. It's a built-in graphics card on the CPU. That graphics card has all the hardware necessary to support the directx 11 api. If they change the directx API, intel changes the driver.

Re:Hard-wired DirectX? (2)

peragrin (659227) | more than 3 years ago | (#34825162)

So why not do it generically? IBM Cell chips integrate a Vector chip on the CPU. Intel and AMD both have video chips integrated into the CPU. So why not integrate like the old Altvec of PPC a Vector co-processor.

Why not use a generic chip designed for that type of instruction set? That way your not limited software versions for your hardware.

Re:Hard-wired DirectX? (4, Informative)

Surt (22457) | more than 3 years ago | (#34825252)

So why not do it generically? IBM Cell chips integrate a Vector chip on the CPU. Intel and AMD both have video chips integrated into the CPU. So why not integrate like the old Altvec of PPC a Vector co-processor.

Why not use a generic chip designed for that type of instruction set? That way your not limited software versions for your hardware.

Because sufficiently generic hardware is not sufficiently fast at the desired task, graphics computation. Even with the optimization intel has put into this, they'll be MORE than an order of magnitude of graphics performance behind the dedicated solutions of their competitors.

Re:Hard-wired DirectX? (1)

Joce640k (829181) | more than 3 years ago | (#34826140)

The chip's instruction set will be designed around the shading languages used in 3D graphics, it won't be very generic.

Re:Hard-wired DirectX? (1)

Surt (22457) | more than 3 years ago | (#34826422)

Yeah exactly ... it wasn't at all clear how 'generic' the grandparent wanted ... so I actually replied twice depending on which level of generic they wanted.

Re:Hard-wired DirectX? (1)

Surt (22457) | more than 3 years ago | (#34825266)

Actually, on rereading your post ... I think it may actually meet your definition. It isn't hard-wired for dx11. There will be a driver. That driver can be modified/optimized later. The hardware is, in fact, generic graphics hardware, at least in the sense I think you mean.

Re:Hard-wired DirectX? (1)

sgt scrub (869860) | more than 3 years ago | (#34825306)

because DirectX sounds cooler to marketing?

Re:Hard-wired DirectX? (1)

Shikaku (1129753) | more than 3 years ago | (#34825208)

Use Linux?

http://intellinuxgraphics.org/ [intellinuxgraphics.org]

All Intel drivers are open source on Linux. I have no idea about code quality or upkeep, so I will say nothing except I know they add regularly.

What other kind of DirectX do you think there is? (2)

fnj (64210) | more than 3 years ago | (#34825358)

Do you know some other way to do it? All graphics cards incorporate "hard-wired DirectX". If you are going to have graphics accelerators, they have to accelerate graphics. You can't meaningfully accelerate blits to frame buffers any faster than they already are. You have to accelerate higher level graphics abstractions. That's all DirectX is - an abstraction of higher level graphics operations. Any software, such as OpenGL, can (and does) tap into the more well chosen of those abstractions.

DirectX (4, Funny)

Anonymous Coward | more than 3 years ago | (#34824910)

Goes to 11!

(I'm sorry)

Intel integrated graphics (2, Insightful)

node 3 (115640) | more than 3 years ago | (#34824922)

I'd rather they made their integrated graphics fast than simply support new DirectX capabilities. I don't really see the point of supporting certain features if the whole thing is going to be slow. I suppose it's easier to implement something than it is to implement it well.

Re:Intel integrated graphics (1)

Shikaku (1129753) | more than 3 years ago | (#34825238)

The main point of Intel graphics is it is cheap. If you want a barebones low graphics computer you buy integrated, which Intel regularly develops, mostly for use in laptops (which add the bonus of power savings).

Re:Intel integrated graphics (1)

UnknowingFool (672806) | more than 3 years ago | (#34826000)

Yes but my understanding is you don't get that choice for some models of the CPU. For the mobile i3, i5, and i7 series now, the Intel GPU is integrated into the chipset. So if you get a new i7 and a discrete GPU, the Intel GPU is just disabled. Apple has done some work so that both the Intel and the discrete both operate depending on the on-demand video requirements. The new Ivy Bridge will be integrated into the CPU and not just the chipset.

Re:Intel integrated graphics (1)

Late Adopter (1492849) | more than 3 years ago | (#34825260)

Then you don't want Intel graphics. The point to their hardware is to make it cheap: low power usage and low die size. Features are just engineering time, and that's something Intel has a lot of.

Re:Intel integrated graphics (1)

blair1q (305137) | more than 3 years ago | (#34825268)

That's what "support" means when talking about graphics. Graphics processing is all about taking some piece of over-used software and putting it in hardware so that it consumes a few hundred picoseconds instead of a several dozen nanoseconds per iteration. It makes common algorithms run faster.

DirectX is a standard for a set of common algorithms. It makes sense to implement as many of them in hardware as you can. DirectX11 is merely the latest iteration of DirectX, and the first to get consideration as part of the CPU die itself*.

* - Sony's Cell processor integrates GPU and CPU functionality, but I don't expect that it was designed with DirectX in mind at all.

Re:Intel integrated graphics (1)

Monkeedude1212 (1560403) | more than 3 years ago | (#34825286)

I suppose it's easier to implement something than it is to implement it well.

80/20 rule.

Re:Intel integrated graphics (2)

divisionbyzero (300681) | more than 3 years ago | (#34825292)

I'd rather they made their integrated graphics fast than simply support new DirectX capabilities. I don't really see the point of supporting certain features if the whole thing is going to be slow. I suppose it's easier to implement something than it is to implement it well.

It will include DirectX 11 *and* theoretically be twice as fast as Sandy Bridge. Not much to complain about there.

P.S. By theoretically I mean it will have twice as many stream processors.

Intel integrated graphics at anandtech.com (4, Informative)

IYagami (136831) | more than 3 years ago | (#34825300)

You can find Sandy Bridge GPU benchmarks at http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/11 [anandtech.com]

"Intel's HD Graphics 3000 makes today's $40-$50 discrete GPUs redundant. The problem there is we've never been happy with $40-$50 discrete GPUs for anything but HTPC use. What I really want to see from Ivy Bridge and beyond is the ability to compete with $70 GPUs. Give us that level of performance and then I'll be happy.

The HD Graphics 2000 is not as impressive. It's generally faster than what we had with Clarkdale, but it's not exactly moving the industry forward. Intel should just do away with the 6 EU version, or at least give more desktop SKUs the 3000 GPU. The lack of DX11 is acceptable for SNB consumers but it's—again—not moving the industry forward. I believe Intel does want to take graphics seriously, but I need to see more going forward."

Note: all Sandy Bridge laptop CPU have Intel HD Graphics 3000

Re:Intel integrated graphics at anandtech.com (1)

CrashNBrn (1143981) | more than 3 years ago | (#34826432)

Yet, you still need an i7 + intel integrated graphics and an i7 compatible motherboard to get the performance of a $~50 dedicated GPU. Pricewise, you could go with an AMD solution and a dedicated GPU in the $75-$100 range from Nvidia or AMD and still pay half as much for better 3D performance.

The numbers look even worse for Intel if you grab an "off-the-shelf" dedicated GPU thats one generation older, e.g. a 1GB Radeon 4670 for ~$65.

AMD also has Hybrid graphics, first introduced with the Puma or Spider platform:

Hybrid Graphics [realworldtech.com]
The 780 chipset is the first product to use a "hybrid" multi-GPU set up, aptly named, Hybrid Crossfire. Hybrid Crossfire operates a discrete GPU (HD 34xx) in tandem with the IGP to boost performance above what either could achieve separately.

Re:Intel integrated graphics (3, Insightful)

TheTyrannyOfForcedRe (1186313) | more than 3 years ago | (#34825368)

I'd rather they made their integrated graphics fast than simply support new DirectX capabilities. I don't really see the point of supporting certain features if the whole thing is going to be slow. I suppose it's easier to implement something than it is to implement it well.

Have you seen performance numbers for Sandy Bridge's on chip graphics? The "Intel graphics are slow" meme is dead. Sandy Bridge's integrated gpu beats most discrete graphics cards under $50. The Ivy Bridge solution will be even faster.

http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/11 [anandtech.com]

Re:Intel integrated graphics (4, Insightful)

0123456 (636235) | more than 3 years ago | (#34825868)

The "Intel graphics are slow" meme is dead.

For anyone who likes their games to run at 30fps at 1024x768 with low graphics settings. The rest of us find that kind of slow actually.

Re:Intel integrated graphics (0)

Anonymous Coward | more than 3 years ago | (#34826112)

I'd rather they made their integrated graphics fast than simply support new DirectX capabilities. I don't really see the point of supporting certain features if the whole thing is going to be slow. I suppose it's easier to implement something than it is to implement it well.

It's an attempt to get lock-in with Microsoft operating systems and the hardware, otherwise Intel would implement opengl in the chip as well. Plus Intel is incorporating DRM in the chip to keep Hollywood happy, locking out any FOSS use of the chips features.

In essence, it become a Win-chip, similar to winmodems and winprinters, ergo "junk".

Intel has never been on the forefront of DX dev. (1)

Lashat (1041424) | more than 3 years ago | (#34824944)

They have always waited for other graphics companies to lead this charge. It's a huge effort on the hardware side to be Microsoft's partner on this. The benefit is that you get out the door first, but Intel has never pushed for that leadership position in graphics (so long as you don't count volume).

Two Questions (3, Interesting)

chill (34294) | more than 3 years ago | (#34824960)

1. Will this in any way benefit OpenGL?

2. Will this hinder future versions of DirectX or are they backwards compatible in a way that there would be large chunks in hardware and new changes made as firmware revisions or software implementations?

Re:Two Questions (2)

Surt (22457) | more than 3 years ago | (#34825110)

The hardware has all the features necessary to support dx11. dx11 is generally a superset of what opengl can do. So yes, opengl should be fully supported, assuming someone writes the driver.

Re:Two Questions (1)

Burnhard (1031106) | more than 3 years ago | (#34825156)

I read that Intel's drivers are notoriously shite for OpenGL. Indeed, my own experimentation showed them to be shite at D3D as well. The device I was using claimed to support PS 3.0 (in its caps), but point-blank refused to run some of my shaders (they ran ok with ATI and NVIDIA cards). I won't be supporting Intel Graphics, that's for sure.

Re:Two Questions (1)

Surt (22457) | more than 3 years ago | (#34825380)

Yeah, that's exactly why I had to put in the qualifier about the driver, unfortunately.

Re:Two Questions (2)

Tr3vin (1220548) | more than 3 years ago | (#34825116)

In theory, OpenGL 4 could take advantage of the new hardware, but Intel would have to write good OpenGL drivers. Future versions of DirectX may require new hardware. We won't know until there is a spec. If it does require new hardware, then people would have to replace their DX11 cards anyway.

First Intel CPU + GPU on die? (2)

TeknoHog (164938) | more than 3 years ago | (#34824992)

FTA:

The Sandy Bridge chips are the first in which Intel has combined a graphics processor and CPU on a single piece of silicon.

I thought Intel already did this a while ago with the newer Atom chips:

http://en.wikipedia.org/wiki/Intel_atom#Second_generation_cores [wikipedia.org]

Re:First Intel CPU + GPU on die? (1)

Surt (22457) | more than 3 years ago | (#34825178)

I'm sure the article was thinking mainstream x86 line, but failed to say it. Or more likely, written by someone who doesn't care about the platforms atom is aimed at, and therefore didn't know.

Re:First Intel CPU + GPU on die? (1)

blair1q (305137) | more than 3 years ago | (#34825356)

They had. The news here is (more of) the DirectX11 API will be in HW.

Who cares? (1)

dicobalt (1536225) | more than 3 years ago | (#34824996)

It will still be too slow to use it for anything that is DirectX 11. Why do they even bother?

Re:Who cares? (1)

blair1q (305137) | more than 3 years ago | (#34825372)

In what way do you mean?

Putting graphics processing in HW instead of doing it in SW is always better, and Intel currently rule in HW speed for mainstream chips.

So it's hard to tell what you're saying.

Re:Who cares? (1)

Surt (22457) | more than 3 years ago | (#34825494)

DX11 titles are so high-end, that no one would find them playable with the capabilities of intel HW. Intel HW indeed rules integrated graphics (until fusion is on the street), but no one plays high end dx10 titles, much less dx11 titles on such hardware. So why bother implementing dx11 at all (instead of, for example, making dx10 faster, possibly enough faster to play high end dx10 titles), when it won't be usably fast for any actual dx11 software? The answer of course is marketing.

Re:Who cares? (1)

blair1q (305137) | more than 3 years ago | (#34825568)

If you're buying high-end software, why are you expecting to play it on low-end hardware?

Integrated GPU/CPU will always be lower performance than discrete. If you want bleeding-edge, open your wallet.

Re:Who cares? (1)

Surt (22457) | more than 3 years ago | (#34825786)

Precisely. So why is Intel bothering to support dx11? That's high-end only, and won't be playable on their hardware, even though it's 'supported'.

Re:Who cares? (1)

blair1q (305137) | more than 3 years ago | (#34825876)

Because things change, and DX11 will soon enough be the low end.

Re:Who cares? (1)

Surt (22457) | more than 3 years ago | (#34825986)

Then (if it weren't for marketing) maybe it would make sense to implement directx11 in the next generation, or the one after that, when they can actually make directx11 content usable.

Re:Who cares? (1)

0123456 (636235) | more than 3 years ago | (#34825830)

If you're buying high-end software, why are you expecting to play it on low-end hardware?

What's the point of supporting DX11 if the game is unplayable?

My laptop's graphics card supports DX10, but if I enable the DX10 engine in any game I own that has one then the frame rate halves. So why bother?

Re:Who cares? (1)

blair1q (305137) | more than 3 years ago | (#34825894)

You shouldn't bother paying for something that doesn't work for you. If you bought that laptop for the DX10 you should return it and get one that works.

Great! (4, Funny)

TechyImmigrant (175943) | more than 3 years ago | (#34825000)

Those new texture mapping algorithms will really make outlook load fast.

Re:Great! (1)

Surt (22457) | more than 3 years ago | (#34825128)

The 3d text mode in outlook 2012 is pretty cool. The words are practically poking you in the eyeballs!

Re:Great! (2)

sgt scrub (869860) | more than 3 years ago | (#34825330)

cool! using outlook always felt like someone was poking me in the eye. now maybe others will be able to relate.

Re:Great! (1)

Monkeedude1212 (1560403) | more than 3 years ago | (#34825448)

I love the way it bump-mapped the bumped post on 4chan.

They actually may (1)

melted (227442) | more than 3 years ago | (#34825456)

They actually may, seeing that the entire GUI frontend of EVERYTHING in Vista and Windows 7 is basically a multithreaded version of Direct 3D. Those "reflections" on the edges of the window frame? They're textures. And textures require mapping.

But will it improve Minecraft's graphics? (5, Funny)

digitaldc (879047) | more than 3 years ago | (#34825032)

That's what I am worried about, I want my Minecraft landscapes to be rendered better.

Re:But will it improve Minecraft's graphics? (2)

Surt (22457) | more than 3 years ago | (#34825188)

No. That's a problem in the minecraft client, not in the hardware that displays it.

Re:But will it improve Minecraft's graphics? (2)

kyz (225372) | more than 3 years ago | (#34825598)

Minecraft uses LWJGL, the lightweight Java game library, which in turn uses OpenGL.

A better graphics card, or better graphics driver, will render Minecraft better.

Re:But will it improve Minecraft's graphics? (1)

Surt (22457) | more than 3 years ago | (#34825776)

Not unless minecraft improves the features they are using. It's a really primitive design, there's almost no way any existing card isn't rendering what minecraft puts out at maximum quality.

Re:But will it improve Minecraft's graphics? (0)

Anonymous Coward | more than 3 years ago | (#34826162)

Pretty sure the issue is that Minecraft is displayed using Voxels (Pixels with volume) rather than textured polygons.

He did this because it is easier, but the graphics quality suffers some... a lot.

Re:But will it improve Minecraft's graphics? (1)

Tolkien (664315) | more than 3 years ago | (#34825312)

Are you saying you want the blocks to be less blocky or more blocky?

Re:But will it improve Minecraft's graphics? (1)

TheL0ser (1955440) | more than 3 years ago | (#34825552)

The blocks should be more blocky but look less blocky.

Re:But will it improve Minecraft's graphics? (2)

Colonel Korn (1258968) | more than 3 years ago | (#34825798)

The blocks should be more blocky but look less blocky.

I want tessellated blocks. The entire Minecraft world should be a dynamic fractal, with the shape of each individual block mirroring the structure of the whole.

AMD has already implemented DirectX 11 in its F... (1)

drinkypoo (153816) | more than 3 years ago | (#34825222)

AMD has already implemented DirectX 11 in its Fusion low-power chips.

As has nVidia in GTX 400 [wikipedia.org] .

Re:AMD has already implemented DirectX 11 in its F (2)

Surt (22457) | more than 3 years ago | (#34825514)

gtx 400 isn't integrated onto a cpu, which I think was the point.

di3k (-1)

Anonymous Coward | more than 3 years ago | (#34825302)

users. BSD/OS Successes with the What they think is recruitment, but are att3nding a confirmed that *BSD on my Pentium Pro it attempts to 40,000 workstations

DirectX who? (0, Troll)

mutherhacker (638199) | more than 3 years ago | (#34825366)

Does anybody really care about DirectX anymore? Linux, Android and Apple are picking up pace. OpenGL all the way.

Re:DirectX who? (4, Informative)

Burnhard (1031106) | more than 3 years ago | (#34825390)

Given that DX is driving innovation in graphics cards at the moment and that GL is playing catch-up, the answer has to be "yes".

Re:DirectX who? (1)

Joce640k (829181) | more than 3 years ago | (#34826232)

None of these chips execute 'Direct3D' or 'OpenGL' directly, they remap the functions to an internal 3D API.

OpenGL and Direct3D do mostly the same things so it's not much of a hardship for the driver writers.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>