Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

DirectX 10 Hardware Is Now Obsolete

Zonk posted more than 6 years ago | from the shouldn't-have-blinked dept.

373

ela_gervaise writes "SIGGRAPH 2007 was the stage where Microsoft dropped the bomb, informing gamers that the currently available DirectX 10 hardware will not support the upcoming DirectX 10.1 in Vista SP1. In essence, all current DX10 hardware is now obsolete. But don't get too upset just yet: 'Gamers shouldn't fret too much - 10.1 adds virtually nothing that they will care about and, more to the point, adds almost nothing that developers are likely to care about. The spec revision basically makes a number of things that are optional in DX10 compulsory under the new standard - such as 32-bit floating point filtering, as opposed to the 16-bit current. 4xAA is a compulsory standard to support in 10.1, whereas graphics vendors can pick and choose their anti-aliasing support currently. We suspect that the spec is likely to be ill-received. Not only does it require brand new hardware, immediately creating a minuscule sub-set of DX10 owners, but it also requires Vista SP1, and also requires developer implementation.'"

cancel ×

373 comments

More juice! (5, Funny)

JosefAssad (1138611) | more than 6 years ago | (#20193891)

4xAA is a compulsory That would seem to me to be the biggest change, that it requires batteries now.

Re:More juice! (0)

Anonymous Coward | more than 6 years ago | (#20194039)

Joking aside, making AA mandatory is a pretty good thing. I'll have to go check if they specifically define the kosher method as multisampling -- recycling pixel shader and texture data for all pixel subsamples and just resampling the underlying geometry for each subsample, instead of resampling everything for each subsample (supersampling) -- as one of the problems in game engines is that some do post-processing tricks that are incompatible with multisampling (having subsamples in the back buffer). Developers actually need a bit of hand holding and arm twisting here, for the benefit of gamers.

Re:More juice! (0)

Anonymous Coward | more than 6 years ago | (#20194123)

If by "good thing" you mean "stupid thing", then I totally agree. The user should always have the choice to either use it or not. On many games, I can run at a much higher resolution without AA than with. In my opinion, high resolution > antialiasing.

For other games (primarily online), I want game speed above all else. I don't care how "pretty" it looks if framerates suffer.

Re:More juice! (5, Interesting)

Solra Bizna (716281) | more than 6 years ago | (#20194493)

Support of the feature by the video card is mandatory. Use of the feature by the game is not.

At least, that's how I understand it.

That aside, am I the only person who remembers reading this "bomb" months back? The plan was that instead of checking for individual features (and coding around their lack case-by-case, like we will still get to do with OpenGL) the developer would check for a DirectX version, leaving fewer opportunities for wonky bugs from weird support combinations.

-:sigma.SB

(Disclaimer: I am a game developer who exclusively uses OpenGL for hardware 3D and I fully intend never to write a single line of DirectX code. Ever.)

Re:More juice! (1)

plague3106 (71849) | more than 6 years ago | (#20194565)

That aside, am I the only person who remembers reading this "bomb" months back? The plan was that instead of checking for individual features (and coding around their lack case-by-case, like we will still get to do with OpenGL) the developer would check for a DirectX version, leaving fewer opportunities for wonky bugs from weird support combinations.

Seems like the DX way would be a good thing to me; your game will display more properly and you should have less support issues. The less exceptions you have in your code the easier it is to write, debug and maintain.

Re:More juice! (3, Funny)

mpe (36238) | more than 6 years ago | (#20194115)

4xAA is a compulsory That would seem to me to be the biggest change, that it requires batteries now.

Presumably Microsoft will be calling one of the new features "EverReady Boost" :)

Re:More juice! (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#20194265)

Makes sense. NiMH-AAs can deliver serious peak current.

How about virtual memory? (-1, Troll)

TheLink (130905) | more than 6 years ago | (#20193893)

In graphics cards that is. Is that required in 10 or 10.1?

Wait... (3, Funny)

Draconix (653959) | more than 6 years ago | (#20193895)

You mean developers are actually using DirectX 10?

Re:Wait... (-1, Troll)

Anonymous Coward | more than 6 years ago | (#20193995)

i take it you're not a game developer and/but a linux user yes?

Re:Wait... (0)

Anonymous Coward | more than 6 years ago | (#20194035)

Don't know about him but I thought that you needed Vista for Direct X 10. That seems like rather a small market to be developing for. At least, compared to XP. Is there really any benefit that makes up for the lost market share?

Re:Wait... (4, Insightful)

Ilgaz (86384) | more than 6 years ago | (#20194275)

i take it you're not a game developer and/but a linux user yes?
All serious gamers are happily running Windows XP with latest service pack. I have not yet seen a single gamer liking Vista unless he/she got a true monster machine which you can't tell difference whatever you do. Some game companies have guts to say "We do NOT support Vista at least until SP1 ships".

I am running OS X here and all my games are OS X native but you don't need DX 10 enabled Vista to browse game forums :)

The absolute need for Vista to run DX 10 killed it from the beginning. The DX 10 and Vista respectively. I am sure lots of game developers who coded direct3d only stuff questioned their choice and started to look to recent OpenGL advancements.

I am hoping they finally started to figure risks of using a MS only technology rather than platform independent, documented frameworks such as OpenGL, OpenAL.

Did MS care to explain what kind of undocumented,hidden quantum computing (!) routines in Vista needed for DX 10 running? :) Or did they simply state "We can't sell Vista otherwise, those FPS racing teens will buy it for DX10". I think they overlooked to gaming community, they weren't that stupid.

You think that "Linux user" wouldn't have clue but you forget WINE factor. If I had a problem with a missing dll in DirectX, I would talk to WINE people :)

Re:Wait... (1)

fastest fascist (1086001) | more than 6 years ago | (#20194453)

I don't see how Vista or DX10 has been killed. It'll just take a few years to become the standard. Anyone who buys a new branded computer will be getting Vista, unless they specifically jump through hoops to get XP instead, and that will slowly but surely bring the majority of PCs into the Vista camp. By then, it's a moot point whether or not games on XP perform better.

Where is OpenGL when we need it? (5, Interesting)

imbaczek (690596) | more than 6 years ago | (#20193897)

This seems like a window of opportunity for a new OpenGL standard. Anybody knows when it's due?

Re:Where is OpenGL when we need it? (5, Informative)

MrCoke (445461) | more than 6 years ago | (#20193933)

Re:Where is OpenGL when we need it? (1, Funny)

tksh (816129) | more than 6 years ago | (#20194279)

Uh oh, it looks like my current OpenGL hardware is now obsolete.

Re:Where is OpenGL when we need it? (3, Insightful)

Ilgaz (86384) | more than 6 years ago | (#20194313)

No, nothing can be obsolete on open industry standards like OpenGL. At last resort, your OpenGL layer would "software render" the OpenGL 3 content instead of telling GPU to draw it. It would be dead slow but still work. Same goes for backwards compatibility. I actually have a game coded in OpenGL 1.1 ages running on my Quad G5 having OpenGL 2 specs.

Nobody would dare claim "Upgrade your OS so you can run OpenGL 3 on your compliant hardware".

MS spent billions to DirectX and converting some naive/beginner developers exactly for this reason. To control. Companies/Developers like ID Software, Blizzard spent extra millions as an answer. They are using OpenGL and OpenAL not because "they are 133t", they use it to minimise effects of such crap by MS. They don't want MS dictating users which OS to run using their millions of man hours as excuse.

This should be a clue for those .NET and upcoming SilverLight lovers too.

The extra price of OpenGL and OpenAL comes from the fact that they are intended for real developers, not some people pointing and clicking in Visual Studio and claim they are game developers.

Re:Where is OpenGL when we need it? (2, Insightful)

QunaLop (861366) | more than 6 years ago | (#20194387)

I think what you say makes a lot of sense, except the last phrase. If games are easier to write (skipping over the effectiveness/perceived effectiveness of any 'platform'), then there are more people writing games and becoming developers, which would make the game market more competitive, and thusly we would have better games!

I can't see any reason why game development should not be point and click, if they made something like OpenGL easier to write for, I think it would be a positive for the game market, and might bring a viable alternative to Microsoft

Re:Where is OpenGL when we need it? (3, Interesting)

Ilgaz (86384) | more than 6 years ago | (#20194417)

I think what you say makes a lot of sense, except the last phrase. If games are easier to write (skipping over the effectiveness/perceived effectiveness of any 'platform'), then there are more people writing games and becoming developers, which would make the game market more competitive, and thusly we would have better games!



I can't see any reason why game development should not be point and click, if they made something like OpenGL easier to write for, I think it would be a positive for the game market, and might bring a viable alternative to Microsoft

Open Standards has some side effects. MS can do everything "click and run" but OpenGL ARB can't do it since it may also end up in some military planes screen. MS can say "Lets drop this, it makes coding complex, nobody would use it in game" but OpenGL can't since it could be in use. Even some high end phones run a stripped version of OpenGL.

I think a developer coding for multiple platforms using open standards must be far more complex/trained/advanced than a guy firing up Visual Studio and run some "Wizards" so he/she actually deserves the extra money. I heard OpenGL is called "expensive" many places so I was trying to explain why.

MS Windows only developers, game developers are already politely bribed by MS. Making OpenGL the easiest to code technology ever won't change their Direct3d obsession or they won't magically ship a native OS X/Linux game as result.

Re:Where is OpenGL when we need it? (0)

neokushan (932374) | more than 6 years ago | (#20194471)

The fact that you don't know the difference between OpenGL, DirectX and Direct3d just shows how ignorant you are. DirectX is an entire library that allows a developer to handle nearly everything - Graphics, sound, input, netplay, etc. OpenGL and OpenAL only account for sound and graphics. And also, I can tell you now that OpenGL/AL are a LOT easier to use than their DirectX equivalents. It's not this Drag-and-drop fantasy you have in mind, OpenGL does a lot of work for you that you don't have to do in Direct3D so if it's really that inferior, why would so many developers use it? The reason Blizzard and co use OpenGL is because OpenGL is cross platform. To them, it makes sense to only write one rendering engine for easy porting to other platforms (notice how all of the companies you list also release their software on Mac and/or Linux?), but a professional developer wouldn't be worried anyway, it's not that difficult to design your applications to have multiple renderers so really there's no excuse. The reason many developers have abandoned OpenGL is because it takes so long to get any new features added to it. How long were were stuck on OpenGL 1.4? Too long! Even today, effects that have been standard in Direct3D are only part of semi-official extensions for OpenGL and require a bit of fiddling to implement. This is where Microsoft has the advantage - they don't need to argue with a board of directors about what features to implement, they can just go ahead and do it and then tell ATI/Nvidia to support it or gtfo. So yeah, there's a reason D3D is so popular and it's not just because Microsoft has managed to manipulate 90% of the games developers out there.

Re:Where is OpenGL when we need it? (0)

Anonymous Coward | more than 6 years ago | (#20194341)

>>> OpenGL 3 is announced recently: http://www.opengl.org/cgi-bin/ubb/ultimatebb.cgi?u [opengl.org] bb=get_topic;f=3;t=015351;p=0

Is it just me or does that page give you folks a nostalgic déjà vu-ish feeling?

Re:Where is OpenGL when we need it? (3, Informative)

baadger (764884) | more than 6 years ago | (#20193937)

According to the OpenGL homepage...

The OpenGL 3 specification is on track to be finalized at the next face-to-face meeting of the OpenGL ARB, at the end of August

Re:Where is OpenGL when we need it? (1)

elFarto the 2nd (709099) | more than 6 years ago | (#20194017)

Unfortunally it will only be release at the end of September due to some 30-day period Khronos requires.

Re:Where is OpenGL when we need it? (0)

Anonymous Coward | more than 6 years ago | (#20193943)

:) There is a new OpenGL standard - OpenGL 3, and it sounds like it pwns. It's already done and it's going to be released officially as soon as the end of september.

Re:Where is OpenGL when we need it? (0)

Anonymous Coward | more than 6 years ago | (#20193955)

The OpenGL ARB officially announced OpenGL 3 on August 8th 2007 at the Siggraph Birds of a Feather (BOF) in San Diego, CA. OpenGL 3 is the official name for what has previously been called OpenGL Longs Peak.
I believe Longs Peak was origianlly meant to be 2.x however, and Mount Evans OpenGL 3.0.

Re:Where is OpenGL when we need it? (1)

jeevesbond (1066726) | more than 6 years ago | (#20194165)

This seems like a window of opportunity for a new OpenGL standard.

Or--even better--a window of opportunity for a new SDL [libsdl.org] version. SDL is comparable to DirectX as it offers control over sound, graphics, mouse/keyboard/joystick. OpenGL is just for graphics so comparing it with DirectX isn't really fair. :)

Re:Where is OpenGL when we need it? (-1, Flamebait)

Dwedit (232252) | more than 6 years ago | (#20194393)

Is this a joke? SDL is not comparable to DirectX in any way, especially since it runs on top of DirectX in Windows.

Buy a Mac (-1, Flamebait)

Anonymous Coward | more than 6 years ago | (#20193903)

Seriously.. the new iMac is fun to use an you can put behind Microsoft's psychotic mood swings on its standards forever.

Re:Buy a Mac (3, Informative)

Dogtanian (588974) | more than 6 years ago | (#20194183)

Seriously.. the new iMac is fun to use an you can put behind Microsoft's psychotic mood swings on its standards forever.
Well, you entirely missed the point... DirectX is primarily a games-oriented technology, and the graphics cards this issue affects will be mostly expensive, leading-edge ones. The type that will be mainly purchased by "hardcore" gamers.

Macs may be nice machines in many respects, but let's be honest- the range and quality of Mac games is poor in comparison with that available for Windows PCs. And then to imagine that hardcore gamers are going to replace their massively-powered PCs and $600 graphics cards with an off-the-shelf iMac and be happy with its performance...?

Seriously, get real. Nice computers, but no-one ever bought a Mac as a games machine.

Re:Buy a Mac (1)

Yahweh Doesn't Exist (906833) | more than 6 years ago | (#20194351)

>the range and quality of Mac games is poor in comparison

I don't understand why this argument still comes up.

surely the range of mac games is ALL games, whereas PCs can run almost, but not quite, all games (no linux or mac-only games).

or is the current iMac's limitation of 2.8GHz Intel Core 2 Extreme and ATI Radeon HD 2600 PRO 256MB GDDR3 just too bad to be of any use? I don't know, it's been years since I've followed "hardcore" games and specs. any good benchmarks would be appreciated.

Re:Buy a Mac (1)

Dogtanian (588974) | more than 6 years ago | (#20194539)

I don't understand why this argument still comes up.
Because the arguments I made still apply.

surely the range of mac games is ALL games,
If you mean that Macs can run Windows games via bootcamp, it misses the point I was replying to. They said "Seriously.. the new iMac is fun to use an you can put behind Microsoft's psychotic mood swings on its standards forever.". Well, you still need to run Windows to play Windows games. Even if it's on an x86 Mac.

or is the current iMac's limitation of 2.8GHz Intel Core 2 Extreme and ATI Radeon HD 2600 PRO 256MB GDDR3 just too bad to be of any use?
Again, this misses the point. It's not whether it's "good enough" for Joe Average to play new games. Since we were discussing DirectX 10 cards (i.e. expensive leading edge ones), it's whether the off-the-shelf iMac graphics gives performance comparable to *them* and whether the hardcore fanboys who buy them would be happy with the iMac instead. Personally, I don't think that's likely.

I don't know, it's been years since I've followed "hardcore" games and specs. any good benchmarks would be appreciated.
I'm don't consider myself a gamer either. However, given that this is an "off-the-shelf" configuration for a mainstream Mac, a range of computers that have never been focussed on games, it seems highly improbable that it's going to compete with a very expensive, cutting-edge graphics card.

I'm assuming that even if (in general) it's possible to connect PC graphics cards to an Intel Mac, that this wouldn't be possible in an iMac due to lack of internal space/expansion, so that's not an option- even if you were prepared to buy an iMac just to have it running Windows 80% of the time (which seems kind of pointless).

Re:Buy a Mac (1)

Ilgaz (86384) | more than 6 years ago | (#20194371)

I still wonder what would happen if half of the people monkeying with Wine and breaking 100s of evil MS license terms or dual booting went and bought Linux native games instead.

Loki is dead so we would never know.

I see same attitude on OS X only users, instead of pushing Apple to fix their issues they go and actually purchase Windows XP to run via boot camp. More ammo to Windows/DirectX monopoly which effects everything.

I am just hoping I wouldn't type "xxxx company, the last OS X native game development company is dead so we would never know" 2 years later.

Re:Buy a Mac (0)

Anonymous Coward | more than 6 years ago | (#20194389)

How dare! I'm a hadrcore player of tetris, bubblets and solitaire. You windows users should stop bashing Macs!

Re:Buy a Mac (1)

Ilgaz (86384) | more than 6 years ago | (#20194349)

Seriously.. the new iMac is fun to use an you can put behind Microsoft's psychotic mood swings on its standards forever.
Actually some games which are announced by Apple after Mactel move are actually Windows games using commercial Wine-like frameworks.

If anything ships saying "Intel Only" , you will get a clue since there are very powerful PPC (e.g. G5 dual core) Macs out there so CPU "speed" can't be an excuse.

As result they are bound to DirectX policies by MSFT. Lets say MS is not happy about exploding Mac/Intel share on market, they could do couple of tricks even on license text so you would say bye to your next Need For Speed version running under OS X because directx 11 may require running under "pure windows"' without "any kind of emulation" . We wouldn't care about it as end users but billion dollar corps like Electronic Arts would sure care.

This is why people should _still_ support OS X native game development using open standards if they dislike MS way of things. That especially includes Linux people. Once a game uses OpenGL and OpenAL, there is a tiny work left to convert it to a true linux game.

Such a disappointment (5, Funny)

zdude255 (1013257) | more than 6 years ago | (#20193905)

I'm sure the two developers using DX10 are gonna be pissed.

Re:Such a disappointment (5, Funny)

harry666t (1062422) | more than 6 years ago | (#20194007)

> I'm sure the two developers using DX10 are gonna be pissed.

I have dissociative identity disorder, you insensitive clod!

Re:Such a disappointment (1)

tsjaikdus (940791) | more than 6 years ago | (#20194301)

I never liked Mike and Mitch anyway

M$ fractures the DX10 community! (4, Funny)

someone1234 (830754) | more than 6 years ago | (#20194339)

Yeah, this move surely fractured the DX10 developer community.

Are they TRYING to shoot themselves in the foot!? (1, Interesting)

Anonymous Coward | more than 6 years ago | (#20193921)

So let's recap:

1.Introduce DX10 but only for Vista
2.Gamers buy new DX10 compatible hardware and Vista to play new games
3.Introduce DX10.1, only for vista, and incompatible with original DX10 compliant hardware
4.???
5.Shoot self in foot
6.Profit?

Re:Are they TRYING to shoot themselves in the foot (4, Informative)

Macthorpe (960048) | more than 6 years ago | (#20194191)

The summary and the Inquirer article are, well, wrong.

Microsoft announced 10.1 as a side-by-side update - DirectX 10 is not obsolete, they are both fully supported. Developers and manufacturers have the option of coding for 10.1 or sticking with 10. The real quote:

Direct3D 10.1 is an incremental, side-by-side update to Direct3D 10.0 that provides a series of new rendering features that will be available in an upcoming generation of graphics hardware.

Re:Are they TRYING to shoot themselves in the foot (2, Funny)

somersault (912633) | more than 6 years ago | (#20194221)

But it's NEW man! NEW!!! YOU MUST SUPPORT THE NEW!!! :o EVERYONE GO OUT AND BUY VISTA AND DX10.1 COMPATIBLE GRAPHICS CARDS... NEW!!! Everyone is obsolete!1 The world will soon be out of date ._.

Re:Are they TRYING to shoot themselves in the foot (1)

Spikeles (972972) | more than 6 years ago | (#20194459)

No they arn't wrong. They both specify the worry. As a game developer you now have 3 choices. OpenGL, Direct3D 9, Direct3D 10 or Direct3D 10.1. Which one do you choose? Which one has the largest market? Do you choose Direct3D 9 and not be able to take full advantage of the hardware? Do you choose Direct3D 10.1 and hope everyone upgrades to new video cards? Do you choose Direct3D 10 and hope people have Vista? Do you choose OpenGL? Do you choose all 3 and spend hundreds of thousands of dollars doing something that may not even realize a profit? I would posit that it's a scary time right now for game studios.

Re:Are they TRYING to shoot themselves in the foot (1)

Macthorpe (960048) | more than 6 years ago | (#20194481)

Lets see. Lost Planet - supports both DirectX 9 and 10. I see no reason why games companies can't support more than one at a time. I imagine game studios will be busier, rather than them being 'scared'.

You can tell Microsoft is ignoring customers.... (2, Interesting)

erareno (1103509) | more than 6 years ago | (#20194227)

Have they EVER heard of http://netpromoter.com/ [netpromoter.com] Net Promoter Scores? I don't think they have.
Microsoft must be too busy counting their cash to be considering consumer satisfaction right now.
All they're doing right now is getting everyone who uses DirectX to hate them with a passion right now.

I wonder if they've realized what they've done?

Re:Are they TRYING to shoot themselves in the foot (0)

Anonymous Coward | more than 6 years ago | (#20194277)

>So let's recap:
>4.???
>5.Shoot self in foot

As the saying goes, stupidity cannot be concealed.

Minor version change (2, Insightful)

Z00L00K (682162) | more than 6 years ago | (#20193925)

and major requirement change - so why not call it DirectX 11 instead? Or maybe that's X11?

Anyway - the whole business here seems to be to force hardware upgrades by one hand and software upgrades with the other just to be sure that the flow of money is ensured. How long will it take until video drivers are Vista Only - just to force an upgrade to Vista?

Re:Minor version change (1)

Yokaze (70883) | more than 6 years ago | (#20194053)

> and major requirement change

It is not a major requirement change, because, contrary to the statement of The Inquirer, the previously optional and now mandatoryfeatures are provided by NVidia (source [nvidia.com] ) and ATI DX10-cards (source [amd.com] ).
Both are have 32-bit fp unified shaders and 4xAA.

Re:Minor version change (1)

ardor (673957) | more than 6 years ago | (#20194159)

But not 32bit floating point texture filtering.

Re:Minor version change (0)

Anonymous Coward | more than 6 years ago | (#20194081)

Maybe they want to eventually rename it "DirectX X 10.1 Puma", DirectX X 10.2 Jaguar", and so on.

After all, we know how it is Microsoft's delight to innovate.

(Okay okay, admitted that Direct3D has taken the spearhead from OpenGL, but back in the day D3D was an abomination that shouldn't have been created for the burden of Windows developers worldwide. A bona fide case of NIH syndrome and, indeed, vendor lock-in. And even the latest whizz-bang Shader Model 4.0 isn't original innovation -- all the ideas are lifted from Renderman.

Aww what the heck, I want to see Alan Wake in action already...)

Re:Minor version change (1)

QunaLop (861366) | more than 6 years ago | (#20194397)

please indicate the 'forcing' part of this

Once again, early adopters take it in the shorts.. (4, Interesting)

tech10171968 (955149) | more than 6 years ago | (#20193931)

The article makes it seem as if Microsoft rushed DX10 out before it was truly ready; when you consider that this is what they often seem to do with their OS's, this should probably come as no surprise. Of course, we're seeing this news on the Inquirer, often considered to be a slightly less-than-reliable source of tech news. Maybe I'll reserve judgement until I hear another explanation from some other source.

Since when is DirectX a standard? (4, Insightful)

Dracos (107777) | more than 6 years ago | (#20193977)

Once again, those seven little letters get left out of a "standards" article: d-e f-a-c-t-o.

Re:Since when is DirectX a standard? (5, Funny)

Anonymous Coward | more than 6 years ago | (#20194445)

DirectX is a standard and de facto standards are a subset of standards: the minority that are actually used.

A standard is just a set of rules. If I wrote a blog article "Rules for wiping ones arse" that would be a standard. In the unlikely event it became widely accepted it would be a de facto standard. If the international community became concerned about global arse-wiping inconsistency it could ultimately become an ISO standard.

does vista SP1 support Direct X 10.0 ? (1)

Alain Williams (2972) | more than 6 years ago | (#20193979)

Does this mean that moving to SP1 makes old hardware unusable ? So will people be able to upgrade to SP1 and still keep their current hardware and games ?

I also wonder if there is a license change; charge hardware vendors more or make it unusable with FLOSS or something.

Re:does vista SP1 support Direct X 10.0 ? (0)

Anonymous Coward | more than 6 years ago | (#20194043)

Does this mean that moving to SP1 makes old hardware unusable ?
Uh, no.

So will people be able to upgrade to SP1 and still keep their current hardware and games ?
Yes, of course they will.

I also wonder if there is a license change; charge hardware vendors more or make it unusable with FLOSS or something.
How about wondering less and educating yourself more? The changes are clearly documented and readily available. There is no vast MS conspiracy here. It's just a minor revision to their directx API. Most API's receive updates and revisions you know.

Re:does vista SP1 support Direct X 10.0 ? (1)

Jugalator (259273) | more than 6 years ago | (#20194257)

Does this mean that moving to SP1 makes old hardware unusable ?

Nah..

So will people be able to upgrade to SP1 and still keep their current hardware and games ?

Yep

Why (3, Interesting)

Unixfreak31 (634088) | more than 6 years ago | (#20193989)

Why this sudden change dx 10 has not even caught on in the hardcore gamers let alone even the above mainstream. Is MS going to make dx 10.2 or 11 radical to where devolpers have no options as well? If so I think its time to move back to OpenGL. No freedom for devlopers. And I want to be able to set my own AA levels.

Re:Why (1)

fastest fascist (1086001) | more than 6 years ago | (#20194473)

the fact it hasn't caught on yet, and won't for a while, is probably precisely why they're willing to make changes that render current hardware obsolete. I see it as an admission that DX10 will not be really kicking off for a generation or two of graphics cards still.

Catchy title but... (5, Insightful)

Taagehornet (984739) | more than 6 years ago | (#20194003)

"Now" is probably an exaggeration, considering that we're talking about Vista SP1.

"Obsolete" ...I guess my DX9 card has been obsolete for a few years now, it still ticks on nicely though. Heck, all my hardware is probably obsolete.

You could sum up TFA in a single line: "Microsoft discusses future extensions to the DirectX API. The current generation of hardware won't support those."

Are anyone really surprised? Newsworthy?

Re:Catchy title but... (4, Insightful)

Sycraft-fu (314770) | more than 6 years ago | (#20194171)

You hear about it for a few reasons:

1) Some people (like many on Slashdot) hate MS and want them to fail, thus look for anything that makes them look bad and make sure it gets page time.

2) For some reason, some people had the perception that because DX10 was launched with Vista, that made it special and thus it wouldn't be changed for a long time. Never mind that MS has released a version of DirectX that has added a significant feature (as in something that needs more hardware) every 1-2 years in the past.

3) Perhaps because of this many people bought in to the DX10 cards expecting them to be "futureproof". Again no idea why anyone would think that given graphics cards are the things that evolve the fastest and thus obsolete the fastest.

Also I'm not so sure they said it wouldn't support it. Maybe I misread their slides, but all I saw was they said that "upcoming hardware" will support it. That statement doesn't mean that current hardware won't.

Either way, much ado about nothing. Games will continue to be made to support whatever hardware is common on the market. Game companies love all the flashy new toys, but they are in bussiness to make money and you do that by selling games that run on the actual systems that are out there. That means so long as most peopel don't have cards capable of using a new standard, they won't require it (though they may support it to give mroe eye candy to the eairly adopters).

Heck, right now you'll discover that a great number of games require nothing more than a DirectX 8 accelerator. That's a card like a GeForce 4 Ti fore example. Basically that means shader model 1.1 hardware. While many games support 2.0 and 3.0 (DX 9.0 and 9.0c respectively) you'll find that a good number don't require 2.0, and very few require 3.0. The reason is that there are still a lot of people using older cards. Not every one upgrades every year. Thus game makers have to take that in to account.

It's not like the second 10.1 comes out developers are going to say "Ok, everyone better upgrade because this is all we support!" They could try, and they'd just go out of business and other, smarter, developers would support the hardware that more people have.

Heck it is a pretty recent phenomena that developers have stopped supporting Windows ME for games, and some still do. Why? Enough people still used it.

I'm not really sure this matters all that much (-1, Flamebait)

wamerocity (1106155) | more than 6 years ago | (#20194009)

I haven't purchased Vista and I don't believe I ever will. In my view, Vista is the new Millenium Edition. One reason among about 75 reasons I don't want it is because of how Microsoft is trying to get people to play games through their Live subscriptions, just like for the xbox 360. I believe I'm in the minority when I say that online gaming is a stupid feature for me. I prefer to play my games on a laptop at LAN parties with people I know, so I can see their dismay when I blow their heads apart. So, AFAIK there are only 2 Vista games now, and both have been cracked to work on XP, so from the hardware perspective it is irrelevant. Thinking back, when DirectX when from 9.0a-> 9.0b ->9.0c, yes there were changes, but it didn't affect any of the games I wanted because the games were backwards compatible, so nothing lost. All this is really going to do is piss off early adopters as well as nVidia and ATI/AMD because M$ has locked them into a rut where they have to make cards with new specifications for windows since M$ said OpenGL would not be supported under Vista.

Re:I'm not really sure this matters all that much (1)

MBMarduk (607040) | more than 6 years ago | (#20194041)

Say what?
Windows Vista DOESN'T support OpenGL? At all?? (serious question)
Breaking news to me, this.

Re:I'm not really sure this matters all that much (0, Troll)

Creepy Crawler (680178) | more than 6 years ago | (#20194089)

OGL calls are forced through DirectX, therefore, never being faster than DX.

Re:I'm not really sure this matters all that much (5, Informative)

Anonymous Coward | more than 6 years ago | (#20194185)

Wow, what a load of FUD. OpenGL is completely supported under Vista and is in no way routed through DX:

http://www.opengl.org/pipeline/article/vol003_9/ [opengl.org]

MOD PARENT UP, GP OP DOWN (1)

sid0 (1062444) | more than 6 years ago | (#20194333)

Next we'll hear Vista eats children.

What a troll (0, Troll)

node159 (636992) | more than 6 years ago | (#20194411)

What a troll

Re:I'm not really sure this matters all that much (2, Informative)

Macthorpe (960048) | more than 6 years ago | (#20194517)

since M$ said OpenGL would not be supported under Vista.
That's odd, seeing as I just finished a fairly long game of City of Heroes on Vista Home Premium.

Pierre Bernard says (3, Funny)

jadin (65295) | more than 6 years ago | (#20194015)

Conan - Are you comfortable and angry Pierre?

Pierre - Comfortable and furious Conan.

Conan - So what are you upset about today?

Pierre - I've been a fan PC Games for ages Conan. To play the latest and greatest games requires me to continually upgrade my computer. Recently I upgraded to Windows Vista by Microsoft in order to play their newest game "Shadowrun". My PC could handle it although there wasn't much benefit over using Windows XP. It, however, required a lot more RAM and faster CPU in order to run smoothly. The game itself required the best video card I could afford. This was a serious investment, the video card alone put me back about the price of a new "non-gaming" PC. All this new hardware also required a bigger power supply, which wound up adding to my expenses. I wound up replacing my entire PC in order to save money. And since I was only upgrading for one game only it was difficult to upgrade for that alone, but I did so knowing my investment would last a year or two. Now Microsoft has announced DirectX 10.1 which makes all hardware for DirectX 10 obsolete. This made my previous investment from a month ago already worthless. To add salt to my wounds most of the features of 10.1 were optional and did nothing to improve the product. PC Gaming is an enjoyable experience, although an expensive one. Hardware should last a minimum of 6 months cutting edge, and about a year for not-the-best but playable.

Bottom line America? Microsoft needs to realize that features need to be worthwhile and should always be optional. If they are truly worth it, they will be adopted as standard by the general public very quickly.

Conan - Thank you Pierre, I'm sure two or three people across America know exactly what you're feeling like.

Good. (1)

siyavash (677724) | more than 6 years ago | (#20194019)

"Hardware is now obsolete"... jeez, calm down buddy boy... /. is sooo turning into a murdock sensationalism media.

Good, let the technology move forward and for once Microsoft actually has the balls to force it to people. I'm hoping for the day they break backward compatibility with old windows apps too and virtualize them saying "Take it or leave it".

People should not get upset about this. Computers and stuff around it never been like wine (the drink) anyhow. They are more like tomatoes and onions, once you bought them, they are "old news" and worth nothing. Better eat'em up!

In short, nothing much to see here, business as usual.

Re:Good. (1)

ardor (673957) | more than 6 years ago | (#20194211)

This is NOT a justifiable move forward. A few specs are now mandatory, and that is enough to render expensive hardware obsolete. Now imagine Game X builds upon D3D10.1. Where is the actual progress?
D3D10.1 would be fine if it had new actual features.

Re:Good. (1)

kaos07 (1113443) | more than 6 years ago | (#20194285)

Because a few features have been made compulsory does by no means render your 8800GT or 2900 'obsolete'. It will still be able to run everygame on the market right now and probably 95% for the next few years (The other 5% being the few made by developers who actually want to embrace DX10.1). So no, your fancy schmancy hardware is not obsolete and DX10.1 is not an excuse to dump it in the landfill and purchase another $800 video card.

So DX10.0 Hardware doesnt support 10.1? (3, Insightful)

Val314 (219766) | more than 6 years ago | (#20194067)

How can this be surprising?

You have 10.0 hardware and want it to support 10.1?

Please stop posting such nonsense, or would you cry foul if your SSE3 CPU doesnt support SSE4 when its available?

Re:So DX10.0 Hardware doesnt support 10.1? (0)

Anonymous Coward | more than 6 years ago | (#20194267)

The point of the article is that all early adopters of the whole Vista/DirectX 10 hype have been royally fucked in the ass by MS. But indeed, how should this be surprising?

Re:So DX10.0 Hardware doesnt support 10.1? (0)

Val314 (219766) | more than 6 years ago | (#20194437)

No, it said "currently available DirectX 10 hardware will not support the upcoming DirectX 10.1"

And that can hardly surprise anyone.

DirectX9b hardware didnt support DirectX9c. There is no reason to believe that e.g. a DX 10.1 Hardware will support DX 10.2 (or whatever it will be called then), but i'd bet that some will complain then, too.

Oh no! (4, Informative)

mikkelm (1000451) | more than 6 years ago | (#20194073)

Is that.. is that progress? New technology requiring new hardware?! BURN IT! BURN THE WITCH!

I didn't think I'd live to see the day where new technology would be unwelcome to the slashdot crowd. I guess it isn't surprising, though, it being a Microsoft product, and slashdot degenerating into a zealot sandbox.

DirectX 10.1 is going to be released about a year after DirectX 10. DirectX 9.0c was released about a year after DirectX 9.0b, and DirectX 9.0b hardware was also incompatible with DirectX 9.0c spec. That didn't create a whole lot of mainstream uproar, as people are generally positive towards new technology. I guess this being Vista and all, people can ignore pesky facts like those and continue their circle jerking unabated.

Re:Oh no! (2, Informative)

ardor (673957) | more than 6 years ago | (#20194141)

The point is that D3D10.1 mainly just enforces stuff that was optional in 10.0. There are no new killer features. So a game requiring 10.1 will make your shine new 8800 obsolete with absolutely no gain. 9.0b->9.0c saw the addition of stream frequencies among others, which is essential for instancing (D3D10 redesigned the entire instancing thing again). Also, 9.0c was largely compatible with 9.0b. It was mostly a bugfix release with added samples and a couple of new features (which were optional).

Re:Oh no! (1)

mikkelm (1000451) | more than 6 years ago | (#20194391)

The title of the article is "DirectX 10 Hardware Is Now Obsolete". If you want to talk about the features making it obsolete, you'll be wanting "DirectX 10.1 Ships With No New Noteworthy Features". The fact of the matter is that it's nothing new that new standards supersede older ones, and that's what the summary and the people posting comments are complaining about.

Re:Oh no! (1)

weicco (645927) | more than 6 years ago | (#20194477)

I can still play my old games with using my old GF 7900 GTX graphics card even if MS releases DirectX 15.7. And new games won't be going for DX 10.1 only any time soon now. So there is basically no point. And if, as you put it, DX 10.1 doesn't bring anything new into table DX 10.0 compatible cards may already support it.

Re:Oh no! (1)

fastest fascist (1086001) | more than 6 years ago | (#20194515)

If it brings no new killer features, it's a non-issue. Any developer choosing whether to code for DX10 or DX10.1 will be balancing the profit losses from losing the crowd with directx 10 cards with the benefits from using DirectX 10.1. If the benefits are nonexistent, then no-one will code for DX10.1.

Re:Oh no! (4, Insightful)

DrEldarion (114072) | more than 6 years ago | (#20194291)

I didn't think I'd live to see the day where new technology would be unwelcome to the slashdot crowd.
That's the general trend of Slashdot nowadays. The realization hit me when everyone started bashing the PS3, which contains a very impressive processor, allows installation of linux, has built-in media streaming, uses standard USB and Bluetooth hardware, runs folding@home, upscales DVDs and old games, etc. etc. All anyone here says, though, is "OMG SONY I BET THERE'S A ROOTKIT ON IT LOL".

This isn't a tech site anymore, it's a political site. Witness all the anti-RIAA/MPAA stories, global warming stories, election stories...

Re:Oh no! (3, Insightful)

marcello_dl (667940) | more than 6 years ago | (#20194409)

I don't buy a PS3 exactly because of the rootkit. But I criticized the PS3 mainly because Linux has not access to the whole hardware, the lack of ram expansion options, the braindead HD partition scheme. If new tech is crippled because of corporate strategies don't expect techies (either on slashdot or elsewhere) to like it.

wow, just wow. (0)

farkus888 (1103903) | more than 6 years ago | (#20194075)

This is just plain amazingly cruel to everyone who gave them the benefit of the doubt and took the risk of being an early adopter. Things like this are the reason why windows has been relegated from my primary OS to dual booted with linux on only of my 4 computers over the last few years. [the rest run only linux].

I've been considering trying out apple, a mac book proto be specific, in the near future. Being as I don't really care for digg so I don't know that I'd like it.

DirectX 10.1 is irrelevant (0)

Anonymous Coward | more than 6 years ago | (#20194083)

You'd have to be nuts to rely on a technology than can only be used by a miniscule subset of the market, namely people running Vista SP1. Move along people, nothing to see here.

Other Culprits (0)

Anonymous Coward | more than 6 years ago | (#20194127)

The video card industry (i.e. ATI and NVidia) is just as much to blame for this as Microsoft, it's not like MS comes up with random designs and the video card companies follow it without any of their input making their way into the plans.

Is the developers tipping point reached? (2, Interesting)

bomanbot (980297) | more than 6 years ago | (#20194135)

Developers already have difficulties justifiying DirectX 10 support because Vista marketshare is still so low and most gamers are perfectly fine with XP and DirectX 9. Also, DirectX 10 lacks the backwards compatibilty of the older versions.

But at least the new Unified Shaders seemed to be useful for developers, so at least they had advantages to it. But now, DirectX 10.1 only seems to make certain features compulsory, thus removing choice for the developers and also does not add new features to make it compelling to use.

So when do developers say "Screw this, DirectX 9 will suffice for the immediate future and works well, we will eschew DirectX 10 and beyond, serve our XP-using customers and use OpenGL for future development"? Especially if the big advantage DirectX had (until version 9), the universal availability on the Windows platform is gone now with DirectX 10 and beyond?

Re:Is the developers tipping point reached? (2, Insightful)

ardor (673957) | more than 6 years ago | (#20194177)

Yes, game developers are getting conservative nowadays, and always have been regarding support of new APIs. So many studios will continue using D3D9. But for the same reason many studios still wont switch to OpenGL. In both cases (D3D9->D3D10, D3D9->OpenGL 2.x or even the coming 3.x) the codebase has to be largely rewritten, so when studios MUST upgrade, they will probably prefer OpenGL this time...

A Bit Late To Notice? (5, Informative)

Zephiris (788562) | more than 6 years ago | (#20194223)

That DirectX 10.1 is incompatible with 10.0 (along with new WDDM interface) has been known for at least a year now. It's a bit late [elitebastards.com] for people to be in shock about it.

Slashdot even covered it before [slashdot.org] .

Just because Microsoft officially announced it at a conference doesn't *exactly* make it new news, since they made it very clear on roadmaps and everything else exactly what was going to happen, and why it wasn't the best idea ever to adopt DirectX 10.0 hardware, rather than hardware capable of 10.1 (or 10.2) and whatever the new superset of OpenGL happened to be (3.0 as it turns out).

Also, the reason to bother [elitebastards.com] with DirectX 10.1 isn't so much that it offers "brand new super features" to games, but the WDDM 2.1 bits, which would allow for far finer-grained context switching and task management. Being able to immediately switch from rendering one small bit, to starting to render something else, which would theorhetically make all of the compiz/Aero type stuff be able to run much more smoothly in conjunction with real 3D rendering (ie, games, CAD).

It all seems an exercise in futility to me, as far as the "DirectX 10" hardware goes. I like faster, I like more features, but there just seems no real reason to upgrade beyond my Geforce 6800 for the price point (which I got 18 months ago). Not to a 7800-series or comparable, and certainly not to an 8x00 or upcoming 9x00 Geforce, unless driver stability improves dramatically, and they can add more real-world-useful features, particularly without the need for Windows Vista. I'm back using WinXP "for a while" again, but I generally won't buy hardware anymore unless it's a notable and drastic improvement in Windows, Linux, and FreeBSD.

I digress, but the point is, the news has already been covered before. If it apparently wasn't that attention-worthy a year ago, is it now? New DirectX versions *always* require brand new hardware, whereas most minor OpenGL revisions have almost always included new features that also work on old hardware (OpenGL 1.5's Vertex Buffer Objects humming along happily on a Geforce 256, for instance), and while full compliance is the best, all you really need to care about is if something implements certain clearly defined extensions, rather than wondering if Nvidia or ATI have 'misinterpreted' specifications over DirectX. Both have been panned in the past for 'creative' adoption of pixel shader standards and bizarre interpretations of DirectX 9.

I'd just hope that eventually, there's more actual competition again, and both companies (and new companies) actually respect and care about standards compliance and that both they and the standards bodies start to care about what customers actually doing with their hardware.

Cranky old... (1)

swokm (1140623) | more than 6 years ago | (#20194271)

Phbbbtt!! Whatever. Screw DirectX and OpenGL. Where is my real-time ray-tracing graphics card?

The Nvidia Renderman 9000 FTW.

How Bill makes money on the stock market.. (0, Flamebait)

3seas (184403) | more than 6 years ago | (#20194273)

make people need you...make them need to upgrade, while you invest in those who supply the upgrade.

Another nail in the very fat and big coffin of Microsoft....

Mandatory 4xAA is this a joke? (0)

AbRASiON (589899) | more than 6 years ago | (#20194321)

Good lord!

That's ridiculous.
Some people out there (gasp!) actually don't care for AA at all!
I'm a CRT user, 22" screen, I use it in 1280x960 or 1600x1200 and I can tell you now that in a good many games the resources wasted on AA could be better spent on lighting, textures etc, I'm very happy with my image quality in that resolution and other resolutions.

LCD's are similar, if you run your game in it's native resolution you're likely using 1280x1024, 1680x1050 or 1920x1200 again where is AA really 'needed' or needed enough to be mandatory?

All this is going to do is force developers to lower their graphical targets as they are wasting frames on something which should CLEARLY be optional.

If for example I did one day find a game looked too blocky, I could up the resolution from 1152 to 1280 or 1280 to 1600 or even 1600 to 2048 and in the case of LCD's well, most LCD's sold nowadays really are running at quite high resolutions for their sizes as it is (in my opinion)

Oh and I do realise some people like AA, infact I'm sure some people are horrified by my choice but well at least currently it is an option for me.

For some reason this makes me think something more sinister is at hand, maybe I'm wrong but I do know the ATI developed Xbox 360 GPU has 'free' 4xAA due to the way the chip was designed, there's some spare EDRAM on the GPU specifically for this, it's fine if you have one of these chips but what if you don't? Is this simply some kind of conspiracy (unlikely but worth mentioning)

I love it, first they release Vista, try to force Shadowrun and Halo 2 to be locked to it, desperate for gamers to use their 'new and improved' (ugly as shit) operating system, now they are trying to force other nonsense on us.
I really really hope that Sony at least come at a draw with MS this console round so that gaming isn't totally fucked in the future, if these guys were the sole game operators, I think I'd get a new hobby.* /rant over.

* Sony haters, this does not imply Sony is flawless, this implies competition is good.

Re:Mandatory 4xAA is this a joke? (4, Informative)

Aladrin (926209) | more than 6 years ago | (#20194407)

You're reading it wrong. The -games- aren't required to support 4xAA, the -hardware- is. It's great that you hate AA and all, but there are plenty of others that insist on it. By requiring the hardware to support to be '10.1 compatible' they are merely pandering to the majority of gamers out there.

They haven't forced you to do anything, and they haven't forced developers either.

Re:Mandatory 4xAA is this a joke? (1)

node159 (636992) | more than 6 years ago | (#20194557)

Hear hear, AA is for pansies with too small penises^D^D^D^D^D^D monitors.

Honestly though, AA has always seemed pointless except for those games that max out the FPS no matter what the settings are (in which case a high texture pack seems to make much more of an impact anyway).

Why render at 1600x1200 and then display it at 640x480 when you can just display it at 1600x1200 (I know a rough analogy, but it still stands), its on of the features that should get bumped once all others are maxed (including resolution).

Has Microsoft gone mad? (1)

Eternal Annoyance (815010) | more than 6 years ago | (#20194369)

The one and only real advantage of Microsoft is games (there aren't as many games for any other OS's as there are for windows). Now they're making the gamer community angry AND they're making the game development community angry. I can't help but wonder what ID software, Valve and Blizzard think of this move. Remember Microsoft, there's a very annoying competitor, called linux, out there.

The Death of PC Gaming (0)

ErMurazor (799115) | more than 6 years ago | (#20194415)

Is this the final blow to PC. How can keep up with all this GFX card upgrades? I think both PS3 and the Xbox360 will gain on this. When more and more developers turn to consoles.

Trust MS, get screwed.... (-1, Troll)

gweihir (88907) | more than 6 years ago | (#20194427)

And not just the users. Developers must be pretty pissed too by now. I hope they all move to OpenGL (which does prettier graphics anyways) and leave this substandard MS technology behind. Of course then Games would run on OS X and Linux as well, with small changes.

IBM then, Microsoft now (1)

erc (38443) | more than 6 years ago | (#20194461)

In a way, Microsoft is trying to emulate IBM when it tried to jam MCA down the throat of the PC world back in the mid-80's. What happened to IBM then should happen to Microsoft now, too.

DirectX 10 like Vista is skippable (1, Informative)

BillGatesLoveChild (1046184) | more than 6 years ago | (#20194535)

> 'Gamers shouldn't fret too much - 10.1 adds virtually nothing that they will care about and,
> more to the point, adds almost nothing that developers are likely to care about.

Actually it's even better. DirectX 10.0 doesn't add anything you will care about either. Game developers are finding Shader 3.0 (DirectX 9.0c) gives them more than enough to do. There's no need to move to DirectX 10.0 for quite some time. Now add to that DirectX only running under Vista, because someone at Microsoft marketing thought it'd help Vista sales (it hasn't). Well, Why would you bother? Here's an interview with John Carmack (DOOM, Quake) on many things, including why DirectX 10 is a big bore:

http://www.gameinformer.com/News/Story/200701/N07. 0109.1737.15034.htm?Page=1 [gameinformer.com]

Known Roadmap (4, Interesting)

Anonymous Coward | more than 6 years ago | (#20194551)

It's funny watching everyone who is shocked. Those are the people who have no idea what DirectX 10 is and why the model has shifted so much from OpenGL and earlier versions of DirectX.

DirectX 10 and up is not just an accelerated video API but it is also a standard. Microsoft has completely eliminated the capability bits, or "capbits", concept in order to ensure to developers that if they program a specific version of the standard that all of the functionality mandatory by that standard will be supported by the graphics hardware. No longer will a developer target DirectX9 or OpenGL2 and have to ask the hardware whether or not it supports a plethora of options and then have to completely branch their development umpteen ways to support different varieties. If a game targets DirectX10.1 then 4xAA is guaranteed to be there, period. If a game does not require 4xAA then it doesn't have to target DirectX10.1.

So get used to it otherwise you'll be shitting yourself for every single DirectX release going forward. This is how it works now.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...