Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

DirectX Architect — Consoles as We Know Them Are Gone

ScuttleMonkey posted more than 5 years ago | from the something-about-a-nomad dept.

Games 434

ThinSkin writes "DirectX architect Alex St. John swims against the current and predicts the demise not of PC gaming, but of game consoles, in an exclusive two-part interview at ExtremeTech. In part one, Alex blasts Intel for pushing its inferior onboard graphics technology to OEMs, insists that fighting piracy is the main reason for the existence of gaming consoles, and explains how the convergence of the GPU and the CPU is the next big thing in gaming. Alex continues in part two with more thoughts on retail and 3D games, and discusses in detail why he feels 'Vista blows' and what's to become of DirectX 10."

cancel ×


Sorry! There are no comments related to the filter you selected.

Go figure... (4, Insightful)

Anonymous Coward | more than 5 years ago | (#22825106)

A DirectX architect says that console games are on the way out, and PC games are coming back. Surprise, surprise.

Re:Go figure... (5, Funny)

Naughty Bob (1004174) | more than 5 years ago | (#22825188)

DirectX architect Alex St. John swims against the current...
He is clearly making his way back to his birthplace, in order to spawn.

His mind is clearly a-buzz with hormones, let's not be too cruel.

Re:Go figure... (4, Insightful)

aleph42 (1082389) | more than 6 years ago | (#22825458)

Not that I have any faith either in that guy, but sure would love PC gaming to win over consoles.

I mean, consoles really are like cell-phone: a product line whose whole logic is consumer lock-in. They sell the console without a profit (like cell phone are sometimes sold for zero), and make up future on expenses which you are forced to make to the same company (through the license cost on the games).

What do you get in exchange for that? A PC (complete with hard drive, internet connection, support for usb, etc), excpet you can't use it like a PC. If the same games where made for PC directly, you would simply win on all fronts (even on the price; it's true that you save on the console, but you lose that by the lack of competition on games).

The hardware design of the PS3 could be sold as CPUs and GPUs (6 cores, why not if some games support it?).
I shouldn't have to wait for an extra year for GTA4 to be available for PC, only to inevitably find that it's a laggy on recent hardware, being a port.
People who get locked up with a console, only to buy games made for 4 different consoles and thus completly unoptimised are being ripped off.

How many players per PC? (1)

tepples (727027) | more than 6 years ago | (#22825584)

If the same games where made for PC directly, you would simply win on all fronts (even on the price; it's true that you save on the console, but you lose that by the lack of competition on games).
Not necessarily. A lot of ames for consoles are designed to let four players play on one big screen. (Several genres other than first-person shooters and real-time war sims don't need the screen to be split.) PC games, on the other hand, tend to assume that each player owns a separate computer and a separate 17 inch monitor. This can get cost-prohibitive if multiple players live in one household.

Re:Go figure... (4, Insightful)

Naughty Bob (1004174) | more than 6 years ago | (#22825644)

Though no Urophage, I love my Wii. When I play with my kids (or even drunken buddies), I think back to my C64 roots and lo, I am thankful.

I am not convinced that a PC analog could have replicated, in the given timescale, the user experience there.

I do think that the PC, once fully integrated into everyday entertainment, will compete in this regard, but the console is/has been a vital stepping stone to what is clearly a fun PC-based future.

The main benefit of consoles is supposed to be ease of development. From what I understand, PC game developers are rather hamstrung by the need to factor in the thousands of potential hardware configurations their products might encounter.

I see all of these problems as a consequence of the immaturity of the field, a short-term hassle to be stomached until the way ahead (open, common standards) is clear and obvious to all the major players.

More M$ PR. Is anyone dumb enough to listen? (1, Troll)

twitter (104583) | more than 5 years ago | (#22825322)

No mention is made of how XP driver support was yanked from Vista at the last minute either. Instead, it's the hardware that sucks, yeah, that's the ticket. The hardware you have been using and know is good dies on Vista because Intel slipped some chips on you, see?

What a crock this man is pushing. Is there anyone who's going to believe him? I hope OpenGL and pledges from Intel, Nvidia and ATI/AMD give him nightmares.

Xbox uses DirectX (4, Insightful)

tepples (727027) | more than 6 years ago | (#22825618)

A DirectX architect says that console games are on the way out, and PC games are coming back. Surprise, surprise.
If you're trying to make a "consider the source" argument, please let me remind you that Xbox and Xbox 360 game consoles use DirectX.

Brendan Mitchell Edwards - Interview (1)

Lord Haw Haw (1248410) | more than 6 years ago | (#22825656)

There's a facinating interview with Edwards, one of the "forgotten founders" of JS. You can download it here [] - it's 16mb WMV but seemed to work fine streaming for me. Very interesting.

Re:Go figure... (0, Redundant)

El Lobo (994537) | more than 6 years ago | (#22825782)

Why is that a surprise? Remember that XBox uses..surprise: DirectX. DirectX is used on both: a PC and a console. The guy is just stating an opinion that you, as a good slashdotter, are trying to over-analyze.

fighting piracy is the main reason (1)

Threni (635302) | more than 5 years ago | (#22825110)

No, it's making money. I used to be more into consoles, even though I've always had a PC, because when I'm not being paid to use computers I don't like struggling with installing software, finding drivers, testing patches etc etc, but now PCs are more competitive on price, and less of a faff to do gaming with, it's less of an issue. Still not ever going to pay more than £100 on a graphics card though, which usually limits me to either slightly older games, or running current games with less eye candy. Still, not my loss.

Re:fighting piracy is the main reason (2, Interesting)

Naughty Bob (1004174) | more than 6 years ago | (#22825494)

fighting piracy is the main reason...
For me, the most insightful part of the first article is where he points out that Warcraft has a new paradigm in DRM- The community. If you construct a game wherein the community is a key aspect of gameplay (and why not? I'd rather frag real people, whose pride will sting with every death, than some dumb bot), you can't then steal the game. Clever.

Competitive on price for how many players? (1)

tepples (727027) | more than 6 years ago | (#22825650)

but now PCs are more competitive on price
In what way? If you want to play a 4-player game with other people who live with you or are visiting your home, you can play a console game or a PC game. To do so on a console, you need one console, one TV, and one copy of the game. To do so on a PC, you need four PCs, four monitors, and four copies of the game, because commercial PC games tend not to support HTPC use cases.

Coming soon... DirectseX (0)

Anonymous Coward | more than 5 years ago | (#22825118)

Consolations as you know them, gone! No more trouble with your joystick -- just 100% pure hard disk action!

Consoles... (5, Insightful)

i_liek_turtles (1110703) | more than 5 years ago | (#22825124)

For gaming, consoles are about as "Just Works" (no Xbox jokes, thanks) as you get. For people who lack computer expertise, but like playing games, how can PCs beat that for the time being?

For games.... (2, Interesting)

iknownuttin (1099999) | more than 6 years ago | (#22825338) can PCs beat that for the time being?

Why should they? What I'm saying is PCs for work and consoles for games. I think it's good that there's a specialty computer for games. That'll relieve some of the pressure on PC makers from having to make these boxes "for everybody". I don't know about you, but most of the graphics capability for my PCs goes unused. And the only reason I can think of is that Intel or whomever designs them that way so that these things "fits all". I'd like an even cheaper mother board for just business type of applications - I don't need the sound cards, super duper video, etc... for email, web browsing, word, exel, or any of the server apps when I'm running Linux on the board.

Re:For games.... (1)

schon (31600) | more than 6 years ago | (#22825404) can PCs beat that for the time being?
Why should they?
Hi there, I'd like to introduce you to this thing called the article. [] Perhaps you might like to read at least the summary, so that you can partake in this discussion without looking like a complete and utter moron.

Re:For games.... (2, Funny)

BlueCollarCamel (884092) | more than 6 years ago | (#22825648)

This your first time to Slashdot?

Re:For games.... (1)

Lemmy Caution (8378) | more than 6 years ago | (#22825556)

I think that there is going to be convergence here, between home PC, DVR/Media Center, and game system. The differences between all of these things are really just interface, as more of the connections become wireless. I can imagine the last "wired bits" being that between CPU/GPU and display. Think of a local "cloud" of interface and display devices.

The problem is one of developing interfaces that make it feel "console simple" to sit down and play a game that is being displayed on your main display (the one in front of your couch) even though the processors used are those used for your home PC, as well.

I give game consoles on more iteration. I expect a PS4, but not a PS5.

...for whose games? (1)

tepples (727027) | more than 6 years ago | (#22825704)

What I'm saying is PCs for work and consoles for games.
For major label games, you might have a point, but what for independent games?

I think it's good that there's a specialty computer for games.
I think it's bad that smaller developers have historically been excluded from them.

Re:Consoles... (1)

DeadDecoy (877617) | more than 6 years ago | (#22825450)

Not to mention it's probably easier for developers to hack out a bug free(er?) game on the console do to a lot of standardization than it would be to make something compatible for almost every single computer configuration out there.

Re:Consoles... (1)

HappyDrgn (142428) | more than 6 years ago | (#22825858)

For gaming, consoles are about as "Just Works" (no Xbox jokes, thanks) as you get. For people who lack computer expertise, but like playing games, how can PCs beat that for the time being?

It "Just Works" is not just people who lack computer experience. There is a growing group of us that are technical and have abandoned the PC as a gaming platform. For me it came down to two major issues. 1) I could not stand buying hardware every six months just for video games. A top of the line video card here, and some RAM there, for what? So I can play some new fancy game? At some point it just stopped making sense. I don't need an uber powerful computer to do my day to day work related tasks. All I need is a browser, email and a dozen or so terminal windows. For the little time I do spend playing games it's just not worth it. 2) *Mostly* Windows only. I hate it. It started becoming a blocker to doing day to day work. After years of blue screens, random data loss, faulty drivers, the horrible transition to Windows98 I gave up and have never looked back. Games should be fun! Fixing / updating / upgrading a computer is not fun when I just spent all day doing that at work. Yeah, I use the "just works" excuse, but not for lack of experience.

If He Thinks "Vista Blows"... (1)

NeverVotedBush (1041088) | more than 5 years ago | (#22825128)

No argument there, of course, but how does he think game consoles are dead/will die and regular computers will win back the gaming scene, if the savior OS for Windows is so dead in the water?

Something else he is missing is that game consoles have introduced lots of people who aren't computer savvy to gaming. I think they will tend to stick to consoles especially when consoles don't have all the problems with malware and viruses that PCs do.

Re:If He Thinks "Vista Blows"... (2, Interesting)

ScrewMaster (602015) | more than 5 years ago | (#22825210)

I think they will tend to stick to consoles especially when consoles don't have all the problems with malware and viruses that PCs do.

That, I suspect, will change as online gaming becomes ever more popular. Furthermore, if the "convergence" that Microsoft is always harping on comes about (with consoles being used for more and more computer-like functions) you'll see consoles becoming targets as well. Hell, even the handhelds have resident Web browsers and WiFi capability, and probably a metric fuckton of security holes just waiting for the right blackhat to take advantage of them. Gaming systems are sophisticated network-aware computers in their own right, are regularly being plugged into home networks which also contain PCs and other IP-based devices. That's a potential risk in and of itself, and I'm sure it will eventually be exploited.

Re:If He Thinks "Vista Blows"... (1)

Captain Splendid (673276) | more than 5 years ago | (#22825254)

That, I suspect, will change as online gaming becomes ever more popular.

Change back to PCs, you mean? Doubt it. Even Sony's managed to finally figure out the online angle, and both the Xbox and the Wii already rock in that department. If anything, decent online compatibility on the consoles will swing things completely the other way.

you'll see consoles becoming targets as well.

Even if it does get to be the problem it is on PCs (very doubtful), it won't matter. It's a trivial task to reset a console to factory defaults and get up and gaming again. Your worst case scenario will probably be loss of saved games, and if you're that anal about it, you've already bought a memory card for safety anyway.

Re:If He Thinks "Vista Blows"... (2, Insightful)

ScrewMaster (602015) | more than 6 years ago | (#22825358)

Change back to PCs, you mean?

I specifically used the term "convergence", which is what Microsoft (and Sony) would like to see happen. That's where the "console" turns into an entertainment center and a home computer. IF (and that's a big 'if') that actually happens, you will see consoles become malware targets. Furthermore, if the convergence between PC and console does happen, you'll find that it won't be so easy or desirable to "reset" your console, for much the same reason that "resetting" a PC is such a pain.

Re:If He Thinks "Vista Blows"... (1)

NeverVotedBush (1041088) | more than 6 years ago | (#22825454)

In spite of all of the predictions of that convergence, I think it won't happen for many.

The reason is that PCs will always be good for doing actual work. For that, people will always want a desk, a nice monitor, and an upright chair. A gaming console just does not provide that experience.

I have a nice big screen TV in my living room, a comfy couch, a coffee table, and end tables. There is nothing about that setup conducive to writing reports, coding software, or doing graphics stuff. And I think it is that kind of environment that is best for consoles.

For the very few who only have a desk and monitor, that convergence thing might work.

Re:If He Thinks "Vista Blows"... (1)

darksith69 (812076) | more than 6 years ago | (#22825562)

The point is there is not much reward hacking into a console. Most of them are read-only devices, with recent exceptions like flash drives or hard disks to either cache games or save user preferences. What good is it for a hacker to 0wn your PS3? Will he sell your Tetris preferences on the black market? Maybe delete them? There's not much personal information there like documents or visa credit card information like on a PC (I guess you could get the cache of the browser if the console stores one, but then again, that's easy to flush after each usage).

What do we have left then? CPU. You could use the CPU of the console for a bot network. The point is, consoles may not be online all the time like a PC might, especially if we consider that most of the market of consoles is for the casual gamer who doesn't spend more than one hour every day playing games. So again, not much to exploit there really.

Maybe hack the firmware upgrade to deploy troyan? This could get you somewhere, installing a network sniffer for other more valuable PCs' communications. But then again, not everybody networks their console, especially if people are unaware of the possibility (wow, so you can browse the internet on a TV with the wii? cool!). Another reduced percentage of the whole market. Plus firmware may be easier to restore than a real OS on a PC once you know you have a virus, tends to be a matter of pressing the reset button for a longer time than usual.

All in all consoles surely can be hacked, but in general Windowses are so much easier targets that consoles are just like Macs and other minority OSes. Hackers know they exists, but don't put much attention to them because they are not the low hanging fruit.


tepples (727027) | more than 6 years ago | (#22825862)

What good is it for a hacker to 0wn your PS3?

Brickers. The new consoles have firmware that updates itself over the Internet. A computer vandal could corrupt the firmware so that the console no longer shows its system menu.

But that's not nearly as profitable as spam. Lots and lots of spam. The consoles of the PS3 generation do a lot more on standby than the previous consoles did. Nintendo even advertises its "WiiConnect24" as a feature of its Wii console: games can install channels that update themselves while the console is sleeping. What if all those sleeping consoles were sending unsolicited advertisements?

Re:If He Thinks "Vista Blows"... (1)

LrdDimwit (1133419) | more than 6 years ago | (#22825624)

People have been trying to make that convergence happen for years with limited-to-no success. But there are a host of reasons the console market is what it is today, and difficulty of piracy is only one aspect. Write-once-run-anywhere is really true for consoles: your game runs on any PS2, XBox 360, NES, whatever console it was originally designed for ... it just works.

What I see as the most overriding factor is the hardware subsidy. The economics of a $300 device are wildly different from a $2500 multipurpose tool. Many people won't spend more; Supposing the average person buys 10 games over the life of the machine, and suppose Sony's cut is $25. Would you recommend Sony charge $700 for the console and $35 for the games, or $450/PS3 and $60/game?

This to me is the real advantage of a console. The marketplace is distorted by a subsidy that reduces a console to an impulse buy. (Comparing to a PC, mind.) The machines are more powerful (better game product) and sell much more than they would without this distortion. So game budgets can be higher, since you will have a huge customer base. These kind of hit-driven markets are subject to network effects, and so this kind of draw can have a profound impact.

Re:If He Thinks "Vista Blows"... (1)

TheNetAvenger (624455) | more than 6 years ago | (#22825402)

I think this guys needs to be taken with the grain of salt he deserves.

You realize the whole PC vs Console argument for HIM is because PCs don't have enough DRM to ensure games don't get pirated. His viewpoint on DRM makes Microsoft look like the anti-DRM corporation.

PS Anyone that installed Wild Tangent knows to just run from anything this guy touches. The best thing that happened to Microsoft and DirectX was this guy leaving.

BTW HE DID NOT WORK ON THE 3D ASPECTS of DirectX as much as he would like people to believe, he worked with Direct Draw the 2D portion of DirectX. Strange uh?

Re:If He Thinks "Vista Blows"... (1)

F-3582 (996772) | more than 6 years ago | (#22825602)

And I guess he got severe depressions from the experience that 2D console games always looked better and ran smoother than games developed for his beloved DirectDraw architecture. Looks like a lot of things unspoken between those consoles and him. I can literally feel his grudge....

Er, no (-1)

Anonymous Coward | more than 5 years ago | (#22825148)

People buy game consoles because they are specifically designed to play video games. They connect to your television easily, they come with controllers for which the available games were designed in advance, they do not have loud annoying fans, capacitors that blow easily, easily-tipped-over cases, and a host of other problems a computer can (and often does) have in comparison.* Oh, oops. The before assumes the audience intended for this article is consumers. Ah, but this is Microsoft talking.

They make what? The xbox? Huh.

* The viruses and malware found on computers running the Windows operating system is another example of why a console is preferable to a computer for the average consumer.

"secure" (1)

Cajun Hell (725246) | more than 5 years ago | (#22825152)

You keep using that word. I do not think it means what you think it means.

Re:"secure" (0)

Anonymous Coward | more than 5 years ago | (#22825306)

Secure means not hacked yet.

Why Microsoft Dislikes Intel Graphics (5, Informative)

Bruce Perens (3872) | more than 5 years ago | (#22825154)

Microsoft dislikes Intel graphics because they're publicly documented for full 3D use by Linux and other Free Software. Intel has put a tremendous time into developing X for them, employing many of the key X developers. I use them on a laptop and desktop, and they work excellently. They are not yet as fast as some other graphics chips. But then again they are better than anything we had at Pixar when I was there :-) Time flies.


Re:Why Microsoft Dislikes Intel Graphics (0)

Anonymous Coward | more than 5 years ago | (#22825284)

Did you RTFA? This guy WORKED for Microsoft until 1998 (10 years ago). How could you infer from the article Microsoft dislikes Intel graphics?

Re:Why Microsoft Dislikes Intel Graphics (5, Informative)

JanusFury (452699) | more than 6 years ago | (#22825382)

That's interesting, but this article is about someone who doesn't work for Microsoft anymore, and hates Intel graphics chips for the same reason any other game developer hates them: They're utter garbage.

I'll enumerate the primary reasons quickly, since I don't expect you to be intimately familiar with the relationship between graphics programmers and graphics driver developers (it's drastically different from Intel's relationship with the X developers):

1) Intel graphics drivers are possibly the most inconsistent drivers on the market. Any given user with a particular Intel chipset might have one of a hundred different driver configurations, as a result of the fact that the chips are bundled with different motherboards which then come with their own driver package... and when you add pre-built machine vendors into the mix the situation is only worse. If their driver quality was extremely high across the board, this wouldn't be an issue, but...

2) Intel graphics drivers have a bad stability track record, at least on Windows. They have a tendency to return invalid/nonsensical error codes from driver calls that shouldn't be able to fail, or to silently fail out inside a driver call instead of returning the error code they're supposed to... resulting in graphics programmers having to special-case handling of individual Intel graphics chipsets (and even driver revisions). In my case, I ended up just having to shut off entire blocks of my hardware-accelerated pipeline on Intel chipsets and replace them with custom software implementations to avoid the incredible hassle involved in coming up with specific fixes. (The wide variety of chipsets and drivers out there meant that for my particular project - an indie game - it was impossible to ensure that I had worked around every bug a user was likely to hit, so I had to just opt out of hardware accel in problem areas entirely).

3) Intel graphics chipsets have sub-par performance across the board, despite marketing claims otherwise. This is mostly problematic for people developing 'cutting-edge' games software, where it creates a 'he-said-she-said' situation with a game developer/publisher claiming that a user's video chipset is insufficient to run a game while Intel claims the complete opposite. (in most cases, Intel is lying.) This is particularly troublesome in areas like support for cutting-edge shader technology, where an Intel chipset may 'support' a feature like Pixel Shader Model 3.0 but implement it in such a way to make it completely unusable. Users don't benefit from this, and neither do developers.

4) Intel graphics chipsets harm the add-on graphics market by discouraging users from picking up a (significantly better) bargain video card from NVidia/ATI for $50 and dropping it into their machine. This hurts everyone because even though that bargain card is significantly better (and most likely more reliable), the user already 'paid' for the integrated chipset on their motherboard, and the documentation that comes with it attempts to make them believe that they don't need a video card. I consider this a dramatic step backward compared to the situation years ago, when integrated graphics chipsets were unheard of and people instead had the option of 'bargain 2d' video cards like Trident or Matrox that would do everything needed for desktop 2D, but also had the option of fairly affordable 3D accelerator cards if they wanted to play games occasionally.

On the bright side, most integrated ATI/NVidia GPUs these days are mature enough to be able to run games acceptably and meet the needs of a typical user. The only thing really holding the market back here, in my opinion, is Intel's insistence on marketing inferior products instead of partnering with ATI or NVidia to please their customers.

Of course, this is unrelated to your point that their Linux/Free Software support is superb, as is their documentation - I'm inclined to agree with you here, but it unfortunately doesn't do much to outweigh their other grievous sins.

Re:Why Microsoft Dislikes Intel Graphics (1, Funny)

Anonymous Coward | more than 6 years ago | (#22825442)

Your mistake is in thinking Bruce read the article, or cared. He saw "Microsoft" and hit reply.

Re:Why Microsoft Dislikes Intel Graphics (1)

Bruce Perens (3872) | more than 6 years ago | (#22825646)

Hi AC,

Your mistake is in thinking Bruce read the article, or cared. He saw "Microsoft" and hit reply.

I did read the part about Intel, and looked for why he thought Vista sucked. I am assuming that part of what he's complaining about could be Microsoft-implementation specific. Others have stepped up to explain the relationship between Direct X developers and MS better.


Re:Why Microsoft Dislikes Intel Graphics (1)

TheNetAvenger (624455) | more than 6 years ago | (#22825598)

Don't forget that several generations of Intel GPU chipsets use CPU operations to function fully.

Even the big Intel argument for 915 chipset to be 'Vista Ready' and promising Aero(WDDM) drivers, and then realizing there was no way technically to implement this since the 915 chipset shoved too many of the DirectX features though the CPU/SSE instead of handling the opertions inherently.

Intel's bastard stepchildren from the 915 to the current x3100 STILL shove operations though the CPU and use SSE optimization to try to get real GPU performance, and then people wonder why their gaming performance with an Intel base Video card is 10x slower than the cheapest native GPU technolog from ATI and NVidia.

To the GP......
I'm sure Intel does give good X support, but this is because they are a business class company providing business class GPU products that perform horribly outside of an X protocol bitmap world, you know, like with gaming...

Anyone that would argue that Intel GPUs are good for gaming or only perceived as bad because they have good X support and MS doesn't like that, is 'really' digging for a reason to hate MS beyond the normal SlashDot kneejerk response. GP, Microsoft's like of the Intel GPU has nothing to do with how much the Intel GPU technologies SUCK!

I can't believe someone would really be out to defend intel's graphics technology, this is just a bit scary...

Great explanation! (4, Interesting)

Bruce Perens (3872) | more than 6 years ago | (#22825746)

Thanks. This makes more sense now. It is radically different from our experience on Linux, though. I once came to the Intel X developers with a rather obtuse problem in the i965 driver that made it run at half-speed. It turned out to be related to the MTRR (memory type and range registers) and a BIOS bug. Believe it or not, the problem is activated by a BIOS FAN setting!

Now, on the mailing list for this driver, I immediately got access to the lead developers. OK, they knew I was Bruce, but it looked like they were treating all callers the same way. They connected me with Intel BIOS programmers, etc.

Now, imaging having this problem in the Windows world. You would be routed to a call-center employee in India who would go through a script with you.

I am using the same driver with i915 in an old Sony laptop and i965 in a new duo motherboard. Both seem to work fine. I don't know how much lower-level DirectX is than GL.


Re:Why Microsoft Dislikes Intel Graphics (1)

twitter (104583) | more than 6 years ago | (#22825400)

Does Intel make aftermarket cards yet? I'd really like to have one or two.

Re:Why Microsoft Dislikes Intel Graphics (1)

Hatta (162192) | more than 6 years ago | (#22825412)

How many games do you play on your intel 3d accelerator?

Re:Why Microsoft Dislikes Intel Graphics (2, Interesting)

Bruce Perens (3872) | more than 6 years ago | (#22825796)

How many games do you play on your intel 3d accelerator?

My 8-year-old son and I play Flightgear. We have two 1280x1024 monitors, both displaying different rectangles of the same graphics plane, and we sometimes pull the window wide so that it displays across both screens at around 2500x1000. The driver still delivers full accelleration when we do that. It gets about 14 frames per second in 2500x1000 mode. We have the CH yoke, pedals, and quadrant. We've played some of the other Open GL games that come with Debian.


What about Vista? (0)

Anonymous Coward | more than 6 years ago | (#22825418)

> In part one, Alex blasts Intel for pushing its inferior onboard graphics technology to OEMs

Remember, this also hurt them when Microsoft rewrote the "Vista Capable" specs to include some of the low-end chipsets. Intel's chips may not be very powerful, but the fact that they're so well documented makes them a lot more useful to free software people.

I like Intel G35 on board (0, Offtopic)

Mike Zilva (785109) | more than 6 years ago | (#22825772)

Recently bought an Asus P5E-VM HDMI and I'm happy runing ubuntu 8.04 beta (with E8200@3.2Ghz intel CPU) it's MUCH faster in everything (including GoogleEarth OpenGL using 1600x1200) than my previous system with AthlonXP3000+ and nVidia FX5200.

I just chose this board cos intel released specs and open documentation for programing the graphics processor, unfortunately actually there are no separated graphics card from intel.

I'm also planing to buy an EeePC witch use an integrated intel graphics and it seems to be more than adequate even for XGL.

The appeal of console gaming (4, Insightful)

Johnny Fusion (658094) | more than 5 years ago | (#22825180)

I use a console when I want to step away from the computer. Console games have some advantages over computers, one you never have to check for system requirements.

As to the demise, I mean lots of people (me included) are still playing vintage game consoles. Heck I got an Atari Paddle Set that works of AA batteries that I still play. But perhaps that says more about the timelessness of Breakout and Pong than consoles...

WildTangent has been a dead end since 2001 (3, Informative)

Qbertino (265505) | more than 5 years ago | (#22825226)

WildTangent actually gained some attention back in 2001, when the offered a web 3D plugin and a dev-enviroment that didn't cost a bazillion dollars. They let their heels drag, only kept offering their plattform for Windows and basically ignored any opinion-leaders in multimedia or VM-based gaming & 3D. WildTangent today is next to insignificant and their 'Orb' VM console (which afaict only runs on MS OSes) is nothing but a pimped WildTangent Plugin/Player and won't gain any traction beyond some niche group who wants to play a console game on the PC. For whatever reasons there may be.

Bottom line: Nothing to see here, move along.

well, I don't know, but (1)

rucs_hack (784150) | more than 5 years ago | (#22825234)

Shit, anyone who commissions a huge model penis for the initial launch of his product can't be all bad.

Seriously, check your history, am I right or am I right?

St John is under the delusion that (2, Insightful)

joeflies (529536) | more than 5 years ago | (#22825280)

OEM video is for gamers in the first place. OEM video is just fine for what it is - people who use computers at work on office documents, presentations, and web browsing.

No matter what GPU is on the on-board video, it won't be enough for gamers.

Re:St John is under the delusion that (1)

Koiu Lpoi (632570) | more than 6 years ago | (#22825730)

OEM video is for gamers in the first place. OEM video is just fine for what it is - people who use computers at work on office documents, presentations, and web browsing.

Ever used a laptop? There's a plethora of people (college students and young working adults, mostly) who would love to play a 3d game, but can't because their 1300 dollar laptop has an Intel chip in it.

No matter what GPU is on the on-board video, it won't be enough for gamers.

You're mistaking "gamers" for those people on the cutting edge who actually spent money on SLI video cards. Enthusiasts. Gamers just want to play games. For instance, I'm a gamer. My laptop has a GeForce 8600M GT in it. It can play Team Fortress 2, cranked. It's enough for me, and I suspect, far more than enough for those people who would love to do anything in 3D but can't.

And guess how much this laptop cost? 1300 USD. OEM video can exist in a form that doesn't suck, and without too much extra cost.

Re:St John is under the delusion that (1)

Bill, Shooter of Bul (629286) | more than 6 years ago | (#22825762)

No, he's just wishing that there were more systems that would play newer games. The more computers that are put out there that can't run games released two years ago, the fewer potential customers of pc games there are. He's arguing that if there were more gamer level pcs, there would be more gamers.

Washes self with holy water to remove the stain of having used the word gamer. shudder

infinium phantom (2, Funny)

fred fleenblat (463628) | more than 5 years ago | (#22825294)

That worked out well.

Popular comments (1)

kramulous (977841) | more than 5 years ago | (#22825296)

convergence of the GPU and the CPU is the next big thing
Wow! That's really insightful. Given that I've clocked my GPU at 182GFlop/s for FFT who would have known that this was going to happen?

insists that fighting piracy is the main reason for the existence of gaming consoles
Honestly, can you blame the console makers for this? I have my old hacked XBox and was pirating content. I have my 360 and have bought plenty of content for it. The game makers are getting their money, pushing newer, more advanced games and will require more advanced hardware. Making

pushing its inferior onboard graphics technology to OEMs
future redundant

Piracy? (1, Interesting)

Anonymous Coward | more than 5 years ago | (#22825316)

You're having a laugh. Both the top selling current game units, the Wii and NDS are both pirate magnates and trivial to mod accordingly. Nintendo left the Wii open to get the masses buying the boxes knowing most people will get the modded and download "backup" games, the NDS requires nothing more than a plug in cartridge. The 360 a bit more difficult to mod, but the full library is available. Clearly the big selling consoles are not doing anything against piracy. Whereas the PS3, still a long way from being hacked, doesn't sell as well. Go figure.

Re:Piracy? (2, Informative)

Jeff DeMaagd (2015) | more than 6 years ago | (#22825738)

Whereas the PS3, still a long way from being hacked, doesn't sell as well. Go figure.

First, correlation is not causation.

Second, NPD showed that PS3 has been outselling 360 in Jan '08 and Feb '08.

Overexcited (1)

SnoopJeDi (859765) | more than 5 years ago | (#22825318)

He's obviously very very excited about this glorious WildTangent Orb business, which I (as a somewhat-in-the-know gamer) have never heard of. Ever.

I gave up around the time he started talking about booting up an HP or Toshiba or Gateway and doing something with Orb. I was just getting nothing out of this article.

Curious question though. As far as I knew, the 'future' of gaming is all about more specialization in chips. He's talking about merging the GPU and CPU, but the big things I keep hearing about include more specialization (PPU and PhysX anybody?). What gives?

This guy is on crack (5, Insightful)

SilverBlade2k (1005695) | more than 6 years ago | (#22825346)

Console gaming will eventually kill PC gaming. It is cheaper for developers since they don't have to make the game to work on 20 million PC configurations, only 1 console configuration. Plus, consumers have to spend a fortune to upgrade their systems to play the newest games. Even some video cards alone are more pricey then a whole console system.

Re:This guy is on crack (1)

WilyCoder (736280) | more than 6 years ago | (#22825472)

As a graphics programmer, I can back up your statement with lots of headaches caused solely by the bajillion (technical term) different hardware configurations out there. You'd think following the standards would be enough but then you would being shortsighted...

Re:This guy is on crack (0, Flamebait)

Kenoli (934612) | more than 6 years ago | (#22825526)

Ever heard of DirectX? How about SDL?

PS, Not all new games require cutting edge technology to play. Some do, but I imagine they are at a severe disadvantage, because as you said, that stuff can be expensive.

Re:This guy is on crack (1)

tepples (727027) | more than 6 years ago | (#22825776)

Ever heard of DirectX?
Just because OpenGL or DirectX graphics exists doesn't mean that every video driver that claims to conform to the specification actually conforms in a useful way, especially Intel drivers [] .

Microstudios (2, Interesting)

tepples (727027) | more than 6 years ago | (#22825744)

Console gaming will eventually kill PC gaming. It is cheaper for developers since they don't have to make the game to work on 20 million PC configurations, only 1 console configuration.
A lot of microstudios develop for PC because they are too small to qualify for console development licenses. What do you suggest for them?

Netcraft confirms it, PC/console gaming are dead.. (1)

analog_line (465182) | more than 6 years ago | (#22825348)

Break out the board games ladies and gentlemen.

Why consoles will win (2, Insightful)

Kohath (38547) | more than 6 years ago | (#22825354)

Consoles are winning and will eventually win. The reason is simple:

Updating your video driver (or other drivers) is not a fun part of gaming. But for PC games, it's usually the first level you have to play.

Now that consoles have comparable graphics and sound to a mid-level PC, there's little advantage to using a PC over a console for games. And there are often large disadvantages.

Re:Why consoles will win (1)

dreamchaser (49529) | more than 6 years ago | (#22825380)

Consoles will have similar graphics and sound capabilities to PC's for about six months to a year. Neither platform is going to go away anytime soon.

Re:Why consoles will win (1)

Kohath (38547) | more than 6 years ago | (#22825446)

Games take 2-3 years to develop. And there'll be another new generation of consoles in a few years.

Also, there's a price consideration. It doesn't matter, on average, what the new graphics cards can do. It matters what the $70 graphics card can do. It matters what the reasonably-priced laptop graphics systems can do.

You post underscores another big problem with PC gaming -- the compulsion to upgrade your system every six months or every year. I paid a lot for my console, but it'll save me 5 times over in frustrating PC upgrades.

PC gaming won't be going away, but it will shrink until it occupies a mostly-MMO niche. It's already well on the way there.

Re:Why consoles will win (1)

F-3582 (996772) | more than 6 years ago | (#22825780)

If you looked at the launch titles and measured the time PC games took to achieve similar results, you might be correct. But that would be a somehow unfair comparison. Console games are evolving, as well, and therefore you should take landmark titles for this comparison.

For example, take Conker Live and Reloaded. When did PC games start to look like that awesome title? Or God of War. Metroid Prime is a good example, too. Final Fantasy XII or Shadow of the Colossus probably not, because they had pretty low frame rates. Anyway, your estimation can surely be extended quite a lot.

Re:Why consoles will win (2, Interesting)

amn108 (1231606) | more than 6 years ago | (#22825420)

People who, for one reason or another like or know enough to do a driver update without smashing their machine to pieces, will always prefer PCs because PCs were and will stay to be the bleeding edge of hardware that drives all these games today. It is perhaps appropriate to call the whole PC gaming a sort of testing grounds for the future of gaming, and every 5 years or so, some manufacturer or another (MS, Sony, Nintendo at this time) decide to cement the testing grounds into a stable, non-volatile gaming platform that one can owe for more than a year and play games at without thinking about at least, graphic driver update. Nevertheless, the testing grounds that PCs are will remain, because there is a purpose to it. Another advantage is that since it is all testing, it is all bleeding edge, and most hardcore gamers breathe bleeding edge. Ever seen a 15 year old who knows everything about NVidia's roadmap for two years ahead, yet has hardly ever been intim with a female? I have.

You just can't expect computers to die as a gaming platform, because no matter how nice it is to have a non-changing console development platform that you don't have to update drivers for, and with which you can just have fun developing games, without worrying about drivers and funky crashes, version conflicts etc, it is still not an option to expect the gaming hardware market (which as most historians of the field know kick started and fueled 3d mathematics and algorithms, plus GPU design since abouit 1995 with the advent of 3dfx Voodoo, Riva and Rage chips) to freeze every 5 years, so that little kids can play their shiny little white PSx that site under their TV.

It is simply two parallel markets, and the only thing they share is the game industry.

Re:Why consoles will win (1)

Kohath (38547) | more than 6 years ago | (#22825554)

But the game developers don't see it that way. They are making games that perform about the same across 2-3 platforms, including consoles and medium-end PCs. They don't make bleeding-edge games because the bulk of the game-buying public won't afford the computer needed to play a bleeding-edge game.

Try listing the bleeding-edge games. I'll start: Crysis. Are there any more?

High production-values games need to sell a LOT of copies to make money. And you just can't sell that many to guys who have $400 graphics cards. The money and the games are on consoles and low to medium end PCs, including laptops.

It's not a performance race any more. Gaming is a mass market now.

Re:Why consoles will win (1)

halycon404 (1101109) | more than 6 years ago | (#22825754)

Thats nothing new, only one bleeding edge game to push the market forward. The last time I went through a serious hardware change was Doom 3, summer 2004. I updated everything to top of the line to play that game, because to get the most out of it, you had to. And for the last 3 years, I was happy with it. It wasn't bleeding edge anymore and it was starting to show its age, but it played everything acceptably well without downgrading below default setup. Then Crysis came out, I've spent a few grand upgrading again. And for the next year, it'll play near everything at max graphics, and for the next 2 years at default graphics levels. At the end of that time I'll have to upgrade again because some new game which pushes the bar higher comes out. Its the nature of the business, every 2-3 years, you upgrade.

Re:Why consoles will win (5, Insightful)

lycono (173768) | more than 6 years ago | (#22825590)

I happen to like FPS games. I also happen to hate FPS games on consoles because I much prefer using a mouse over a joystick to aim. Chalk it up to my inability to learn how to use the console controller correctly or chalk it up to the inadequacy of the controller for these kinds of games. Either way, I still prefer playing with a mouse. This is a huge reason I don't play many console games.

Re:Why consoles will win (1)

SanityInAnarchy (655584) | more than 6 years ago | (#22825652)

Show me where I can buy World of Warcraft for a console.

Oh, and while I don't often bring this up, updating my drivers is:

apt-get update
apt-get dist-upgrade

That's all drivers, on the entire system. On OS X, it's even simpler: When your computer asks you if you want to update now, type your password and click OK. When it finishes, it asks you to reboot; click "reboot".

Only on Windows is this kind of thing a chore.

Re:Why consoles will win (1)

Jeff DeMaagd (2015) | more than 6 years ago | (#22825756)

I'd say that WoW is an outlier, it's a popular game but it's just one game.

Re:Why consoles will win (1)

Kohath (38547) | more than 6 years ago | (#22825766)

Arguing exceptions like MMOs, Linux machines and Macs doesn't really advance things.

The point is that consoles are going to take more and more of the game market share away from PCs. The point was NOT that PCs would eventually stop being used for games.

Consoles have the upper hand now. That will continue for at least 2 more years. PCs won't catch up and take over like the Wild Tangent guy says. He's wrong -- for the foreseeable future.

Re:Why consoles will win (1)

stone2020 (123807) | more than 6 years ago | (#22825728)

Except consoles are becoming more like PC's. Don't most of the consoles have system updates now that are no different than updating video card drivers?

Lockout chip business model (3, Insightful)

tepples (727027) | more than 6 years ago | (#22825788)

there's little advantage to using a PC over a console for games.
Other than the fact that PC users can download and run games released as free software, freeware, or shareware, produced by any developer with a copy of Windows and a copy of GCC [] ? Consoles such as Wii are restricted to developers that are established businesses with actual office space (see for details), and the game cannot include copylefted free software because the console makers outright refuse to allow the developers to provide Installation Information.

Re:Lockout chip business model (1)

Kohath (38547) | more than 6 years ago | (#22825848)

And this is a big selling point to a very tiny fraction of the public with an even tinier fraction of game-buying dollars.

Consoles won't be your choice if you want your games to be "free". They are a good option for folks who prioritize fun in their gaming though.

Agreed (1)

rsilvergun (571051) | more than 6 years ago | (#22825794)

I just tried to install Peter Jackson's King Kong on my PC, only to yank it off because the Starforce copy protection causes all sorts of problems. Plus, I don't like having to upgrade my hardware every 2 years to keep playing. I'm basically lazy, and I like just poppint in the disk and playing a damn game.

Just bought a console (4, Insightful)

chicago_scott (458445) | more than 6 years ago | (#22825388)

I just recently bought a console. The main reason was because I was tired of needing to buy a new graphics card every year in order to to display the best graphics and have the best performance for the newest games and the only reason I needed to upgrade was for games. I did this when I went from PCI to AGP many yeas ago, thereby needing to buy a new motherboard, new processor, memory, etc. (I have also upgraded the motherboard several times since then in order to have a faster processor and memory.)

I didn't want to do that again in order to upgrade to PCI-E, so I bought a 360 console for less than half the price and I don't intend to upgrade my PC again for at least two or three years. I think a 3.2 GHz processsor and 2 GB of memory will be fine for software development for at least that long.

I also wanted to play games on a large screen and not have to sit in the same chair where I work all day when I'm relaxing.

Intels crappy chip (0)

Anonymous Coward | more than 6 years ago | (#22825392)

Intel may have a "crappy graphics chip" for games but I bought a new computer a few months ago and after spending a lot of time reading about hardware, so I could make sure I was getting the parts that were right for me, Intel graphics was the first thing I knew I wanted. Crap or not, if I wanted free drivers that was the best choice. ATI had just started with their open source thing but I needed something that worked right away, not sometime in the future. The next computer I buy might have an ATI card but who knows, for now I'm very happy with what I have, and I reckon I'll keep it for a long time.

It sure beats the ATI card that I had on my old computer, I had trouble switching from Windows to GNU/Linux because everything felt so sluggish. And as if that wasn't enough I remember having to install proprietary drivers for the motherboard which was a real PITA. I'll never again buy another piece of hardware that doesn't come with good free drivers.

"Vista Blows" (1)

ObjetDart (700355) | more than 6 years ago | (#22825460)

I read the whole portion of the interview under the "Vista Blows" link in the summary.

Unless I missed something, no where did he even explain why Vista blows, other than a vague reference to DirectX 10 being "bloated". I would have sure appreciated just a little bit more elaboration here.

Not in my house (1)

Tsiangkun (746511) | more than 6 years ago | (#22825464)

Computer gaming is always going to slant the playing field in favor of the gamer with the biggest budget.

I predict this guy is wrong, if only because some of us don't care to perpetually upgrade a machine so we can play games with our friends.

Console end of life (1)

tepples (727027) | more than 6 years ago | (#22825800)

I predict this guy is wrong, if only because some of us don't care to perpetually upgrade a machine so we can play games with our friends.
Nintendo doesn't publish (or license third parties to publish) GameCube games anymore. So you still have to upgrade in order to play new games with your friends when they visit your home, and that'll be $400 for a Wii on eBay.

OK, some facts now... (4, Interesting)

TheNetAvenger (624455) | more than 6 years ago | (#22825514)

1) He claims to be a 3D expert, but for some reason he only worked on the 2D aspects of DirectX while he was at Microsoft. (DirectDraw, etc)

2) His current software and games are very much NOT 3D, so he is commenting on the 3D market why again?

3) His argument about PCs not being good gaming platforms is that they don't contain enough DRM? Truly, go back and read this again. What the hell does he want, a gun pointing a peoples faces if their mouse gets near the rip or copy button?

4) Throughout the article they keep talking about WildTangent Orb, which is a program that competes DIRECTLY with Windows Vista & Windows Marketplace & Games for Windows, in Rating games based on system performance, and providing a consistent expectation for the gamer.

5) WildTangent huh... Ok, anyone that installed this software or has removed it from a friends computer would shudder to think that this guy has any insight when it comes to programming at all, let alone 3D gaming. (WildTangent is borderline Spyware, and the games are kludges, slow, etc.)

6) He thinks DirectX is bad and Vista is bad, but argue that they the best that can be done with 3D gaming. Hmm..

7) He talks about the DirectX hardware abstraction levels and implies DirectX 10 is further from the hardware than previous versions. This is really really inaccurate, as DirectX even opens a new diret pipeline for shoving calculations and physics to the GPU. The only place DirectX 10 is 'further' from hardware is the removal of DirectSound, but this has been replaced in 10.1 with a new hardware layer that is compatible with the new Vista sound subsystem. This stuff makes me think the guy is insane, has a chip or both.

8) His argues about current 3D technology is tricks, but raytracing is real 3D? Um, raytracing is also freaking tricks, especially if you work to get any performance out of it. (And this is just in studio level rendering we are talking about, let alone gaming). Moving raytracing to games or adding it to current 3D technologies would be great, but it is going to take more 'tricks' for good performance and STILL WILL NOT BE REAL 3D, any more than current gaming technologies. He is an expert and yet doesn't understand this? Holy cow...

9) The only thing I can agree with in the article is the portion about onboard Video being a bane to the gaming industry, and Intel being a horrible proponent of bad entry level 3D chipsets that can't even run Flight Sim 98, let alone a current game with more than 15fps.

Re:OK, some facts now... (1)

n0dna (939092) | more than 6 years ago | (#22825870)

Alex St. John is the single reason I stopped reading CPU Magazine.

He has exactly 2 articles that he resubmits monthly. The first one is how stupid Microsoft is for everything and if they would just call and ask him, he'd explain it to them. The second one is about how awesome Wild Tangent is, and how confused he is that there isn't a single person on earth smart enough to realize this but him, oh and also it's not really spyware despite the data it phones home.

Occasionally he'll get cornered in an OpEd and have to say "Really? I'm sure I write about other stuff."

He may be a blow-hard, but at least he's irrelevant.

PC and Consoles, not PC vs Consoles (1)

Enderandrew (866215) | more than 6 years ago | (#22825532)

Each can coexist and have their own niche, and perhaps that is the way it should be. Some games you absolutely need a keyboard for. That being said, I had zero interest in reading TFA until I saw that he admits "Vista blows".

The article sounded interesting (0)

Anonymous Coward | more than 6 years ago | (#22825536)

and then I found out this guy did WildTangent, after which I immediately closed the window. Wildtangent if remember correctly only did really crappy online games that bundled spyware, and one of their most popular products was some computer animated stripper screensaver, or something similar.

Anything he has to say isn't worth listening to.

And here's why we need raytracing... (3, Interesting)

argent (18001) | more than 6 years ago | (#22825582)

And so, what you see, is one of the reasons that games that have 40 million dollar budgets and that too close to 80 percent of the cost of the game is art now, is that art replaces, or fakes, the absence of good 3D or realistic 3D and physics. Because instead of having a realistic interaction with the [game] world, what I do instead is create a lot more animations. For every possible scenario in the game....
This is why we need real-time raytracing and real-time physics.

Getting great graphics from the next generation of raster engines is going to cost even more. Sure, you can sit there and micromanage every goddamn thing on the screen and get graphics that look good enough that you can't tell them from optically correct rendering at a glance. But that costs you five times as much as building a model and telling the graphics engine to render it, and letting the software figure out where you need shadows and hilights and bloom.

The other side of this is the Myst problem. Remember Myst? Remember how you could only go where they're rendered the scenes? Now in many modern games, guess what, you can only go where they've prepared the scenes. You can't even walk across a flowerbed and around the back of the tavern, because they haven't prepared the back of the tavern. you get puzzles that involve figuring out what rope to grab to climb up a 45 degree slope, and if they haven't decided that you're going to be able to climb that slope you can't... even if you've got elf boots and a magic rope.

Why? Because it's so damned expensive to get them looking good.

Let the computer do the stuff that we know how to make a computer do... simulation... and let the humans worry about making the simulation fun.

Re:And here's why we need raytracing... (1)

Koiu Lpoi (632570) | more than 6 years ago | (#22825700)

A lot of people like the straightforward, game-tells-you-where-to-go style gameplay. See Final Fantasy and its popularity.

For you, try Morrowind.

input devices or online community (2, Interesting)

m0llusk (789903) | more than 6 years ago | (#22825588)

How about some improved software? Why do NPCs in supposedly advanced games often just stand around or walk back and forth continuously for the entire game? When are simulated game realities going to become interesting enough that interacting with virtual elements is as interesting as shooting them?

Short version (1)

3vi1 (544505) | more than 6 years ago | (#22825632)

This guy's on crack. Nothing will change from the way it's been the last decade or so. There will always be console gaming for the economics/simplicity factor, and there will always be PC gaming where the latest 3D card blows consoles away... at the expense of economics/simplicity.


The future of gaming is simple (1)

MickDownUnder (627418) | more than 6 years ago | (#22825658) [] [] [] []

Simplicity is beautiful.

10 years from now the biggest gaming platform will be the mobile phone.

As long as pcs have free online play and user mods (3, Insightful)

Joe The Dragon (967727) | more than 6 years ago | (#22825694)

As long as pc's have free online play and user mods and maps that are free Consoles will still be behind.

There are some payed for mods on the consoles but they are not the same as the free stuff on the pc.

Also who would want to pay for LIVE and for the game as well paying a monthly fee for the game for something like WOW?

There are also a lot of cool free and open pc games that will never be a consoles.

Also there are games that work better with a mouse and mouse are not used that much on a consoles.

Games also like to use the web and other stuff on the same system that they game on.

Hardware and more... (2, Interesting)

toejam13 (958243) | more than 6 years ago | (#22825714)

FTA: The first one is that, from many points of view, Microsoft and Intel come from an enterprise background. They're enterprise-centric. So in many respects the consumer market, from their point of view, is an after market for stuff really designed for the enterprise

This is because enterprise customers have a higher rate of legitimate purchases than home consumers (what is the rate of Windows piracy in China and India?). Furthermore, while enterprise customers may receive deeper discounts on their bulk-OEM licenses than home consumers, they counter that buy purchasing more lucrative packages (how many home users are using XP Server or Advanced Datacenter?).

FTA: So certainly Intel is producing a new generation of chips that have CPU and GPU on the same die which share access to the cache--the L1 cache--coming out in maybe 2009.

You know, Cyrix tried something similar back in the late 1990s with their MediaGX 5x86 processor. Granted, the MediaGX did not have the level of integration that Intel is proposing, but one has to ask: is this really a good thing? Will the video run as a separate core, with a level of autonomy, or will it be more tightly coupled? Will this cause contention between the VPUs and ALUs on die?

Also, how many video cards does the average person have before they toss a system? My current K8/3800+ is on its second video card (upgraded from dual 6600GTs to a single 8600GTS). I'll most likely keep this system for another two years. Although I doubt it'll be my primary system by then, I do bet that it'll have a new video card.

Since the days of Cyrix and AMD keeping "outdated" sockets are over (remember the Am5x86 for Socket3|5, K6-2/500 for Super7?), I suspect that the life cycle of existing sockets will get shorter (I think SocketA's longevity was a fluke). So, if GPU/VPU systems are integrated on-die, how can we keep systems updated when they are 3 or 4 years old? Will Hypertransport direct add-on GPUs be in our future?

FTA: [Nintendo] shipped off the shelf, cheapo, ATI video chips! And they're killing it! ... Nintendo correctly observes that graphics is no longer a differentiating feature; it's a commodity

The use of off-the-shelf components for consoles is nothing new. As an example, the Texas Instruments TMS9918 (and variants) were used in an arse-load of consoles during the mid-1980s (including the ColecoVision, Sega Master System, Sega Genesis, Sega Game Gear and others). It did quite well versus Nintendo's semi-custom chipsets at the time.

So, it is the same game, just with higher-end gear and more expensive R&D budgets? As ray-tracing takes over from current 3D technology, will new coprocessors that are designed specifically for that task be utilized? Yes, you could use more generalized processors (such as POWER, Cell or x64), but then, the original Voodoo cards could have been equipped with a MC68020, too. Right?

The social aspet (0)

Anonymous Coward | more than 6 years ago | (#22825722)

I think one thing the author is seriously forgetting here is the social aspect to console gaming. I've never seen 4-8 people crowded around a single computer laughing, talking shit and having a great time. Consoles are far superior to the PC in this aspect, you may still have online, but nothing compares to playing madden, mario kart, or any other group game on a bigscreen with a few friends.

If only for that one reason, consoles are here to stay.

yeah right (0)

Anonymous Coward | more than 6 years ago | (#22825770)

game consoles will go away??

No, just the opposite. Soon we'll see the PC replaced by appliances/consoles.

How about stand alone games? (0)

Anonymous Coward | more than 6 years ago | (#22825790)

There is a third alternative. We have consoles. We have PCs. We also used to have stand alone games. We could have them again.

Given that computer power keeps getting cheaper and smaller, it is reasonable that game cartridges could also contain the cpu and gpu. That solution would completely clear up the piracy problem. (Well, OK, it would make it hard for casual users to pirate games. There would still be a steady stream of pirated games, Rolex watches and Gucci handbags.)

Many posters talk about the problem with GPUs, video cards, etc. At some point, those problems will go away. Remember sound cards? Given increasing computing power, the necessity for high end video cards will also go away eventually. The other thing that people seem to be ignoring is that many game players (of the lan party type) reduce their graphics quality to reduce latency and get a competitive advantage. Amazing graphics only go so far to increase the fun of game playing.

So, consoles or PCs? Maybe neither.

Console gaming and PC gaming - neither will die (1)

Werthless5 (1116649) | more than 6 years ago | (#22825808)

Both modes of gaming have their merits. PC gaming has free online play and better 3rd party support (mods, etc. although there are some fine console modding communities, the PC mod community has always been and will always be bigger and better). Consoles are more appealing to the masses (you practically don't need a brain to run a console). As a gaming platform, both options will have exclusive titles and both options have their fanboys. Some games are better played with a mouse + keyboard, other games are better played with a controller.

There are too many blithering idiots on the internet claiming that console or PC gaming is in jeopardy. It would be better for the rest of us to ignore the morons and realize that both forms of gaming have their merits and that there really are no big reasons for either form of gaming to die.

Yeah, BS on the DirectX guy... (1)

erroneus (253617) | more than 6 years ago | (#22825840)

The single most important thing for gaming, IMHO, is a consistent operating environment. PCs cannot deliver that. There will always be a faster, more capable machine out there. Consoles definitely have the right idea as one is the same as the next. This offers a more level playing field when competing. (Of course, the superior network connection ends up playing a role in cases where that's relevant, but it can't be helped.)

This guy is just doing what they did at Initech -- interviewing for his own job and justifying his existence.

"You don't understand! I'm good with people! I have people skills!"

Sigh... (1)

V!NCENT (1105021) | more than 6 years ago | (#22825856)

Only Microsoft can come up with such a conclusion... I mean seriously; nothing compares to the power of a Playstation 3. PC's as we know them today suck at games because the mainstream PC architecture is simply not made for it.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?