×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Launches World's First Mobile DirectX 11 GPUs

kdawson posted more than 4 years ago | from the forty-nm-of-fast dept.

AMD 169

J. Dzhugashvili writes "Less than 4 months after releasing the first DX11 desktop graphics card, AMD has followed up with a whole lineup of mobile graphics processors based on the same architecture. The new Mobility Radeon HD 5000 lineup includes four different series of GPUs designed to serve everything from high-end gaming notebooks to mainstream thin-and-light systems. AMD has based these processors on the same silicon chips as its desktop Radeon HD 5000-series graphics cards, so performance shouldn't disappoint. The company also intends to follow Nvidia's lead by offering notebook graphics drivers directly from its website, as opposed to relying on laptop vendors to provide updates."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

169 comments

And? (-1, Redundant)

Anonymous Coward | more than 4 years ago | (#30686964)

Better rush out to get this so you can play the whopping 3 games that support DirectX 11.

Re:And? (0, Redundant)

symes (835608) | more than 4 years ago | (#30687340)

Ok - so that's minesweeper and hearts... what's the third? Seriously though - I don't even have those games on my work computers. I'm more interested to know if they work well with my software, or matlab and so forth. There are other uses for computers.

Re:And? (1)

Jeremy Erwin (2054) | more than 4 years ago | (#30687454)

Are these cards fast enough to run the games in DX11 mode?

Re:And? (1)

Nadaka (224565) | more than 4 years ago | (#30687768)

At least the higher end models will. They have 800, 400 and 80 stream processors respectively.

Re:And? (1)

Jeremy Erwin (2054) | more than 4 years ago | (#30687836)

Some of the reviews of "Dirt 2" had suggested that ATI 5xxx cards were up to 50% faster in DX9 mode.

Re:And? (1)

Nadaka (224565) | more than 4 years ago | (#30687886)

I wouldn't know, but the 800 stream processor mobile card looks like it has very similar performance to the desktop 5xxx cards. Even at 75% speed, it should still be playable. Besides, DX11 is brand spanking new, I would expect some time before the drivers mature.

Re:And? (0, Troll)

Barny (103770) | more than 4 years ago | (#30688274)

This is ATI, their drivers don't mature they ferment :)

As for the original article, I could have sworn I had adverts turned off on this site...

Driver Quality? (0, Troll)

statusbar (314703) | more than 4 years ago | (#30686970)

Perhaps this will increase the actual quality of the Drivers which have been historically so bad?

--jeffk++

Re:Driver Quality? (-1, Redundant)

Anonymous Coward | more than 4 years ago | (#30687190)

You have no idea what you are talkin about. Looks like you just tried to sneak in a first post...

Re:Driver Quality? (0)

Anonymous Coward | more than 4 years ago | (#30687284)

Perhaps this will increase the actual quality of the Drivers which have been historically so bad?

--jeffk++

Came here to say the same thing that you said so I'm out!

Re:Driver Quality? (3, Insightful)

stimpleton (732392) | more than 4 years ago | (#30687296)

1995 called and wants their "ATI drivers are crap" comment back.

Re:Driver Quality? (3, Insightful)

Anonymous Coward | more than 4 years ago | (#30687348)

2010 called and wants their ATi card to run stable and stop crashing in any number of PC games: Borderlands, Saboteur etc. There have been public known issues with the 5xxx line of their cards causing system locks because of poor drivers and incompatibilities. http://www.joystiq.com/2009/11/03/borderlands-glitch-watch-2009-radeon-powered-pc-crashes/ [joystiq.com] http://www.evilavatar.com/forums/showthread.php?t=101665 [evilavatar.com] etc. etc.

Re:Driver Quality? (0)

Anonymous Coward | more than 4 years ago | (#30687502)

boredomlands works fine on my 5870 bro, sure you dont just have a shitty pc?

Re:Driver Quality? (1, Informative)

Anonymous Coward | more than 4 years ago | (#30687616)

Yes, it was a major issue when the game released, ie. essentially every 5xxx series card getting hardlocks; the recent drivers appear to have more or less fixed it after gearbox just ignored the issue for a couple months. Was also a huge issue with catalyst AI and breaking textures, but that was also a fix from the last catalyst update. I guess it's less of an ATi driver issue and more developers botching it these days, ATi is definitely doing better lately. see: http://gbxforums.gearboxsoftware.com/showthread.php?t=78815&highlight=radeon [gearboxsoftware.com]

Re:Driver Quality? (5, Insightful)

joshtheitguy (1205998) | more than 4 years ago | (#30687352)

1995 called and wants their "ATI drivers are crap" comment back.

Obviously you have never tried running Linux on a system with a ATI graphics card.

Re:Driver Quality? (0)

Anonymous Coward | more than 4 years ago | (#30687458)

I have a system with a Radeon 9800 Pro card in it, no problems so far with Ubuntu.

Re:Driver Quality? (0)

Anonymous Coward | more than 4 years ago | (#30687608)

I have a system with a Radeon 9800 Pro card in it, no problems so far with Ubuntu.

The shitty AMD supported FGLRX Linux drivers do not support the 9800 on the latest Kernels and versions of xorg. The MESA open source drivers don't count when it comes to bringing light to the shittiness of AMD's Linux drivers, sure a 9800 will work with linux but not by using AMD's drivers.

Re:Driver Quality? (2, Insightful)

PitaBred (632671) | more than 4 years ago | (#30688928)

AMD is developing the open-source drivers. It's paying people to work on them. Does that make them not AMD's drivers?

Re:Driver Quality? (2, Informative)

PixetaledPikachu (1007305) | more than 4 years ago | (#30689438)

I have a system with a Radeon 9800 Pro card in it, no problems so far with Ubuntu.

A BenQ Joybook with X300, and a Toshiba Satellite with HD3470. And I have been running ubuntu since 7.10 to 9.10 with ATI drivers in these machines. Issues, such as flickering video and incompatibility between 3D acceleration and Compiz do exist you know. I can only Google Earth on top of compiz fine only just recently (9.04 & 9.10) if I'm not mistaken. Xinerama support, which was excellent in 8.xx became unusable in 9.04. I can't hook the notebook to projector during the 8.xx series if compiz is running.

Re:Driver Quality? (4, Interesting)

MostAwesomeDude (980382) | more than 4 years ago | (#30687564)

I have three in my system. :3

Re:Driver Quality? (0)

Anonymous Coward | more than 4 years ago | (#30687620)

I have three in my system. :3

You righteously deserve your nick. Congrats, you magnificent bastard.

Re:Driver Quality? (1)

armanox (826486) | more than 4 years ago | (#30687644)

Recently I've not seen an issue - HD4550 in my desktop, HD3200 in my lappy, XPress200m in old lappy (which did have major driver issues in 2006)

Re:Driver Quality? (1)

cpicon92 (1157705) | more than 4 years ago | (#30688256)

Obviously you have never tried running Linux on a system with a ATI graphics card.

It works fine for me... I've never built an nvidea system and ati graphics drivers have always come through for me.

Re:Driver Quality? (1, Informative)

BikeHelmet (1437881) | more than 4 years ago | (#30688586)

Obviously you have never tried running Linux on a system with a ATI graphics card.

Obviously you have never tried running Linux on a system with a nVidia graphics card.

It's seriously a PITA to get new drivers working on a new kernel with an old card. Anything pre-GeForce 8 may have annoying issues. Not a problem for desktop linux with a new videocard - but if you were setting up a Myth box on that old Athlon XP w/ 6600GT, you may be in for a headache.

Avoid distros like Ubuntu with automatic kernel updates. One update and suddenly your graphics drivers won't work and X won't start. Then it's back down to the CLI to figure out why the fully supported drivers with full 6600GT support don't work with your 6600GT.

P.S. I've been jaded by automatic updates.

Re:Driver Quality? (0)

Anonymous Coward | more than 4 years ago | (#30687360)

Agreed. It's been some time since ATI/AMD has produced bad graphics drivers.

Re:Driver Quality? (0)

Anonymous Coward | more than 4 years ago | (#30687542)

Last week?

Re:Driver Quality? (0)

Anonymous Coward | more than 4 years ago | (#30687632)

you're joking right? not that nvidia's are wonderful, but ati's are still loaded with unhandled exceptions (read bsod) for applications that are NOT the top 3 latest games. try running some 3d workstation apps or even demoscene stuff. ati has a long way to go.

Re:Driver Quality? (2, Insightful)

cynyr (703126) | more than 4 years ago | (#30688458)

hmm xrandar support, new kernel support? can i run vs. git sources? or just 1-2 releases back? does it support the 57xx and 58xx cards yet? how about TVout? Also can i use the card "hard"(WoW raids) for 4+ hours? and maintain uptimes of weeks? how about the current release of xorg? All of the above only applies to linux.

Anyways until then i'll be sticking with nvidia cards.

Re:Driver Quality? (1)

perrin (891) | more than 4 years ago | (#30687892)

It was only 3 years ago when I gave up on ATI and switched to NVidia because ATIs drivers could not handle bad inputs, and would crash the entire system. So I had to write my own abstraction layer to ensure that no bad point coordinates and so on could be sent to the driver. I also filed kernel crash bugs with ATI that took forever to get fixed. After I switched to NVidia, I have yet to see a single kernel failure due to programming mistakes. Their drivers are just rock solid. So much better to develop on that it would take a lot to go back. I also had much the same bad experience with the open source Intel drivers.

People Still Use DirectX??? (2, Interesting)

Anonymous Coward | more than 4 years ago | (#30687044)

Who the hell other than the poor sods still doing x86 Windows only game/graphics development still uses that turd of an API DirectX?

Let's just go over the platforms I work on:

PC graphics development - OpenGL
Linux graphics development - OpenGL
Mac graphics development - OpenGL
Android graphics development - OpenGL ES
iPhone graphics development - OpenGL ES
Embedded ARM based system development - OpenGL ES

even some OpenGL for console development.

Re:People Still Use DirectX??? (1, Insightful)

Nabeel_co (1045054) | more than 4 years ago | (#30687106)

I think the far majority of games are x86 windows based games.

You have to remember that x86 itself is a shitty architecture, but is only used because of Window's dominance.
A RISC based architecture would be much better suited for todays computers.

Re:People Still Use DirectX??? (4, Funny)

WaroDaBeast (1211048) | more than 4 years ago | (#30687176)

Maybe one of the big names over at Microsoft said at some point he wanted his employees to adopt the "lack of risk" mantra, but instead they all understood "lack of RISC." ;-)

Re:People Still Use DirectX??? (0)

Anonymous Coward | more than 4 years ago | (#30687212)

What an inane comment.

You have to remember that the "far majority" of games are using specialized instruction sets on both the CPU and the GPU which long ago replaced the "shitty architecture" that was x86

Re:People Still Use DirectX??? (3, Insightful)

Anonymous Coward | more than 4 years ago | (#30687336)

Is this 1990 again? We are back to RISC vs CISC? Intel and AMD showed that decoding CISC to RISC microps can be just as fast as RISC. They gain some performance advantage on the instruction cache hit rate vs pure RISC at the expense of some hardware logic(This only comes into play when compared to very low power devices)

Re:People Still Use DirectX??? (2, Insightful)

sznupi (719324) | more than 4 years ago | (#30688558)

(This only comes into play when compared to very low power devices)

Which of course means "this only comes into play when looking at most widespread devices, shipping at least order of magnitude more units than x86"

Re:People Still Use DirectX??? (0)

Anonymous Coward | more than 4 years ago | (#30688696)

The context of the article is still desktop.

Re:People Still Use DirectX??? (1)

sznupi (719324) | more than 4 years ago | (#30689136)

Well, laptop, actually ;p

But even in desktops, there are possibly more ARM cores than x86 ones. Something in the monitor. Something in optical drive. HDD controller perhaps. Or WiFi controller.

Re:People Still Use DirectX??? (2, Insightful)

Lunix Nutcase (1092239) | more than 4 years ago | (#30687562)

A RISC based architecture would be much better suited for todays computers.

Is this ignoring the fact that modern x86 chips from Intel are basically RISC chips with a CISC to RISC interpreter bolted on?

Re:People Still Use DirectX??? (0)

Anonymous Coward | more than 4 years ago | (#30687824)

No he just watched 'Hackers' so he's a cpu-design specialist now.

Re:People Still Use DirectX??? (1)

Atzanteol (99067) | more than 4 years ago | (#30687842)

1997 called. They want their troll back.

Re:People Still Use DirectX??? (2, Informative)

Nabeel_co (1045054) | more than 4 years ago | (#30688052)

Ok so...
XBOX 360 RISC
PS3 RISC
PS2 RISC
iPhone RISC
Most, if not all Mobile devices RISC
Wii RISC
Sun systems RISC
Need I go on?

If you need power and efficiency, you use RISC. Always. Try to come up with anywhere near as many examples for CISC.

Re:People Still Use DirectX??? (2, Informative)

Lunix Nutcase (1092239) | more than 4 years ago | (#30688114)

And that's why modern x86 processors are basically RISC processors with a decoder on them for legacy x86 instructions. Your comments haven't been insightful for quite some time now.

Re:People Still Use DirectX??? (2, Informative)

Nabeel_co (1045054) | more than 4 years ago | (#30688450)

So then -- and this is a genuine question -- why are RISC based devices so much more powerful while using a lower clock speed, and consuming less power?

For example, this video was recently referenced in a /. post a few days ago: http://www.youtube.com/watch?v=W4W6lVQl3QA [youtube.com]

Where an atom processor at 1.6GHz was just about on-par with a 500MHz ARM based processor.

Re:People Still Use DirectX??? (0, Troll)

sexconker (1179573) | more than 4 years ago | (#30688508)

link to article on microcode
something something
turn in your geek card
something something

Re:People Still Use DirectX??? (1)

Nadaka (224565) | more than 4 years ago | (#30688352)

that is debatable. But the competition is good. RISC isn't a miracle cure, but I like where ARM has been going the last few years. Hopefully the next year will see some Cortex A8 or Cortex A9 chips approach the performance of x86 chips (atom at least).

Qualcoms snapdragon is based on the Cortex A8 with a ton of custom development work, I have not really seen much in high performance Cortex A9 chips yet, but they are supposed to be on the way.

Re:People Still Use DirectX??? (2, Informative)

confused one (671304) | more than 4 years ago | (#30687330)

And that makes perfect sense if you're targeting all those different platforms. There may even be perfectly reasonable reasons to use OpenGL over DirectX based on your coding requirements and the APIs. However, if you're target audience is Window and Windows Embedded only, and there are no requirements that are better served by OpenGL, there's no reason not to use DirectX.

It's just a tool.

Re:People Still Use DirectX??? (-1, Flamebait)

Anonymous Coward | more than 4 years ago | (#30687588)

It's just a tool.

And I reserve the right to call you a Microsoft tool too. No seriously when you target your application for Windows ONLY, you are a Microsoft tool. (Yes it's the Damn the Man card.)

DirectX, like any Microsoft technology, is synonymous with vendor lock-in.

Re:People Still Use DirectX??? (3, Insightful)

kestasjk (933987) | more than 4 years ago | (#30688028)

The Microsoft tool's dilemma: Should I stop making money selling software, or risk being called a Microsoft tool by an anonymous coward on /. (who writes iPhone apps, no vendor lock-in there of course).

Re:People Still Use DirectX??? (3, Interesting)

Lodragandraoidh (639696) | more than 4 years ago | (#30688888)

I read your post and it occurred to me that it illustrates perfectly a key problem with software development today: short sightedness.

In an age of fast multiprocessing, it only makes sense to do everything you can to create abstraction layers that will ensure:

1. My software will have the widest possible audience regardless of platform. $$$

2. I will be able to extend the application, or create a new one with minimal effort by reusing modules I've already created to do hard things well/fast. $$$ (in form of turn-around time/effort)

3. If a vendor decides to break something in their firmware/hardware - I only have to fix one module that drives the given hardware - *NOT* the application itself. $$$ (ditto)

Flexibility, resiliency, more cash in your pocket...I don't see a down side to taking this approach. On modern gaming rigs in particular, there is no reason NOT to use OpenGL - for all it's perceived limitations compared to a tweaked out directX X86 app.

As a gamer myself, I look at it from another angle: I have Linux, Mac machines as well as a high-end Windows game rig - to host games (I like to create and share my own maps/scenarios in some games) cost efficiently I prefer to use the Linux server, and play on my Windows box....using and tweaking WINE in order to run the game (I'm not made of money and can't cost-justify a full compliment of windows servers - which also would waste resources since I am a *nix developer too). Getting WINE to work with some of the niche games I play is a royal pain. If the developers of said games took my advice, I would be running their games natively under linux with minimal headaches.

Flexibility and choice is good for the widest audience. Vendor lock-in is bad - and only serves a few types of people (the corporation$$$ and simple gamer-$$$). The funny thing is, these companies stand to make more money than they would under their lock-n strategy if they would think long term and build flexible extensible applications that benefit the largest audience. Lucky for me most of the titles I currently enjoy have taken this approach; I will continue to gravitate to those that do, and deny $$$ to those that won't.

Re:People Still Use DirectX??? (0)

Anonymous Coward | more than 4 years ago | (#30687518)

You don't play many Windows games, do you? Most of them use DirectX, IIRC.

Re:People Still Use DirectX??? (1)

jgtg32a (1173373) | more than 4 years ago | (#30687930)

Most?
Besides iD tech are there really any engines that use OpenGL?

Re:People Still Use DirectX??? (1)

Ash-Fox (726320) | more than 4 years ago | (#30688416)

Besides iD tech are there really any engines that use OpenGL?

Off the top of my head... APOCALYX, Irrlicht, Espresso3D, AGE3D, Vortex3D, Qube, Cube, Cube 2, Hero engine, Aleph One, Unreal Engine 3, Axiom Engine, Crystal Space, Dark places, Allegro, Exult...

And if you want more, you'll have to search for it yourself,

Re:People Still Use DirectX??? (2, Insightful)

MBGMorden (803437) | more than 4 years ago | (#30687754)

Yeah those "poor sods" making multi-million dollar grossing titles. Seriously, I'm all for OpenGL. I like it because it does make ports easier and I'd like to see more games available on Linux and Mac.

The snide "are people STILL using technology X?" comments when technology X is the clear market leader are just annoying though.

Re:People Still Use DirectX??? (0)

Anonymous Coward | more than 4 years ago | (#30687998)

market leader != automatic best choice/win in every context. I'm sick of business types, or nerds pretending to be business types because they happened to make a bit of money, pretending this falsehood is true. It's just another version of the popularity fallacy.

Re:People Still Use DirectX??? (1)

Lodragandraoidh (639696) | more than 4 years ago | (#30689046)

As I point out at length in my post above, ironically they would make more money if they did cater to a wider audience. Of course, that would require 2 things:

1. Long Term Thought.

2. Abandoning unmaintainable hard coded monolithic structured program cores that spin on tweaked out low level directX hardware APIs...

Unfortunately everyone wants to be a rock star - so everyone is more concerned about the size of their bank account without consideration for the size it would be if they made titles that endured and morphed quickly with the widest possible audience..."But it's fast...and I make a lot of money from it...", they respond.

Sometimes it is like speaking to a brick wall.

Re:People Still Use DirectX??? (1)

TheKidWho (705796) | more than 4 years ago | (#30689322)

Your "obvious" ideas require a significantly higher level of investment which 99% of the time won't pay off.

Sometimes it's like listening to an idiot harp on.

Re:People Still Use DirectX??? (0, Troll)

sexconker (1179573) | more than 4 years ago | (#30688536)

The snide "are people STILL using technology X?" comments when technology X is the clear market leader are just annoying though.

Are people STILL being snide on the internet?

Most of the game world (3, Interesting)

Sycraft-fu (314770) | more than 4 years ago | (#30687810)

As well as a good deal of other Windows graphic programs. You can stick your head in the sand and pretend that Microsoft Windows isn't a major player, but you are fooling only yourself. Windows development matters a whole lot, and DX is the native API and thus many use it.

However, in this case the reference is to features of the card. See OpenGL is really bad about staying up to date with hardware. They are always playing catchup and often their "support" is just to have the vendors implement their own extensions. So when a new card comes out, talking about it in terms of OpenGL features isn't useful.

Well, new versions of DirectX neatly map to new hardware features. Reason is MS works with the card vendors. They tell the vendors what they'd like to see, the vendors tell them what they are working on for their next gen chips and so on. So a "DX11" card means "A card that supports the full DirectX 11 feature set." This implies many things, like 64-bit FP support, support for new shader models, and so on. IT can be conveniently summed up as DX11. This sets it apart to a DX10 card like the 8800. While that can run with DX11 APIs, it doesn't support the features. Calling it DX10 means it supports the full DX10 feature set.

So that's the reason. If you want to yell and scream how OpenGL should rule the world, you can go right ahead, however the simple fact of the matter is DirectX is a major, major player in the graphics market.

Re:Most of the game world (1)

RAMMS+EIN (578166) | more than 4 years ago | (#30688958)

``However, in this case the reference is to features of the card. See OpenGL is really bad about staying up to date with hardware.''

How can that be, when it allows vendors to add their own extensions? Add a feature to your hardware, add an extension to OpenGL so programmers can use it. No need for delays.

``They are always playing catchup and often their "support" is just to have the vendors implement their own extensions.''

Is there a problem with that? I mean, yes, it would be nicer if features were immediately also included into a standard that was also supported by other vendors, but you can't have it both ways. If progress comes from the vendors, the standard is going to be behind. If progress comes from the standard, the vendors are going to be behind. At least OpenGL defines an extension mechanism that is used by vendors to provide access to new features, while otherwise remanining compatible with standard OpenGL.

``So when a new card comes out, talking about it in terms of OpenGL features isn't useful.''

Perhaps, but, personally, "DirectX 11.0" doesn't tell me anything, either. I suppose it's great to know the DirectX level if you code to DirectX, though. For me, it would be more useful to know which OpenGL version and extensions were supported.

Now, don't get me wrong. I think Microsoft is doing some great work with Direct3D. I doubt there would be as much progress and coherence if it wasn't for Direct3D. It's just a pity they had to go and make their own, incompatible API when an open standard already existed.

X [] O (1)

tepples (727027) | more than 4 years ago | (#30687814)

Xbox 360 graphics development - DirectX
XNA (Xbox 360 indie games) graphics development - a managed API based on DirectX

Re:People Still Use DirectX??? (0)

Anonymous Coward | more than 4 years ago | (#30687888)

Ever hear of XBOX what the hell underlying code do you think they use, it's called XBOX with the X meaning DirectX Now it is not entirely the same as on the PC but shares some of the same underpinnings.

Re:People Still Use DirectX??? (0)

Anonymous Coward | more than 4 years ago | (#30688050)

I still remember Half Life 1 / Counter Strike beta through 1.6, and getting much better results on OpenGL then on the DirectX crap at the time (was it 5 or 6?). Since then, it seems most modern games require directx, and no longer offer the OpenGL option, which is a true shame.

Re:People Still Use DirectX??? (0, Flamebait)

TheKidWho (705796) | more than 4 years ago | (#30688408)

Why is it a shame?

Because it's using M$'s technology? lol.

Re:People Still Use DirectX??? (0)

Anonymous Coward | more than 4 years ago | (#30688998)

Why is it a shame?

Because it's using M$'s technology? lol.

Perhaps it's a shame because Microsoft doesn't support DirectX on Mac, Linux, iPhones, Wii, or the Playstation. While it's great to use it in a microsoft-only world, there *are* other systems out there, and Microsoft has no plans to support DirectX on such platforms.

On the other hand OpenGL does have Windows support, for the sanity of developers not in game companies with billions of dollars who can write their code twice: for microsoft platforms and for non-microsoft platforms.

Re:People Still Use DirectX??? (4, Insightful)

Kjella (173770) | more than 4 years ago | (#30688232)

Only all the AAA games on Windows, but clearly you are far more important than them.

Re:People Still Use DirectX??? (2, Informative)

shutdown -p now (807394) | more than 4 years ago | (#30689214)

Who the hell other than the poor sods still doing x86 Windows only game/graphics development still uses that turd of an API DirectX?

I know you've specifically excluded Carmack here, but nonetheless, I think his opinion is not exactly irrelevant:

"DX9 is really quite a good API level. Even with the D3D side of things, where I know I have a long history of people thinking I'm antagonistic against it. Microsoft has done a very, very good job of sensibly evolving it at each step—they're not worried about breaking backwards compatibility—and it's a pretty clean API. I especially like the work I'm doing on the 360, and it's probably the best graphics API as far as a sensibly designed thing that I've worked with."

(the original interview that contained that quote seems to be offline, sadly, so I cannot give you the primary source, but googling for that phrase should give plenty of secondary sources)

Starry Starry Night (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#30687184)

If there was a big tech shocker this Christmas, it was the fact that the Kindle was the top selling gift of all time, according to multiple news sources. Really? If so, did the entire industry miss the mark, leaving the space open to a guy whose company specializes in online books sales? A guy who has never really entered the hardware space in any meaningful way before, except as an investor?

How did this happen?

The kicker here, is the fact that numerous failed attempts at a pad machine have been made, beginning with the imaginary Dynabook in the 1970s and including various WinPads and other tablets right up to the Microsoft announcement of a tablet platform a few years ago. You remember that, right? This was the platform that was set to dominate all computing by 2000 or 2004 or whatever. In the meantime, a slew of "convertible" laptops evolved and subsequently ended up in the trash heap of innovation.

Apparently all anyone actually wanted was a device they could read books on. Of course, this may only be the beginning, should Apple come out with its iSlate or iPad--what I prefer to just call a Giant iPod Touch.

So, what changed? Why are we making this sort of platform shift, all of the sudden?

How's this for an idea: This would have happened a long time ago, were it not for Microsoft. The same holds true for the smartphone. It would have become an object of desire a lot sooner, had Microsoft laid off.

The software giant has been a hindrance to progress ever since it transformed from a subversive company to a kind of IBM-clone to a lackey for big business. There's no connection between the company and the end-user anymore.

To understand this, we need to travel back to the introduction of Excel. The dominant spreadsheet app used to be Lotus 1-2-3. The program put Lotus at the top of the software heap. The company was bigger than Microsoft--by a lot. Microsoft had a crummy spreadsheet program called Multiplan at the time, but it discovered a guy who was working on a Lotus killer. His app became Excel and Microsoft ran a series of humorous ads featuring people skulking around an office, showing each other Excel in a manner that suggested their fictitious company wouldn't approve of the use of the unauthorized product. All of the workers continued to use it on the sly, because they could get so much more work done.

The ad was totally subversive. It fit right in with the mid-1970s PC revolution, where attendees at those early computer conferences would boo IBM's name. The entire "micro-computer" scene was very subversive in that way. But Bill Gates said that he'd like Microsoft to become the software version of IBM. It's since managed that feat, and now Microsoft is as likely to get booed as IBM was back in 1976

And you know what? Judging from the company's recent history, it deserves it. Microsoft is the post-modern IBM, serving as the same sort of hindrance to the scene that IBM was 50 years ago. Microsoft is hindering the industry with its vision, stumbling into places it doesn't belong. The company is like the big, dumb rich kid you don't want at your party. He comes anyway and knocks over the punchbowl--again. The company ruins markets just by showing up.

Microsoft entered the smartphone space early on and slowed things down in the process because nobody wanted to compete with its money and fickle approach. It's not worth risking that the company would take your good idea, re-brand your idea, and ruin the whole thing for everyone. It's a scorched earth policy. Examples are everywhere. Look at FrontPage, a very functional HTML editor bought and branded as a Microsoft product. The company kept mucking with the app until it was useless. The product line was eventually shuttered. HTML and page editing gravitated to the Mac in order to avoid Microsoft (the aforementioned big, dumb rich kid). That Mac software was eventually re-coded for Windows.

Microsoft's tablet was doomed from the beginning--actually, it's been doomed numerous times over the years. The company was lording it over everyone, while making bogus predictions about the potential of its vision. The smartphone story is the same. We would have no smartphone industry without Steve Jobs's iPhone.

I'm not writing this to slam Microsoft. The company is amazing, but once it lost its subversive edge, it lost its ability to be anything more than a functionary company--that big, dumb uninvited rich kid--doing the bidding of big business. Microsoft is a well-paid lackey. Gates himself used to go to user group meetings. The company pulled the plug on all of that, and the scene rapidly declined as the corporate shills took over.

Now we're seeing the emergence of the next Microsoft-free zone--the Kindle and the Apple iDunno. Finally, a touchscreen computer that people actually want. The Kindle was beyond Microsoft's grasp. This became apparent when the company decided to stop scanning books to compete with Google. Someone asked why they stopped, and I guess the answer was, "I don't know. Does anyone even read anymore?" It was also beyond them because it has no relation to the big corporations, amongst which Microsoft has settled. I'm sure it's comfortable there, but it doesn't do me or anyone else outside of the corporate structure any good, does it? Microsoft should just get out of the smartphone business and the tablet game. It wasn't invited in the first place.

Blue and Gold, or is that Silver, Vincent?

Re:Starry Starry Night (1)

Lodragandraoidh (639696) | more than 4 years ago | (#30689340)

You forget to mention the Apple Newton. There are people STILL using their Newtons today. It had handwriting recognition years ahead of anything comparable, and communication capabilities. When Steve Jobs went to Next, Mr Sculley(sic) (another corporate droid - in the theme of your post) shut it down.

Over the years Newton enthusiasts have asked the company many times to release the code so they could port their beloved operating system to newer hardware as their Newtons died of old age. Apple always refused. I can only speculate, but I think this new device may leverage the experience of the Newton - if not the code - to resurrect the table PC concept.

I would rather eat a bag of computer screws... (0)

Anonymous Coward | more than 4 years ago | (#30687402)

Than buy another graphics card powered by ATI. Seriously, my X1300xt was only supported for like 3 years. Now I can't even use it on a current distro whilst retaining the full functionality/performance of the closed driver.

To ATI: Support your products for at least 5 years (like NVidia does), or I will *never* buy from you again!

Re:I would rather eat a bag of computer screws... (1)

StayFrosty (1521445) | more than 4 years ago | (#30687586)

ATI dropped support from their binary drivers because these cards are supported by the OSS driver. They aren't good enough to game on, so why would anyone want to run the proprietary ATI driver with this card anyway?

Re:I would rather eat a bag of computer screws... (1, Interesting)

Anonymous Coward | more than 4 years ago | (#30687736)

The older cards are still capable of doing the minimal of acceleration needed to do desktop effects but yet the MESA drivers seem to be incapable of providing reliable 3D acceleration of said desktop effects.

Re:I would rather eat a bag of computer screws... (1)

StayFrosty (1521445) | more than 4 years ago | (#30688344)

I know both of our arguments here are anecdotal, but 3D acceleration and compiz-fusion are running great on my laptop with a mobile x1270 with the MESA drivers. 3D just worked right out of the (figurative) box. The motherboard that used to be in my HTPC had an x200 onboard and 3D acceleration worked fine with minimal effort on that machine as well.

Re:I would rather eat a bag of computer screws... (0)

nxtw (866177) | more than 4 years ago | (#30688044)

Than buy another graphics card powered by ATI. Seriously, my X1300xt was only supported for like 3 years. Now I can't even use it on a current distro whilst retaining the full functionality/performance of the closed driver.

This sounds more like a Linux/X.org problem than an ATI problem. Users of old graphics cards in Windows can keep using the old drivers, even in newer operating systems. Even Windows 2000/XP drivers continue to work in Windows 7/2008 R2, although without the features made possible with newer drivers.

Re:I would rather eat a bag of computer screws... (1)

PitaBred (632671) | more than 4 years ago | (#30688898)

The architecture changed significantly. Not to mention that it actually has more functionality under the open-source driver than it ever did under the closed-source one. What in the hell are you bitching about?

Re:I would rather eat a bag of computer screws... (0)

Anonymous Coward | more than 4 years ago | (#30689050)

Original poster here. I'm bitching because the open source drivers don't provide the same level of performance for games that the proprietary drivers do. If they're going to drop support for the proprietary drivers, the open ones should be *just as fast* as the closed ones were.

Although an X1300xt is an older card, it is still quite powerful and able to run games from a year or two ago pretty well. Take Penumbra or Sauerbratten or Nexuiz for example.

Innovationz!!!! (3, Funny)

NotBorg (829820) | more than 4 years ago | (#30687500)

DirectX 11 in a mobile device? So the device doubles as a hairdryer?

Re:Innovationz!!!! (2, Informative)

mikael (484) | more than 4 years ago | (#30687672)

Embedded systems may only be using a screen resolution of 640x480 or 800x600 rather than dual monitor 2048x1536. That's one energy/time saving. Then there won't be 900+ stream processors like the high-end gaming cards, there might just be 128 or 256. There's another saving. Anti-aliasing will be disabled as well, so that saves some processing time and power as well.

You will still have texture mapping, shadowing effects using fragment shaders, but just not as many triangles as the current gaming engines will all the effects turned on.

Re:Innovationz!!!! (1)

WilyCoder (736280) | more than 4 years ago | (#30687712)

Um, the mobility 5870 has 800 Stream Processing Units and utilizes 1.04 billion 40nm transistors.

http://www.amd.com/us/products/notebook/graphics/ati-mobility-hd-5800/Pages/hd-5870-specs.aspx [amd.com]

Re:Innovationz!!!! (1)

mikael (484) | more than 4 years ago | (#30688572)

Thanks for that link - I'm amazed that the energy demand for all of that is only 50 watts, compared to the 300 watts required in the past for other cards. You could have a cluster of those and still use a standard electric socket.

Linux support is coming, we promise! (4, Informative)

MostAwesomeDude (980382) | more than 4 years ago | (#30687572)

Support in the open-source drivers is being written as fast as ATI can verify and declassify docs. Also the r600/r700 3D code should be mostly reusable for these GPUs.

Re:Linux support is coming, we promise! (2, Interesting)

owlstead (636356) | more than 4 years ago | (#30687828)

How many years was it again that they promised to produce open source graphic drivers for Linux? I've lost count and have ordered a new motherboard with a silent Nvidia based graphics card because I just *HAD* it with ATI on Linux. My AMD chipset motherboard also had a lot of SATA instability under Linux and I had all kinds of problems letting the system know how to read any of the CPU's censors (X2 Phenom based CPU). So I have just ordered an Intel based CPU/chipset as well.

I've no doubt that AMD is slowly working with the community to get better support, but their current binary offering sucks balls, there is no other way to describe it. Having a discrete graphics chip with video decoding capabilities should not mean you can use either one, but not both at the same time. Turning my monitor 90 degrees? Forget it, greyed out. And that's just the start of things.

And don't tell me how to do things, I've been running Linux since my first slackware CD's and even now I have not a single good idea on how to fix these issues, even after googling for hours on end. If I can't get this right, then only very hard core Linux programmers can.

Re:Linux support is coming, we promise! (3, Interesting)

Kjella (173770) | more than 4 years ago | (#30688030)

How many years was it again that they promised to produce open source graphic drivers for Linux?

Announced: September 7th, 2007: press release [amd.com]

Since then they've been catching up more and more, the HD5xxx/Evergreen/R800 instruction set was posted before Christmas so the docs are almost up to date, minus a few things like UVD2. Also AMD promised to help the open source community, not write the whole thing themselves and it's making big strides but there's also a lot of rework going on in xorg to support a modern desktop.

Re:Linux support is coming, we promise! (1)

Randle_Revar (229304) | more than 4 years ago | (#30689102)

Are you that guy from the Debian mailing list? You sound like him - unjustly bashing ATI/AMD, misrepresenting their statements, and exaggerating the problems ATI on Linux has.

They fact is that even the binary drivers (yuck) are much better than thy used to be, and the Free drivers are moving along by leaps and bounds. AMD has done very well with their promise to deliver documentation, and the Xorg guys are improving drivers as fast as they can, given limited manpower, and a rather large amount of (needed) churn in Xorg (DRI2, KMS, TTM/GEM, Gallium3D) that they need to keep up with.

I currently have Intel, but in 2007/2008 I had an r300, and it worked very well (free drivers). I have a lot of confidence in the Radeon driver, and sometime soonish I will probably get an r700 or r800.

Re:Linux support is coming, we promise! (1)

owlstead (636356) | more than 4 years ago | (#30689326)

Are you that guy from the Debian mailing list? You sound like him - unjustly bashing ATI/AMD, misrepresenting their statements, and exaggerating the problems ATI on Linux has.

No, for a normal user they are unusable. Any less advanced person would not have spend those kind of hours on configuring a graphics card.

They fact is that even the binary drivers (yuck) are much better than thy used to be, and the Free drivers are moving along by leaps and bounds. AMD has done very well with their promise to deliver documentation, and the Xorg guys are improving drivers as fast as they can, given limited manpower, and a rather large amount of (needed) churn in Xorg (DRI2, KMS, TTM/GEM, Gallium3D) that they need to keep up with.

Oh, I'm not bashing anyone, trust me on this.

I'm just this guy waiting on some kind of normal display/sound drivers on my Linux computers. Currently doing anything slightly over running vesa or nvidia for graphics and very basic sound stuff sucks on Linux (or at least the last 4 Ubuntu versions I tried. Don't mistake this comment for "Linux sucks". I love the way many things like package mgmt and source control work in Linux. But also don't forget that good touchpad, display and sound card support are paramount for any real "Linux on the desktop". After so much time, any promises of better support would get anyone down.

I currently have Intel, but in 2007/2008 I had an r300, and it worked very well (free drivers). I have a lot of confidence in the Radeon driver, and sometime soonish I will probably get an r700 or r800.

Well, I'm looking at a screen driven by my laptop with Intel graphics now (karmic koala), and if Windows would give me kind of shit like this I would SERIOUSLY complain to Microsoft. It's that compiling the OpenJDK works so much better under Linux because seriously, the amount of time that it takes to get things slightly right is bugging the hell out of me.

It's amazing how far Linux has come, and it is at least as amazing how far it still needs to go.

Re:Linux support is coming, we promise! (0)

Anonymous Coward | more than 4 years ago | (#30689226)

and I had all kinds of problems letting the system know how to read any of the CPU's censors

I knew DRM had gotten bad but this crosses the line. Do you know what your chip censors? I hope it's not the XOR operations.

And don't tell me how to do things, I've been running Linux since my first slackware CD

And here I thought you had been running linux since you created a mirror of linus' source! I've never heard of anyone installing this linux thing from a distributions install disc! Amazing!

Re:Linux support is coming, we promise! (0)

Anonymous Coward | more than 4 years ago | (#30689234)

The point of free software and open specifications is that YOU can write the code and YOU can share it with the world. If you believe that FLOSS means that everyone does the leg work while you just sit back and wait they hand it to you then you have it all fundamentally, profoundly wrong.

Re:Linux support is coming, we promise! (2, Insightful)

Sir_Lewk (967686) | more than 4 years ago | (#30687840)

At first glance, from the subject line, I thought your post was a snide comment about the state of official ATI drivers on linux. I must say though, you guys are doing an excellent job at picking up ATI's slack.

Re:Linux support is coming, we promise! (1)

MemoryDragon (544441) | more than 4 years ago | (#30687992)

Not fast enough, I dumped my perfectly fine Radeo 4850 in favor of a somewhat slower NVidia, the reason was that X support was hit and miss, half the 3d functions crashed X others worked. I then dropped in my NVidia card and everything worked out of the box.
I do not care for how many years we got promises, the linux drivers suck donkey balls, and probably will be forever.
Wake me up when the stability is up to NVidias offerings, or shock the Intel opensource drivers.

Re:Linux support is coming, we promise! (1)

Ash-Fox (726320) | more than 4 years ago | (#30688162)

Support in the open-source drivers is being written as fast as ATI can verify and declassify docs.

Personally, I've not been impressed with the 'correct' opensource effort when it comes to 3D acceleration support, see my blog entry [livejournal.com] for more details.

ATI at it again... (1, Informative)

GooberToo (74388) | more than 4 years ago | (#30688724)

Just upgraded my brother's laptop over the holiday. Seems ATI dropped support for his GPU in their proprietary driver so now he has a choice. Option one, use the open source drivers which provide no 3d acceleration. Basically 3D is completely unusable. Option two, use an older distribution which has the required version of X, kernel support, and all dependent software. And with the second option comes all the associated security issues of running an old and unsupported distro. He chose to run a current distro and be stuck with 2d-only acceleration. All of the 3d games he had on his laptop are now completely unplayable; measured in fractions of frames per second.

It turns out ATI decided they would simply stop supporting his GPU and AFAIK, they have not released any 3D documentation on it. This is exactly the reason I've gone out of my way to never buy ATI. They drop support of cards like crazy leaving users completely stuck. And in something like laptops, which is exactly what this article is about, that means your entire laptop is now obsolete.

I don't care how many ATI fanboys there are that want to bash NVIDIA for providing binary blobs - the fact is, their stuff works and works well and best of all, they don't leave their users high and dry. The only problems I've had with NVIDIA was years ago when their first started providing 64-bit Linux drivers. So say what you will to support ATI, at the end of the day, they are still doing the same old thing and hurting their customers. Case in point, I have an nvidia video card which is older than my brothers laptop which is still supported by NVIDIA's drivers.

So what do you want as a user? Stuff that works year after year or a company (ATI) telling you when your equipment is obsolete and that you need to replace the entire computer?

For Linux there is still only one 3D option - NVIDIA. Period.

Re:ATI at it again... (1)

Randle_Revar (229304) | more than 4 years ago | (#30689166)

>Seems ATI dropped support for his GPU in their proprietary driver so now he has a choice. Option one, use the open source drivers which provide no 3d acceleration.

Bullpucky. Any/all cards that are not supported by the binary drivers do have 3D support from the OSS drivers.

>For Linux there is still only one 3D option - NVIDIA. Period.

Funny, my experience with 3D with both Intel and ATI has been great

Just in time...? (1)

billsayswow (1681722) | more than 4 years ago | (#30688254)

Good thing they got these out on the market. They were about to kill off DX10, just like they killed off DX9... oh wait...

What good is great hardware ... (0)

Anonymous Coward | more than 4 years ago | (#30688936)

...without great drivers?

Seriously.

I know everyone says ATi is so much better now, but as frequently as they drop support for older hadrware (and as badly as I was "burned" for my last two cards from them), I still have no desire to purchase their hardware.

nVidia delivers drivers for platforms that have an even smaller market than GNU/Linux does for 3D such as Solaris, FreeBSD, etc. which greatly benefits me as a user of those operating systems.

What motivation do I have to buy ATi cards when their support for non-Windows operating systems has been so crap for so long?

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...